WO2025203020A1 - System and method to generate image visualization - Google Patents
System and method to generate image visualizationInfo
- Publication number
- WO2025203020A1 WO2025203020A1 PCT/IL2025/050263 IL2025050263W WO2025203020A1 WO 2025203020 A1 WO2025203020 A1 WO 2025203020A1 IL 2025050263 W IL2025050263 W IL 2025050263W WO 2025203020 A1 WO2025203020 A1 WO 2025203020A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- detector
- image data
- parameter
- subject
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/54—Control of apparatus or devices for radiation diagnosis
- A61B6/547—Control of apparatus or devices for radiation diagnosis involving tracking of position of the device or parts of the device
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/12—Arrangements for detecting or locating foreign bodies
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/42—Arrangements for detecting radiation specially adapted for radiation diagnosis
- A61B6/4208—Arrangements for detecting radiation specially adapted for radiation diagnosis characterised by using a particular type of detector
- A61B6/4241—Arrangements for detecting radiation specially adapted for radiation diagnosis characterised by using a particular type of detector using energy resolving detectors, e.g. photon counting
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/44—Constructional features of apparatus for radiation diagnosis
- A61B6/4405—Constructional features of apparatus for radiation diagnosis the apparatus being movable or portable, e.g. handheld or mounted on a trolley
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/48—Diagnostic techniques
- A61B6/482—Diagnostic techniques involving multiple energy imaging
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5211—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5211—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
- A61B6/5229—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
- A61B6/5235—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from the same or different ionising radiation imaging techniques, e.g. PET and CT
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/02—Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
- A61B6/027—Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis characterised by the use of a particular data acquisition trajectory, e.g. helical or spiral
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/40—Arrangements for generating radiation specially adapted for radiation diagnosis
- A61B6/405—Source units specially adapted to modify characteristics of the beam during the data acquisition process
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/44—Constructional features of apparatus for radiation diagnosis
- A61B6/4429—Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units
- A61B6/4435—Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being coupled by a rigid structure
- A61B6/4447—Tiltable gantries
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5211—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
- A61B6/5229—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
- A61B6/5235—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from the same or different ionising radiation imaging techniques, e.g. PET and CT
- A61B6/5241—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from the same or different ionising radiation imaging techniques, e.g. PET and CT combining overlapping images of the same imaging modality, e.g. by stitching
Definitions
- the present disclosure relates to imaging a subject, and particularly to a system and method to acquire image data for generating a selected image visualization.
- a subject such as a human patient, may undergo a procedure.
- the procedure may include a surgical procedure to correct or augment an anatomy of the subject.
- the augmentation of the anatomy can include various procedures, such as movement or augmentation of bone, insertion of an implant (i.e. an implantable device), or other appropriate procedures.
- a surgeon can perform the procedure on the subject with images of the subject that are based on projections of the subject.
- the images may be generated with imaging systems such as a magnetic resonance imaging (MRI) system, computed tomography (CT) system, fluoroscopy (e.g. C-Arm imaging systems), or other appropriate imaging systems.
- MRI magnetic resonance imaging
- CT computed tomography
- fluoroscopy e.g. C-Arm imaging systems
- a system to acquire image data of a subject with an imaging system may use x-rays.
- the subject may be a living patient (e.g. a human patient).
- the subject may also be a non-living subject, such as an enclosure, a casing, etc.
- the imaging system may include a moveable source and/or detector that is moveable relative to the subject.
- the imaging system may acquire a plurality of projections at the same perspective relative to the subject.
- Each projection may be acquired at a varying or different parameter of a detector.
- a single detector may have parameters that detect different spectra.
- a different spectrum may be achieved between projections such as by changing a voltage to cause an emission from a source.
- multiple energies may be detected at the detector to acquire varying projections at the same perspective.
- Fig. 1 is an environmental view of an imaging system in an operating theatre
- FIG. 3 is an illustration of multiple image projections and a composite image visualization, according to various embodiments.
- the subject disclosure is directed to an exemplary embodiment of a surgical procedure on a subject, such as a human patient. It is understood, however, that the system and methods described herein are merely exemplary and not intended to limit the scope of the claims included herein. In various embodiments, it is understood, that the systems and methods may be incorporated into and/or used on non-animate objects.
- the systems may be used to, for example, image and register coordinate systems between two systems for use on manufacturing systems, maintenance systems, and the like.
- automotive assembly may use one or more robotic systems including individual coordinate systems that may be registered together for coordinated or consorted actions. Accordingly, the exemplary illustration of a surgical procedure herein is not intended to limit the scope of the appended claims.
- a tracking system may be incorporated into a navigation system to allow tracking and navigation of one or more instruments (which may be the members) that may be tracked relative to the subject.
- the subject may also be tracked.
- the navigation system may include one or more tracking systems that track various portions, such as tracking devices, associated with instruments.
- the tracking system may include a localizer that is configured to determine the pose of the tracking device in a navigation system coordinate system.
- the pose may include any number of degrees of freedom, such as a three-dimensional location (e.g., x, y, z) and an orientation (e.g., yaw, pitch, and roll).
- Techniques, systems, or processes to determine the navigation system coordinate system may include those described at various references including U.S. Pat. No. 8,737,708; U.S. Pat. No. 9,737,235; U.S. Pat. No. 8,503,745; U.S. Pat. No. 8,175,681 ; and U.S. Pat. No. 11 ,135,025; all incorporated herein by reference.
- a localizer may be able to track an object within a volume relative to the subject.
- the navigation volume, in which a device may be tracked may include or be referred to as the navigation coordinate system or navigation space.
- a determination or correlation between two coordinate systems may allow for or also be referred to as a registration between two coordinate systems.
- Image data may be acquired for use and/or to generate images, which may also be referred to as image visualizations, of selected portions of a subject.
- the images may be displayed for viewing by a user, such as a surgeon.
- superimposed on at least a portion of the image may be a graphical representation of a tracked portion or member, such as an instrument.
- the graphical representation may be generated (e.g., by a processor module executing instructions) entirely as a graphic that represents the instrument.
- the graphical representation may be superimposed on the image at an appropriate pose due to registration of an image space (also referred to as an image coordinate system) to a subject space.
- a method to register a subject space defined by a subject to an image space may include those disclosed in U.S. Pat. Nos. U.S. Pat. No. 8,737,708; U.S. Pat. No. 9,737,235; U.S. Pat. No. 8,503,745; and U.S. Pat. No. 8,175,681 ; all incorporated herein by reference.
- the first coordinate system may be registered to the subject space or subject coordinate system due to a selected procedure, such as imaging of the subject.
- the first coordinate system may be registered to the subject by imaging the subject with a fiducial portion that is fixed relative to the first member or system, such as a robotic system or other instrument.
- the known position of the fiducial relative to any portion, such as the robotic system or the subject may be used to register the subject space relative to any coordinate system in which the fiducial may be determined (e.g., by imaging or detecting (e.g., touching)).
- a registration of a second coordinate system may allow for tracking of additional elements not fixed to a first portion, such as a robot that has a known coordinate system.
- the tracking of an instrument during a procedure allows for navigation of a procedure.
- the navigation may be used to determine a pose of one or more portions, such as an instrument.
- the pose may include any number of degrees of freedom, such as a three-dimensional location (e.g., x, y, z) and an orientation (e.g., yaw, pitch, and roll).
- image data is used to define an image space it can be correlated or registered to a physical space defined by a subject, such as a patient.
- the patient defines a patient space in which an instrument can be tracked and navigated.
- the image space defined by the image data can be registered to the patient space defined by the patient.
- the registration can occur with the use of fiducials that can be identified in the image data and in the patient space.
- Fig. 1 is a diagrammatic view illustrating an overview of a procedure room or arena.
- the procedure room may include a surgical suite in which may be placed a robotic system 20 and a navigation system 26 that can be used for various procedures.
- the robotic system 20 may include a Mazor XTM robotic guidance system, sold by Mazor Robotics Ltd. having a place of business in Israel and/or Medtronic, Inc. having a place of business in Minnesota, USA and/or as disclosed in U.S. Pat. No. 11 ,135,025, incorporated herein by reference.
- the robotic system 20 may be used to assist in guiding a selected instrument, such as drills, screws, be an ultrasound (US) probe 33, etc. relative to a subject 30.
- a selected instrument such as drills, screws, be an ultrasound (US) probe 33, etc. relative to a subject 30.
- the robotic system 20 may include a mount 34 that fixes a portion, such as a robotic base 38, relative to the subject 30.
- the robotic system 20 may include one or more arms 40 that are moveable or pivotable relative to the subject 30, such as including an end effector 44.
- the end effector 44 may be any appropriate portion, such as a tube, guide, or passage member. Affixed to and/or in place of the end effector may be the imaging system that may be the US probe 33.
- a robotic processor module 53 may be used to control (e.g., execute instructions) to move and determine a pose of the end effector, such as relative to the base 34.
- the navigation system 26 can be used to navigate the various portions due to the tracked the pose of one or more tracking devices, tracking devices may include a robot tracking device 54, a subject tracking device 58, an imaging system tracking device 62, a tool tracking device 66, and/or an US probe tracking device 81 . Each of the tracking devices may be used to track one or more portions, including those illustrated as being attached to the respective tracking devices.
- the patient 30 can also be tracked as the patient moves with a patient tracking device, DRF, or tracker 58.
- the patient 30 may be fixed within navigation space defined by the navigation system 26 to allow for registration.
- registration of the image space to the patient space or subject space allows for navigation of the instrument 68 with the image data.
- a position of the instrument 68 can be illustrated relative to image data acquired of the patient 30 on a display device 84.
- An additional and/or alternative display device 84’ may also be present to display an image.
- Various tracking systems such as one including an optical localizer 88 or an electromagnetic (EM) localizer 92 can be used to track the instrument 68.
- EM electromagnetic
- the subject 30 can be positioned within the x-ray cone 270 to allow for acquiring image data of the subject 30 based upon the emission of x-rays in the direction of vector 274 towards the detector 178.
- the subject may attenuate, e.g., scatter or absorb, x-rays.
- the unattenuated x-rays or c-rays that pass through the subject 30 reach the detector 170.
- the weighting or thresholding process may include various techniques, such as that discussed herein. For example, a similar area in two or more images or image elements may be determined. The area or region may, for example, be one that is determined to be not too bright or too dark. Too bright or too dark may be relative and may be a deviation from a mean of the one or more images, such as within 10% of the mean. One of images may then be scaled so it would be similar to the other, such as by scaling the brightness of the selected image or image elements therein. In the composite or fused image, a weight may be given to pixels in each of the images to determine if each pixel is included and/or how much weight it is given in the new composite image pixel.
- a second path may include that the image element is determined to not be beyond the threshold value.
- a no path 410 may be followed.
- the determination that an image element is not beyond a threshold value may be that the image element includes or is within a threshold value range.
- an image element that is beyond a threshold may be replaced with an image element that is not beyond a threshold.
- the first and second projections may be acquired with different parameters of the detector of 178.
- the two projections 300, 304 may have differing contrasts or exposures. Therefore an image element in, for example, an image element in the first projection 300 that is within a threshold may be used to replace an image element that in the second projection 304 that is outside of the threshold or beyond the threshold. Therefore, a direct replacement of an image element between the two projections may occur, such as due to the mapping of the image elements in block 382.
- Processing of the image elements may also or alternatively include a scaling of an image element. For example, if an image element is determined to be beyond a threshold it may be scaled a selected amount, such as increasing or decreasing its whiteness level. The scaling may be based upon various parameters or schemes such as its deviation from the threshold value, a comparison of the image element to the second or another image projection, or the like. Generally, scaling may allow two or more images or projections to be made similar enough to allow the generation of the compensate for variations in intensity of selected regions (e.g., "good" areas) of each of the images. This may be done by selecting a region or area that is considered “good” in the two or more images. For example, if two or more of the images do not have the same mean brightness level.
- selected regions e.g., "good" areas
- a ratio of the mean bright level may be determined and at least one of the images may be scaled accordingly so the two regions are be similar. This allows the two or more images to be fused.
- the fusing to generate the composite may be by means of weighting, as previously described.
- the scaling of the image element may allow for a changing of the value of the image element without directly replacing it with another image element.
- the processed image elements may also be combined in block 418. Therefore, processed image elements from block 422 and the kept image elements from block 414 may be combined in block 418.
- the combined image elements may be mapped to a projection area due to the mapping in block 382.
- a generation of a composite image visualization may be made with the combined image elements and may be performed in block 426.
- the generation of the composite image in block 426 may include generating a composite image or projection that includes the kept image elements from block 414 at their respective coordinate and the processed image element from block 422 at their respective coordinates. Therefore, the generation of the composite image may include generating an image with image elements at the coordinates determined in the map image elements from block 382, such as by the techniques of scaling and weighting as discussed above.
- the composite image may then be output in block 430.
- Outputting the composite image in block 430 may include various processes.
- Outputting the composite image in block 430 may include saving the composite image in a memory system, such as a memory system of the imaging assembly 96.
- the output of the composite image in 430 may also include transfer or generation of the visual display of the composite image. Therefore, a display of a selected image may be made in block 434.
- the selected image may be the composite image output in block 430.
- the processing subprocess 390 may generate the composite image that is output for various purposes, such as displaying a block 434. It is understood that other image projections may also be displayed such as the acquired projections from subprocess 370.
- the process 360 may then end in block 438.
- the process ending about 438 may include other processes such as performing a procedure on the subject 30, analyzing or segmenting the composite image, or other appropriate procedures.
- the ending of the process 360 in block 438 may not require a ceasing of all actions, but rather the generation of the composite image in an output thereof, as discussed above.
- the process 360 may be operated to perform various procedures.
- the imaging system ad may be operated either alone or in combination with other systems to generate the composite image to allow for an image that includes image data or an image visualization that may not be possible with only a single projection.
- the composite projection or image visualization such that the composite image may be used for various purposes with greater efficiency than either or any one of the originally acquired projections, as discussed above.
- Example 1 A system to generate an image visualization of a subject, the system comprising: a detector configured to acquire image data at a first detector parameter and a second detector parameter; a positioning system configured to position the detector at a first pose relative to the subject; a source configured to emit at the detector; and a processor module configured to execute instructions to evaluate a first image data acquired at the first detector parameter and a second image data acquired at the second detector parameter and output a composite image visualization based thereon.
- Example 4 The system of Example 1 , wherein the detector includes a duel energy detector.
- Example s The system of Example 1 , wherein the detector includes at least a first sensitivity parameter as the first detector parameter and a second sensitivity parameter as the second detector parameter.
- Example 7 The system of Example 1 , wherein the source includes a x-ray source.
- Example 8 The system of Example 1 , wherein the source and the detector are configured to move relative to the subject.
- Example 9 The system of Example 1 , wherein the source is configured to emit a beam of x-rays having a spectrum with a selected peak x-ray voltage.
- Example 10- The system of Example 1 , wherein the processor module is configured to execute further instructions to determine a value of an image element in both the first image data and the second image data relative to a threshold.
- Example 11 The system of Example 10, wherein the composite image visualization includes image elements selected or scaled based on the comparison to the threshold of the image elements of the first image data and the second image data.
- Example 12- A method to generate an image visualization of a subject, the method comprising: acquiring a first image data at a first detector parameter; acquiring a second image data at a second detector parameter; emitting an energy beam from a source; and evaluating the first image data acquired at the first detector parameter and the second image data acquired at the second detector parameter; and outputting a composite image visualization based on the evaluation.
- Instructions may be executed by a processor and may include software, firmware, and/or microcode, and may refer to programs, routines, functions, classes, data structures, and/or objects.
- the term shared processor circuit encompasses a single processor circuit that executes some or all code from multiple modules.
- the term group processor circuit encompasses a processor circuit that, in combination with additional processor circuits, executes some or all code from one or more modules. References to multiple processor circuits encompass multiple processor circuits on discrete dies, multiple processor circuits on a single die, multiple cores of a single processor circuit, multiple threads of a single processor circuit, or a combination of the above.
- the term shared memory circuit encompasses a single memory circuit that stores some or all code from multiple modules.
- the term group memory circuit encompasses a memory circuit that, in combination with additional memories, stores some or all code from one or more modules.
- the computer programs may include: (i) assembly code; (ii) object code generated from source code by a compiler; (iii) source code for execution by an interpreter; (iv) source code for compilation and execution by a just-in-time compiler, (v) descriptive text for parsing, such as HTML (hypertext markup language) or XML (extensible markup language), etc.
- source code may be written in C, C++, C#, Objective-C, Haskell, Go, SQL, Lisp, Java®, ASP, Perl, Javascript®, HTML5, Ada, ASP (active server pages), Perl, Scala, Erlang, Ruby, Flash®, Visual Basic®, Lua, or Python®.
- Communications may include wireless communications described in the present disclosure can be conducted in full or partial compliance with IEEE standard 802.11-2012, IEEE standard 802.16-2009, and/or IEEE standard 802.20-2008.
- IEEE 802.11-2012 may be supplemented by draft IEEE standard 802.11ac, draft IEEE standard 802.11ad, and/or draft IEEE standard 802.11ah.
- a processor, processor module, module or ‘controller’ may be used interchangeably herein (unless specifically noted otherwise) and each may be replaced with the term ‘circuit.’ Any of these terms may refer to, be part of, or include: an Application Specific Integrated Circuit (ASIC); a digital, analog, or mixed analog/digital discrete circuit; a digital, analog, or mixed analog/digital integrated circuit; a combinational logic circuit; a field programmable gate array (FPGA); a processor circuit (shared, dedicated, or group) that executes code; a memory circuit (shared, dedicated, or group) that stores code executed by the processor circuit; other suitable hardware components that provide the described functionality; or a combination of some or all of the above, such as in a system- on-chip.
- ASIC Application Specific Integrated Circuit
- FPGA field programmable gate array
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Radiology & Medical Imaging (AREA)
- Molecular Biology (AREA)
- Biophysics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Optics & Photonics (AREA)
- Pathology (AREA)
- Physics & Mathematics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- High Energy & Nuclear Physics (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Processing (AREA)
Abstract
A method and system are disclosed for acquiring image data of a subject. The image data can be collected with an imaging system in a selected manner and/or motion. More than one projection may be combined to generate and create a selected view or visualization of the subject.
Description
SYSTEM AND METHOD TO GENERATE IMAGE VISUALIZATION
FIELD
[0001] The present disclosure relates to imaging a subject, and particularly to a system and method to acquire image data for generating a selected image visualization.
BACKGROUND
[0002] This section provides background information related to the present disclosure which is not necessarily prior art.
[0003] A subject, such as a human patient, may undergo a procedure. The procedure may include a surgical procedure to correct or augment an anatomy of the subject. The augmentation of the anatomy can include various procedures, such as movement or augmentation of bone, insertion of an implant (i.e. an implantable device), or other appropriate procedures.
[0004] A surgeon can perform the procedure on the subject with images of the subject that are based on projections of the subject. The images may be generated with imaging systems such as a magnetic resonance imaging (MRI) system, computed tomography (CT) system, fluoroscopy (e.g. C-Arm imaging systems), or other appropriate imaging systems.
SUMMARY
[0005] This section provides a general summary of the disclosure, and is not a comprehensive disclosure of its full scope or all of its features.
[0006] According to various embodiments, a system to acquire image data of a subject with an imaging system may use x-rays. The subject may be a living patient
(e.g. a human patient). The subject may also be a non-living subject, such as an enclosure, a casing, etc. The imaging system may include a moveable source and/or detector that is moveable relative to the subject.
[0007] An imaging system may include the movable source and/or detector to create a plurality of projections of a subject. The plurality of projections may be acquired in a linear path of movement of the source and/or detector. The plurality of projections may then be combined, such as by stitching together, to generate or form a long view (also referred to as a long film). The long view may be a two-dimensional view of the subject. The various projections may also or alternatively be combined with selected techniques to include a range of image data that is not present in a single projection. Thus, the plurality of projections may include more than one projection acquired at the same pose (e.g., location and orientation) relative to the subject.
[0008] In various embodiments, the imaging system may acquire a plurality of projections at the same perspective relative to the subject. Each projection may be acquired at a varying or different parameter of a detector. For example, a single detector may have parameters that detect different spectra. A different spectrum may be achieved between projections such as by changing a voltage to cause an emission from a source. Moreover, multiple energies may be detected at the detector to acquire varying projections at the same perspective.
[0009] The imaging system may include a detector that is configurable to detect at different parameters. The parameters may be based on the source emissions and the operation of the source. The detector, however, may be operated to acquire varying image data, such as more than one projection at a selected pose relative to the
subject. The multiple projections may be combined to generate an image visualization that included image data from the more than one projections.
[0010] Further areas of applicability will become apparent from the description provided herein. The description and specific examples in this summary are intended for purposes of illustration only and are not intended to limit the scope of the present disclosure.
DRAWINGS
[0011] The drawings described herein are for illustrative purposes only of selected embodiments and not all possible implementations, and are not intended to limit the scope of the present disclosure.
[0012] Fig. 1 is an environmental view of an imaging system in an operating theatre;
[0013] Fig. 2 is a detailed schematic view of an imaging system with a dual energy source system;
[0014] Fig. 3 is an illustration of multiple image projections and a composite image visualization, according to various embodiments; and
[0015] Fig. 4 is a flowchart of a process for acquiring and processing image visualizations, according to various embodiments.
[0016] Corresponding reference numerals indicate corresponding parts throughout the several views of the drawings.
DETAILED DESCRIPTION
[0017] Example embodiments will now be described more fully with reference to the accompanying drawings.
[0018] The subject disclosure is directed to an exemplary embodiment of a surgical procedure on a subject, such as a human patient. It is understood, however, that the system and methods described herein are merely exemplary and not intended to limit the scope of the claims included herein. In various embodiments, it is understood, that the systems and methods may be incorporated into and/or used on non-animate objects. The systems may be used to, for example, image and register coordinate systems between two systems for use on manufacturing systems, maintenance systems, and the like. For example, automotive assembly may use one or more robotic systems including individual coordinate systems that may be registered together for coordinated or consorted actions. Accordingly, the exemplary illustration of a surgical procedure herein is not intended to limit the scope of the appended claims.
[0019] Various members or portions thereof may also be tracked relative to the subject. For example, a tracking system may be incorporated into a navigation system to allow tracking and navigation of one or more instruments (which may be the members) that may be tracked relative to the subject. The subject may also be tracked. The navigation system may include one or more tracking systems that track various portions, such as tracking devices, associated with instruments. The tracking system may include a localizer that is configured to determine the pose of the tracking device in a navigation system coordinate system. The pose may include any number of degrees of freedom, such as a three-dimensional location (e.g., x, y, z) and an orientation (e.g., yaw, pitch, and roll). Techniques, systems, or processes to determine the navigation system coordinate
system may include those described at various references including U.S. Pat. No. 8,737,708; U.S. Pat. No. 9,737,235; U.S. Pat. No. 8,503,745; U.S. Pat. No. 8,175,681 ; and U.S. Pat. No. 11 ,135,025; all incorporated herein by reference. In particular, a localizer may be able to track an object within a volume relative to the subject. The navigation volume, in which a device may be tracked, may include or be referred to as the navigation coordinate system or navigation space. A determination or correlation between two coordinate systems may allow for or also be referred to as a registration between two coordinate systems.
[0020] Image data may be acquired for use and/or to generate images, which may also be referred to as image visualizations, of selected portions of a subject. The images may be displayed for viewing by a user, such as a surgeon. In various embodiments, superimposed on at least a portion of the image may be a graphical representation of a tracked portion or member, such as an instrument. The graphical representation may be generated (e.g., by a processor module executing instructions) entirely as a graphic that represents the instrument. According to various embodiments, the graphical representation may be superimposed on the image at an appropriate pose due to registration of an image space (also referred to as an image coordinate system) to a subject space. A method to register a subject space defined by a subject to an image space may include those disclosed in U.S. Pat. Nos. U.S. Pat. No. 8,737,708; U.S. Pat. No. 9,737,235; U.S. Pat. No. 8,503,745; and U.S. Pat. No. 8,175,681 ; all incorporated herein by reference.
[0021] During a selected procedure, the first coordinate system may be registered to the subject space or subject coordinate system due to a selected procedure, such as imaging of the subject. In various embodiments, the first coordinate system may
be registered to the subject by imaging the subject with a fiducial portion that is fixed relative to the first member or system, such as a robotic system or other instrument. The known position of the fiducial relative to any portion, such as the robotic system or the subject, may be used to register the subject space relative to any coordinate system in which the fiducial may be determined (e.g., by imaging or detecting (e.g., touching)). A registration of a second coordinate system may allow for tracking of additional elements not fixed to a first portion, such as a robot that has a known coordinate system.
[0022] The tracking of an instrument during a procedure, such as a surgical or operative procedure, allows for navigation of a procedure. The navigation may be used to determine a pose of one or more portions, such as an instrument. The pose may include any number of degrees of freedom, such as a three-dimensional location (e.g., x, y, z) and an orientation (e.g., yaw, pitch, and roll). When image data is used to define an image space it can be correlated or registered to a physical space defined by a subject, such as a patient. According to various embodiments, therefore, the patient defines a patient space in which an instrument can be tracked and navigated. The image space defined by the image data can be registered to the patient space defined by the patient. The registration can occur with the use of fiducials that can be identified in the image data and in the patient space.
[0023] Fig. 1 is a diagrammatic view illustrating an overview of a procedure room or arena. In various embodiments, the procedure room may include a surgical suite in which may be placed a robotic system 20 and a navigation system 26 that can be used for various procedures. The robotic system 20 may include a Mazor X™ robotic guidance system, sold by Mazor Robotics Ltd. having a place of business in Israel and/or Medtronic, Inc. having a place of business in Minnesota, USA and/or as disclosed in U.S. Pat. No.
11 ,135,025, incorporated herein by reference. The robotic system 20 may be used to assist in guiding a selected instrument, such as drills, screws, be an ultrasound (US) probe 33, etc. relative to a subject 30.
[0024] The robotic system 20 may include a mount 34 that fixes a portion, such as a robotic base 38, relative to the subject 30. The robotic system 20 may include one or more arms 40 that are moveable or pivotable relative to the subject 30, such as including an end effector 44. The end effector 44 may be any appropriate portion, such as a tube, guide, or passage member. Affixed to and/or in place of the end effector may be the imaging system that may be the US probe 33. A robotic processor module 53 may be used to control (e.g., execute instructions) to move and determine a pose of the end effector, such as relative to the base 34. The pose of the base 34 may be known in a coordinate system, such as the patient space of the patient 30 and/or the image coordinate system due to a registration as discussed above and exemplary disclosed in U.S. Pat. No. 11 ,135,025, incorporated herein by reference.
[0025] The navigation system 26 can be used to navigate the various portions due to the tracked the pose of one or more tracking devices, tracking devices may include a robot tracking device 54, a subject tracking device 58, an imaging system tracking device 62, a tool tracking device 66, and/or an US probe tracking device 81 . Each of the tracking devices may be used to track one or more portions, including those illustrated as being attached to the respective tracking devices.
[0026] An imaging device or system 80 may be an additional or alternative imaging system that may be used to acquire pre-, intra-, or post-operative or real-time image data of a subject, such as the subject 30, and may be tracked with the image system tracking device 62. It will be understood, however, that any appropriate subject can be
imaged and any appropriate procedure may be performed relative to the subject. In the example shown, the imaging device 80 comprises an O-arm® imaging device sold by Medtronic Navigation, Inc. having a place of business in Louisville, Colorado, USA. The imaging device 80 may have a generally annular gantry housing 82 in which an image capturing portion is moveably placed. The imaging device 80 can include those disclosed in U.S. Pat. Nos. 7,188,998; 7,108,421 ; 7,106,825; 7,001 ,045; and 6,940,941 ; all of which are incorporated herein by reference, or any appropriate portions thereof. It is further appreciated that the imaging device 80 may include in addition or alternatively a fluoroscopic C-arm. Other exemplary imaging devices may include fluoroscopes such as bi-plane fluoroscopic systems, ceiling mounted fluoroscopic systems, cath-lab fluoroscopic systems, fixed C-arm fluoroscopic systems, isocentric C-arm fluoroscopic systems, 3D fluoroscopic systems, etc. Other appropriate imaging devices can also include MRI, CT, ultrasound, etc. The various imaging systems may include or use one or more imaging modalities, such as x-ray, magnetic resonance, ultrasound, Positron emission tomography (PET) scans, combinations thereof, etc.
[0027] The position of the imaging system 33, 80, and/or portions therein such as the image capturing portion, can be precisely known relative to any other portion of the imaging device 33, 80. Also, the respective tracking devices may be used to track one or more portions of the respective imaging systems. The precise positioning and/or tracking can allow the imaging system 33, 80 and/or the navigation system 26 to know its position relative to the patient 30 or other references. In addition, as discussed herein, the precise knowledge of the position of the image capturing portion can be used in conjunction with a tracking system to determine the position of the image capturing portion and the image data relative to the tracked subject, such as the patient 30. The pose (e.g.,
distance from a selected portion of the US probe 33 and/or the tracking device 81) may be determined or predetermined and saved for recall with a calibration process and/or jig, such as that disclosed in U.S. Pat. Nos. 7,831 ,082; 8,320,653; and 9,138,204, all incorporated herein by reference.
[0028] The imaging device 80 can be tracked with a tracking device 62. Also, the tracking device 81 can be associated directly with the US probe 33. The US probe 33 may, therefore, be directly tracked with a navigation system as discussed herein. In addition or alternatively, the US probe 33 may be positioned and tracked with the robotic system 20. Regardless, image data defining an image space acquired of the patient 30 can, according to various embodiments, be registered (e.g., manually, inherently, or automatically) relative to an object space. The object space can be the space defined by a patient 30 in the navigation system 26.
[0029] The patient 30 can also be tracked as the patient moves with a patient tracking device, DRF, or tracker 58. Alternatively, or in addition thereto, the patient 30 may be fixed within navigation space defined by the navigation system 26 to allow for registration. As discussed further herein, registration of the image space to the patient space or subject space allows for navigation of the instrument 68 with the image data. When navigating the instrument 68, a position of the instrument 68 can be illustrated relative to image data acquired of the patient 30 on a display device 84. An additional and/or alternative display device 84’ may also be present to display an image. Various tracking systems, such as one including an optical localizer 88 or an electromagnetic (EM) localizer 92 can be used to track the instrument 68.
[0030] More than one tracking system can be used to track the instrument 68 in the navigation system 26. According to various embodiments, these tracking
systems can include an electromagnetic tracking (EM) system having the EM localizer 94, an optical tracking system having the optical localizer 88 and/or other appropriate tracking systems not illustrated such as an ultrasound tracking system, or other appropriate tracking systems. One or more of the tracking systems can be used to track selected tracking devices, as discussed herein, sequentially or simultaneously. It will be understood, unless discussed otherwise, that a tracking device can be a portion trackable with a selected tracking system. A tracking device need not refer to the entire member or structure to which the tracking device is affixed or associated.
[0031] Image data acquired from the imaging system 33, 80 or any appropriate imaging system, can be acquired at and/or forwarded from an image device controller 96, that may include a processor module 97, to a navigation computer and/or processor system 102 that can be a part of a controller or work station 98 having the display 84 and a user interface 106. The processor system 102 may be a processor module, as discussed herein, including integral memory or a communication system to access external memory for executing instructions and/or operated as a specific integrated circuit (e.g., ASIC). It will also be understood that the image data is not necessarily first retained in the controller 96, but may also be directly transmitted to the work station 98. The work station 98 can provide facilities for displaying the image data as an image 108 on the display 84, saving, digitally manipulating, or printing a hard copy image of the received image data. The user interface 106, which may be a keyboard, mouse, touch pen, touch screen or other suitable device, allows the user 72 to provide inputs to control the imaging device 80, via the image device controller 96, or adjust the display settings of the display 84. The work station 98 may also direct the image device controller 96 to adjust the image capturing portion of the imaging device 80 to obtain
various two-dimensional images along different planes in order to generate one or more representative two-dimensional and three-dimensional image data that may be used to generate two-dimensional, three-dimensional, or a more than of either or both images.
[0032] With continuing reference to FIG. 1 , the navigation system 26 can further include any one or more tracking system, such as the tracking system including either or both of the electromagnetic (EM) localizer 94 and/or the optical localizer 88. The tracking systems may include a controller and interface portion 110. The controller 110 can be connected to the processor portion 102, which can include a processor included within a computer. The EM tracking system may include the STEALTHSTATION® AXIEM™ Navigation System, sold by Medtronic Navigation, Inc. having a place of business in Louisville, Colorado; or can be the EM tracking system described in U.S. Pat. No. 7,751 ,865; U.S. Pat. No. 5,983,126; U.S. Pat. No. 5,913,820; or U.S. Pat. No. 5,592,939; all of which are herein incorporated by reference. It will be understood that the navigation system 26 may also be or include any appropriate tracking system, including a STEALTHSTATION® TREON® or S7™ tracking systems having an optical localizer, which may be used as the optical localizer 88, and sold by Medtronic Navigation, Inc. of Louisville, Colorado. Other tracking systems include an acoustic, radiation, radar, etc. The navigation system 26 and/or tracking system may be a hybrid system that includes components from multiple tracking systems. The tracking systems can be used according to generally known or described techniques in the above incorporated references. Details will not be included herein except when to clarify selected operation of the subject disclosure.
[0033] Various portions of the navigation system 26, such as the instrument
68, and others as will be described in detail below, can be equipped with at least one, and
generally multiple, of the tracking devices 66. The instrument can also include more than one type or modality of tracking device 66, such as an EM tracking device and/or an optical tracking device. According to various embodiments, the navigation system 26 can be used to track the instrument 68 relative to the patient 30. The instrument 68 can be tracked with the tracking system, as discussed above. Image data of the patient 30, or an appropriate subject, can be used to assist the user 72 in guiding the instrument 68. The image data, which may include one or more image data or images, may be registered to the patient 30. The image data defines an image space that is registered to the patient space defined by the patient 30. The registration can be performed as discussed herein, automatically, manually, or combinations thereof.
[0034] Generally, registration allows a translation map to be generated of the physical location of the instrument 68 relative to the image space of the image data. The translation map allows the tracked position of the instrument 68 to be displayed on the display device 84 relative to the image 108. A graphical representation 68i, also referred to as an icon, can be used to illustrate the location of the instrument 68 relative to the image data 108.
[0035] With continuing reference to Fig. 1 , a subject registration system or method can use the tracking device 58. The tracking device 58 may include portions or members 120 that may be trackable, but may also act as or be operable as a fiducial assembly. The fiducial assembly 120 can include a clamp or other fixation portion 124 and the imageable fiducial body 120. It is understood, however, that the members 120 may be separate from the tracking device 58. The fixation portion 124 can be provided to fix any appropriate portion, such as a portion of the anatomy. As illustrated in Fig. 1 , the fiducial assembly 120 can be interconnected with a vertebra 126 and/or a portion of the
vertebra 126 which may form a spine. In various embodiments, the connection may be to a spinous process.
[0036] With additional reference to Fig. 2, The imaging system 80 may move, as a whole or in part, relative to the subject 30. For example, the source 174 and the detector 178 can move in a 360° motion around the patient 30. The movement of the source 174 and the detector 178 as a source/detector unit 198 within the gantry 82 may allow the source 174 to remain generally 180° opposed (such as with a fixed inner gantry or rotor or moving system) to the detector 178. Thus, the detector 178 may be referred to as moving around (e.g. in a circle or spiral) the subject 30 and it is understood that the source 174 is remaining opposed thereto, unless disclosed otherwise.
[0037] Also, the gantry 82 can move isometrically (also referred as “wag” relative to the subject 30 generally in the direction of arrow 200 around an axis 202, such as through a cart 160, as illustrated in Fig. 1 . The gantry 82 can also tilt relative to a long axis 206 of the patient 30 illustrated by arrows 210. In tilting, a plane of the gantry 82 may tilt or form a non-orthogonal angle with the axis 206 of the subject 30.
[0038] The gantry 82 may also move longitudinally in the direction of arrows 214 along the line 206 relative to the subject 30 and/or the cart 160. Also, the cart 160 may move to move the gantry 82. Further, the gantry 82 can move up and down generally in the direction of arrows 218 relative to the cart 160 and/or the subject 30, generally transverse to the axis 206 and parallel with the axis 202.
[0039] The movement of the imaging system 80, in whole or in part is to allow for positioning of the source/detector unit (SDU) 198 relative to the subject 30. The imaging device 80 can be precisely controlled to move the SDU 198 relative to the subject 30 to generate precise image data of the 30.
[0040] The source 174, as discussed herein, may include one or more sources of x-rays for imaging the subject 30. In various embodiments, the source 174 may include a single source that may be powered by more than one power source to generate and/or emit x-rays at different energy characteristics or different amounts x-rays. In various embodiments, the same characteristics may be used with low vs high amounts, which is different from having two different characteristics, but having the same total amount. In various embodiments, the source may emit x-rays in at least two different powers such as maintaining the same spectrum, e.g., 120 kilovolt peak (kVp), but changing the used milliamp seconds (mAs). In other words, the voltage may have a spectrum and the kVp is the peak in that spectrum. For example, 120kVp and have 100 mAs for one image and 200 mAs for a second image for two images. Further, 300 mAs for a third image, in case more than two images being used. The mAs is the change of the tube current, so the characteristics of the X-rays are the same, but the total amount is different. Further, more than one x-ray source may be the source 174 that may be powered to emit x-rays with differing energy characteristics at selected times. Also, the source may emit with a single selected power, but may emit a spectrum such as 140kVp or 80kVp, while maintaining the same mAs, thus the characteristics and the does may change.
[0041] With continuing reference to Fig. 2, according to various embodiments, the source 174 can include a single x-ray tube assembly 250 that can be connected to a switch 254 that can interconnect a first power source 258 via a connection or power line 262. As discussed above, X-rays can be emitted from the x-ray tube 250. According to various embodiments, the x-rays may be emitted in a cone shape 270 towards the detector 178 and generally in the direction from the x-ray tube 250 as
indicated by arrow, beam arrow, beam or vector 274. The switch 254 can switch power on or off to the tube 250 to emit x-rays of selected characteristics, as is understood by one skilled in the art.
[0042] The vector or portion 274 may be a central vector or ray within the cone 270 of x-rays. An x-ray beam may be emitted as the cone 270 or other appropriate geometry. The vector 274 may include a selected line or axis relevant for further interaction with the beam, such as with a filter member, as discussed further herein.
[0043] The subject 30 can be positioned within the x-ray cone 270 to allow for acquiring image data of the subject 30 based upon the emission of x-rays in the direction of vector 274 towards the detector 178. Generally, the subject may attenuate, e.g., scatter or absorb, x-rays. The unattenuated x-rays or c-rays that pass through the subject 30 reach the detector 170.
[0044] The imaging system 80, according to various embodiments, may operate the x-ray tube 250 to generate projections of the subject 30. The projections may be based on detected x-rays with a x-ray detector. The x-ray detector may be Flat Panel Detector (FPD). The projections may be two-dimension (2D) x-ray projections of the subject 30, including selected portions of the subject 30, or any area, region or volume of interest, in light of the x-rays impinging upon or being detected at the detector 178. The 2D x-ray projections can be reconstructed, as discussed herein, to generate and/or display selected images or image visualizations including 2d images, three-dimensional (3D) volumetric models of the subject 28, selected portion of the subject 28, or any area, region or volume of interest, or a plurality of either to generate four dimension (4D) images that may be time varying. As discussed herein, the 2D x-ray projections can be image
data acquired with the imaging system 80, while the 3D volumetric models can be generated or model image data.
[0045] For reconstructing or forming the 3D volumetric image, appropriate algebraic techniques include Expectation maximization (EM), Ordered Subsets EM (OS- EM), Simultaneous Algebraic Reconstruction Technique (SART) and Total Variation Minimization (TVM), as generally understood by those skilled in the art. The application to perform a 3D volumetric reconstruction based on the 2D projections allows for efficient and complete volumetric reconstruction. Generally, an algebraic technique can include an iterative process to perform a reconstruction of the subject 30 for display as the image 108.
[0046] The imaging system 80, or portions thereof such as the SDU 198 may be moved relative to the subject 30, such as around along axis of the subject 30. Accordingly, as illustrated in Fig. 2 the imaging system 80 may be positioned such that an anterior-to-posterior projection may be acquired with cone 270 or a lateral projection may be acquired with a cone 270’. Accordingly a plurality of image projections of the subject 30 may be acquired at different perspectives.
[0047] However, as discussed above, the imaging system 80 may also acquire a plurality of projections at a single perspective. For example, with an anterior to posterior projection the switch 254 may operated to switch from the first power source to a second power source 280. Therefore, the beam 270 may be operated at two different powers. The image data acquired of the subject 30 may then also be acquired at two different powers and include at least two different projections. As discussed above other parameters may also be altered at the source. Also, the source may emit x-rays within a
spectrum, such as based on a selected power (e.g., 40 kVp or 120 kVp) or being unfiltered.
[0048] Further the detector 178 may be operated to collect varying image data of the subject 30. For example, the detector 178 may be a dual power detector, such as a detector that may specifically operate to detect X-rays at two different powers within a single broad spectrum or a single detector able to detect two different powers. For example, a detector such as the detector included with the X-35 series X-ray inspection systems sold by Mettler Toledo having a place of business at Columbus, OH may be provided. The detector 178 may also alternatively include a detector able to discretely detect X-rays at varying spectrum. For example, detectors including those included in the dynamic CMOS X-ray flat detector sold by Teledyne Dalsa having a place of business in Santa Clara, CA may be provided. The detectors may be discretely operated or controlled to detect X-rays at differing spectra.
[0049] As discussed above, the imaging system may be operated with the selected controller. Therefore the controller may operate the detector of 178 to detect X- rays at discrete spectra and/or selected differing powers. Thus, a plurality projections may be acquired at a single perspective of the detector of 178 relative to the subject 30. The plurality of projections may be evaluated and/or combined to achieve a selected image visualization, as discussed further herein.
[0050] Turning to Fig. 3, according to various embodiments, a plurality of projections such as at least a first projection 300 and a second projection 304, may be acquired of the subject 30. The two projections 300, 304 may be acquired at the same perspective relative to the subject 30. In various embodiments the projections 300, 304
may be anterior-to-posterior image projections and may include a portion of a spine of the subject 30, but this is merely exemplary.
[0051] In various embodiments, the two projections 300, 304, may be acquired with varied or different parameters of the detector 178 such that the different projections have different characteristics of x-rays detection. For example, the projection 300 may be acquired with the detector 178 set at high exposure parameter and the second projection 304 may be acquired at a low exposure parameter. The two projections 300, 304 may differ relative to one another and include various different details based upon the various parameters of the detector 178 during the image data projection acquisition. The two parameters may be set to allow acquisition of the two projections 300, 304 simultaneously or substantially simultaneously. For example, within a 10 milliseconds of each other. General, simultaneously or substantially simultaneously allows the two projections 300, 304 to include image data of the subject in the same or substantially the same pose relative to the detector.
[0052] The two projections 300, 304 may be combined in various manners to achieve a combined, also referred to as a composite, or selected visualization 310. The image visualization 310 may be generated based upon the two projections 300, 304 to include various details and image information that is not singly available or readily apparent in the separate projections 300, 304 but may be generated in the selected visualization 310.
[0053] The first projection 300 may be a projection or illustrative of image data acquired by the imaging system 80 with the detector 178 at the selected image data acquisition parameters. According to various embodiments the projection 300 may also represent an image that is generated based upon the image data acquired with the
imaging system 80 operating at the first parameters. According to various embodiments, for example, the image projection 300 may be acquired with the detector operating at parameters such as to detect x-rays having an energy spectrum of up to 100 kilo electron volts (keV) or having a selected full-well capacity of up to 50 micro-Gray. The total intensity that causes saturation ("blindness") of a detector, i.e., the sensitivity of the detector is sometimes called full well capacity of the detector. Therefore, the image projection 300 may include a projection that is illustrative of an "over or high exposure”. As is understood by one skilled in the art the projection 300 may be acquired at a high energy or high dosage such that the projection or image appears to be overexposed when viewed, including a low data or high exposure area 320.
[0054] While the entire image projection 300 may be produced at a high exposure, a high exposure area 320 may include relatively little image data, as exemplary illustrated in Fig. 3. The second projection 304, however, may be acquired with the detector 178 at alternative or different parameters, such as different full-well capacity, detecting x-rays in an electron volt or energy range of up to 50 micro-Gray. This may cause the appearance of lower energy, as the detector is much less sensitive in this case, even if the X-ray parameters of the source were exactly the same. In this manner, the image or projection 304 may be a lower exposure or appear to be at a lower exposure, as illustrated in Fig. 3. Again the lower exposure may be exemplary and the acquisition of the image data may be acquired with a lower contrast and appear darker in a generated image, as illustrated in Fig. 3.
[0055] In the lower intensity or lower energy projection 304, the image projection may be acquired at the exact same or substantially same perspective as the first image projection 300, but at the different detector parameters. The exact same or
substantially same position may include a perspective or pose change of a selected minimal amount, such as a change of one centimeter (cm) or less, including no measurable change, a percentage change (e.g., 5% or less), or other appropriate amount. Thus, the overexposed region 320 in the first projection 300 may relate to a lower exposed or selected exposure region 324 in the second projection 304.
[0056] In a similar manner, the first projection 300 may include a second or moderate exposure region 330 that may relate to an underexposed or dark region 334 in the second projection 304. Again as the two projection 300, 304 may be acquired at the exact same or substantially same perspective relative to the subject 30 the two regions 330, 334 may represent the same, including exactly the same, portion of the subject 30.
[0057] A composite or selected visualization image 310 may include image data from either or both of the projections 300, 304 or altered image data based upon the image data acquired in the projections 300, 304. In various embodiments, the composite image may be an image that is a fused image of two or more images or projections, such as the projections 300, 304. According to various embodiments, the projections 300, 304 may include various pixels or image elements. Each image element may relate to a specific portion of the image data and, if taken at the same perspective, represent the same portion of the subject 30. The image elements may relate to any discrete portion of the image and/or portion of a portion of an image, and may also be referred to as a pixel or voxel.
[0058] As discussed further herein, for example, a pixel value may be scaled based upon a comparison or threshold identified relative to either or both of the projections
300, 304. Accordingly, the composite image 310 may include portions that are copied from either of the projections 300, 304, scaled based upon a scaling relative to a threshold
between the projections 300, 304 or based upon a difference between the projection and 300, 304. As illustrated in Fig. 3, for example, the portion 324 from the projection 304 may be processed (e.g., as discussed herein) into the portion 324’ in the composite image 310. As illustrated in Fig. 3, for example, the portion 330 from the projection 300 may be processed (e.g., as discussed herein) into the portion 330’ in the composite image 310.
[0059] As discussed above various processor modules, such as the processor module 102 of the navigation system 26 and/or the processor module 97 of the imaging system 80, may be included in and/or accessed by a selected system. The one or more processing modules may execute instructions, as discussed further herein, to either operate to the detector 178 in the selected manner, generate the composite image 310, or combinations thereof. Accordingly the respective or selected processor module may be operated to assist in generating an image or image visualization that may include a selected composite or scaling of image element based upon the one or more image projections, and generally including at least the two image projections 300, 304 to generate to the composite image.
[0060] The imaging system 80, or any appropriate imaging system, may include a detector such as the detector 178 and may acquire image projections of the subject 30 in a selected manner. For example, the image detector may include the image detectors as discussed above. According to various embodiments the image detector 178, therefore, may include a plurality of layers that may be selectively or operatively sensitive to a single source of x-rays from the source 250. The x-rays in the beam 270 may include a selected energy or energy spectrum, as discussed above. The detector 178 may include two or more layers that may be sensitive to the different portions of the spectrum in the beam 270. For example, the detector 178 may include a first layer 178a and a second
layer 178b. It is understood, according to various embodiments, that the detector 178 may include any appropriate number of layers. In various embodiments, different portions of the detector 178 may also operate as different detecting parameters (e.g., adjacent or alternating detector elements).
[0061] Nevertheless the two layers 178a, 170b are described merely for simplicity of the subject application. Each of the layers 178a, 178b may be used to generate the two projections 300, 304, as illustrated in Fig. 3. Therefore, the detector 178 may be positioned at a certain or single position or perspective relative to the subject 30 to detect the beam 270 from the source 250 at each of the layers 178a, 178b. A detection in each of the layers 178a, 178b may be used to generate the two separate projections 300, 304. In this manner, the two projections 300, 304 may be at a single perspective relative to the subject 30 and the detector 178 need not be moved to acquire both of the projections. It is understood that the two projections may be acquired at any appropriate time such as before during or after a selected portion of a procedure. Thus, any further image analysis or processing may be performed on image data that is acquired from a memory system after being acquired with the imaging system 80 and/or acquired immediately before the processing by the imaging system 80.
[0062] A process 360, as illustrated in Fig. 4, may be used to generate to the composite image 310, as discussed above. The process 360 may include various subprocesses and/or portions that may allow for manual input and/or portions that may be executed or performed by executing instructions with a processor module. Therefore, the process 360 will be understood to allow for operation of various instruments or portions, such as the imaging system 80 and/or automatic generation of the composite image 310, according to various environments.
[0063] The process 360 may begin in the process start block 364. After starting the processing on block 364, an image data acquisition subprocess may occur in block 370. The image data acquisition subprocess 370 may include operating the imaging system 80, acquiring pre-acquired images or image data from a memory system, or other appropriate image data acquisitions. The acquisition sub-process 370 may include the acquisition of a first image projection in block 370 at a first parameter and acquiring a second projection at a second parameter in block 378. As discussed above, the two parameters may include a selection of operation or parameter settings for the detector 178. In various embodiments, the detector 178 may include a plurality of layers to allow for the acquisition of the first and second projection at the first and second parameters substantially simultaneously with a single x-ray beam. Other acquisitions may occur in an appropriate manner, as is understood by one skilled in the art. Nevertheless, the detector 178 may be operated to acquire the first and second image projections at substantially the same perspective, as discussed above.
[0064] The first and second projections acquired in blocks 374 and 378 may include the first and second projections 300, 304, as discussed above. The two projections 300, 304 may be acquired at any appropriate time, stored in the memory system, and/or acquired substantially immediately or during the process 360. Nevertheless, the acquisition of the two projections may occur in the subprocess 370.
[0065] The first and second projections may then be mapped together in block 382. As discussed above, according to various embodiments, the detector 178 may acquire the two projections from blocks 374, 378 at essentially a single or same perspective. Therefore, a mapping of the first and second image projections to one another may include a determination that the two images are acquired at the same
perspective such that an image element in a first position or a first coordinate of the first projection acquired in block 374 is the same image element at the same coordinate in the second acquired image projection in block 378. However, if the two projections are not acquired at exactly the same perspective (e.g., no change in pose of the detector 178), the mapping in block 382 may allow for a mapping of the image elements between the two acquired images. Thus, as discussed further herein, an analysis of an image element in one image projection may be related to an image element in another image projection. Thus, the mapping in block 382 may allow for a determination of relation of image elements in both of the image projections acquired in blocks 374, 378.
[0066] The mapped images may then be processed in an image processing subprocess 390. The imaging processing subprocess 390 may allow for processing the image projections, such as to generate the composite image 310. Thus, the image processing subprocess 390 may be performed by executing instructions with the processor module, such as the imaging processor module 97, as discussed above.
[0067] A threshold value, or set of values, may be selected or recalled in block 394. If there are several projections, generally a set of multiple values is used rather than only a single value. A threshold value may be a selected parameter of one or more image elements in one or more projections. According to various embodiments, a brightness value may be a threshold value, a relative contrast value may be a threshold value, or other appropriate values or parameters may be selected. The parameters may depend on the image type (e.g., 8bit vs 16 bit or if the image is scaled to [0,1] range). The threshold value may then be used to assist in evaluating the projection 300, 304 in processing the same to generate the composite image 310.
[0068] The selected threshold may be used to then determine the value of an image element relative to the threshold in block 398. The image element may include image elements in both of the first projection 300 and the second projection 304. It is understood, however, that any appropriate number of projections may be acquired and two projections is merely exemplary. Nevertheless, a determination of a value of each of the image elements relative to the threshold value may be made for all of the image elements acquired in the image acquisition subprocess 370. The processor module may evaluate the value selected as the threshold value in block 394 and compare it to each of the image elements of all the acquired image projections in block 398.
[0069] Thereafter, a determination may be made whether each image element is beyond a threshold in block 402. The determination may allow for at least one of two paths to be followed. If the image element is determined to be beyond a threshold value a yes path 406 may be followed. Again the threshold value may be any appropriate selected value, as discussed above. A determination that an image element is beyond a threshold value may include that the image element includes a value greater than a threshold value or less than a threshold value. Further, if an element is determined to be beyond the threshold value it may be weighted based upon its differential from the threshold value, proximity to other image elements, or other determinations relative to the threshold value, or the like.
[0070] The weighting or thresholding process may include various techniques, such as that discussed herein. For example, a similar area in two or more images or image elements may be determined. The area or region may, for example, be one that is determined to be not too bright or too dark. Too bright or too dark may be relative and may be a deviation from a mean of the one or more images, such as within
10% of the mean. One of images may then be scaled so it would be similar to the other, such as by scaling the brightness of the selected image or image elements therein. In the composite or fused image, a weight may be given to pixels in each of the images to determine if each pixel is included and/or how much weight it is given in the new composite image pixel. For example, a 50% weight to pixels that are considered "good" in both images, a 20% weight for pixels that are “good enough” in one image and 80% weight for the same pixel from the other image, and 0% weight for bad pixels in one image and 100% weight to the same pixel in the other image (i.e., the 0% weight pixel is not taken or used in the composite image). The relative ratings of “good”, “good enough”, and “bad” may be based on a user analysis and/or automatic analysis, such as a deviation from a pixel from a mean. In various embodiments, for example, a good pixel or region may be a region that includes identifiable features or relevant features. Further, any appropriate number or relative ranges may be used, and three is merely an example.
[0071] A second path may include that the image element is determined to not be beyond the threshold value. Thus, a no path 410 may be followed. The determination that an image element is not beyond a threshold value may be that the image element includes or is within a threshold value range.
[0072] If an image element is determined to not be beyond the threshold and the no path 410 is followed, the image element may be kept in block 414. In keeping the image element in block 414 the image element may be maintained substantially unchanged or unaltered. The unaltered image element may be stored for inclusion in further processing, such as during combination with other image elements in block 418.
[0073] However, if the yes path is 406 is followed, processing of the image elements may occur in block 422. The processing of the image elements of 422 may allow
for processing of image elements that are not within a threshold. The threshold may be selected to assist in generating an image that includes a selected contrast of image elements relative to one another, clarity or resolution of an image, identification, and/or segmentation of portions in the image, or other processes.
[0074] In the processing of the image elements in block 422, an image element that is beyond a threshold may be replaced with an image element that is not beyond a threshold. As discussed above the first and second projections may be acquired with different parameters of the detector of 178. As illustrated above in Fig. 3 the two projections 300, 304 may have differing contrasts or exposures. Therefore an image element in, for example, an image element in the first projection 300 that is within a threshold may be used to replace an image element that in the second projection 304 that is outside of the threshold or beyond the threshold. Therefore, a direct replacement of an image element between the two projections may occur, such as due to the mapping of the image elements in block 382.
[0075] Processing of the image elements may also or alternatively include a scaling of an image element. For example, if an image element is determined to be beyond a threshold it may be scaled a selected amount, such as increasing or decreasing its whiteness level. The scaling may be based upon various parameters or schemes such as its deviation from the threshold value, a comparison of the image element to the second or another image projection, or the like. Generally, scaling may allow two or more images or projections to be made similar enough to allow the generation of the compensate for variations in intensity of selected regions (e.g., "good" areas) of each of the images. This may be done by selecting a region or area that is considered “good” in the two or more images. For example, if two or more of the images do not have the same mean brightness
level. A ratio of the mean bright level may be determined and at least one of the images may be scaled accordingly so the two regions are be similar. This allows the two or more images to be fused. The fusing to generate the composite may be by means of weighting, as previously described. The scaling of the image element may allow for a changing of the value of the image element without directly replacing it with another image element.
[0076] The processed image elements may also be combined in block 418. Therefore, processed image elements from block 422 and the kept image elements from block 414 may be combined in block 418. The combined image elements may be mapped to a projection area due to the mapping in block 382. A generation of a composite image visualization may be made with the combined image elements and may be performed in block 426.
[0077] The generation of the composite image in block 426 may include generating a composite image or projection that includes the kept image elements from block 414 at their respective coordinate and the processed image element from block 422 at their respective coordinates. Therefore, the generation of the composite image may include generating an image with image elements at the coordinates determined in the map image elements from block 382, such as by the techniques of scaling and weighting as discussed above.
[0078] The composite image may then be output in block 430. Outputting the composite image in block 430 may include various processes. Outputting the composite image in block 430 may include saving the composite image in a memory system, such as a memory system of the imaging assembly 96. The output of the composite image in 430 may also include transfer or generation of the visual display of the composite image. Therefore, a display of a selected image may be made in block 434.
The selected image may be the composite image output in block 430. Thus, the processing subprocess 390 may generate the composite image that is output for various purposes, such as displaying a block 434. It is understood that other image projections may also be displayed such as the acquired projections from subprocess 370.
[0079] The process 360 may then end in block 438. The process ending about 438 may include other processes such as performing a procedure on the subject 30, analyzing or segmenting the composite image, or other appropriate procedures. The ending of the process 360 in block 438 may not require a ceasing of all actions, but rather the generation of the composite image in an output thereof, as discussed above.
[0080] The process 360 may be operated to perform various procedures. The imaging system ad may be operated either alone or in combination with other systems to generate the composite image to allow for an image that includes image data or an image visualization that may not be possible with only a single projection. Thus, the composite projection or image visualization such that the composite image may be used for various purposes with greater efficiency than either or any one of the originally acquired projections, as discussed above.
[0081] The following paragraphs provide various Examples reciting examples and alternatives disclosed herein.
[0082] Example 1 - A system to generate an image visualization of a subject, the system comprising: a detector configured to acquire image data at a first detector parameter and a second detector parameter; a positioning system configured to position the detector at a first pose relative to the subject; a source configured to emit at the detector; and a processor module configured to execute instructions to evaluate a first image data acquired at the first detector parameter and a second image data acquired at
the second detector parameter and output a composite image visualization based thereon.
[0083] Example 2 - The system of Example 1 , further comprising: a memory system configured to store at least one of the first image data or the second image data.
[0084] Example 3 - The system of Example 1 , further comprising: a display device configured to display at least the composite image visualization; wherein the image visualization includes portions clarified due to the evaluation that are recognizable therein but not in both of the first image data or the second image data.
[0085] Example 4 - The system of Example 1 , wherein the detector includes a duel energy detector.
[0086] Example s - The system of Example 1 , wherein the detector includes at least a first sensitivity parameter as the first detector parameter and a second sensitivity parameter as the second detector parameter.
[0087] Example 6 - The system of Example 1 , wherein the first detector parameter and the second detector parameter differ by at least 10 keV.
[0088] Example 7 - The system of Example 1 , wherein the source includes a x-ray source.
[0089] Example 8 - The system of Example 1 , wherein the source and the detector are configured to move relative to the subject.
[0090] Example 9 - The system of Example 1 , wherein the source is configured to emit a beam of x-rays having a spectrum with a selected peak x-ray voltage.
[0091] Example 10- The system of Example 1 , wherein the processor module is configured to execute further instructions to determine a value of an image element in both the first image data and the second image data relative to a threshold.
[0092] Example 11- The system of Example 10, wherein the composite image visualization includes image elements selected or scaled based on the comparison to the threshold of the image elements of the first image data and the second image data.
[0093] Example 12- A method to generate an image visualization of a subject, the method comprising: acquiring a first image data at a first detector parameter; acquiring a second image data at a second detector parameter; emitting an energy beam from a source; and evaluating the first image data acquired at the first detector parameter and the second image data acquired at the second detector parameter; and outputting a composite image visualization based on the evaluation.
[0094] Example 13 -The method of Example 12, further comprising: displaying at least the composite image visualization; wherein the image visualization includes portions clarified due to the evaluation that are recognizable therein but not in both of the first image data or the second image data.
[0095] Example 14 -The method of Example 12, further comprising: providing the detector as a duel energy detector.
[0096] Example 15 - The method of Example 12, further comprising: providing the detector having a first sensitivity parameter as the first detector parameter and a second sensitivity parameter as the second detector parameter.
[0097] Example 16 -The method of Example 12, wherein acquiring the first image data at the first detector parameter and acquiring the second image data at the second detector parameter occurs substantially simultaneously.
[0098] Example 17- The method of Example 12, further comprising: moving the source and the detector relative to the subject.
[0099] Example 18 - The method of Example 12, further comprising: operating the source to emit a beam of x-rays having a spectrum with a selected peak x- ray voltage.
[0100] Example 19 -The method of Example 12, further comprising: determining a value of an image element in both the first image data and the second image data relative to a threshold.
[0101] Example 20 -The method of Example 19, wherein outputting the composite image visualization includes outputting image elements selected or scaled based on the comparison to the threshold of the image elements of the first image data and the second image data.
[0102] Example embodiments are provided so that this disclosure will be thorough, and will fully convey the scope to those who are skilled in the art. Numerous specific details are set forth such as examples of specific components, devices, and methods, to provide a thorough understanding of embodiments of the present disclosure. It will be apparent to those skilled in the art that specific details need not be employed, that example embodiments may be embodied in many different forms and that neither should be construed to limit the scope of the disclosure. In some example embodiments, well-known processes, well-known device structures, and well-known technologies are not described in detail.
[0103] Instructions may be executed by a processor and may include software, firmware, and/or microcode, and may refer to programs, routines, functions, classes, data structures, and/or objects. The term shared processor circuit encompasses
a single processor circuit that executes some or all code from multiple modules. The term group processor circuit encompasses a processor circuit that, in combination with additional processor circuits, executes some or all code from one or more modules. References to multiple processor circuits encompass multiple processor circuits on discrete dies, multiple processor circuits on a single die, multiple cores of a single processor circuit, multiple threads of a single processor circuit, or a combination of the above. The term shared memory circuit encompasses a single memory circuit that stores some or all code from multiple modules. The term group memory circuit encompasses a memory circuit that, in combination with additional memories, stores some or all code from one or more modules.
[0104] The apparatuses and methods described in this application may be partially or fully implemented by a processor (also referred to as a processor module) that may include a special purpose computer (i.e., created by configuring a processor) and/or a general purpose computer to execute one or more particular functions embodied in computer programs. The computer programs include processor-executable instructions that are stored on at least one non-transitory, tangible computer-readable medium. The computer programs may also include or rely on stored data. The computer programs may include a basic input/output system (BIOS) that interacts with hardware of the special purpose computer, device drivers that interact with particular devices of the special purpose computer, one or more operating systems, user applications, background services and applications, etc.
[0105] The computer programs may include: (i) assembly code; (ii) object code generated from source code by a compiler; (iii) source code for execution by an interpreter; (iv) source code for compilation and execution by a just-in-time compiler, (v)
descriptive text for parsing, such as HTML (hypertext markup language) or XML (extensible markup language), etc. As examples only, source code may be written in C, C++, C#, Objective-C, Haskell, Go, SQL, Lisp, Java®, ASP, Perl, Javascript®, HTML5, Ada, ASP (active server pages), Perl, Scala, Erlang, Ruby, Flash®, Visual Basic®, Lua, or Python®.
[0106] Communications may include wireless communications described in the present disclosure can be conducted in full or partial compliance with IEEE standard 802.11-2012, IEEE standard 802.16-2009, and/or IEEE standard 802.20-2008. In various implementations, IEEE 802.11-2012 may be supplemented by draft IEEE standard 802.11ac, draft IEEE standard 802.11ad, and/or draft IEEE standard 802.11ah.
[0107] A processor, processor module, module or ‘controller’ may be used interchangeably herein (unless specifically noted otherwise) and each may be replaced with the term ‘circuit.’ Any of these terms may refer to, be part of, or include: an Application Specific Integrated Circuit (ASIC); a digital, analog, or mixed analog/digital discrete circuit; a digital, analog, or mixed analog/digital integrated circuit; a combinational logic circuit; a field programmable gate array (FPGA); a processor circuit (shared, dedicated, or group) that executes code; a memory circuit (shared, dedicated, or group) that stores code executed by the processor circuit; other suitable hardware components that provide the described functionality; or a combination of some or all of the above, such as in a system- on-chip.
[0108] Instructions may be executed by one or more processors or processor modules, such as one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic
circuitry. Accordingly, the term “processor” or “processor module” as used herein may refer to any of the foregoing structure or any other physical structure suitable for implementation of the described techniques. Also, the techniques could be fully implemented in one or more circuits or logic elements.
[0109] The foregoing description of the embodiments has been provided for purposes of illustration and description. It is not intended to be exhaustive or to limit the disclosure. Individual elements or features of a particular embodiment are generally not limited to that particular embodiment, but, where applicable, are interchangeable and can be used in a selected embodiment, even if not specifically shown or described. The same may also be varied in many ways. Such variations are not to be regarded as a departure from the disclosure, and all such modifications are intended to be included within the scope of the disclosure.
Claims
1 . A system to generate an image visualization of a subject, comprising: a detector configured to acquire image data at a first detector parameter and a second detector parameter; a positioning system configured to position the detector at a first pose relative to the subject; a source configured to emit at the detector; and a processor module configured to execute instructions to evaluate a first image data acquired at the first detector parameter and a second image data acquired at the second detector parameter and output a composite image visualization based thereon.
2. The system of Claim 1 , further comprising: a memory system configured to store at least one of the first image data or the second image data.
3. The system of Claim 1 , further comprising: a display device configured to display at least the composite image visualization; wherein the image visualization includes portions clarified due to the evaluation that are recognizable therein but not in both of the first image data or the second image data.
4. The system of Claim 1 , wherein the detector includes a duel energy detector.
5. The system of Claim 1 , wherein the detector includes at least a first sensitivity parameter as the first detector parameter and a second sensitivity parameter as the second detector parameter.
6. The system of Claim 1 , wherein the first detector parameter and the second detector parameter differ by at least 10 keV.
7. The system of Claim 1 , wherein the source includes a x-ray source.
8. The system of Claim 1 , wherein the source is configured to emit a beam of x-rays having a spectrum with a selected peak x-ray voltage.
9. The system of Claim 1 , wherein the processor module is configured to execute further instructions to determine a value of an image element in both the first image data and the second image data relative to a threshold; wherein the composite image visualization includes image elements selected or scaled based on the comparison to the threshold of the image elements of the first image data and the second image data.
10. A method to generate an image visualization of a subject, comprising: acquiring a first image data at a first detector parameter;
acquiring a second image data at a second detector parameter; emitting an energy beam from a source; and evaluating the first image data acquired at the first detector parameter and the second image data acquired at the second detector parameter; and outputting a composite image visualization based on the evaluation.
11 . The method of Claim 10, further comprising: displaying at least the composite image visualization; wherein the image visualization includes portions clarified due to the evaluation that are recognizable therein but not in both of the first image data or the second image data.
12. The method of Claim 10, further comprising: providing the detector as at least one of (1) a duel energy detector or (2) a duel sensitivity detector having a first sensitivity parameter as the first detector parameter and a second sensitivity parameter as the second detector parameter.
13. The method of Claim 10, wherein acquiring the first image data at the first detector parameter and acquiring the second image data at the second detector parameter occurs substantially simultaneously.
14. The method of Claim 10, further comprising: operating the source to emit a beam of x-rays having a spectrum with a selected peak x-ray voltage.
15. The method of Claim 12, further comprising: determining a value of an image element in both the first image data and the second image data relative to a threshold; wherein outputting the composite image visualization includes outputting image elements selected or scaled based on the comparison to the threshold of the image elements of the first image data and the second image data.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202463570140P | 2024-03-26 | 2024-03-26 | |
| US63/570,140 | 2024-03-26 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025203020A1 true WO2025203020A1 (en) | 2025-10-02 |
Family
ID=95517100
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IL2025/050263 Pending WO2025203020A1 (en) | 2024-03-26 | 2025-03-20 | System and method to generate image visualization |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025203020A1 (en) |
Citations (19)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5592939A (en) | 1995-06-14 | 1997-01-14 | Martinelli; Michael A. | Method and system for navigating a catheter probe |
| US5913820A (en) | 1992-08-14 | 1999-06-22 | British Telecommunications Public Limited Company | Position location system |
| US5983126A (en) | 1995-11-22 | 1999-11-09 | Medtronic, Inc. | Catheter location system and method |
| US6940941B2 (en) | 2002-02-15 | 2005-09-06 | Breakaway Imaging, Llc | Breakable gantry apparatus for multidimensional x-ray based imaging |
| US7001045B2 (en) | 2002-06-11 | 2006-02-21 | Breakaway Imaging, Llc | Cantilevered gantry apparatus for x-ray imaging |
| US7106825B2 (en) | 2002-08-21 | 2006-09-12 | Breakaway Imaging, Llc | Apparatus and method for reconstruction of volumetric images in a divergent scanning computed tomography system |
| US7108421B2 (en) | 2002-03-19 | 2006-09-19 | Breakaway Imaging, Llc | Systems and methods for imaging large field-of-view objects |
| US7188998B2 (en) | 2002-03-13 | 2007-03-13 | Breakaway Imaging, Llc | Systems and methods for quasi-simultaneous multi-planar x-ray imaging |
| US7751865B2 (en) | 2003-10-17 | 2010-07-06 | Medtronic Navigation, Inc. | Method and apparatus for surgical navigation |
| US7831082B2 (en) | 2000-06-14 | 2010-11-09 | Medtronic Navigation, Inc. | System and method for image based sensor calibration |
| US8175681B2 (en) | 2008-12-16 | 2012-05-08 | Medtronic Navigation Inc. | Combination of electromagnetic and electropotential localization |
| US8503745B2 (en) | 2009-05-13 | 2013-08-06 | Medtronic Navigation, Inc. | System and method for automatic registration between an image and a subject |
| US8737708B2 (en) | 2009-05-13 | 2014-05-27 | Medtronic Navigation, Inc. | System and method for automatic registration between an image and a subject |
| US9138204B2 (en) | 2011-04-29 | 2015-09-22 | Medtronic Navigation, Inc. | Method and apparatus for calibrating and re-aligning an ultrasound image plane to a navigation tracker |
| US20160000396A1 (en) * | 2014-07-02 | 2016-01-07 | Kabushiki Kaisha Toshiba | X-ray ct apparatus and image processing apparatus |
| US20160203619A1 (en) * | 2015-01-08 | 2016-07-14 | Kabushiki Kaisha Toshiba | Computed tomography using simultaneous image reconstruction with measurements having multiple distinct system matrices |
| US9737235B2 (en) | 2009-03-09 | 2017-08-22 | Medtronic Navigation, Inc. | System and method for image-guided navigation |
| JP2021126163A (en) * | 2020-02-10 | 2021-09-02 | キヤノン株式会社 | Image processing device and image processing method |
| US11135025B2 (en) | 2019-01-10 | 2021-10-05 | Medtronic Navigation, Inc. | System and method for registration between coordinate systems and navigation |
-
2025
- 2025-03-20 WO PCT/IL2025/050263 patent/WO2025203020A1/en active Pending
Patent Citations (20)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5913820A (en) | 1992-08-14 | 1999-06-22 | British Telecommunications Public Limited Company | Position location system |
| US5592939A (en) | 1995-06-14 | 1997-01-14 | Martinelli; Michael A. | Method and system for navigating a catheter probe |
| US5983126A (en) | 1995-11-22 | 1999-11-09 | Medtronic, Inc. | Catheter location system and method |
| US8320653B2 (en) | 2000-06-14 | 2012-11-27 | Medtronic Navigation, Inc. | System and method for image based sensor calibration |
| US7831082B2 (en) | 2000-06-14 | 2010-11-09 | Medtronic Navigation, Inc. | System and method for image based sensor calibration |
| US6940941B2 (en) | 2002-02-15 | 2005-09-06 | Breakaway Imaging, Llc | Breakable gantry apparatus for multidimensional x-ray based imaging |
| US7188998B2 (en) | 2002-03-13 | 2007-03-13 | Breakaway Imaging, Llc | Systems and methods for quasi-simultaneous multi-planar x-ray imaging |
| US7108421B2 (en) | 2002-03-19 | 2006-09-19 | Breakaway Imaging, Llc | Systems and methods for imaging large field-of-view objects |
| US7001045B2 (en) | 2002-06-11 | 2006-02-21 | Breakaway Imaging, Llc | Cantilevered gantry apparatus for x-ray imaging |
| US7106825B2 (en) | 2002-08-21 | 2006-09-12 | Breakaway Imaging, Llc | Apparatus and method for reconstruction of volumetric images in a divergent scanning computed tomography system |
| US7751865B2 (en) | 2003-10-17 | 2010-07-06 | Medtronic Navigation, Inc. | Method and apparatus for surgical navigation |
| US8175681B2 (en) | 2008-12-16 | 2012-05-08 | Medtronic Navigation Inc. | Combination of electromagnetic and electropotential localization |
| US9737235B2 (en) | 2009-03-09 | 2017-08-22 | Medtronic Navigation, Inc. | System and method for image-guided navigation |
| US8503745B2 (en) | 2009-05-13 | 2013-08-06 | Medtronic Navigation, Inc. | System and method for automatic registration between an image and a subject |
| US8737708B2 (en) | 2009-05-13 | 2014-05-27 | Medtronic Navigation, Inc. | System and method for automatic registration between an image and a subject |
| US9138204B2 (en) | 2011-04-29 | 2015-09-22 | Medtronic Navigation, Inc. | Method and apparatus for calibrating and re-aligning an ultrasound image plane to a navigation tracker |
| US20160000396A1 (en) * | 2014-07-02 | 2016-01-07 | Kabushiki Kaisha Toshiba | X-ray ct apparatus and image processing apparatus |
| US20160203619A1 (en) * | 2015-01-08 | 2016-07-14 | Kabushiki Kaisha Toshiba | Computed tomography using simultaneous image reconstruction with measurements having multiple distinct system matrices |
| US11135025B2 (en) | 2019-01-10 | 2021-10-05 | Medtronic Navigation, Inc. | System and method for registration between coordinate systems and navigation |
| JP2021126163A (en) * | 2020-02-10 | 2021-09-02 | キヤノン株式会社 | Image processing device and image processing method |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN112041890B (en) | System and method for reducing artifacts in images | |
| CN113226185A (en) | System and method for imaging a subject | |
| US20140334709A1 (en) | 3d-2d image registration for medical imaging | |
| AU2017258292A1 (en) | Method and apparatus for image-based navigation | |
| US10849574B2 (en) | Interventional imaging | |
| CN113874071B (en) | Medical image processing device, storage medium, medical device, and treatment system | |
| WO2025203020A1 (en) | System and method to generate image visualization | |
| US20080075379A1 (en) | Image processing device and image processing method | |
| US12076172B2 (en) | System and method for imaging | |
| US12023187B2 (en) | System and method for imaging | |
| WO2025196750A1 (en) | System and method to image data | |
| WO2025210547A1 (en) | System and method to acquire image data and generate an image | |
| CN116367782A (en) | Filter system and method for imaging a subject | |
| US20240277415A1 (en) | System and method for moving a guide system | |
| US12465316B2 (en) | Method and system for positioning an imaging system | |
| WO2025248391A1 (en) | System and method to provide feedback regarding image navigation | |
| US20240277412A1 (en) | System and method for validating a procedure | |
| US12471875B2 (en) | Method and system for positioning an imaging system | |
| US12453524B2 (en) | Method and system for positioning an imaging system | |
| EP4210581B1 (en) | System and method for imaging | |
| WO2024209477A1 (en) | System and method for determining a probability of registering images | |
| US20240341707A1 (en) | System and method for automatically detecting orientation and anatomy in an imaging system | |
| CN116348043A (en) | Filter system and method for imaging a subject | |
| WO2024226947A1 (en) | Method and system for positioning an imaging system | |
| KR20250034994A (en) | Medical image processing device, treatment system, medical image processing method, program, and storage medium |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 25721056 Country of ref document: EP Kind code of ref document: A1 |