WO2024214003A1 - System and method for automatically detecting orientation and anatomy in an imaging system - Google Patents
System and method for automatically detecting orientation and anatomy in an imaging system Download PDFInfo
- Publication number
- WO2024214003A1 WO2024214003A1 PCT/IB2024/053463 IB2024053463W WO2024214003A1 WO 2024214003 A1 WO2024214003 A1 WO 2024214003A1 IB 2024053463 W IB2024053463 W IB 2024053463W WO 2024214003 A1 WO2024214003 A1 WO 2024214003A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- imaging system
- image
- determining
- data
- patient
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/48—Diagnostic techniques
- A61B6/488—Diagnostic techniques involving pre-scan acquisition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/46—Arrangements for interfacing with the operator or the patient
- A61B6/461—Displaying means of special interest
- A61B6/465—Displaying means of special interest adapted to display user selection data, e.g. graphical user interface, icons or menus
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/54—Control of apparatus or devices for radiation diagnosis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/54—Control of apparatus or devices for radiation diagnosis
- A61B6/542—Control of apparatus or devices for radiation diagnosis involving control of exposure
- A61B6/544—Control of apparatus or devices for radiation diagnosis involving control of exposure dependent on patient size
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/54—Control of apparatus or devices for radiation diagnosis
- A61B6/545—Control of apparatus or devices for radiation diagnosis involving automatic set-up of acquisition parameters
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/12—Arrangements for detecting or locating foreign bodies
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/54—Control of apparatus or devices for radiation diagnosis
- A61B6/547—Control of apparatus or devices for radiation diagnosis involving tracking of position of the device or parts of the device
Definitions
- the present disclosure relates to imaging a subject, and particularly to a system to automatically determine the patient orientation to populate a menu system for subsequent images.
- a subject such as a human patient, may undergo a procedure.
- the procedure may include a surgical procedure to correct or augment an anatomy of the subject.
- the augmentation of the anatomy can include various procedures, such as movement or augmentation of bone, insertion of an implant (i.e. , an implantable device), or other appropriate procedures.
- various types of data are input by the technician and are used to capture image data. Data such as the imager settings and the patient settings are generally entered prior to capturing the image data. Completing the extensive list or inputting the settings can or is very time consuming and may extend surgical time and/or surgical suite time.
- a system to acquire image data of a subject may be an imaging system that uses x-rays.
- the subject may be a living patient (e.g., a human patient).
- the subject may also be a non-living subject, such as an enclosure, a casing, etc.
- the imaging system may acquire image data of an interior of the subject.
- the imaging system may include a moveable source and/or detector that is moveable relative to the subject. The position and movement of the system is performed automatically to reduce the overall imaging time and provide less exposure of x-rays to the subject.
- the method for controlling an imaging system includes positioning the imaging system into a first position, acquiring a first image at the first position, determining patient data from the first image, communicating patient data to a user interface, displaying the patient data on a display and acquiring a second image based on the patient data.
- a system to control an imaging system has a controller configured to execute instructions to acquire a first image at a first position, determine patient data from the first image, communicating patient data to a user interface and displaying the patient data on a display.
- Fig. 1 is an environmental view of an imaging system in an operating theatre
- FIG. 2 is a detailed schematic view of an imaging system with a source and detector configured to move around a subject, according to various embodiments;
- FIG. 3 is a block diagrammatic view of the imaging system.
- Fig. 4A is a representation of the order of the vertebrae in a patient.
- Fig. 4B is an image of vertebrae illustrating the order of vertebrae.
- Fig. 4G is a representation of a patient in a head left supine position.
- Fig. 4D is a representation of a patient in a head right supine position.
- Fig. 4E is an image of a head left position for the patient in the supine position.
- Fig. 4F is a head right position of the patient in the prone position.
- Fig. 5A is a representation of a patient data user interface.
- Fig. 5B is a representation of an image data user interface.
- Fig. 6 is a flowchart of a method for operating the system.
- a subject may be imaged with an imaging system, as discussed further herein.
- the subject may be a living subject, such as a human patient.
- Image data may be acquired of the human patient and may be combined to provide an image of the human patient that is greater than any dimension of any single projection acquired with the imagining system. It is understood, however, that image data may be acquired of a nonliving subject, such an inanimate subject including a housing, casing, interior of a super structure, or the like.
- image data may be acquired of an airframe for various purposes, such as diagnosing issues and/or planning repair work.
- FIG. 1 a schematic view of a procedure room 20 is illustrated.
- a user 24 such as a surgeon, can perform a procedure on a subject, such as a patient 28.
- the subject may be placed on a support, such as a table 32 for a selected portion of the procedure.
- the table 32 may not interfere with image data acquisition with an imaging system 36.
- the user 24 can use the imaging system 36 to acquire image data of the patient 28 to allow a selected system to generate or create images to assist in performing a procedure.
- Images generated with the image data may be two-dimensional (2D) images, three-dimensional (3D), or appropriate type of images, such as a model (such as a three-dimensional (3D) image), long views, single projections views, etc.
- the display device 44 can be part of and/or connected to a processor system 48 that includes a user interface 52, such as a keyboard, mouse, stylus, a touch screen as part of the display device 44 or combinations thereof.
- a processor 56 can include one or more processors, processor module, and/or microprocessors incorporated with the processing system 48 along with selected types of non-transitory and/or transitory memory 58.
- a connection 62 can be provided between the processor 56 and the display device 44 for data communication to allow driving the display device 44 to display or illustrate the image 40.
- the processor 56 may be any appropriate type of processor such as a general-purpose processor that executes instructions included in a program or an application specific processor such as an application specific integrated circuit.
- the imaging system 36 can include but is not limited to an O-Arm® imaging system sold by Medtronic Navigation, Inc. having a place of business in Louisville, CO, USA.
- the imaging system 36 including the O-Arm® imaging system, or other appropriate imaging systems may be in use during a selected procedure, such as the imaging system described in U.S. Patent App. Pubs. 2012/0250822, 2012/0099772, and 2010/0290690, all the above incorporated herein by reference.
- the imaging system 36 when, for example, including the O-Arm® imaging system, may include a mobile cart 60 that includes a controller and/or control system 64.
- the control system 64 may include a processor and/or processor system 68 (similar to the processor 56), a user interface 67 such as a keyboard, a mouse, a touch screen, a memory 58 (e.g., a non-transitory memory) and a display device 69.
- the memory system 66 may include various instructions that are executed by the processor 68 that acts as a controller to control the imaging system 36, including various portions of the imaging system 36.
- the imaging system 36 may include further additional portions, such as an imaging gantry 70 in which is positioned a source unit (also referred to as a source assembly) 74 and a detector unit (also referred to as a detector assembly) 78.
- a source unit also referred to as a source assembly
- a detector unit also referred to as a detector assembly
- the detector 78 alone and/or together with the source unit may be referred to as an imaging head of the imaging system 36.
- the gantry 70 is moveably connected to the mobile cart 60.
- the gantry 70 may be O-shaped or toroid shaped, wherein the gantry 70 is substantially annular and includes walls that form a volume in which the source unit 74 and detector 78 may move.
- the mobile cart 60 may also be moved.
- the gantry 70 and/or the cart 60 may be moved while image data is acquired, including both being moved simultaneously.
- the imaging system 36 via the mobile cart 60 can be moved from one operating theater to another (e.g., another room).
- the gantry 70 can move relative to the cart 60, as discussed further herein. This allows the imaging system 36 to be mobile and moveable relative to the subject 28, thus allowing it to be used in multiple locations and with multiple procedures without requiring a capital expenditure or space dedicated to a fixed imaging system.
- the processor 68 may be a general-purpose processor or an application specific application processor.
- the memory system 66 may be a non-transitory memory such as a spinning disk or solid-state non-volatile memory.
- the memory system may include instructions to be executed by the processor 68 to perform functions and determine results, as discussed herein.
- the memory system 66 may be used to store images from the imaging system 36 to allow calculations to be performed thereon.
- the memory system 66 may be used to store intermediate and final calculations, such as data for identifying body structures, distance for the imaging system to travel, a target position for the imaging system 36.
- the imaging system 36 may include an imaging system that acquires images and/or image data using emitting x-rays and detecting x- rays after interactions and/or attenuations of the x-rays with or by the subject 28.
- the x- ray imaging may be an imaging modality. It is understood that other imaging modalities are possible, such as other high energy beams, etc.
- the source unit 74 may be an x-ray emitter that can emit x-rays at and/or through the patient 28 to be detected by the detector 78.
- the x-rays emitted by the source 74 can be emitted in a cone 90 along a selected main vector 94 and detected by the detector 78, as illustrated in Fig. 2.
- the source 74 and the detector 78 may also be referred to together as a source/detector unit 98, especially wherein the source 74 is generally diametrically opposed (e.g., 180 degrees (°) apart) from the detector 78 within the gantry 70.
- the imaging system 36 may move, as a whole or in part, relative to the subject 28.
- the source 74 and the detector 78 can move around the patient 28, e.g., a 360° motion, spiral, portion of a circle, etc.
- the movement of the source/detector unit 98 within the gantry 70 may allow the source 74 to remain generally 180° opposed (such as with a fixed inner gantry or rotor or moving system) to the detector 78.
- the detector 78 may be referred to as moving around (e.g., in a circle or spiral) the subject 28 and it is understood that the source 74 remains opposed thereto, unless disclosed otherwise.
- the gantry 70 can move isometrically (also referred as “wag”) relative to the subject 28 generally in the direction of arrow 100 around an axis 102, such as through the cart 60, as illustrated in Fig. 1.
- the gantry 70 can also tilt relative to a longitudinal axis 106 of the patient 28 illustrated by arrows 110. In tilting, a plane of the gantry 70 may tilt or form a non-orthogonal angle with the axis 106 of the subject 28.
- the gantry 70 may also move longitudinally in the direction of arrows 114 along the line 106 relative to the subject 28 and/or the cart 60. Also, the cart 60 may move to move the gantry 70. Further, the gantry 70 can move up and down generally in the Y-axis direction of arrows 118 relative to the cart 60 and/or the subject 28, generally transverse to the axis 106 and parallel with the axis 102. The gantry may also be moved in an X direction in the direction of the arrows 116 by moving the wheels 117.
- the movement of the imaging system 36 is to allow for positioning of the source/detector unit (SDU) 98 relative to the subject 28.
- the imaging system 36 can be precisely controlled to move the SDU 98 relative to the subject 28 to generate precise image data of the subject 28.
- the imaging system 36 can be connected to the processor 56 via a connection 120, which can include a wired or wireless connection or physical media transfer from the imaging system 36 to the processor 56.
- image data collected with the imaging system 36 can be transferred to the processing system 56 for navigation, display, reconstruction, etc.
- the source 74 may include one or more sources of x- rays for imaging the subject 28.
- the source 74 may include a single source that may be powered by more than one power source to generate and/or emit x-rays at different energy characteristics.
- more than one x-ray source may be the source 74 that may be powered to emit x-rays with differing energy characteristics at selected times.
- the imaging system 36 can be used with an un-navigated or navigated procedure.
- a localizer and/or digitizer including either or both of an optical localizer 130 and/or an electromagnetic localizer 138 can be used to generate a field and/or receive and/or send a signal within a navigation domain relative to the subject 28.
- the navigated space or navigational domain relative to the subject 28 can be registered to the image 40.
- Correlation is to allow registration of a navigation space defined within the navigational domain and an image space defined by the image 40.
- a patient tracker or dynamic reference frame 140 can be connected to the subject 28 to allow for a dynamic registration and maintenance of registration of the subject 28 to the image 40.
- the patient tracking device or dynamic registration device 140 and an instrument 144 can then be tracked relative to the subject 28 to allow for a navigated procedure.
- the instrument 144 can include a tracking device, such as an optical tracking device 148 and/or an electromagnetic tracking device 152 to allow for tracking of the instrument 144 with either or both of the optical localizer 130 or the electromagnetic localizer 138.
- a navigation/probe interface device 158 may have communications (e.g., wired or wireless) with the instrument 144 (e.g., via a communication line 156), with the electromagnetic localizer 138 (e.g., via a communication line 162), and/or the optical localizer 130 (e.g., via a communication line 166).
- the interface 158 can also communicate with the processor 56 with a communication line 168 and may communicate information (e.g., signals) regarding the various items connected to the interface 158. It will be understood that any of the communication lines can be wired, wireless, physical media transmission or movement, or any other appropriate communication. Nevertheless, the appropriate communication systems can be provided with the respective localizers to allow for tracking of the instrument 144 relative to the subject 28 to allow for illustration of a tracked location of the instrument 144 relative to the image 40 for performing a procedure.
- the instrument 144 may be any appropriate instrument, such as a ventricular or vascular stent, spinal implant, neurological stent or stimulator, ablation device, or the like.
- the instrument 144 can be an interventional instrument or can include or be an implantable device. Tracking the instrument 144 allows for viewing a location (including x,y,z position and orientation) of the instrument 144 relative to the subject 28 with use of the registered image 40 without direct viewing of the instrument 144 within the subject 28.
- the imaging system 36 such as the gantry 70, can include an optical tracking device 174 and/or an electromagnetic tracking device 178 to be tracked with the respective optical localizer 130 and/or electromagnetic localizer 138.
- the imaging system 36 can be tracked relative to the subject 28 as can the instrument 144 to allow for initial registration, automatic registration, or continued registration of the subject 28 relative to the image 40. Registration and navigated procedures are discussed in the above incorporated U.S. Patent No. 8,238,631 , incorporated herein by reference.
- an icon 180 may be displayed relative to, including overlaid on, the image 40.
- the image 40 may be an appropriate image and may include a 2D image, a 3D image, or any appropriate image as discussed herein.
- the source 74 can include a single assembly that may include a single x-ray tube 190 that can be connected to a switch 194 that can interconnect a first power source 198 via a connection or power line 200.
- x-rays can be emitted from the x-ray tube 190 generally in the cone shape 90 towards the detector 78 and generally in the direction from the x-ray tube 190 as indicated by arrow, beam arrow, beam or vector 94.
- the switch 194 can switch power on or off to the tube 190 to emit x-rays of selected characteristics, as is understood by one skilled in the art.
- the vector 94 may be a central vector or ray within the cone 90 of x-rays.
- An x-ray beam may be emitted as the cone 90 or other appropriate geometry.
- the vector 94 may include a selected line or axis relevant for further interaction with the beam, such as with a filter member, as discussed further herein.
- the subject 28 can be positioned within the x-ray cone 90 to allow for acquiring image data of the subject 28 based upon the emission of x-rays in the direction of vector 94 towards the detector 78.
- the x-ray tube 190 may be used to generate two- dimensional (2D) x-ray projections of the subject 28, including selected portions of the subject 28, or any area, region or volume of interest, in light of the x-rays impinging upon or being detected on a 2D or flat panel detector, as the detector 78.
- the 2D x-ray projections can be reconstructed, as discussed herein, to generate and/or display three- dimensional (3D) volumetric models of the subject 28, selected portion of the subject 28, or any area, region or volume of interest.
- the 2D x-ray projections can be image data acquired with the imaging system 36, while the 3D volumetric models can be generated or model image data.
- EM Expectation maximization
- OS-EM Ordered Subsets EM
- SART Simultaneous Algebraic Reconstruction Technique
- TVM Total Variation Minimization
- Various reconstruction techniques may also and alternatively include machine learning systems and algebraic techniques.
- the application to perform a 3D volumetric reconstruction based on the 2D projections allows for efficient and complete volumetric reconstruction.
- an algebraic technique can include an iterative process to perform a reconstruction of the subject 28 for display as the image 40.
- a pure or theoretical image data projection such as those based on or generated from an atlas or stylized model of a “theoretical” patient, can be iteratively changed until the theoretical projection images match the acquired 2D projection image data of the subject 28.
- the stylized model can be appropriately altered as the 3D volumetric reconstruction model of the acquired 2D projection image data of the selected subject 28 and can be used in a surgical intervention, such as navigation, diagnosis, or planning.
- the theoretical model can be associated with theoretical image data to construct the theoretical model. In this way, the model or the image data 40 can be built based upon image data acquired of the subject 28 with the imaging system 36.
- the source 74 may include various elements or features that may be moved relative to the x-ray tube 190.
- a collimator 220 may be positioned relative to the x-ray tube 190 to assist in forming the cone 90 relative to the subject 28.
- the collimator 220 may include various features such as movable members that may assist in positioning one or more filters within the cone 90 of the x-rays prior to reaching the subject 28.
- One or more movement systems 224 may be provided to move all and/or various portions of the collimator 220.
- various filters may be used to shape the x-ray beam, such as shaping the cone 90, into a selected shape prior to reaching the subject 28.
- the x-rays may be formed into a thin fan or plane to reach and pass through the subject 28 and be detected by the detector 78.
- the controller 310 may be one of or both of the processor 56 and the processor 68.
- the controller 310 is in communication with a user interface 312.
- the user interface 312 may be one of or both the user interfaces 52, 67.
- the user interface may also be used together with a display device 314 that allows selections to be made and data to be entered.
- the display device 314 may also be referred to as a pendant.
- the controller 310 is also in communication with the display device 314 which may be one or both of the display devices 44, 69.
- the controller 310 may process various signals at the processor 56, the processor 68 or combinations thereof.
- the user interface 312 may provide input to the controller 310 from the user interface 67 or 52.
- the display device 314 may display various features, images or data at the display device which may be 44 or 69 described above.
- the controller 310 includes a memory system 316 that may be one of the memory system 66, the memory system 58 or a combination thereof.
- the memory system 316 is used to store various data including, but not limited to, the data described above relative to the memory system 66, 58.
- the memory system 316 may also be used to store imaging system data such as settings and patient data, both of which will be desired in further detail below.
- a timer 318 is used to time various functions including the movement of the imaging system 316.
- the controller 310 is used to position the imaging system 36 having an O-arm.
- the imaging system 36 may have a position detector 320 associated therewith.
- the position detector 320 is used for determining the relative position of the O-arm or movable structure of the imaging system 36. A relative position relative to a subject 28 may be obtained.
- the position detector 320 may include encoders that are used to determine the amount of movement from a predetermined or an initial position.
- the position of various portions relative to others may also be determined with any one or more appropriate position determination systems. As discussed herein, the position of one or more portions are used to assist in determining an appropriate setup (e.g., initial) of the imaging system.
- the controller 310 may also include a patient data module 330.
- the patient data module 330 may be used to calculate or determine various patient data based upon an image from the imaging system 336. In this example, a two-dimensional image may be used to determine various patient data.
- the patient data module 330 includes a patient size module 330A.
- the patient size module 330A may include and/or be used to determine the size and/or geometry of the patient as a whole.
- the patient size module 330A may be used to determine a width of the patient from side to side, such as a should or abdomen width.
- the patient size module 330A may further determine a thickness of the patient anterior to posterior.
- the patient size module 330A measures the sizes of the patient from the image from the imaging system.
- a body structure recognition module 330B may also be incorporated into the patient data module 330.
- the body structure recognition module 330B may recognize various structures within the body that are within the image from the imaging system 36. Examples of different types of body structures will be provided below. For example, vertebrae and the orientation and order of the vertebrae may allow various data regarding the position of the patient to be determined. That is, in 330C, a patient orientation module uses the body structure that is recognized within the body structure recognition module to determine the patient orientation. Examples of patient orientation are prone or supine. The patient orientation module 330C also recognizes head first or feet first relative to an O-arm.
- An existing device or implanted device module 330D may provide a location of an existing device or devices within the body.
- artificial knees, hips, shoulders, spinal implants and pacemakers are some of the types of existing devices that may be identified and located relative to the patient.
- the implanted device module 330D may provide a coordinate for an existing device without identifying the device.
- the type of device may also be recognized.
- Types of recognition may include neural networks and machine learning that form a trained classifier for determining the existing types of devices within a body. Other types of recognition including using atlas data, segmentation, tables and databases of implantable shapes and geometries of implantable devices may be used.
- the controller 310 may also include an imaging system module 340.
- the imaging system module 340 provides data for the settings of the particular imager and the X-ray tube therein. That is, different imaginers require different types of settings and therefore the exact types of data may vary.
- a voltage module 340A may determine the amount of voltage required by the imaging system. Data from the patient data module 330 may be used in this calculation. How large the patient is in terms of AP thickness, the type of body structure to be imaged (tissue, hard bone, soft bone) and width allow the voltage module 340A to determine an amount of voltage to be used at the imaging system.
- the imaging system module 340 may also include a tube current module 340B.
- the tube current module 340B may provide a tube current so an adequate image is obtained.
- the tube current module 340B depends upon various body structures and the size of the patient, the type of body structure to be imaged and can be calculated.
- a pulse width module 340C is used to determine the pulse width of the beam generated from the imaging system. Again, various patient data, such as the size of the patient (width and thickness), the body structure that will be altered in the procedure and implantable devices may also have an effect on the pulse width.
- a collimation module 340D is used to determine the type of collimation for the imaging system. Again, the collimation module 340D may change the collimation of the imaging system 36 based upon various patient data including the size, the body structure to be modified, the patient orientation and any existing devices that are located within the patient. Collimation change the shape of the beam used for imaging. Collimation can be used to remove highly attenuating (Metallic structure) or lightly attenuating (Air) objects so that the technique factor (KVP, pulse width, mA, beam filtration) can be optimized to visualize anatomy.
- KVP technique factor
- the area of interest module 340E is also disposed within the imaging system module.
- the area of interest module 340E determines the area of interest to be scanned based upon the body structure, the patient size and orientation determined at the patient data module 330.
- the area of interest module 340E thus provides the desired position of the detector and the emitter of the imaging system to obtain the desired image of the body structure of interest.
- the body structure may include but is not limited to vertebrae, an end plate, corners of an end plate, a full vertebrate, a partial vertebrate, a skull, a limb, or an organ.
- the vertebrae of a patient are illustrated in Fig. 4A.
- the spinal column body structure includes cervical vertebrae 410, thoracic vertebrae 412, lumbar vertebrae 414, and intervertebral disk 416, a sacrum 418 and the coccyx 420.
- the orientation of the patient may be determined based on the known position of the imager.
- the spinous process 430 extend from each of the vertebrae and allows the orientation in terms of supine and prone positions to be recognized by the controller 310.
- FIG. 4B a representation of vertebrae C1 -T1 is shown.
- An image 440 of this type would recognize the body structure in the body structure recognition module as a chest category.
- FIG. 4C illustrates the patient in a head left supine position.
- Fig. 4D illustrates a head right supine position.
- the orientations illustrated in Figs. 4C and 4D may be identified by labeling the vertebrae, the order of the vertebrae and the spinous process as described above and the position of the image when the image is taken.
- a head left position or orientation of the patient 28 is illustrated based upon the position of the spinous process 430 and the order of the label vertebrae C1 -T1 .
- Fig. 4F the patient is illustrated in a head right position based upon the order of the vertebrae.
- Figs. 4E and 4F illustrate a supine and prone position, respectively. That is, in Fig. 4F, the spinous process 430 is in an up position. If the rotor is in LAT position, the patient is prone relative to the detector. If the rotor is in an AP position, the patient is lateral relative to the detector.
- Fig. 5A a patient data user interface that displays patient data is displayed on one of the screen displays 44, 69 is illustrated. As mentioned above, various types of patient data may be provided and displayed beyond that set forth in Fig. 5A.
- the user interface 508 has a patient width 510, a patient thickness 512, a patient orientation such as prone or supine 514 and head right or head left 516 and a location of existing devices 518 such as prior implants.
- the area of interest 520 may also be a user interface selection.
- a user interface 528 includes an image data user interface 528 that is a three-dimensional image user interface that includes data and/or settings used for taking three-dimensional images.
- a two- dimensional image may be obtained and the patient data and ultimately image data may be provided based upon the patient data.
- the imaging system power 530, the tube current 532, the pulse width 534, the collimation 536 and the area of interest 538 are determined automatically based on the above.
- the area of interest includes an area of the body that will be imaged. The area of interest allows the detector and the emitter of the imaging system to be aligned properly.
- a method for operating the imaging system 36 is set forth.
- a two-dimensional image is obtained from the imaging system in block 610.
- the imaging system position is also obtained from the imaging system. That is, the position of the detector and/or the emitter are provided to the controller.
- Block 614 obtains patient data from the two-dimensional image of block 610.
- Block 614 is broken down into a plurality of sub blocks.
- the patient size is obtained.
- the patient size may include the width of the patient and the AP thickness of the patient.
- the patient size may be determined by evaluating measurements determined from the image data in block 610.
- a body structure of the patient may be determined and its position.
- the position of the imaging system is relative to a particular one or more body structures of the patient such as by evaluating the image data from block 610 and the determined position of the imaging system from block 612.. Examples are illustrated in Fig. 4A.
- the orientation of the body may be obtained in block 614C. That is, the position of the skull may be identified in the image from block 610 and if the position of the skull is known or the position of the vertebras and the order of the vertebras, the patent orientation may be known.
- the spinous process is identified, the prone or supine position of the patient may be determined.
- a location of existing devices may be obtained.
- the location of existing devices including, but not limited to, implants, plates, replacement joints and pacemakers may be determined in the image data from block 610.
- the image obtained in block 610 may be used to identify various features and portions therein and the position of the imaging system is determined in block 612, the relative position of the imaging system to the patient may be determined and the patient data in block 614 may be determined.
- the patient data in block 614 is used to populate the patient data user interface. That is, the various patient data determined using the initial or test images in block 610 may be used to populate various fields of the patient data user interface as noted above. The user may then review the completed fields for various purposes, such as verification thereof.
- the patient data may also be used to determine the various data of the imaging system in block 616 to be used to take a next (e.g., second) or diagnostic image at the controller, such as a three-dimensional image and/or a plurality of 2D images that may be reconstructed into a 3D image.
- Block 616 has various sub blocks that determine the settings for the imaging system. In the various sub blocks parameters of the imaging system may be determined or recalled based on the patient data from block 614.
- an imaging system voltage may be determined. That is, the amount of voltage based upon the patient data may be determined and obtained. For example, larger patients may require more voltage to take an adequate image with the proper contrast. The voltage amount may be recalled and/or determined based on a selected determined patient size.
- an imaging system tube current may be determined by the controller.
- the tube current may use the patient size data and the orientation in a similar manner to that described above relative to the system power.
- the power of the system uses both the voltage and the current.
- the pulse width of the beam may be determined. Depending upon the contrast desired, the imaging system pulse width may vary.
- an imaging system collimation may be provided, particularly to collimate a beam, which may include altering a beam, such as a beam of x-rays.
- Collimation of the beam can be used to remove highly attenuating (metallic structure) or lightly attenuating (air) objects so that the technique factor (KVP, pulse width, mA, beam filtration) can be optimized to visualize anatomy.
- the collimation may be done, such as with a collimator, to shape and/or direct the beam.
- the beam may be directed to not engage or not pass through highly attenuating items or lightly attenuating items to optimize or enhance image data acquired through the selected subject, also referred to as field of interest.
- an imaging system region of interest may be identified.
- the region of interest may be identified from the patient size data and the body structure described above.
- the imaging system region of interest allows the detector and emitter of the imaging system to be oriented in a selected position (e.g., direction for emitting x- rays) relative to the patient to acquire image data for the purposes. For example, image data to reconstruct a three dimensional image and/or acquire three dimensional image data.
- the imaging system 36 including only portions thereof, may be moved as discussed above.
- the patient data and the imaging system data are communicated to a user interface.
- a user interface As illustrated in Figs. 5A and 5B, two different user interfaces may be generated. However, the user interfaces may also be combined or in a list that may be scrolled through in order to see the various data.
- the user interface may also include a menu and sub menu display type and/or a graphical display that may include graphics of the imaging system 36 and/or the patient 28.
- the patient data and the imaging system data may be used by the imaging system to acquire a selected or required subsequent image data.
- the patient data and the imaging system data therefore, may be determined only automatically and/or with little manual input as an input for setting up the imaging system to acquire subsequent image data (e.g., diagnostic image data) as discussed herein.
- the patient data and the imaging system data may be displayed in a user interface for obtaining a subsequent image data acquisition, which may be a three-dimensional image.
- a user interface for obtaining a subsequent image data acquisition which may be a three-dimensional image.
- examples of the three- dimensional image user interfaces are set forth in Figs. 5A and 5B.
- the user interface may be displayed on a screen display of the imager.
- a diagnostic image scan which may be a three-dimensional scan, is performed with the data as included in the user interface.
- the imaging system data, as well as the patient data may be used to obtain the three-dimensional image in block 622.
- the imaging system data and/or the patient data allows the imaging system 36 to be setup and operated to generate the image data that is selected of the patient 28. This may include the area of interest and of a selected quality and/or contrast for analysis and/or diagnoses.
- the imaging system data and the patient data may be substantially automatically obtained without manual intervention. This may increase consistency and/or reduce operating room time and/or radiation exposure.
- the described techniques may be implemented in hardware, software, firmware, or any combination thereof.
- Computer-readable media may include non-transitory computer-readable media, which corresponds to a tangible medium such as data storage media (e.g., RAM, ROM, EEPROM, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer).
- data storage media e.g., RAM, ROM, EEPROM, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer.
- processors or processor modules such as one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry.
- DSPs digital signal processors
- ASICs application specific integrated circuits
- FPGAs field programmable logic arrays
- processors may refer to any of the foregoing structure or any other physical structure suitable for implementation of the described techniques. Also, the techniques could be fully implemented in one or more circuits or logic elements.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Radiology & Medical Imaging (AREA)
- Molecular Biology (AREA)
- Biophysics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Optics & Photonics (AREA)
- Pathology (AREA)
- Physics & Mathematics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- High Energy & Nuclear Physics (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Human Computer Interaction (AREA)
- Apparatus For Radiation Diagnosis (AREA)
Abstract
Description
Claims
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202480029932.0A CN121079040A (en) | 2023-04-12 | 2024-04-09 | System and method for automatically detecting orientation and anatomy in an imaging system |
Applications Claiming Priority (6)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202363458694P | 2023-04-12 | 2023-04-12 | |
| US202363458697P | 2023-04-12 | 2023-04-12 | |
| US63/458,697 | 2023-04-12 | ||
| US63/458,694 | 2023-04-12 | ||
| US18/608,449 US20240341707A1 (en) | 2023-04-12 | 2024-03-18 | System and method for automatically detecting orientation and anatomy in an imaging system |
| US18/608,449 | 2024-03-18 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024214003A1 true WO2024214003A1 (en) | 2024-10-17 |
Family
ID=90825571
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IB2024/053463 Pending WO2024214003A1 (en) | 2023-04-12 | 2024-04-09 | System and method for automatically detecting orientation and anatomy in an imaging system |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN121079040A (en) |
| WO (1) | WO2024214003A1 (en) |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20020054662A1 (en) * | 2000-10-02 | 2002-05-09 | Verdonck Bert Leo Alfons | Method and X-ray apparatus for optimally imaging anatomical parts of the human anatomy |
| US20100290690A1 (en) | 2009-05-13 | 2010-11-18 | Medtronic Navigation, Inc. | System And Method For Automatic Registration Between An Image And A Subject |
| US20120099772A1 (en) | 2010-10-20 | 2012-04-26 | Medtronic Navigation, Inc. | Gated Image Acquisition and Patient Model Construction |
| US20120250822A1 (en) | 2011-04-01 | 2012-10-04 | Medtronic Navigation, Inc. | X-Ray Imaging System and Method |
| WO2018018087A1 (en) * | 2016-07-27 | 2018-02-01 | Charles Sturt University | A method and system for automating radiation dose parameters |
| JP2021137259A (en) * | 2020-03-04 | 2021-09-16 | キヤノンメディカルシステムズ株式会社 | Medical diagnostic system, medical diagnostic apparatus, and medical information processing apparatus |
-
2024
- 2024-04-09 CN CN202480029932.0A patent/CN121079040A/en active Pending
- 2024-04-09 WO PCT/IB2024/053463 patent/WO2024214003A1/en active Pending
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20020054662A1 (en) * | 2000-10-02 | 2002-05-09 | Verdonck Bert Leo Alfons | Method and X-ray apparatus for optimally imaging anatomical parts of the human anatomy |
| US20100290690A1 (en) | 2009-05-13 | 2010-11-18 | Medtronic Navigation, Inc. | System And Method For Automatic Registration Between An Image And A Subject |
| US8238631B2 (en) | 2009-05-13 | 2012-08-07 | Medtronic Navigation, Inc. | System and method for automatic registration between an image and a subject |
| US20120099772A1 (en) | 2010-10-20 | 2012-04-26 | Medtronic Navigation, Inc. | Gated Image Acquisition and Patient Model Construction |
| US20120250822A1 (en) | 2011-04-01 | 2012-10-04 | Medtronic Navigation, Inc. | X-Ray Imaging System and Method |
| WO2018018087A1 (en) * | 2016-07-27 | 2018-02-01 | Charles Sturt University | A method and system for automating radiation dose parameters |
| JP2021137259A (en) * | 2020-03-04 | 2021-09-16 | キヤノンメディカルシステムズ株式会社 | Medical diagnostic system, medical diagnostic apparatus, and medical information processing apparatus |
Also Published As
| Publication number | Publication date |
|---|---|
| CN121079040A (en) | 2025-12-05 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP7098485B2 (en) | Virtual alignment image used for imaging | |
| JP6092922B2 (en) | Technology to acquire selected images to optimize patient model building | |
| US20240071025A1 (en) | System and method for imaging | |
| US20240180502A1 (en) | System And Method For Displaying An Image | |
| US12412297B2 (en) | System and method for identifying and classifying a feature in an image of a subject | |
| US20250157076A1 (en) | System and Method for Identifying Feature in an Image of a Subject | |
| JP2017202031A (en) | Medical information processing device | |
| US20240341707A1 (en) | System and method for automatically detecting orientation and anatomy in an imaging system | |
| WO2024214003A1 (en) | System and method for automatically detecting orientation and anatomy in an imaging system | |
| EP4441702A1 (en) | System and method for identifying feature in an image of a subject | |
| EP4441693A1 (en) | System and method for identifying feature in an image of a subject | |
| WO2023096836A1 (en) | System and method for identifying feature in an image of a subject | |
| US20240277412A1 (en) | System and method for validating a procedure | |
| US12465316B2 (en) | Method and system for positioning an imaging system | |
| US12471875B2 (en) | Method and system for positioning an imaging system | |
| US12453524B2 (en) | Method and system for positioning an imaging system | |
| US12458314B2 (en) | System and method for imaging | |
| WO2024214064A1 (en) | System and method for automatically adjusting a position of an imaging system | |
| WO2024226947A1 (en) | Method and system for positioning an imaging system | |
| WO2024209477A1 (en) | System and method for determining a probability of registering images |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24720583 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2024720583 Country of ref document: EP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| ENP | Entry into the national phase |
Ref document number: 2024720583 Country of ref document: EP Effective date: 20251112 |
|
| ENP | Entry into the national phase |
Ref document number: 2024720583 Country of ref document: EP Effective date: 20251112 |
|
| ENP | Entry into the national phase |
Ref document number: 2024720583 Country of ref document: EP Effective date: 20251112 |
|
| ENP | Entry into the national phase |
Ref document number: 2024720583 Country of ref document: EP Effective date: 20251112 |
|
| ENP | Entry into the national phase |
Ref document number: 2024720583 Country of ref document: EP Effective date: 20251112 |