WO2024194367A1 - Procédé de reconstruction d'image pour un système médical - Google Patents
Procédé de reconstruction d'image pour un système médical Download PDFInfo
- Publication number
- WO2024194367A1 WO2024194367A1 PCT/EP2024/057476 EP2024057476W WO2024194367A1 WO 2024194367 A1 WO2024194367 A1 WO 2024194367A1 EP 2024057476 W EP2024057476 W EP 2024057476W WO 2024194367 A1 WO2024194367 A1 WO 2024194367A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- tracking
- medical
- images
- imaging system
- medical imaging
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
- G06T7/248—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
Definitions
- the present invention relates to an image reconstruction method for a medical system, a corresponding computer program product and computer-readable medium, and a (medical) system.
- motion artifacts present a general problem. Therefore, tracking is often employed to detect motion. Detected motion can be used to reduce motion artifacts. Known approaches require a high degree of periodicity in the motion pattern. A subject-motion might be tracked using a stationary tracking device.
- Loop-X system being freely movable on wheels and having a tiltable gantry.
- the additional degrees of freedom are very challenging in terms of image reconstruction, particularly motion compensation, as the position of the imaging system at the time of capturing each respective image and motion of a subject during a scan needs to be taken into account.
- the present invention accordingly, has the object of providing improved image quality with reduced motion artifacts.
- the present invention can be used for medical imaging, particularly allowing for periodic or non-periodic motion of the imaged subject and/or the imaging device, e.g. in connection with a Loop-X imaging system or navigation in medical procedures and/or together with systems for image-guided radiotherapy such as VERO® and ExacTrac®, both products of Brainlab AG.
- the invention provides an image reconstruction method, medical system, computer program product, and computer readable medium according to the independent claims. Preferred embodiments are laid down in the dependent claims.
- n by means of the tracking device, for each of the plurality of medical images l_i, acquire corresponding tracking coordinates in C2 using one or more markers; transforming the medical images based on the tracking coordinates and the calibration between C1 and C2 ; and reconstructing 3D images based on the transformed images with respect to a reference position, which may for example be associated with a reference medical image.
- the claimed method is highly reliable and efficient in reducing motion artefacts.
- the method is suitable for periodic and non-periodic movement.
- the method of the present disclosure may comprise temporal binning of medical images acquired by the medical imaging device by assigning the medical images to specific breathing cycle phases and reconstructing multiple 3D images, each corresponding to a respective breathing cycle phase, particularly using intrinsic or extrinsic binning techniques, where the respective breathing cycle phases may be optionally specified within specific temporal window sizes, which result in effective pixel-based binning in the spatial domain of the reconstructed 3D images.
- the tracking device may track a marker, and a current breathing phase and a 3D image corresponding to the current breathing phase may be selected based on a correlation of a marker movement and a breathing phase, particularly for use in tool tracking, particularly, wherein the method comprises displaying an overlay of a tool model with the 3D image corresponding to the current breathing phase.
- gated navigation may be provided by executing or providing a navigation function only for a subset of the breathing cycle phases, particularly only in one of the breathing cycle phases.
- 4D CBCT is acquired using intrinsic (e.g. using Amsterdam Shroud filter) and/or extrinsic (e.g. based on optical tracking markers using a tracking system mounted to the medical imaging device, e.g. a gantry-mounted tracking device) binning of projections, i.e. assignment of projections to specific breathing cycle phases (e.g., inhalation phase and exhalation phase) to reconstruct multiple CBCT datasets, each corresponding to a specific (and sensed) breathing cycle phase (e.g.
- intrinsic e.g. using Amsterdam Shroud filter
- extrinsic e.g. based on optical tracking markers using a tracking system mounted to the medical imaging device, e.g. a gantry-mounted tracking device
- binning of projections i.e. assignment of projections to
- a correlation of marker movement e.g. one or more markers attached to patient’s chest - patient reference marker(s)
- the breathing phase is established I derived from the acquisition data, i.e., the X-ray projections and corresponding tracked marker positions.
- This allows deriving, at a later stage, in which breathing phase the patient currently is, for example just from observing the patient reference with the tracking device.
- tool e.g.
- the tool tracking may comprise that both, the tool (e.g. needle) and the patient reference, are tracked in the imaging system’s coordinate system using the (e.g.) gantry-mounted tracking system, and based on this, the tool’s model (e.g. CAD model) may be displayed on a screen on top of the acquired CBCT dataset (e.g. in a cross-sectional image view). Contrary to known methods, not only spatial information based on sensed current patient and tool position is used for the display, but additionally the current breathing cycle phase of the patient may be obtained.
- the currently overlaid CBCT image on display may be replaced with the most appropriate CBCT image of the 4D CBCT set based on the breathing cycle phase. That is, for example, when sensing that the patient is currently in the inhalation phase, the CBCT (made of binned projections) of the inhalation phase may be used. As another example, when it is sensed that the patient is currently 30% after inhalation, but 70% before the exhalation phase, an accordingly interpolated CBCT (accordingly mixed by corresponding deformation vector interpolation between the phases) may be displayed as overlay image (one could compare this to a live “movie” being played, whose speed and current replay position is determined by the sensed breathing cycle surrogate).
- implicit pixel-based binning may be applied to the displayed CBCT.
- all phases within a window having a window size e.g. all phases from 20% to 40%, may be binned (i.e. averaged) in the displayed CBCT.
- the window size may be user-selectable. Selection of the window size is an effective means to control the trade-off between visible motion artefacts and visible pixel intensity noise.
- gated navigation may entail that navigation is only available or executed in specific breathing cycles of the patient (e.g. only in the inhalation phase, which could also be denoted as deep inspiration breath hold technique).
- the medical images l_i may be medical 2D projection images.
- the term “medical image” is therefore to be understood to particularly refer to medical 2D projection images.
- the reference position refers to a reference position in C2, that is, a tracking position.
- a medical image at said reference position in the present disclosure, may be referred to as the corresponding medical image or as the reference medical image.
- a position C2 may also be referred to as a tracking position or a position in tracking coordinates.
- tracking device and “tracking system” are used interchangeably.
- a medical imaging system may, for example, be an X-ray-based imaging system, such as a CT (computed tomography) imaging system.
- the medical imaging system may comprise a CBCT (cone beam CT).
- a tracking device may be a camera, for example, infrared (IR) or in the visible spectrum, configured for tracking markers.
- IR infrared
- visible spectrum configured for tracking markers.
- the tracking device may be detachably attached to the medical imaging system.
- the tracking device may be attached to the medical imaging system in a fixed relative position (including orientation) relative to a portion of the medical imaging system.
- the tracking device may be configured to yield coordinates for the markers in the coordinate system C2 of the tracking device.
- the coordinate system C1 of the medical imaging system may, for example, have a fixed relation to C2, which may be determined in the course of the calibration.
- a transformation between coordinates in C1 and coordinates in C2 can be defined based on the calibration.
- the medical imaging system may perform a scan.
- the medical imaging system, the patient couch, or the subject might (inadvertently) change position in an unplanned manner, which has the potential of leading to motion artifacts.
- a (residual) breathing motion of a patient, a sag of a patient couch, which particularly may happen gradually over time, coughing of the patient, flatulence may lead to motion artifacts.
- the tracking device may record tracking coordinates. From the perspective of the tracking device, the markers have changed position in the coordinate system C2, i.e. , changed tracking coordinates.
- the tracking coordinates are representative of the relative movement between the tracking device (and the medical imaging system) and the markers.
- a transformation of the medical images may be carried out, the transformation, for example, being such that the medical images taken can be combined to reconstruct a 3D image with respect to a reference position from the transformed images accounting for any (intentional and/or unintentional) motion.
- the method may comprise determining the reference position and determining the reference position may comprise selecting one or more images among the plurality of medical images and determining the reference position based on the corresponding tracking positions of the selected image(s).
- determining the reference position may comprise selecting an initial medical image l_1 as a reference image and using the corresponding tracking position as the reference position or selecting the last medical image l_n as a reference image and using the corresponding tracking position as the reference position.
- the first image may be the one used as a reference image and the corresponding tracking position as the reference position. This may be particularly advantageous where a real-time reconstruction is desired, in which case it might be detrimental to delay until further medical images have been captured.
- the last image may be the one used as a reference image and the corresponding tracking position as the reference position.
- determining the reference position may comprise calculating a reference position based on a set of tracking positions corresponding to a set of medical images l_1 , ... I_x, particularly using statistical methods like determining a mean tracking position and using it as reference position.
- determining the reference position may comprise calculating a reference position based on one or more tracking positions at a specific time or within a specific time frame.
- determining the reference position may comprise calculating a reference position based on a set of tracking positions corresponding to a certain timeframe, e.g. prior to medical imaging, particularly using statistical methods like determining a mean (tracking) position and using it as a reference position.
- the entire series or a sub-set thereof may be selected and used to obtain a reference position, for example taking into account the position of each of the selected medical images, or alternatively, independently of any medical images, a series of tracking positions from a certain time frame may be selected and used to obtain a reference position. For example, where an oscillating motion like breathing occurs during the scan, it may be possible to use the position around which the oscillation takes place as a reference position. Calculation of the mean may be limited to a subset of the possible degrees of freedom, e.g., only with respect to a specific translation direction.
- image reconstruction may also be done (for example at a later time) with a different reference position.
- a 3D image reconstructed using the reference position may be (e.g. rigidly) transformed so as to obtain a 3D image from a different reference position, i.e. without requiring a reconstruction with the different reference position.
- Reconstructing the 3D images may be performed in real-time and the initial imaging system coordinate is used as reference position.
- the method of the present disclosure may comprise storing, with the medical images, the respective imaging system coordinates and corresponding tracking coordinates of the one or more markers. This may allow to reconstruct the 3D images at a later time and with respect to different reference positions.
- the method of the present disclosure may comprise storing, with the reconstructed 3D images, the respective reference position. Thus, subsequent analysis of the 3D images may be improved.
- the method of the present disclosure may further comprise reconstructing second 3D images from the transformed images using a second reference position that is different from the reference position.
- the second 3D images may be constructed at a later time, particularly, where the respective coordinates are stored with the medical images.
- the method of the present disclosure may comprise reconstructing the 3D images with respect to a reference position determined from a marker attached to a patient in proximity of the imaged region of interest (ROI) such that the tracked markers are a valid surrogate signal for any motion occurring in the ROI during acquiring the medical images (i.e., during imaging).
- ROI imaged region of interest
- the method of the present disclosure may comprise utilizing the tracking device for tracking a tool used in a medical procedure in the imaging coordinate system.
- a tool may also be tracked using the tracking device, e.g. via markers attached to the tool.
- a tool position may be obtained in the tracking device’s coordinate system and optionally, based on the calibration, it may then also be reflected in the coordinate system of the medical imaging device.
- a visualization of the tool with correct spatial relations e.g. for navigation purposes, may be enabled.
- the tool may be a pointer tool and the tracking of the tool may be used for visualizing the pointer tool with respect to medical images obtained prior to the medical procedure and/or during the medical procedure, particularly for use in pointer planning workflows.
- a pointer tool may be tracked and the medical imaging device may be arranged so as to adjust, based on the calibration between C1 and C2, the current projection geometry with respect to a direction indicated by the pointer tool.
- the method of the present disclosure may comprise using a shared reference marker for acquiring the tracking coordinates and for tracking by means of a tracking camera of a surgical navigation system, the surgical navigation system performing registration of image data obtained by the surgical navigation system during the medical procedure with medical image data obtained in advance of the medical procedure so as to allow for navigation during the medical procedure.
- Surgical navigation systems are often provided separately from medical imaging systems, as they can be used for a wide variety of applications. It is advantageous to allow for making use of the capabilities of these navigation systems together with the capabilities of the tracking device. The method proposed herein allows for doing so in a reliable and efficient manner.
- the shared reference marker may be selected in such a manner as to avoid artifacts in reconstructed images and to provide accurate registration of the medical image data obtained in advance of the medical procedure and/or image data obtained during the procedure.
- the method of the present disclosure may comprise the surgical navigation system providing a time of registration to the medical imaging system, e.g. as a signal, and the medical imaging system using a position associated with the time of registration as a reference position for motion compensation in reconstructing the 3D images or deriving a reference position from multiple tracking positions within a time frame of registration.
- the medical imaging system may be a wheeled system that is autonomously movable and the method may comprise using the tracking device for providing positional awareness of the medical imaging system.
- the medical imaging system may be configured to allow acquiring medical images for saddle trajectories.
- the medical imaging system may be configured such that several degrees of freedom are possible, including one or more of C-arm tilt, gantrytilt, C-arm-yaw, gantry-yaw, longitudinal translational movements of the gantry or C- arm.
- a trajectory of the gantry or C-arm combining two or more of these motions of the gantry or C-arm may be referred to as saddle trajectories.
- the method according to the present disclosure allows for addressing such challenges, e.g., allows for efficient motion compensation when scanning using trajectories like the saddle trajectories.
- this allows for accurately determining relative movements accurately in the same manner as the motion compensation that does not involve motions except for source and detector rotation.
- tracking an autonomously movable system is important for precise image reconstruction. Since the method proposed herein is a high-accuracy method for determining a position relative to markers, by using suitably arranged markers the medical imaging system, via the tracking device, may obtain accurate positional awareness even in challenging arrangements and relative to a position that can simply be selected by respective marker placement.
- the medical imaging system may comprise a tiltable gantry or a tiltable C-arm, particularly a gantry configured such that its rotation plane is tiltable or a C-arm configured such that its rotation plane is tiltable.
- the gantry or C-arm may be configured and mounted such that the plane itself may be tilted.
- the gantry may rotate around an axis that is parallel to the longitudinal axis of a patient bed.
- the gantry in addition, may describe a translational movement in a direction parallel to the longitudinal axis.
- the gantry may also tilt around a horizontal axis that is perpendicular to the longitudinal axis, e.g., towards and against the translational movement direction.
- a tilt rotation of the C-arm or gantry may be a rotation around hinges used for mounting the C-arm or gantry, e.g., mounted to a fixed or movable support structure like one or more feet, particularly a support structure movable on wheels.
- the medical imaging system may comprise a gantry or a C-arm, particularly a tiltable gantry or a tiltable C-arm as described above, and the medical imaging system may be configured such that the gantry or the C-arm are movable to describe a yaw rotation.
- a yaw rotation may be achieved by a wheeled support structure to which the C-arm or gantry is mounted, wherein particularly the wheeled support structure may be driven via traction, e.g., using back wheels.
- all wheels might be set to a 45 ° so as to form a circle yaw rotation.
- the tracking device may be rigidly attached to a component of the medical imaging system that is fixed within the imaging coordinate system or to a component of the imaging system that is movable within the imaging coordinate system.
- the medical imaging system may comprise components that are movable with respect to each other and the tracking device may be rigidly attached to one of said components.
- the imaging coordinate system C1 may be selected such that the tracking device coordinate system C2 remains in a fixed relation.
- the relation between the coordinate system C1 and C2 may not have a constant fixed relation, in which case, in order to being able to reconstruct the 3D image, a respective relation between the coordinate systems C1 and C2 is tracked, e.g., based on the control of the movement.
- the medical imaging system may comprise a radiation source and a radiation detector that are movable relative to each other, wherein the tracking device is rigidly attached to a medical imaging system so as to have a fixed spatial relation with respect to one of the radiation source and the radiation detector.
- the relation between coordinate systems C1 and C2 is then a function of the current source or detector position (angle), which may be pre-calibrated.
- the medical imaging system may comprise a radiation source and a radiation detector that have a fixed relative spatial position, wherein the tracking device is rigidly attached to a medical imaging system so as to have a fixed spatial relation with respect to the radiation source and the radiation detector.
- the relation between coordinate systems C1 and C2 is then a function of either the current source or detector position (angle), which may be pre-calibrated.
- the tracking device may be configured for pose tracking and/or may comprise at least one of: an (e.g. near) infrared tracking system, particularly comprising one or more cameras, a video camera tracking system comprising one or more cameras, an electromagnetic tracking system.
- Said tracking systems may, for example, be monoscopic or stereoscopic tracking systems.
- the one or more markers may comprise markers attached to a patient couch and/or markers attached to the floor and/or markers attached to one or more portions of the patient, in particular a portion of the patient inside a region of interest and/or a portion of a patient, outside the region of interest of the medical imaging device.
- the one or more markers may comprise a marker attached to a chest of a patient and/or a spine of a patient and/or a cranium of a patient and/or a marker attached to a limb of a patient.
- the patient may move during a procedure, having markers attached to parts of the patient relevant for the imaging, e.g. near or in a region of interest, may improve reconstruction accuracy. That is, the perceived position of the imaging system relative to spatial fixed markers may then differ from the perceived position of the imaging system relative to markers in the region of interest. Taking both into account may allow for proper perspective for reconstruction and accounting for patient movement as well. This may be particularly relevant in the context of cooperation with a surgical navigation system which might be configured to work with markers attached to the patient.
- the one or more markers may additionally comprise markers attached to a treatment device, e.g. a radiation treatment device, and/or markers attached to a tool for use in a medical procedure.
- a treatment device e.g. a radiation treatment device
- the one or more markers may be attached in such positions as to allow for at least one of: accounting for a tilt of a gantry or a C- arm, accounting for a sag of a gantry or a C-arm, accounting for a device yaw, motion compensation during a longitudinal scan, accounting for slip of wheels of the medical imaging system on the floor, accounting for an uneven floor beneath the medical imaging system, particularly when the medical imaging system is a wheeled system.
- the tracking camera may, for example, be used for position control of the medical imaging device, like an autonomously movable device (e.g. where very accurate positional awareness of the device is of interest).
- the medical imaging system may, for example, based on the tracking camera, carry out a guided motion using markers that are static or even movable markers that allow for a follow-function.
- the medical imaging system may be a wheeled system that is autonomously movable and the method according to the present disclosure may comprise using the tracking device for providing positional awareness of the medical imaging system.
- the method according to the present disclosure may comprise the medical imaging system repositioning itself automatically to a target position, e.g.
- repositioning comprises: acquiring, during the repositioning, images of one or more markers by means of the tracking device and determining current marker positions from said images; using previous marker positions determined from images of markers acquired by the tracking device before moving or being moved from the target position, the images of the markers and/or marker positions of the markers and/or marker IDs of the markers optionally stored with corresponding medical images; and based on the current marker positions and the previous marker positions, determining a trajectory for repositioning the medical imaging system.
- the method of the present disclosure may comprise the tracking device attached to the medical imaging system acquiring images of markers positioned at fixed positions in a room and uses marker positions determined from said images for determining the medical imaging system’s position.
- the method may comprise a fixed-position tracking device, e.g. attached to a wall or a ceiling or a floor of a room or an equipment in the room having a known position in the room, being used to track the position of the medical imaging system in the room by acquiring images of one or more markers attached to the medical imaging system with the fixed- position tracking device and determining the position of the medical imaging system based on marker positions in the acquired images.
- the tracking device attached to the medical imaging system and a fixed-position tracking device e.g.
- the medical imaging system’s position may be determined relative to a room coordinate system and/or relative to a tool having a position calibrated relative to the room coordinate system.
- the medical imaging device particularly the autonomously movable device, (e.g. the Loop-X) must be moved away from the imaging position (e.g. after imaging) and back to it several times, e.g. for re-imaging (e.g. when clearing operating space for a clinician or surgeon).
- re-imaging e.g. when clearing operating space for a clinician or surgeon.
- a (e.g. gantry-mounted) tracking system mounted to the medical imaging device may, particularly using the gantry-mounted tracking camera, continuously detect optical markers placed in its field of view (FOV) to obtain marker data, and store the marker data in the corresponding X-ray projections.
- the stored optical marker data of the previously acquired image i.e., before moving away from the imaging position
- a corrective trajectory for the repositioning of the medical imaging device can then be calculated from this data, for example, can be calculated subtracting the currently sensed optical tracking marker position from the stored optical tracking marker position of the previously acquired image.
- the gantry mounted tracking system of the present disclosure allows determining tool positions and/or relative motion of the patient (motion compensation) in the coordinate system of the imaging device.
- the medical imaging device e.g. Loop-X
- Loop-X may be a mobile imaging device, this does not necessarily give information of the tool or patient movement in an outer room coordinate system (i.e. if the medical imaging device is moved).
- Reference markers may be provided statically in the room (i.e. at a known fixed position in the room). Using these markers as reference allows for resolving this issue.
- the imaging device’s position in the room may be determined solely by a tracking system (for example using IR camera arrays) mounted statically to the room’s ceiling or walls or floor or an (non-)moveable equipment located in the room at known position (referred to as room tracking system), for example in combination with markers attached to the medical imaging device, e.g. to the medical imaging device’s gantry, the markers being captured by the room tracking system.
- a tracking system for example using IR camera arrays mounted statically to the room’s ceiling or walls or floor or an (non-)moveable equipment located in the room at known position
- room tracking system for example in combination with markers attached to the medical imaging device, e.g. to the medical imaging device’s gantry, the markers being captured by the room tracking system.
- the imaging device’s position in the room may be determined by combining the information gained by the tracking system mounted to the medical imaging device (e.g. the gantry-mounted tracking system) with one or more statically room-mounted tracking systems.
- a dynamic marker can be placed in the room, which is captured by both, the tracking system mounted to the medical imaging device (e.g. the gantry-mounted tracking system) and the room tracking system(s) during imaging. These two pieces of information allow exact localization of the medical imaging device in the room, and thereby also exact relation to other components and devices with known position in the room, e.g. fixed particle therapy beam.
- An advantage of a dynamic marker is that this marker’s position can be situationally adjusted, as needed, thereby allowing to overcome different challenges.
- Using a room tracking system allows to address challenges such as reliably capturing the medical imaging device’s position in the room throughout the entire movement range of the device (e.g. yaw, translations) generally being very large.
- n by means of the tracking device, for each of the plurality of medical images l_i, acquire corresponding tracking coordinates in C2 using one or more markers; transforming the medical images based on the tracking coordinates and the calibration between C1 and C2 ; and reconstructing 3D images based on the transformed images with respect to a reference position, which may for example be associated with a reference medical image.
- the present disclosure also provides a medical system, computer program product, and computer readable medium according to the independent claims.
- the present disclosure also provides a medical system comprising a medical imaging system and a tracking device attached, particularly rigidly attached, to the medical imaging system, the system configured to carry out the method of any of the preceding claims.
- the present disclosure also provides a medical system comprising a medical imaging system and a tracking device, particularly attached to the medical imaging system, e.g. rigidly or in a movable manner, the system configured to carry out the method of any of the preceding claims.
- the medical system may further comprise a medical imaging system configured to perform the scan and a tracking device configured to acquire the corresponding tracking coordinates.
- the medical imaging system may comprise an X-ray based system, e.g., a CT scanner.
- the medical imaging system may be a cone beam CT, CBCT.
- the medical imaging system may be an autonomously movable system.
- the medical imaging system may have a tiltable gantry or C-arm.
- the tracking device may comprise a camera, e.g., a monoscopic or stereoscopic camera. It may comprise an infrared camera and/or a camera using the visible spectrum or other EM spectra.
- the medical system may further comprise tracking markers for use by the tracking device, the tracking markers attached to one or more of a patient bed, one or more body parts of the patient, the floor, one or more tools for use in a medical procedure, one or more components of a treatment machine, e.g., a radiation treatment machine or histotripsy device. Potential configuration of markers will be described below in the terminology section.
- the medical system may further comprise a navigation system for medical procedures, the navigation system using one or more navigation markers, wherein at least a subset of the navigation markers correspond to at least a subset of the tracking markers, particularly to a subset of the tracking markers attached to the patient.
- the tracking device may comprise one or more cameras enclosed by a housing, particularly completely enclosed by a housing, wherein the housing is configured to seal off non-marker-related reflections from an image sensor of the one or more cameras, particularly reflections from integrated illumination sources.
- the present disclosure also provides a computer program product comprising instructions which, when the program is executed by a computer, cause the computer to control and/or carry out any of the steps of the method according to the present disclosure, particularly of the method claims.
- the present disclosure also provides a computer-readable medium comprising instructions which, when executed by a computer, cause the computer to control and/or carry out any of the steps of the method according to the present disclosure, particularly of the method claims.
- the present disclosure also relates to the use of the method and/or system of the present disclosure for medical imaging and/or performing a navigation, a positioning and/or alignment of a subject and various objects, e.g., pieces of equipment or tools.
- the invention does not involve or in particular comprise or encompass an invasive step which would represent a substantial physical interference with the body requiring professional medical expertise to be carried out and entailing a substantial health risk even when carried out with the required professional care and expertise.
- the invention does not comprise a step of positioning a medical implant in order to fasten it to an anatomical structure or a step of fastening the medical implant to the anatomical structure or a step of preparing the anatomical structure for having the medical implant fastened to it.
- the invention does not involve or in particular comprise or encompass any surgical or therapeutic activity.
- the invention is instead directed as applicable to medical imaging, navigation, positioning and/or alignment. For this reason alone, no surgical or therapeutic activity and in particular no surgical or therapeutic step is necessitated or implied by carrying out the invention.
- the method in accordance with the invention is for example a computer implemented method.
- all the steps or merely some of the steps (i.e. less than the total number of steps) of the method in accordance with the invention can be executed by a computer (for example, at least one computer).
- An embodiment of the computer implemented method is a use of the computer for performing a data processing method.
- An embodiment of the computer implemented method is a method concerning the operation of the computer such that the computer is operated to perform one, more or all steps of the method.
- the computer for example comprises at least one processor and for example at least one memory in order to (technically) process the data, for example electronically and/or optically.
- the processor being for example made of a substance or composition which is a semiconductor, for example at least partly n- and/or p-doped semiconductor, for example at least one of II-, III-, IV-, V-, Vl-sem iconductor material, for example (doped) silicon and/or gallium arsenide.
- the calculating or determining steps described are for example performed by a computer. Determining steps or calculating steps are for example steps of determining data within the framework of the technical method, for example within the framework of a program.
- a computer is for example any kind of data processing device, for example electronic data processing device.
- a computer can be a device which is generally thought of as such, for example desktop PCs, notebooks, netbooks, etc., but can also be any programmable apparatus, such as for example a mobile phone or an embedded processor.
- a computer can for example comprise a system (network) of "sub-computers", wherein each sub-computer represents a computer in its own right.
- the term "computer” includes a cloud computer, for example a cloud server.
- the term "cloud computer” includes a cloud computer system which for example comprises a system of at least one cloud computer and for example a plurality of operatively interconnected cloud computers such as a server farm.
- Such a cloud computer is preferably connected to a wide area network such as the world wide web (WWW) and located in a so-called cloud of computers which are all connected to the world wide web.
- WWW world wide web
- Such an infrastructure is used for "cloud computing", which describes computation, software, data access and storage services which do not require the end user to know the physical location and/or configuration of the computer delivering a specific service.
- the term "cloud” is used in this respect as a metaphor for the Internet (world wide web).
- the cloud provides computing infrastructure as a service (laaS).
- the cloud computer can function as a virtual host for an operating system and/or data processing application which is used to execute the method of the invention.
- the cloud computer is for example an elastic compute cloud (EC2) as provided by Amazon Web ServicesTM.
- a computer for example comprises interfaces in order to receive or output data and/or perform an analogue-to-digital conversion.
- the data are for example data which represent physical properties and/or which are generated from technical signals.
- the technical signals are for example generated by means of (technical) detection devices (such as for example devices for detecting marker devices) and/or (technical) analytical devices (such as for example devices for performing (medical) imaging methods), wherein the technical signals are for example electrical or optical signals.
- the technical signals for example represent the data received or outputted by the computer.
- the computer is preferably operatively coupled to a display device which allows information outputted by the computer to be displayed, for example to a user.
- a display device is a virtual reality device or an augmented reality device (also referred to as virtual reality glasses or augmented reality glasses) which can be used as "goggles" for navigating.
- augmented reality glasses is Google Glass (a trademark of Google, Inc.).
- An augmented reality device or a virtual reality device can be used both to input information into the computer by user interaction and to display information outputted by the computer.
- Another example of a display device would be a standard computer monitor comprising for example a liquid crystal display operatively coupled to the computer for receiving display control data from the computer for generating signals used to display image information content on the display device.
- a specific embodiment of such a computer monitor is a digital lightbox.
- An example of such a digital lightbox is Buzz®, a product of Brainlab AG.
- the monitor may also be the monitor of a portable, for example handheld, device such as a smart phone or personal digital assistant or digital media player.
- the invention also relates to a program which, when running on a computer, causes the computer to perform one or more or all of the method steps described herein and/or to a program storage medium on which the program is stored (in particular in a non- transitory form) and/or to a computer comprising said program storage medium and/or to a (physical, for example electrical, for example technically generated) signal wave, for example a digital signal wave, carrying information which represents the program, for example the aforementioned program, which for example comprises code means which are adapted to perform any or all of the method steps described herein.
- computer program elements can be embodied by hardware and/or software (this includes firmware, resident software, micro-code, etc.).
- computer program elements can take the form of a computer program product which can be embodied by a computer-usable, for example computer-readable data storage medium comprising computer-usable, for example computer-readable program instructions, "code” or a "computer program” embodied in said data storage medium for use on or in connection with the instructionexecuting system.
- Such a system can be a computer; a computer can be a data processing device comprising means for executing the computer program elements and/or the program in accordance with the invention, for example a data processing device comprising a digital processor (central processing unit or CPU) which executes the computer program elements, and optionally a volatile memory (for example a random access memory or RAM) for storing data used for and/or produced by executing the computer program elements.
- a computer-usable, for example computer-readable data storage medium can be any data storage medium which can include, store, communicate, propagate or transport the program for use on or in connection with the instruction-executing system, apparatus or device.
- the computer-usable, for example computer-readable data storage medium can for example be, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared or semiconductor system, apparatus or device or a medium of propagation such as for example the Internet.
- the computer-usable or computer-readable data storage medium could even for example be paper or another suitable medium onto which the program is printed, since the program could be electronically captured, for example by optically scanning the paper or other suitable medium, and then compiled, interpreted or otherwise processed in a suitable manner.
- the data storage medium is preferably a non-volatile data storage medium.
- the computer program product and any software and/or hardware described here form the various means for performing the functions of the invention in the example embodiments.
- the computer and/or data processing device can for example include a guidance information device which includes means for outputting guidance information.
- the guidance information can be outputted, for example to a user, visually by a visual indicating means (for example, a monitor and/or a lamp) and/or acoustically by an acoustic indicating means (for example, a loudspeaker and/or a digital speech output device) and/or tactilely by a tactile indicating means (for example, a vibrating element or a vibration element incorporated into an instrument).
- a computer is a technical computer which for example comprises technical, for example tangible components, for example mechanical and/or electronic components. Any device mentioned as such in this document is a technical and for example tangible device. Acquiring data
- acquiring data for example encompasses (within the framework of a computer implemented method) the scenario in which the data are determined by the computer implemented method or program.
- Determining data for example encompasses measuring physical quantities and transforming the measured values into data, for example digital data, and/or computing (and e.g. outputting) the data by means of a computer and for example within the framework of the method in accordance with the invention.
- the meaning of "acquiring data” also for example encompasses the scenario in which the data are received or retrieved by (e.g. input to) the computer implemented method or program, for example from another program, a previous method step or a data storage medium, for example for further processing by the computer implemented method or program.
- the expression “acquiring data” can therefore also for example mean waiting to receive data and/or receiving the data.
- the received data can for example be inputted via an interface.
- the expression "acquiring data” can also mean that the computer implemented method or program performs steps in order to (actively) receive or retrieve the data from a data source, for instance a data storage medium (such as for example a ROM, RAM, database, hard drive, etc.), or via the interface (for instance, from another computer or a network).
- the data acquired by the disclosed method or device, respectively may be acquired from a database located in a data storage device which is operably to a computer for data transfer between the database and the computer, for example from the database to the computer.
- the computer acquires the data for use as an input for steps of determining data.
- the determined data can be output again to the same or another database to be stored for later use.
- the database or database used for implementing the disclosed method can be located on network data storage device or a network server (for example, a cloud data storage device or a cloud server) or a local data storage device (such as a mass storage device operably connected to at least one computer executing the disclosed method).
- the data can be made "ready for use” by performing an additional step before the acquiring step. In accordance with this additional step, the data are generated in order to be acquired.
- the data are for example detected or captured (for example by an analytical device). Alternatively or additionally, the data are inputted in accordance with the additional step, for instance via interfaces.
- the data generated can for example be inputted (for instance into the computer).
- the data can also be provided by performing the additional step of storing the data in a data storage medium (such as for example a ROM, RAM, CD and/or hard drive), such that they are ready for use within the framework of the method or program in accordance with the invention.
- the step of "acquiring data" can therefore also involve commanding a device to obtain and/or provide the data to be acquired.
- the acquiring step does not involve an invasive step which would represent a substantial physical interference with the body, requiring professional medical expertise to be carried out and entailing a substantial health risk even when carried out with the required professional care and expertise.
- the step of acquiring data does not involve a surgical step and in particular does not involve a step of treating a human or animal body using surgery or therapy.
- the data are denoted (i.e. referred to) as "XY data” and the like and are defined in terms of the information which they describe, which is then preferably referred to as "XY information" and the like.
- the n-dimensional image of a body is registered when the spatial location of each point of an actual object within a space, for example a body part in an operating theatre, is assigned an image data point of an image (CT, MR, etc.) stored in a navigation system.
- CT computed tomography
- MR magnetic resonance
- Image registration is the process of transforming different sets of data into one coordinate system.
- the data can be multiple photographs and/or data from different sensors, different times or different viewpoints. It is used in computer vision, medical imaging and in compiling and analysing images and data from satellites. Registration is necessary in order to be able to compare or integrate the data obtained from these different measurements.
- a marker detection device for example, a camera or an ultrasound receiver or analytical devices such as CT or MRI devices
- the detection device is for example part of a navigation system.
- the markers can be active markers.
- An active marker can for example emit electromagnetic radiation and/or waves which can be in the infrared, visible and/or ultraviolet spectral range.
- a marker can also however be passive, i.e. can for example reflect electromagnetic radiation in the infrared, visible and/or ultraviolet spectral range or can block x-ray radiation.
- the marker can be provided with a surface which has corresponding reflective properties or can be made of metal in order to block the x-ray radiation. It is also possible for a marker to reflect and/or emit electromagnetic radiation and/or waves in the radio frequency range or at ultrasound wavelengths.
- a marker preferably has a spherical and/or spheroid shape and can therefore be referred to as a marker sphere; markers can however also exhibit a cornered, for example cubic, shape.
- a marker device can for example be a reference star or a pointer or a single marker or a plurality of (individual) markers which are then preferably in a predetermined spatial relationship.
- a marker device comprises one, two, three or more markers, wherein two or more such markers are in a predetermined spatial relationship. This predetermined spatial relationship is for example known to a navigation system and is for example stored in a computer of the navigation system.
- a marker device comprises an optical pattern, for example on a two-dimensional surface.
- the optical pattern might comprise a plurality of geometric shapes like circles, rectangles and/or triangles.
- the optical pattern can be identified in an image captured by a camera, and the position of the marker device relative to the camera can be determined from the size of the pattern in the image, the orientation of the pattern in the image and the distortion of the pattern in the image. This allows determining the relative position in up to three rotational dimensions and up to three translational dimensions from a single two-dimensional image.
- the position of a marker device can be ascertained, for example by a medical navigation system.
- the position of the object can be determined from the position of the marker device and the relative position between the marker device and the object. Determining this relative position is also referred to as registering the marker device and the object.
- the marker device or the object can be tracked, which means that the position of the marker device or the object is ascertained twice or more over time.
- a marker holder is understood to mean an attaching device for an individual marker which serves to attach the marker to an instrument, a part of the body and/or a holding element of a reference star, wherein it can be attached such that it is stationary and advantageously such that it can be detached.
- a marker holder can for example be rodshaped and/or cylindrical.
- a fastening device (such as for instance a latching mechanism) for the marker device can be provided at the end of the marker holder facing the marker and assists in placing the marker device on the marker holder in a force fit and/or positive fit.
- a pointer is a rod which comprises one or more - advantageously, two - markers fastened to it and which can be used to measure off individual co-ordinates, for example spatial co-ordinates (i.e. three-dimensional co-ordinates), on a part of the body, wherein a user guides the pointer (for example, a part of the pointer which has a defined and advantageously fixed position with respect to the at least one marker attached to the pointer) to the position corresponding to the co-ordinates, such that the position of the pointer can be determined by using a surgical navigation system to detect the marker on the pointer.
- the relative location between the markers of the pointer and the part of the pointer used to measure off co-ordinates is for example known.
- the surgical navigation system then enables the location (of the three-dimensional co-ordinates) to be assigned to a predetermined body structure, wherein the assignment can be made automatically or by user intervention.
- a “reference star” refers to a device with a number of markers, advantageously three markers, attached to it, wherein the markers are (for example detachably) attached to the reference star such that they are stationary, thus providing a known (and advantageously fixed) position of the markers relative to each other.
- the position of the markers relative to each other can be individually different for each reference star used within the framework of a surgical navigation method, in order to enable a surgical navigation system to identify the corresponding reference star on the basis of the position of its markers relative to each other. It is therefore also then possible for the objects (for example, instruments and/or parts of a body) to which the reference star is attached to be identified and/or differentiated accordingly.
- the reference star serves to attach a plurality of markers to an object (for example, a bone or a medical instrument) in order to be able to detect the position of the object (i.e. its spatial location and/or alignment).
- an object for example, a bone or a medical instrument
- Such a reference star for example features a way of being attached to the object (for example, a clamp and/or a thread) and/or a holding element which ensures a distance between the markers and the object (for example in order to assist the visibility of the markers to a marker detection device) and/or marker holders which are mechanically connected to the holding element and which the markers can be attached to.
- the present disclosure may be applied in the context of a navigation system for computer- assisted surgery.
- This navigation system preferably comprises the aforementioned computer for processing the data provided in accordance with the computer implemented method as described in any one of the embodiments described herein.
- the navigation system preferably comprises a detection device for detecting the position of detection points which represent the main points and auxiliary points, in order to generate detection signals and to supply the generated detection signals to the computer, such that the computer can determine the absolute main point data and absolute auxiliary point data on the basis of the detection signals received.
- a detection point is for example a point on the surface of the anatomical structure which is detected, for example by a pointer. In this way, the absolute point data can be provided to the computer.
- the navigation system also preferably comprises a user interface for receiving the calculation results from the computer (for example, the position of the main plane, the position of the auxiliary plane and/or the position of the standard plane).
- the user interface provides the received data to the user as information.
- Examples of a user interface include a display device such as a monitor, or a loudspeaker.
- the user interface can use any kind of indication signal (for example a visual signal, an audio signal and/or a vibration signal).
- a display device is an augmented reality device (also referred to as augmented reality glasses) which can be used as so-called "goggles" for navigating.
- Google Glass a trademark of Google, Inc.).
- An augmented reality device can be used both to input information into the computer of the navigation system by user interaction and to display information outputted by the computer.
- the invention also relates to a navigation system for computer-assisted surgery, comprising: a computer for processing the absolute point data and the relative point data; a detection device for detecting the position of the main and auxiliary points in order to generate the absolute point data and to supply the absolute point data to the computer; a data interface for receiving the relative point data and for supplying the relative point data to the computer; and a user interface for receiving data from the computer in order to provide information to the user, wherein the received data are generated by the computer on the basis of the results of the processing performed by the computer.
- a navigation system such as a surgical navigation system, is understood to mean a system which can comprise: at least one marker device; a transmitter which emits electromagnetic waves and/or radiation and/or ultrasound waves; a receiver which receives electromagnetic waves and/or radiation and/or ultrasound waves; and an electronic data processing device which is connected to the receiver and/or the transmitter, wherein the data processing device (for example, a computer) for example comprises a processor (CPU) and a working memory and advantageously an indicating device for issuing an indication signal (for example, a visual indicating device such as a monitor and/or an audio indicating device such as a loudspeaker and/or a tactile indicating device such as a vibrator) and a permanent data memory, wherein the data processing device processes navigation data forwarded to it by the receiver and can advantageously output guidance information to a user via the indicating device.
- the navigation data can be stored in the permanent data memory and for example compared with data stored in said memory beforehand.
- a landmark is a defined element of an anatomical body part which is always identical or recurs with a high degree of similarity in the same anatomical body part of multiple patients.
- Typical landmarks are for example the epicondyles of a femoral bone or the tips of the transverse processes and/or dorsal process of a vertebra.
- the points (main points or auxiliary points) can represent such landmarks.
- a landmark which lies on (for example on the surface of) a characteristic anatomical structure of the body part can also represent said structure.
- the landmark can represent the anatomical structure as a whole or only a point or part of it.
- a landmark can also for example lie on the anatomical structure, which is for example a prominent structure.
- an example of such an anatomical structure is the posterior aspect of the iliac crest.
- Another example of a landmark is one defined by the rim of the acetabulum, for instance by the centre of said rim.
- a landmark represents the bottom or deepest point of an acetabulum, which is derived from a multitude of detection points.
- one landmark can for example represent a multitude of detection points.
- a landmark can represent an anatomical characteristic which is defined on the basis of a characteristic structure of the body part.
- a landmark can also represent an anatomical characteristic defined by a relative movement of two body parts, such as the rotational centre of the femur when moved relative to the acetabulum.
- Imaging geometry preferably comprises information which allows the analysis image (x-ray image) to be calculated, given a known relative position between the imaging geometry analysis apparatus and the analysis object (anatomical body part) to be analysed by x-ray radiation, if the analysis object which is to be analysed is known, wherein "known” means that the spatial geometry (size and shape) of the analysis object is known.
- "interaction” means for example that the analysis radiation is blocked or partially or completely allowed to pass by the analysis object.
- the location and in particular orientation of the imaging geometry is for example defined by the position of the x-ray device, for example by the position of the x-ray source and the x-ray detector and/or for example by the position of the multiplicity (manifold) of x-ray beams which pass through the analysis object and are detected by the x-ray detector.
- the imaging geometry for example describes the position (i.e. the location and in particular the orientation) and the shape (for example, a conical shape exhibiting a specific angle of inclination) of said multiplicity (manifold).
- the position can for example be represented by the position of an x-ray beam which passes through the centre of said multiplicity or by the position of a geometric object (such as a truncated cone) which represents the multiplicity (manifold) of x-ray beams.
- Information concerning the above-mentioned interaction is preferably known in three dimensions, for example from a three- dimensional CT, and describes the interaction in a spatially resolved way for points and/or regions of the analysis object, for example for all of the points and/or regions of the analysis object.
- Knowledge of the imaging geometry for example allows the location of a source of the radiation (for example, an x-ray source) to be calculated relative to an image plane (for example, the plane of an x-ray detector).
- Shape representatives represent a characteristic aspect of the shape of an anatomical structure.
- Examples of shape representatives include straight lines, planes and geometric figures.
- Geometric figures can be one-dimensional such as for example axes or circular arcs, two-dimensional such as for example polygons and circles, or three-dimensional such as for example cuboids, cylinders and spheres.
- the relative position between the shape representatives can be described in reference systems, for example by co-ordinates or vectors, or can be described by geometric variables such as for example length, angle, area, volume and proportions.
- the characteristic aspects which are represented by the shape representatives are for example symmetry properties which are represented for example by a plane of symmetry.
- a characteristic aspect is the direction of extension of the anatomical structure, which is for example represented by a longitudinal axis.
- Another example of a characteristic aspect is the cross-sectional shape of an anatomical structure, which is for example represented by an ellipse.
- Another example of a characteristic aspect is the surface shape of a part of the anatomical structure, which is for example represented by a plane or a hemisphere.
- the characteristic aspect constitutes an abstraction of the actual shape or an abstraction of a property of the actual shape (such as for example its symmetry properties or longitudinal extension). The shape representative for example represents this abstraction. Referencing
- Determining the position is referred to as referencing if it implies informing a navigation system of said position in a reference system of the navigation system.
- Atlas data is acquired which describes (for example defines, more particularly represents and/or is) a general three-dimensional shape of the anatomical body part.
- the atlas data therefore represents an atlas of the anatomical body part.
- An atlas typically consists of a plurality of generic models of objects, wherein the generic models of the objects together form a complex structure.
- the atlas constitutes a statistical model of a patient’s body (for example, a part of the body) which has been generated from anatomic information gathered from a plurality of human bodies, for example from medical image data containing images of such human bodies.
- the atlas data therefore represents the result of a statistical analysis of such medical image data for a plurality of human bodies.
- the atlas data comprises image information (for example, positional image information) which can be matched (for example by applying an elastic or rigid image fusion algorithm) for example to image information (for example, positional image information) contained in medical image data so as to for example compare the atlas data to the medical image data in order to determine the position of anatomical structures in the medical image data which correspond to anatomical structures defined by the atlas data.
- image information for example, positional image information
- the atlas data comprises image information (for example, positional image information) which can be matched (for example by applying an elastic or rigid image fusion algorithm) for example to image information (for example, positional image information) contained in medical image data so as to for example compare the atlas data to the medical image data in order to determine the position of anatomical structures in the medical image data which correspond to anatomical structures defined by the atlas data.
- the human bodies the anatomy of which serves as an input for generating the atlas data, advantageously share a common feature such as at least one of gender, age, ethnicity, body measurements (e.g. size and/or mass) and pathologic state.
- the anatomic information describes for example the anatomy of the human bodies and is extracted for example from medical image information about the human bodies.
- the atlas of a femur for example, can comprise the head, the neck, the body, the greater trochanter, the lesser trochanter and the lower extremity as objects which together make up the complete structure.
- the atlas of a brain can comprise the telencephalon, the cerebellum, the diencephalon, the pons, the mesencephalon and the medulla as the objects which together make up the complex structure.
- One application of such an atlas is in the segmentation of medical images, in which the atlas is matched to medical image data, and the image data are compared with the matched atlas in order to assign a point (a pixel or voxel) of the image data to an object of the matched atlas, thereby segmenting the image data into objects.
- the movements of the treatment body parts are for example due to movements which are referred to in the following as "vital movements".
- vital movements Reference is also made in this respect to EP 2 189943 A1 and EP 2 189 940 A1 , also published as US 2010/0125195 A1 and US 2010/0160836 A1 , respectively, which discuss these vital movements in detail.
- analytical devices such as x-ray devices, CT devices or MRT devices are used to generate analytical images (such as x-ray images or MRT images) of the body.
- analytical devices are constituted to perform medical imaging methods.
- Analytical devices for example use medical imaging methods and are for example devices for analysing a patient's body, for instance by using waves and/or radiation and/or energy beams, for example electromagnetic waves and/or radiation, ultrasound waves and/or particles beams.
- Analytical devices are for example devices which generate images (for example, two-dimensional or three-dimensional images) of the patient's body (and for example of internal structures and/or anatomical parts of the patient's body) by analysing the body.
- Analytical devices are for example used in medical diagnosis, for example in radiology.
- Tracking an indicator body part thus allows a movement of the treatment body part to be tracked on the basis of a known correlation between the changes in the position (for example the movements) of the indicator body part and the changes in the position (for example the movements) of the treatment body part.
- marker devices which can be used as an indicator and thus referred to as "marker indicators” can be tracked using marker detection devices.
- the position of the marker indicators has a known (predetermined) correlation with (for example, a fixed relative position relative to) the position of indicator structures (such as the thoracic wall, for example true ribs or false ribs, or the diaphragm or intestinal walls, etc.) which for example change their position due to vital movements.
- the present invention may also be employed in the field of controlling a treatment beam.
- the treatment beam treats body parts which are to be treated and which are referred to in the following as "treatment body parts". These body parts are for example parts of a patient's body, i.e. anatomical body parts.
- the present invention relates to the field of medicine and for example to the use of beams, such as radiation beams, to treat parts of a patient's body, which are therefore also referred to as treatment beams.
- a treatment beam treats body parts which are to be treated and which are referred to in the following as "treatment body parts". These body parts are for example parts of a patient's body, i.e. anatomical body parts.
- Ionising radiation is for example used for the purpose of treatment.
- the treatment beam comprises or consists of ionising radiation.
- the ionising radiation comprises or consists of particles (for example, sub-atomic particles or ions) or electromagnetic waves which are energetic enough to detach electrons from atoms or molecules and so ionise them.
- ionising radiation examples include x-rays, high-energy particles (high-energy particle beams) and/or ionising radiation emitted from a radioactive element.
- the treatment radiation for example the treatment beam, is for example used in radiation therapy or radiotherapy, such as in the field of oncology.
- parts of the body comprising a pathological structure or tissue such as a tumour are treated using ionising radiation.
- the tumour is then an example of a treatment body part.
- the treatment beam is preferably controlled such that it passes through the treatment body part.
- the treatment beam can have a negative effect on body parts outside the treatment body part. These body parts are referred to here as "outside body parts".
- a treatment beam has to pass through outside body parts in order to reach and so pass through the treatment body part.
- a treatment body part can be treated by one or more treatment beams issued from one or more directions at one or more times.
- the treatment by means of the at least one treatment beam thus follows a particular spatial and temporal pattern.
- the term "beam arrangement" is then used to cover the spatial and temporal features of the treatment by means of the at least one treatment beam.
- the beam arrangement is an arrangement of at least one treatment beam.
- the "beam positions” describe the positions of the treatment beams of the beam arrangement.
- the arrangement of beam positions is referred to as the positional arrangement.
- a beam position is preferably defined by the beam direction and additional information which allows a specific location, for example in three- dimensional space, to be assigned to the treatment beam, for example information about its co-ordinates in a defined co-ordinate system.
- the specific location is a point, preferably a point on a straight line. This line is then referred to as a "beam line” and extends in the beam direction, for example along the central axis of the treatment beam.
- the defined co-ordinate system is preferably defined relative to the treatment device or relative to at least a part of the patient's body.
- the positional arrangement comprises and for example consists of at least one beam position, for example a discrete set of beam positions (for example, two or more different beam positions), or a continuous multiplicity (manifold) of beam positions.
- one or more treatment beams adopt(s) the treatment beam position(s) defined by the positional arrangement simultaneously or sequentially during treatment (for example sequentially if there is only one beam source to emit a treatment beam). If there are several beam sources, it is also possible for at least a subset of the beam positions to be adopted simultaneously by treatment beams during the treatment.
- one or more subsets of the treatment beams can adopt the beam positions of the positional arrangement in accordance with a predefined sequence.
- a subset of treatment beams comprises one or more treatment beams.
- the complete set of treatment beams which comprises one or more treatment beams which adopt(s) all the beam positions defined by the positional arrangement is then the beam arrangement.
- imaging methods are used to generate image data (for example, two- dimensional or three-dimensional image data) of anatomical structures (such as soft tissues, bones, organs, etc.) of the human body.
- image data for example, two- dimensional or three-dimensional image data
- medical imaging methods is understood to mean (advantageously apparatus-based) imaging methods (for example so-called medical imaging modalities and/or radiological imaging methods) such as for instance computed tomography (CT) and cone beam computed tomography (CBCT, such as volumetric CBCT), x-ray tomography, magnetic resonance tomography (MRT or MRI), conventional x-ray, sonography and/or ultrasound examinations, and positron emission tomography.
- CT computed tomography
- CBCT cone beam computed tomography
- MRT or MRI magnetic resonance tomography
- sonography and/or ultrasound examinations
- positron emission tomography positron emission tomography
- the medical imaging methods are performed by the analytical devices.
- medical imaging modalities applied by medical imaging methods are: X-ray, magnetic resonance imaging, medical ultrasonography or ultrasound, endoscopy, elastography, tactile imaging, thermography, medical photography and nuclear medicine functional imaging techniques as positron emission tomography (PET) and Single-photon emission computed tomography, as mentioned by Wikipedia.
- PET positron emission tomography
- Single-photon emission computed tomography as mentioned by Wikipedia.
- the image data thus generated is also termed “medical imaging data”.
- Analytical devices for example are used to generate the image data in apparatus-based imaging methods.
- the imaging methods are for example used for medical diagnostics, to analyse the anatomical body in order to generate images which are described by the image data.
- the imaging methods are also for example used to detect pathological changes in the human body.
- some of the changes in the anatomical structure such as the pathological changes in the structures (tissue) may not be detectable and for example may not be visible in the images generated by the imaging methods.
- a tumour represents an example of a change in an anatomical structure. If the tumour grows, it may then be said to represent an expanded anatomical structure.
- This expanded anatomical structure may not be detectable; for example, only a part of the expanded anatomical structure may be detectable.
- Primary/high-grade brain tumours are for example usually visible on MRI scans when contrast agents are used to infiltrate the tumour.
- MRI scans represent an example of an imaging method.
- the signal enhancement in the MRI images due to the contrast agents infiltrating the tumour
- the tumour is detectable and for example discernible in the image generated by the imaging method.
- enhancing tumours it is thought that approximately 10% of brain tumours are not discernible on a scan and are for example not visible to a user looking at the images generated by the imaging method.
- Mapping describes a transformation (for example, linear transformation) of an element (for example, a pixel or voxel), for example the position of an element, of a first data set in a first coordinate system to an element (for example, a pixel or voxel), for example the position of an element, of a second data set in a second coordinate system (which may have a basis which is different from the basis of the first coordinate system).
- the mapping is determined by comparing (for example, matching) the color values (for example grey values) of the respective elements by means of an elastic or rigid fusion algorithm.
- the mapping is embodied for example by a transformation matrix (such as a matrix defining an affine transformation).
- Image fusion can be elastic image fusion or rigid image fusion.
- rigid image fusion the relative position between the pixels of a 2D image and/or voxels of a 3D image is fixed, while in the case of elastic image fusion, the relative positions are allowed to change.
- image morphing is also used as an alternative to the term “elastic image fusion”, but with the same meaning.
- Elastic fusion transformations are for example designed to enable a seamless transition from one dataset (for example a first dataset such as for example a first image) to another dataset (for example a second dataset such as for example a second image).
- the transformation is for example designed such that one of the first and second datasets (images) is deformed, for example in such a way that corresponding structures (for example, corresponding image elements) are arranged at the same position as in the other of the first and second images.
- the deformed (transformed) image which is transformed from one of the first and second images is for example as similar as possible to the other of the first and second images.
- (numerical) optimisation algorithms are applied in order to find the transformation which results in an optimum degree of similarity.
- the degree of similarity is preferably measured by way of a measure of similarity (also referred to in the following as a "similarity measure").
- the parameters of the optimisation algorithm are for example vectors of a deformation field. These vectors are determined by the optimisation algorithm in such a way as to result in an optimum degree of similarity.
- the optimum degree of similarity represents a condition, for example a constraint, for the optimisation algorithm.
- the bases of the vectors lie for example at voxel positions of one of the first and second images which is to be transformed, and the tips of the vectors lie at the corresponding voxel positions in the transformed image.
- a plurality of these vectors is preferably provided, for instance more than twenty or a hundred or a thousand or ten thousand, etc.
- constraints include for example the constraint that the transformation is regular, which for example means that a Jacobian determinant calculated from a matrix of the deformation field (for example, the vector field) is larger than zero, and also the constraint that the transformed (deformed) image is not self-intersecting and for example that the transformed (deformed) image does not comprise faults and/or ruptures.
- the constraints include for example the constraint that if a regular grid is transformed simultaneously with the image and in a corresponding manner, the grid is not allowed to interfold at any of its locations.
- the optimising problem is for example solved iteratively, for example by means of an optimisation algorithm which is for example a first-order optimisation algorithm, such as a gradient descent algorithm.
- Other examples of optimisation algorithms include optimisation algorithms which do not use derivations, such as the downhill simplex algorithm, or algorithms which use higher-order derivatives such as Newton-like algorithms.
- the optimisation algorithm preferably performs a local optimisation. If there is a plurality of local optima, global algorithms such as simulated annealing or generic algorithms can be used. In the case of linear optimisation problems, the simplex method can for instance be used.
- the voxels are for example shifted by a magnitude in a direction such that the degree of similarity is increased.
- This magnitude is preferably less than a predefined limit, for instance less than one tenth or one hundredth or one thousandth of the diameter of the image, and for example about equal to or less than the distance between neighbouring voxels.
- Large deformations can be implemented, for example due to a high number of (iteration) steps.
- the determined elastic fusion transformation can for example be used to determine a degree of similarity (or similarity measure, see above) between the first and second datasets (first and second images).
- the deviation between the elastic fusion transformation and an identity transformation is determined.
- the degree of deviation can for instance be calculated by determining the difference between the determinant of the elastic fusion transformation and the identity transformation. The higher the deviation, the lower the similarity, hence the degree of deviation can be used to determine a measure of similarity.
- a measure of similarity can for example be determined on the basis of a determined correlation between the first and second datasets.
- a fixed position which is also referred to as fixed relative position, in this document means that two objects which are in a fixed position have a relative position which does not change unless this change is explicitly and intentionally initiated.
- a fixed position is in particular given if a force or torque above a predetermined threshold has to be applied in order to change the position. This threshold might be 10 N or 10 Nm.
- the position of a sensor device remains fixed relative to a target while the target is registered or two targets are moved relative to each other.
- a fixed position can for example be achieved by rigidly attaching one object to another.
- the spatial location which is a part of the position, can in particular be described just by a distance (between two objects) or just by the direction of a vector (which links two objects).
- the alignment which is another part of the position, can in particular be described by just the relative angle of orientation (between the two objects).
- a medical workflow comprises a plurality of workflow steps performed during a medical treatment and/or a medical diagnosis.
- the workflow steps are typically, but not necessarily performed in a predetermined order.
- Each workflow step for example means a particular task, which might be a single action or a set of actions.
- Examples of workflow steps are capturing a medical image, positioning a patient, attaching a marker, performing a resection, moving a joint, placing an implant and the like.
- Fig. 1 illustrates a method according to the present disclosure
- Fig. 2 shows parts of a method according to the present disclosure
- Fig. 3 is a schematic illustration of the system according to the present disclosure.
- Figure 1 illustrates the basic steps of an image reconstruction method for a medical system comprising a medical imaging system and a tracking device attached, particularly rigidly attached, to the medical imaging system according to the present disclosure.
- the tracking device may not be attached to the medical imaging system.
- Step S11 encompasses calibrating a coordinate system C2 of the tracking device for marker-based tracking to a coordinate system C1 of the medical imaging system.
- Step S13 encompasses, for each of the plurality of medical images l_i, by means of the tracking device, acquiring corresponding tracking coordinates in C2 using one or more markers.
- the method may optionally comprise storing (S13a), with the medical images, the respective imaging system coordinates and corresponding tracking coordinates of the one or more markers.
- the method may comprise determining, in optional step S10, the reference position.
- the reference position may, for example, be determined by selecting a medical image, for example the first medical image l_1 , and using the corresponding tracking position as a reference position.
- the reference position may be determined by selecting some or all of the medical images and determining a corresponding mean tracking position and using the mean tracking position as a reference position.
- the reference position may be based on a tracking position at a given time or a timeframe of tracking positions independently of medical image acquisition.
- Step S14 encompasses transforming the medical images based on the tracking coordinates and the calibration between C1 and C2.
- Step S15 encompasses reconstructing 3D images based on the transformed images with respect to a/the reference position. Reconstruction may be performed at a first time in step S15a, for example in real time and, for example, using the first image l_1 as a reference image. Alternatively or in addition, reconstruction may be carried out at a later time in step S15b, optionally with respect to a different reference position.
- the method may comprise storing, in optional step S16, with the reconstructed 3D images, the respective reference position.
- the method may also comprise optional step S17 of tracking a tool, for example a pointer tool, by means of the tracking device, e.g. for visualization and/or selecting a reference position.
- a tool for example a pointer tool
- Optional step S18 may entail that a surgical navigation system carries out marker-based tracking concurrently with the tracking system attached to the medical imaging device carrying out the tracking.
- the method may comprise step S18a of the surgical navigation system using a shared reference marker that is also used by the tracking device attached to the medical imaging system and the step S18b of performing registration of image data obtained during the medical procedure and pre-procedural image data.
- step S19 pointer planning may be carried out. That is, the medical imaging device may be positioned/aligned based on a pointer being tracked by means of the tracking device and then, optionally a planar X-ray image is acquired (step S19a) or a CBCT imaging is carried out (step S19b) from said direction/in said position.
- the step 19b may entail reverting to step S11.
- Figure 2 illustrates in detail one example of transformation and 3D reconstruction. This may be employed in the method described in the context of Figure 1 .
- the coordinate system C1 of the imaging system is defined. For example, it may be defined with respect to the imaging geometry.
- the coordinate system C2 of the rigidly attached tracking device/system is defined.
- the coordinate system may be defined with respect to the viewing direction of tracking camera(s) of the tracking device.
- the coordinate systems C1 and C2 may be calibrated. This may entail determining the transformation that transforms a position in one of the coordinate systems to a position in the other of the coordinate systems. Specifically, as the tracking coordinates are obtained in C2, the transformation for representing a position or motion (described by the tracking coordinates) in C1 may be determined.
- step S24 medical imaging data having their spatial information in coordinate system C1 are obtained.
- tracking device tracks the position of markers in coordinate system C2, thereby determining relative motion.
- a synchronization mechanism between the tracking device and the imaging device ensures that the tracked position of markers can be unambiguously associated with the correct medical (projection) imaging data frames. If for one medical (projection) image data frame no association with a tracked position of markers is possible, the tracked position of markers of the previous frame shall be used.
- step S26 the medical (projection) imaging data along with the associated position of markers (transformed into C1 through calibration established in S21 ) are used for 3D image reconstruction.
- each medical (projection) image its spatial information (S24) is modified by transforming its projection geometry prior to 3D reconstruction.
- the transformation is determined by subtracting the reference position of markers from the sensed tracked position of markers of the medical (projection) image.
- each medical (projection) image its original spatial information (S24) is used for 3D reconstruction.
- the position (including orientation) of the 3D reconstruction is modified by the inverse of the abovementioned transformation prior to each reconstruction step.
- FIG. 3 discloses a medical system 1 according to the present disclosure.
- the medical system may be configured to carry out the methods of the present disclosure, particularly as outlined in the context of Figs. 1 and 2.
- the medical system comprises a medical imaging system 2, e.g. a CT imaging system.
- a Loop-X imaging system may be employed.
- An exemplary source 2a and exemplary detector 2b arranged on a gantry 2c are shown.
- the gantry may have a loop shape.
- the medical system further comprises a tracking device 3, in this example an infrared tracking camera.
- the tracking device is attached, particularly rigidly attached, to the medical imaging system 3, optionally in a detachable manner. Alternatively, the tracking device may not be attached to the medical imaging system.
- the medical system may optionally comprise a surgical navigation system 4.
- Markers 5a to 5e are also illustrated, respectively arranged on the floor, attached to a patient bed 6, attached to a portion of the patient 7 inside the region of interest 8 or attached to a portion of the patient outside the region of interest, or attached to a medical tool 9. It is to be understood that any single one of and any combination of at least some of the markers 5a to 5e may be employed.
- a tracking device of the surgical navigation system is denoted by reference sign 4a.
- the medical imaging system is illustrated as a wheeled system and as a system having a tiltable gantry 2c, particularly the gantry having a tiltable rotation plane (which is in this illustration viewed from the side) indicated by dashed line 10.
- the tracking device 3 and the medical imaging system 2 may communicate via data connection 11 a.
- the surgical navigation system and the medical imaging system may communicate via a data connection 11 b.
- the respective data connection may be wired or wireless.
- the floor is shown as having some unevenness. This is merely for illustrating challenges in terms of position tracking of the medical imaging device. Other difficulties may arise from slip of the wheels on a potentially wet floor. These challenges can be overcome by the method and system of the present disclosure.
- an imaging system such as a CT system, with a coordinate system (C1 ) and a tracking device, such as a tracking camera, rigidly attached to imaging system (C2), particularly a component thereof that is a fixed component relative to C1 or a mobile component relative to C1 .
- the tracking device may comprise a camera such as an optical stereoscopic camera, a monoscopic camera, an EM tracking device, or an infrared, camera.
- the tracking device may be attached, for example to a gantry or a C-arm to which source and/or detector of the imaging system are attached.
- the tracking device may be attached to the imaging system in such a manner that the relation between C1 and C2 is well-defined. Even more specifically, the tracking device may be attached in such a manner that if the gantry or C-arm tilts, the tracking camera tilts together with the gantry or C-arm, such that the relation between C1 and C2 remains constant.
- C2 is calibrated to C1 , optionally via another coordinate system like C3, for example an intrinsic calibration of detector or source relative to C1 .
- optical coordinate system C2 and radiological coordinate system C1 are cross-calibrated, particularly, for example, using a static cross-calibration.
- Markers to be used by the tracking device may be positioned statically (e.g. on a patient couch or on the floor) or on a patient, particularly close to or removed from moving body parts. Markers may be positioned on the patient inside or outside the ROI of the imaging device.
- Static mounting of markers may be used to address the challenge that for some trajectory types or device axes, accurate intrinsic calibration of an imaging device is difficult or not possible.
- challenges arise in view of gantry tilt (e.g., due to residual varying gravity-induced sag), device yaw, or longitudinal I lateral scans and motion (due to slip of wheels and unpredictable floor surface).
- the imaging device may thus be tracked during a scan to improve image quality via more accurate “encoding” of the respective device axes.
- marker(s) is(are) positioned on a tool or treatment device, such as radiotherapy treatment device.
- (medical) images (l_i) may be acquired and stored /together with imaging system coordinates.
- Tracking coordinates of (patient or static) marker(s) are acquired and stored with l_1
- Tracking coordinates of (patient or static) marker(s) are acquired and store with l_2 and so on.
- Image l_1 or another l_n (that may also be virtual and derived by statistical means such as the mean of a set of images l_x) is selected as image l_ref at a reference position.
- All images l_i are transformed, for example using the transformation of trackingCoordinates(l_i) - trackingCoordinates(l_ref), giving l_i_transf.
- each medical (projection) image its original spatial information may be used for 3D reconstruction.
- the position (including orientation) of the 3D reconstruction is modified by the inverse of the abovementioned transformation prior to each reconstruction step.
- a 3D image may be reconstructed based on transformed images l_i_transf
- Reconstructed 3D image may be stored with respect to the selected I computed reference position.
- a live reconstruction may be carried out based on l_1 as l_ref.
- a reference position can later be changed, e.g. average position, first, last, based on tracking of patient marker and similarity of current position to imaging position.
- the surgical navigation system may have its own tracking camera and may perform independent registration of a pre-procedural image data set.
- intra-procedural image data may be blurry and/or misaligned to the pre-procedural data. Therefore, tracking of the imaging system relative to the patient with the same (a shared) reference marker (or with a marker rigidly attached to the reference) will provide intra- procedural image data that is correctly registered to the pre-procedural data and/or will not be blurry.
- “Blurry” in this context may generally denote a reconstructed image that shows image artefacts. This is consequently not ideal for navigation purposes and potentially introduces uncertainty and consequently inaccuracy into the navigation process.
- a registration point in time may be provided as a signal from the surgical navigation system to the imaging system so that the imaging system can store and use this to determine the corresponding tracking position as a reference position for the motion correction, i.e., for the transforming of the medical images and performing 3D image reconstruction.
- the tracking device attached to the medical imaging system may substitute or complement an external navigation system.
- the tracking device may be used to track tools in the imaging coordinate system C1 . Markers attached to the tools may be used to that end.
- the medical imaging system may adjust the projection geometry based on a tracked pointer tool.
- a treatment device such as a LINAC may have a marker attached to it and the patient motion correction marker may acts as reference for the substitute navigation system.
- the tracking device may also be used for position control of the imaging system independent of imaging, e.g. where very accurate positional awareness of the imaging system is of interest.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Apparatus For Radiation Diagnosis (AREA)
Abstract
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2025501720A JP2025523304A (ja) | 2023-03-21 | 2024-03-20 | 医療システムのための画像再構成方法 |
| AU2024240570A AU2024240570A1 (en) | 2023-03-21 | 2024-03-20 | Image reconstruction method for a medical system |
| EP24711925.8A EP4515491A1 (fr) | 2023-03-21 | 2024-03-20 | Procédé de reconstruction d'image pour un système médical |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/EP2023/057220 WO2024193816A1 (fr) | 2023-03-21 | 2023-03-21 | Procédé de reconstruction d'image pour un système médical |
| EPPCT/EP2023/057220 | 2023-03-21 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024194367A1 true WO2024194367A1 (fr) | 2024-09-26 |
Family
ID=85778860
Family Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/EP2023/057220 Pending WO2024193816A1 (fr) | 2023-03-21 | 2023-03-21 | Procédé de reconstruction d'image pour un système médical |
| PCT/EP2024/057476 Pending WO2024194367A1 (fr) | 2023-03-21 | 2024-03-20 | Procédé de reconstruction d'image pour un système médical |
Family Applications Before (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/EP2023/057220 Pending WO2024193816A1 (fr) | 2023-03-21 | 2023-03-21 | Procédé de reconstruction d'image pour un système médical |
Country Status (4)
| Country | Link |
|---|---|
| EP (1) | EP4515491A1 (fr) |
| JP (1) | JP2025523304A (fr) |
| AU (1) | AU2024240570A1 (fr) |
| WO (2) | WO2024193816A1 (fr) |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2007014470A2 (fr) * | 2005-08-01 | 2007-02-08 | Resonant Medical Inc. | Systeme et procede de detection des derives dans les systemes de poursuite etalonnes |
| US20100125195A1 (en) | 2008-11-19 | 2010-05-20 | Kajetan Berlinger | Determination of regions of an analytical image that are subject to vital movement |
| WO2012080973A2 (fr) * | 2010-12-16 | 2012-06-21 | Koninklijke Philips Electronics N.V. | Appareil d'imagerie tomodensitométrique et nucléaire hybride, étalonnage croisé et évaluation des performances |
| WO2017191207A1 (fr) * | 2016-05-04 | 2017-11-09 | Brainlab Ag | Surveillance de la position d'un patient au moyen d'une image de planification et d'une imagerie thermique consécutive |
Family Cites Families (11)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH10216118A (ja) * | 1997-02-10 | 1998-08-18 | Shimadzu Corp | X線診断装置 |
| JP5592694B2 (ja) * | 2010-05-21 | 2014-09-17 | 株式会社日立メディコ | X線ct装置 |
| US9480440B2 (en) * | 2011-09-28 | 2016-11-01 | Qr Srl | System and method for cone beam computed tomography |
| IN2014DN05824A (fr) * | 2012-04-24 | 2015-05-15 | Hitachi Medical Corp | |
| JP6373558B2 (ja) * | 2013-03-29 | 2018-08-15 | 学校法人藤田学園 | X線ct装置 |
| CN104997528B (zh) * | 2014-04-21 | 2018-03-27 | 东芝医疗系统株式会社 | X 射线计算机断层拍摄装置以及拍摄条件设定辅助装置 |
| EP3574836A1 (fr) * | 2018-05-30 | 2019-12-04 | Koninklijke Philips N.V. | Imagerie tridimensionnelle à déclenchement périodique temporel |
| EP3783379A1 (fr) * | 2019-08-22 | 2021-02-24 | Koninklijke Philips N.V. | Système d'imagerie tomographique comportant un système de détection de mouvement |
| WO2021160294A1 (fr) * | 2020-02-14 | 2021-08-19 | Brainlab Ag | Compensation d'imprécisions de suivi |
| FI129905B (fi) * | 2020-07-08 | 2022-10-31 | Palodex Group Oy | Röntgenkuvausjärjestelmä ja menetelmä hammasröntgenkuvausta varten |
| CN113081013B (zh) * | 2021-03-19 | 2024-04-23 | 东软医疗系统股份有限公司 | 定位片扫描方法、装置及系统 |
-
2023
- 2023-03-21 WO PCT/EP2023/057220 patent/WO2024193816A1/fr active Pending
-
2024
- 2024-03-20 JP JP2025501720A patent/JP2025523304A/ja active Pending
- 2024-03-20 WO PCT/EP2024/057476 patent/WO2024194367A1/fr active Pending
- 2024-03-20 AU AU2024240570A patent/AU2024240570A1/en active Pending
- 2024-03-20 EP EP24711925.8A patent/EP4515491A1/fr active Pending
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2007014470A2 (fr) * | 2005-08-01 | 2007-02-08 | Resonant Medical Inc. | Systeme et procede de detection des derives dans les systemes de poursuite etalonnes |
| US20100125195A1 (en) | 2008-11-19 | 2010-05-20 | Kajetan Berlinger | Determination of regions of an analytical image that are subject to vital movement |
| EP2189943A1 (fr) | 2008-11-19 | 2010-05-26 | BrainLAB AG | Détermination de régions à mouvements vitaux d'une image d'analyse |
| EP2189940A1 (fr) | 2008-11-19 | 2010-05-26 | BrainLAB AG | Détermination d'éléments de corps d'indicateur et de trajectoires de pré-indicateurs |
| US20100160836A1 (en) | 2008-11-19 | 2010-06-24 | Kajetan Berlinger | Determination of indicator body parts and pre-indicator trajectories |
| WO2012080973A2 (fr) * | 2010-12-16 | 2012-06-21 | Koninklijke Philips Electronics N.V. | Appareil d'imagerie tomodensitométrique et nucléaire hybride, étalonnage croisé et évaluation des performances |
| WO2017191207A1 (fr) * | 2016-05-04 | 2017-11-09 | Brainlab Ag | Surveillance de la position d'un patient au moyen d'une image de planification et d'une imagerie thermique consécutive |
Non-Patent Citations (5)
| Title |
|---|
| ANDRE Z KYME ET AL: "Motion estimation and correction in SPECT, PET and CT", PHYSICS IN MEDICINE AND BIOLOGY, INSTITUTE OF PHYSICS PUBLISHING, BRISTOL GB, vol. 66, no. 18, 15 September 2021 (2021-09-15), XP020368998, ISSN: 0031-9155, [retrieved on 20210915], DOI: 10.1088/1361-6560/AC093B * |
| J-H KIM ET AL: "A rigid motion correction method for helical computed tomography (CT)", PHYSICS IN MEDICINE AND BIOLOGY, INSTITUTE OF PHYSICS PUBLISHING, BRISTOL GB, vol. 60, no. 5, 12 February 2015 (2015-02-12), pages 2047 - 2073, XP020281395, ISSN: 0031-9155, [retrieved on 20150212], DOI: 10.1088/0031-9155/60/5/2047 * |
| ROGER Y. TSAI: "A Versatile Camera Calibration Technique for High-Accuracy 3D Machine Vision Metrology Using Off-the-Shelf TV Cameras and Lenses", IEEE JOURNAL OF ROBOTICS AND AUTOMATION, vol. RA, no. 4, August 1987 (1987-08-01), pages 323 - 344 |
| ROGER Y. TSAI: "An Efficient and Accurate Camera Calibration Technique for 3D Machine Vision", PROCEEDINGS OF THE IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, 1986, pages 364 - 374, XP001004843 |
| ZIV YANIV, FLUOROSCOPIC X-RAY IMAGE PROCESSING AND REGISTRATION FOR COMPUTER-AIDED ORTHOPEDIC SURGERY |
Also Published As
| Publication number | Publication date |
|---|---|
| EP4515491A1 (fr) | 2025-03-05 |
| JP2025523304A (ja) | 2025-07-18 |
| AU2024240570A1 (en) | 2024-12-19 |
| WO2024193816A1 (fr) | 2024-09-26 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11645768B2 (en) | Multi image fusion based positioning verification | |
| US11628012B2 (en) | Patient positioning using a skeleton model | |
| EP3209380A1 (fr) | Utilisation d'un scanner tdm pour des procédures de radiothérapie | |
| WO2017093034A1 (fr) | Procédé et appareil de détermination ou de prédiction de la position d'une cible | |
| EP3694438B1 (fr) | Détermination d'une position cible d'un dispositif à rayons x | |
| WO2017191207A1 (fr) | Surveillance de la position d'un patient au moyen d'une image de planification et d'une imagerie thermique consécutive | |
| US20210343396A1 (en) | Automatic setting of imaging parameters | |
| US20230141234A1 (en) | Radiation treatment parameters for target region tumour | |
| US20190111279A1 (en) | Patient pre-positioning in frameless cranial radiosurgery using thermal imaging | |
| EP3622479B1 (fr) | Partitionnement d'une image médicale | |
| JP2023036805A (ja) | 人体部分の撮像方法、コンピュータ、コンピュータ読み取り可能記憶媒体、コンピュータプログラム、および医療システム | |
| EP3408832A1 (fr) | Préparation de patient guidée par image pour radiothérapie | |
| US20210145372A1 (en) | Image acquisition based on treatment device position | |
| EP4515491A1 (fr) | Procédé de reconstruction d'image pour un système médical | |
| US11266857B2 (en) | Long-exposure-time-imaging for determination of periodically moving structures | |
| EP4500847A1 (fr) | Procédé pour l'enregistrement d'une image virtuelle dans un système de réalité augmentée |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24711925 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2024711925 Country of ref document: EP |
|
| ENP | Entry into the national phase |
Ref document number: 2024711925 Country of ref document: EP Effective date: 20241128 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: AU2024240570 Country of ref document: AU |
|
| ENP | Entry into the national phase |
Ref document number: 2024240570 Country of ref document: AU Date of ref document: 20240320 Kind code of ref document: A |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2025501720 Country of ref document: JP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |