WO2024117129A1 - 医用画像処理装置、治療システム、医用画像処理方法、およびプログラム - Google Patents
医用画像処理装置、治療システム、医用画像処理方法、およびプログラム Download PDFInfo
- Publication number
- WO2024117129A1 WO2024117129A1 PCT/JP2023/042555 JP2023042555W WO2024117129A1 WO 2024117129 A1 WO2024117129 A1 WO 2024117129A1 JP 2023042555 W JP2023042555 W JP 2023042555W WO 2024117129 A1 WO2024117129 A1 WO 2024117129A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- interest
- trajectory
- marker
- tracking
- image processing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N5/00—Radiation therapy
- A61N5/10—X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
- A61N5/1048—Monitoring, verifying, controlling systems and methods
- A61N5/1049—Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/48—Diagnostic techniques
- A61B6/486—Diagnostic techniques involving generating temporal series of image data
- A61B6/487—Diagnostic techniques involving generating temporal series of image data involving fluoroscopy
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/04—Positioning of patients; Tiltable beds or the like
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/12—Arrangements for detecting or locating foreign bodies
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/46—Arrangements for interfacing with the operator or the patient
- A61B6/461—Displaying means of special interest
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/54—Control of apparatus or devices for radiation diagnosis
- A61B6/547—Control of apparatus or devices for radiation diagnosis involving tracking of position of the device or parts of the device
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N5/00—Radiation therapy
- A61N5/10—X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N5/00—Radiation therapy
- A61N5/10—X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
- A61N5/103—Treatment planning systems
- A61N5/1037—Treatment planning systems taking into account the movement of the target, e.g. 4D-image based planning
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N5/00—Radiation therapy
- A61N5/10—X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
- A61N5/1048—Monitoring, verifying, controlling systems and methods
- A61N5/1064—Monitoring, verifying, controlling systems and methods for adjusting radiation treatment in response to monitoring
- A61N5/1065—Beam adjustment
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N5/00—Radiation therapy
- A61N5/10—X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
- A61N5/1048—Monitoring, verifying, controlling systems and methods
- A61N5/1064—Monitoring, verifying, controlling systems and methods for adjusting radiation treatment in response to monitoring
- A61N5/1069—Target adjustment, e.g. moving the patient support
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2068—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N5/00—Radiation therapy
- A61N5/10—X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
- A61N5/1048—Monitoring, verifying, controlling systems and methods
- A61N5/1049—Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam
- A61N2005/1051—Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam using an active marker
Definitions
- Embodiments of the present invention relate to a medical image processing device, a treatment system, a medical image processing method, and a program.
- Radiation therapy is a treatment method that destroys tumors (lesions) inside the patient's body by irradiating them with radiation.
- it is necessary to accurately aim the radiation (treatment beam) irradiated to the tumor in order to reduce the effects of radiation on the patient's normal tissue.
- the radiation treatment beam
- respiratory-synchronized irradiation is used for such tumors, in which the treatment beam is irradiated in sync with the patient's breathing.
- One technique for respiratory-synchronized irradiation is, for example, the marker tracking technique.
- a marker metal marker
- fluoroscopic images e.g., X-ray fluoroscopic images
- irradiation of the treatment beam is controlled only when the marker projected on the fluoroscopic image is within a predetermined area (irradiation spot) set based on the position of the marker.
- the marker tracking technique allows the tumor to be irradiated with the treatment beam at the appropriate timing, enabling appropriate radiation therapy to be performed.
- marker tracking techniques uses template matching to detect and track metal markers from fluoroscopic images captured immediately before treatment for a time period equal to or longer than one breath of the patient (see, for example, Patent Document 1).
- fluoroscopic images are captured from two directions, and the position of the marker is estimated in three dimensions using a triangulation technique.
- conventional technology can grasp the position of the marker, which moves in sync with the patient's breathing, in three dimensions, and irradiate the tumor with a treatment beam at an optimal timing.
- the user who performs radiation therapy such as a doctor or technician, visually checks the markers projected on the fluoroscopic image and selects the method for tracking the marker. For this reason, in conventional technology, the user must manually select the optimal template from multiple marker templates for tracking the marker they have visually checked.
- markers may have low visual visibility due to, for example, the influence of organs and bones, or noise contained in the fluoroscopic image.
- small markers such as needle-shaped markers with a diameter of 0.25 mm, are used as markers to reduce the burden on the patient when the markers are placed inside the body during surgery, and the markers are imaged overlapping large organs.
- the markers projected on the fluoroscopic image move periodically in sync with the patient's breathing, it is not necessarily possible to see the entire marker for the entire period of one breath of the imaged patient, and there are cases in which visibility is reduced for some periods or for parts of the marker. In such cases, it is easy to imagine that it becomes even more difficult for the user to determine whether or not to perform respiratory-synchronized irradiation.
- the problem that the present invention aims to solve is to provide a medical image processing device, a treatment system, a medical image processing method, and a program that can determine whether or not a treatment beam can be irradiated in a respiratory-synchronous manner based on a fluoroscopic image.
- a medical image processing device has an image acquisition unit, a trajectory generation unit, and a selection unit.
- the image acquisition unit acquires a plurality of fluoroscopic images of a patient.
- the trajectory generation unit recognizes the position of a region of interest captured in each of the plurality of fluoroscopic images, and generates a trajectory of the region of interest as it moves based on the recognized position of the region of interest.
- the selection unit selects a tracking method for tracking the region of interest based on the trajectory of the region of interest.
- a medical image processing device a treatment system, a medical image processing method, and a program that can determine whether or not a treatment beam can be irradiated in a respiratory synchronization manner based on a fluoroscopic image.
- FIG. 1 is a block diagram showing an example of the configuration of a treatment system including a medical image processing apparatus according to a first embodiment.
- FIG. 1 is a block diagram showing an example of the configuration of a medical image processing apparatus according to a first embodiment.
- 4 is a diagram showing an example of a template used for recognizing a marker in the medical image processing apparatus of the first embodiment.
- 4 is a diagram showing an example of a template used for recognizing a marker in the medical image processing apparatus of the first embodiment.
- 4 is a diagram showing an example of a template used for recognizing a marker in the medical image processing apparatus of the first embodiment.
- 4 is a diagram showing an example of a template used for recognizing a marker in the medical image processing apparatus of the first embodiment.
- 5A and 5B are diagrams illustrating an example of a process for recognizing the position of a marker in the medical image processing apparatus according to the first embodiment.
- 5A and 5B are diagrams showing an example of a trajectory of a marker whose position is recognized in the medical image processing apparatus of the first embodiment.
- 4A and 4B are diagrams showing an example of a trajectory of a marker whose position is recognized in the medical image processing apparatus of the first embodiment;
- 4 is a flowchart showing the flow of operations in the medical image processing apparatus of the first embodiment.
- 6 is a flowchart showing the flow of an operation of determining whether tracking is possible in a selection unit included in the medical image processing apparatus of the first embodiment.
- FIG. 10 is a flowchart showing the flow of another operation of determining whether tracking is possible in a selection unit included in the medical image processing apparatus of the first embodiment.
- FIG. 13 is a block diagram showing an example of the configuration of a medical image processing apparatus according to a second embodiment.
- FIG. 1 is a diagram showing an example of a display image for presenting information in the medical image processing apparatus of the embodiment (part 1).
- FIG. 2 is a diagram showing an example of a display image for presenting information in the medical image processing apparatus of the embodiment (part 2).
- FIG. 11 is a diagram showing an example of a display image for presenting information in the medical image processing apparatus of the embodiment (part 3).
- FIG. 4 is a diagram showing an example of a display image for presenting information in the medical image processing apparatus of the embodiment (part 4).
- First Embodiment 1 is a block diagram showing an example of the configuration of a treatment system including a medical image processing device according to the first embodiment.
- the treatment system 1 includes, for example, a treatment table 10, a bed controller 11, two radiation sources 20 (radiation source 20-1 and radiation source 20-2), two radiation detectors 30 (radiation detector 30-1 and radiation detector 30-2), a treatment beam irradiation gate 40, an irradiation controller 41, a display controller 50, a display device 51, and a medical image processing device 100.
- the treatment table 10 is a bed on which a subject (patient) P is fixed to receive radiation therapy.
- the bed control unit 11 controls the translation mechanism and rotation mechanism provided on the treatment table 10 to change the direction in which the treatment beam B is irradiated onto the patient P fixed to the treatment table 10.
- the bed control unit 11 controls each of the translation mechanism and rotation mechanism of the treatment table 10, for example, in three axial directions, i.e., in six axial directions.
- Radiation source 20-1 irradiates radiation r-1 from a predetermined angle to visualize the inside of the patient P's body.
- Radiation source 20-2 irradiates radiation r-2 from a predetermined angle different from that of radiation source 20-1 to visualize the inside of the patient P's body.
- Radiation r-1 and radiation r-2 are, for example, X-rays.
- Figure 1 shows a case where X-ray imaging is performed from two directions on patient P who is fixed on treatment table 10.
- a control unit that controls the irradiation of radiation r by radiation source 20 is not shown in Figure 1.
- the radiation detector 30-1 detects radiation r-1 irradiated from the radiation source 20-1, which has passed through the body of the patient P and reached the patient, and generates a two-dimensional X-ray fluoroscopic image FI-1 that visualizes the state inside the patient P according to the magnitude of the energy of the detected radiation r-1.
- the radiation detector 30-2 detects radiation r-2 irradiated from the radiation source 20-2, which has passed through the body of the patient P and reached the patient, and generates a two-dimensional X-ray fluoroscopic image FI-2 that visualizes the state inside the patient P according to the magnitude of the energy of the detected radiation r-2.
- the radiation detectors 30-1 and 30-2 each generate the X-ray fluoroscopic image FI-1 and the X-ray fluoroscopic image FI-2 at the same time, that is, at the same time.
- the radiation detector 30 has X-ray detectors arranged, for example, in a two-dimensional array, and generates a digital image, as the X-ray fluoroscopic image FI, which is an image of the magnitude of the energy of the radiation r that has reached each X-ray detector and is expressed as a digital value.
- the radiation detector 30 is, for example, a flat panel detector (FPD), an image intensifier, or a color image intensifier. In the following description, it is assumed that each radiation detector 30 is an FPD.
- the radiation detector 30 (FPD) outputs each generated X-ray fluoroscopic image FI to the medical image processing device 100. In FIG. 1, the control unit that controls the generation of the X-ray fluoroscopic image FI by the radiation detector 30 is not shown.
- the medical image processing device 100 and the radiation detector 30 may be connected, for example, via a LAN (Local Area Network) or a WAN (Wide Area Network).
- LAN Local Area Network
- WAN Wide Area Network
- FIG. 1 shows an imaging device that captures X-ray fluoroscopic images FI of a patient P from two different directions.
- the X-ray fluoroscopic images FI are an example of a "fluoroscopic image.”
- the fluoroscopic image may be any image, such as a CT image or a DRR image, in which a part of interest inside the patient P is projected.
- the pair of the radiation source 20 and the radiation detector 30 is configured as one imaging device.
- the positions of the radiation source 20 and the radiation detector 30 are fixed, so the direction in which the imaging device configured by the pair of the radiation source 20 and the radiation detector 30 captures images (relative direction to the fixed coordinate system of the treatment room) is fixed. Therefore, when three-dimensional coordinates are defined in the three-dimensional space in which the treatment system 1 is installed, the positions of the radiation source 20 and the radiation detector 30 can be expressed by coordinate values of three axes.
- the information on the coordinate values of the three axes is called the geometry information of the imaging device configured by the pair of the radiation source 20 and the radiation detector 30.
- the position of a tumor (lesion) in the body of the patient P located at any position in a specified three-dimensional coordinate system can be obtained from the position when the radiation irradiated from the radiation source 20 passes through the body of the patient P and reaches the radiation detector 30.
- the position of the tumor in the body of the patient P in the specified three-dimensional coordinate system can be obtained as a projection matrix.
- the geometry information can be obtained from the installation positions of the radiation source 20 and radiation detector 30 designed when the treatment system 1 was installed, and the inclination of the radiation source 20 and radiation detector 30 relative to a reference direction in the treatment room.
- the geometry information can also be obtained from the installation positions of the radiation source 20 and radiation detector 30 measured using a three-dimensional measuring device or the like. By determining the projection matrix from the geometry information, the medical image processing device 100 can calculate at what position in the captured X-ray fluoroscopic image FI a tumor inside the body of patient P in three-dimensional space will be captured.
- a projection matrix can be obtained for each pair of radiation source 20 and radiation detector 30. This makes it possible to calculate coordinate values in a predetermined three-dimensional coordinate system that represent the position of the target area from the position (position in two-dimensional coordinate system) of the image of the target area captured in the two fluoroscopic images, in a manner similar to the principle of triangulation.
- the target area is a tumor (lesion), organ, bone, etc. in the body of the patient P.
- the target area may be a marker that has been placed in advance near a tumor in the body of the patient P.
- the marker is made of a material, such as metal, whose image is projected onto the X-ray fluoroscopic image FI by radiation r. Markers come in various shapes, such as a sphere, a rod, a wedge, etc. In the following description, the target area is assumed to be a marker.
- the treatment system 1 shown in FIG. 1 is configured to include two sets of radiation sources 20 and radiation detectors 30, in other words, two imaging devices, but the number of imaging devices included in the treatment system 1 is not limited to two.
- the treatment system 1 may include three or more imaging devices (three or more sets of radiation sources 20 and radiation detectors 30).
- the treatment beam irradiation gate 40 irradiates radiation as a treatment beam B to destroy a tumor (lesion), which is a site to be treated inside the body of the patient P.
- the treatment beam B is, for example, a heavy particle beam, an X-ray, an electron beam, a gamma ray, a proton beam, or a neutron beam.
- the treatment beam B is irradiated linearly from the treatment beam irradiation gate 40 to the patient P (for example, a tumor inside the body of the patient P).
- the irradiation control unit 41 controls the irradiation of the treatment beam B from the treatment beam irradiation gate 40 to the patient P.
- the irradiation control unit 41 causes the treatment beam irradiation gate 40 to irradiate the treatment beam B in response to a signal indicating the irradiation timing of the treatment beam B output by the medical image processing device 100.
- the treatment beam irradiation gate 40 is an example of an "irradiation unit”
- the irradiation control unit 41 is an example of an "irradiation control unit”.
- the treatment system 1 shows a configuration including one fixed treatment beam irradiation gate 40, but the present invention is not limited to this, and the treatment system 1 may include multiple treatment beam irradiation gates.
- the treatment system 1 may further include a treatment beam irradiation gate that irradiates the treatment beam to the patient P from a horizontal direction.
- the treatment system 1 may be configured such that one treatment beam irradiation gate rotates around the patient P to irradiate the treatment beam to the patient P from various directions.
- the treatment beam irradiation gate 40 shown in FIG. 1 may be configured to be able to rotate 360 degrees around the horizontal Y rotation axis shown in FIG. 1.
- the treatment system 1 configured in this way is called a rotating gantry type treatment system.
- the radiation source 20 and the radiation detector 30 also rotate 360 degrees simultaneously around the same axis as the rotation axis of the treatment beam irradiation gate 40.
- the pair of the radiation source 20-1 and the radiation detector 30-1, and the pair of the radiation source 20-2 and the radiation detector 30-2 are arranged so that the radiation r-1 and the radiation r-2 intersect with each other at angles of ⁇ 45 degrees with respect to the treatment beam B, that is, so that the radiation r-1 and the radiation r-2 are perpendicular to each other.
- the positions at which the treatment beam irradiation gate 40, the pair of the radiation source 20-1 and the radiation detector 30-1, and the pair of the radiation source 20-2 and the radiation detector 30-2 are arranged are not limited to the example shown in FIG.
- the treatment system 1 may be configured not only to arrange the treatment beam irradiation gate 40 at a position where the treatment beam B is irradiated to the patient P from the vertical direction Z or the horizontal direction Y, but also to arrange the treatment beam irradiation gate 40 at a position or configuration where the treatment beam B can be irradiated to the patient P from any angle including at least angles of 30 degrees, 60 degrees, 120 degrees, 150 degrees, etc., or where the irradiation angle or irradiation direction of the treatment beam B can be adjusted.
- the treatment system 1 may be configured not only to arrange a pair of the radiation source 20 and the radiation detector 30 at a position where the respective rays r are perpendicular to each other, but also to arrange a pair of the radiation source 20 and the radiation detector 30 at a position where the respective rays r form different angles with respect to the treatment beam B.
- the treatment system 1 may be configured such that, for example, a pair of radiation source 20-1 and radiation detector 30-1 is positioned so that radiation r-1 forms an angle of +30 degrees with respect to treatment beam B (the angle at which radiation r-1 is emitted from the 5 o'clock direction and incident in the 11 o'clock direction), and a pair of radiation source 20-2 and radiation detector 30-2 is positioned so that radiation r-2 forms an angle of -60 degrees with respect to treatment beam B (the angle at which radiation r-2 is emitted from the 8 o'clock direction and incident in the 2 o'clock direction).
- the medical image processing device 100 determines whether or not it is possible to track the position of a tumor that moves in sync with the breathing of the patient P, for example, based on the X-ray fluoroscopic image FI captured in the preparation stage for radiation therapy, and selects a method for tracking the tumor (tracking method).
- the medical image processing device 100 tracks the tumor moving inside the body of the patient P using the selected tracking method.
- the medical image processing device 100 tracks the tumor indirectly, for example, by detecting the position of a marker that has been placed in the body of the patient P beforehand.
- the medical image processing device 100 recognizes an image of a marker inside the body of the patient P (hereinafter referred to as a "marker image") projected onto the X-ray fluoroscopic image FI captured in the preparation stage for radiation therapy, and tracks the tumor moving inside the body of the patient P based on the position of this marker image in the X-ray fluoroscopic image FI.
- a marker image an image of a marker inside the body of the patient P
- Tumor tracking in the medical image processing device 100 is not limited to a method that is performed indirectly by detecting the position of a marker (marker tracking method).
- Tumor tracking in the medical image processing device 100 may be, for example, a markerless tracking method that indirectly recognizes the position of the tumor based on the shape and movement of an organ near the tumor, that is, a tracking method that tracks the tumor without using a marker, or a method that recognizes the position of the tumor and tracks it directly.
- the medical image processing device 100 outputs a signal to the irradiation control unit 41 to indicate the irradiation timing for irradiating the tracked tumor with a predetermined treatment beam B.
- the medical image processing device 100 outputs information representing the current status to the display control unit 50 in order to present the status of detecting the position of the marker or tumor and the status of tracking the marker or tumor to the practitioner who performs radiation therapy using the treatment system 1, i.e., the user of the treatment system 1, such as a doctor or technician.
- treatment planning is performed, for example, several days to several weeks in advance.
- a three-dimensional computed tomography (CT) image is taken, and a digitally reconstructed radiograph (DRR) image is generated by virtually reconstructing an X-ray fluoroscopic image FI from this CT image.
- a region of interest (ROI) is determined for tracking the position of the treatment site (tumor) to which the treatment beam B is to be irradiated.
- the direction (irradiation direction) in which the treatment beam B is irradiated to the treatment site and the intensity (irradiation intensity) of the irradiated treatment beam B are also determined.
- the medical image processing device 100 performs various image processing when performing radiation therapy in the treatment system 1, in addition to processing to track the tumor and indicate the irradiation timing of the treatment beam B. For example, the medical image processing device 100 performs image processing for alignment to align the current position of the patient P so that the treatment beam B is irradiated with a direction and intensity of irradiation that have been determined in advance, such as at the stage of treatment planning. The medical image processing device 100 outputs images that have undergone each image processing and information obtained by each image processing to the corresponding components.
- the image processing for aligning the patient P in the treatment system 1 is the same as that in conventional treatment systems. Therefore, a detailed description of the configuration and processing of the image processing performed by the medical image processing device 100 to align the patient P will be omitted.
- the display control unit 50 causes the display device 51 to display images for presenting various information in the treatment system 1 to the user, including the state in which the tumor in the patient P is being tracked in the medical image processing device 100.
- the display control unit 50 causes the display device 51 to display, for example, information output by the medical image processing device 100 indicating the state in which the position of the marker or tumor is being detected, or the state in which the marker or tumor is being tracked.
- the display control unit 50 causes the display device 51 to display, for example, various images such as the captured X-ray fluoroscopic image FI, or images in which various information is superimposed on these images.
- the display device 51 is, for example, a display device such as a liquid crystal display (LCD).
- the user of the treatment system 1 can obtain information for performing radiation therapy using the treatment system 1 by visually checking the image displayed on the display device 51.
- the treatment system 1 may be configured to include a user interface such as an operation unit (not shown) operated by the user of the treatment system 1, and to manually operate various functions executed by the treatment system 1.
- the medical image processing device 100 the above-mentioned “imaging device” consisting of a set of the radiation source 20 and the radiation detector 30, the irradiation control unit 41, and the display control unit 50 may be combined to form a "medical device.”
- the “medical device” may be configured to include a user interface such as the above-mentioned operation unit (not shown) in addition to the medical image processing device 100, the “imaging device,” the irradiation control unit 41, and the display control unit 50.
- the “medical device” may further be configured to be integrated with the display unit 51.
- FIG. 2 is a block diagram showing an example of the configuration of the medical image processing device 100 of the first embodiment.
- the medical image processing device 100 includes an image acquisition unit 101, a trajectory generation unit 102, a selection unit 103, and a tracking unit 104.
- FIG. 2 also shows an irradiation control unit 41 and a display control unit 50 (including a display device 51) connected to the medical image processing device 100.
- Some or all of the components of the medical image processing device 100 are realized, for example, by a hardware processor such as a CPU (Central Processing Unit) executing a program (software). Some or all of these components may be realized by hardware (including circuitry) such as an LSI (Large Scale Integration), ASIC (Application Specific Integrated Circuit), FPGA (Field-Programmable Gate Array), or GPU (Graphics Processing Unit), or may be realized by a combination of software and hardware. Some or all of the functions of these components may be realized by a dedicated LSI.
- a hardware processor such as a CPU (Central Processing Unit) executing a program (software).
- Some or all of these components may be realized by hardware (including circuitry) such as an LSI (Large Scale Integration), ASIC (Application Specific Integrated Circuit), FPGA (Field-Programmable Gate Array), or GPU (Graphics Processing Unit), or may be realized by a combination of software and hardware. Some or all of the functions of these components may be realized by
- the program (software) may be stored in advance in a storage device (storage device with a non-transient storage medium) such as a ROM (Read Only Memory), RAM (Random Access Memory), or a semiconductor memory element such as a flash memory, or a HDD (Hard Disk Drive) provided in the medical image processing device 100, or may be stored in a removable storage medium (non-transient storage medium) such as a DVD or CD-ROM, and installed in the storage device provided in the medical image processing device 100 by attaching the storage medium to a drive device provided in the medical image processing device 100.
- the program (software) may be downloaded from another computer device via a network and installed in the storage device provided in the medical image processing device 100.
- the image acquisition unit 101 acquires an X-ray fluoroscopic image FI of the inside of the current patient P fixed to the treatment table 10 in the treatment room to which the treatment system 1 is applied. At this time, the image acquisition unit 101 continuously acquires multiple frames of X-ray fluoroscopic images FI output by the radiation detector 30. In other words, the image acquisition unit 101 acquires two moving images simultaneously captured from different directions by the imaging device. The image acquisition unit 101 continuously acquires X-ray fluoroscopic images FI for at least a period of time corresponding to one respiratory cycle of the patient P.
- the image acquisition unit 101 when acquiring the X-ray fluoroscopic images FI of the patient P, the image acquisition unit 101 also acquires geometry information of each X-ray fluoroscopic image FI, and associates each acquired X-ray fluoroscopic image FI with the geometry information.
- the image acquisition unit 101 may acquire each X-ray fluoroscopic image FI and geometry information from each radiation detector 30 via a LAN or WAN.
- the image acquisition unit 101 outputs each X-ray fluoroscopic image FI (multiple frames of X-ray fluoroscopic images FI) associated with geometry information to the trajectory generation unit 102 and the tracking unit 104.
- the trajectory generating unit 102 recognizes the marker images inside the body of the patient P projected within a range designated by the user of the treatment system 1 in each X-ray fluoroscopic image FI output by the image acquiring unit 101.
- the range in which the trajectory generating unit 102 recognizes the marker images may be, for example, the range of the ROI determined at the stage of treatment planning, or may be a predetermined range including the marker images in the DRR image generated at the time of treatment planning.
- the trajectory generating unit 102 recognizes the marker images within the designated range, for example, by template matching using multiple templates of markers prepared in advance.
- the trajectory generating unit 102 generates a trajectory of the marker that moves in synchronization with the breathing of the patient P based on the position of the marker image recognized in each X-ray fluoroscopic image FI. More specifically, the trajectory generating unit 102 sequentially connects the positions of the marker images recognized in each X-ray fluoroscopic image FI to generate a trajectory of the marker for a period of at least the length of one breathing cycle of the patient P.
- FIGS. 3A to 3D are diagrams showing an example of a template used for recognizing a marker in the medical image processing apparatus 100 (more specifically, the trajectory generating unit 102) of the first embodiment.
- FIGS. 3A to 3D show an example of a template for recognizing a bar-shaped marker.
- the template is a two-dimensional image in which a marker image that is expected to be projected on an X-ray fluoroscopic image FI is rotated at multiple angles on the image plane.
- the template shown in FIG. 3A is an example of a template for recognizing a marker projected in the horizontal direction (this angle is set to 0 [deg]) in the X-ray fluoroscopic image FI.
- the template shown in FIG. 3B is an example of a template for recognizing a marker that is rotated and projected in the right direction (i.e., the right end is projected with an upward tilt of 45 [deg]) in the X-ray fluoroscopic image FI.
- the template shown in FIG. 3C is an example of a template for recognizing a marker that is further rotated to the right and projected vertically in the X-ray fluoroscopic image FI (i.e., the right end is projected with an inclination of 90 degrees upward).
- 3D is an example of a template for recognizing a marker that is further rotated to the right and projected in the X-ray fluoroscopic image FI (i.e., the right end is projected with an inclination of 135 degrees).
- the templates shown in FIG. 3A to FIG. 3D are merely examples, and there are many more templates for tracking markers. For example, some templates corresponding to rod-shaped markers recognize markers that are rotated and projected in the depth direction. Similarly, there are multiple templates corresponding to markers of different shapes (other than spherical markers) that represent states rotated in various directions.
- the trajectory generating unit 102 recognizes the marker image by comparing the image projected within the range specified for each X-ray fluoroscopic image FI with each template as shown in Figures 3A to 3D (template matching). Then, when a marker is projected on the X-ray fluoroscopic image FI, the trajectory generating unit 102 can obtain the position of the marker on the image plane of the X-ray fluoroscopic image FI. In other words, the trajectory generating unit 102 can obtain the position of the marker in a two-dimensional coordinate system.
- the position of the marker on the X-ray fluoroscopic image FI can be represented, for example, by the position of the pixels constituting the X-ray fluoroscopic image FI.
- the pixel representing the position of the marker on the X-ray fluoroscopic image FI is, for example, the pixel at the center position of the marker image recognized by the trajectory generating unit 102 on the X-ray fluoroscopic image FI.
- the entire marker is not projected as a marker image on the X-ray fluoroscopic image FI, that is, only a part of the marker is projected as a marker image.
- the pixel representing the position of the marker may be a pixel at the center or edge of a portion of the marker image projected onto the X-ray fluoroscopic image FI, or it may be a pixel corresponding to the center of a template used by the trajectory generating unit 102 when recognizing a marker image in the X-ray fluoroscopic image FI.
- FIG. 4 is a diagram showing an example of a process for recognizing the position of a marker in the medical image processing device 100 (more specifically, the trajectory generating unit 102) of the first embodiment.
- FIG. 4 shows an example of a case where the position of a marker in three-dimensional coordinates defined in the treatment room is determined as a projection matrix based on geometry information.
- a symbol representing a coordinate in the three-dimensional coordinate system is followed by " ⁇ " to represent a vector between the position of that coordinate and the position of the corresponding coordinate.
- the constants ⁇ 1, ⁇ 2, P ⁇ 1, and P ⁇ 2 are matrices of fixed values.
- Each of ⁇ 1 and ⁇ 2 is a 1 ⁇ 3 matrix
- each of P ⁇ 1 and P ⁇ 2 is a 3 ⁇ 4 projection matrix representing the three-dimensional coordinates of the treatment room.
- the trajectory generating unit 102 obtains the two-dimensional marker position (coordinate m1 and coordinate m2) on each X-ray fluoroscopic image FI, and performs calculations to obtain the three-dimensional marker position O from the two simultaneously captured X-ray fluoroscopic images FI, for one respiratory cycle of the patient P.
- the trajectory generating unit 102 then sequentially connects the two-dimensional and three-dimensional marker positions obtained from each of the continuously captured X-ray fluoroscopic images FI to generate a marker trajectory having a length equivalent to one respiratory cycle of the patient P.
- the trajectory generating unit 102 cannot obtain the two-dimensional marker position in, for example, any one of the frames of each of the continuously captured X-ray fluoroscopic images FI.
- the trajectory generating unit 102 may complement the position of the marker to be obtained in a frame in which the position of the marker was not obtained, using the position of the marker obtained in another frame, but only when the period during which the continuity is interrupted is short.
- the trajectory generating unit 102 may use the position of the marker obtained in the previous frame as the position of the marker to be obtained in a frame in which the position of the marker was not obtained, or may use the intermediate position between the positions of the marker obtained in the previous frame and the frame after that as the position of the marker to be obtained in a frame in which the position of the marker was not obtained.
- FIG. 5A and 5B are diagrams showing an example of the trajectory of a marker whose position is recognized by the medical image processing device 100 (more specifically, the trajectory generating unit 102) of the first embodiment.
- FIG. 5A and FIG. 5B an example of a trajectory generated by connecting the positions of the markers recognized in one-directional X-ray fluoroscopic image FI in frame order, that is, in chronological order is shown.
- FIG. 5A and FIG. 5B an example of a trajectory generated by connecting the positions of the markers recognized in one-directional X-ray fluoroscopic image FI in frame order, that is, in chronological order is shown.
- 5B correspond to an example of a trajectory generated by connecting the positions of the markers in one direction of the three-dimensional position O, that is, in one direction of the X-axis direction, Y-axis direction, or Z-axis direction, in frame order (chronological order) when the trajectory generating unit 102 performs a calculation to obtain the three-dimensional position O of the marker from two simultaneously captured X-ray fluoroscopic images FI.
- the position of the marker (position or coordinate of a pixel) changes periodically. From this trajectory, it is considered that the patient P is breathing in a calm state.
- the treatment beam B can be suitably irradiated to the tumor (lesion) at a timing based on the trajectory of the marker.
- the trajectory shown in FIG. 5B is one in which the marker position (pixel position or coordinates) changes drastically.
- the treatment system 1 would be able to suitably irradiate the tumor (lesion) with the treatment beam B.
- it may be necessary to wait until the trajectory of the marker position changes periodically like the trajectory shown in FIG. 5A before starting radiation therapy, or to switch to a method of tracking the tumor without using a marker, such as a markerless tracking method, and suitably irradiating the tumor with the treatment beam B.
- the trajectory generation unit 102 outputs information representing the generated trajectories of each marker to the selection unit 103. More specifically, the trajectory generation unit 102 outputs to the selection unit 103 information representing the trajectory (change in pixel position) of the marker for each template obtained in a two-dimensional coordinate system for each X-ray fluoroscopic image FI, and information representing the trajectory (change in position O) of the marker for each template obtained in a three-dimensional coordinate system defined in the treatment room for two simultaneously captured X-ray fluoroscopic images FI.
- the information representing the marker trajectories output by the trajectory generation unit 102 also includes information representing the template used in template matching.
- the information representing the template used in template matching is, for example, identification information exclusively assigned to each template.
- the selection unit 103 determines whether or not a method of tracking a marker in radiation therapy, that is, a method of indirectly tracking a tumor, can be adopted based on the information representing the trajectory of each marker output by the trajectory generation unit 102, and selects a tracking method for tracking the marker or tumor based on the determination result.
- the selection unit 103 determines whether or not a method of tracking a marker or tumor can be adopted in radiation therapy, for example, by a classifier using machine learning.
- the selection unit 103 performs a process of determining whether or not a method of tracking a marker or tumor can be adopted, for example, by a classifier using machine learning.
- a classifier using machine learning for example, a classification model such as a random forest, a decision tree, a support vector machine (SVM), a K nearest neighbor algorithm (KNN), or a logistic regression is used.
- the classification model is a trained model that is trained in advance using, for example, an AI (Artificial Intelligence) function, by preparing multiple trajectories in a two-dimensional coordinate system or a three-dimensional coordinate system, for which it is known whether or not the marker can be tracked.
- the classifier corresponding to the trajectory in the two-dimensional coordinate system and the classifier corresponding to the trajectory in the three-dimensional coordinate system may use the same classification model or different classification models.
- the classifier corresponding to the trajectory obtained from the X-ray fluoroscopic image FI-1 and the classifier corresponding to the trajectory obtained from the X-ray fluoroscopic image FI-2 may use the same classification model or different classification models.
- the selection unit 103 executes a process of determining whether tracking is possible in three dimensions based on the trajectory of the marker in the three-dimensional coordinate system output by the trajectory generation unit 102, and a process of determining whether tracking is possible in two dimensions based on the trajectory of the marker in the two-dimensional coordinate system output by the trajectory generation unit 102.
- the process of determining whether tracking is possible in three dimensions and the process of determining whether tracking is possible in two dimensions in the selection unit 103 may be performed simultaneously, or one of the processes of determining whether tracking is possible may be performed first and the other process of determining whether tracking is possible later.
- the selection unit 103 selects a tracking method for tracking the marker in the three-dimensional coordinate system (hereinafter referred to as "three-dimensional tracking") If the marker trajectory information obtained in the three-dimensional coordinate system output by the trajectory generation unit 102 indicates that the trajectory changes drastically in any direction of the three-dimensional position O, for example, as shown in FIG. 5B, the selection unit 103 checks the marker trajectory information obtained in the two-dimensional coordinate system output by the trajectory generation unit 102.
- the selection unit 103 selects a tracking method that tracks the marker in the two-dimensional coordinate system in this X-ray fluoroscopic image FI (hereinafter referred to as "two-dimensional tracking").
- the selection unit 103 selects a tracking method that does not use a marker (hereinafter referred to as "external respiratory synchronization tracking").
- the selection unit 103 outputs information representing the selected tracking method to the tracking unit 104. If the selection unit 103 selects three-dimensional tracking or two-dimensional tracking, it also outputs information about the template used by the trajectory generation unit 102 for template matching to the tracking unit 104.
- the tracking unit 104 tracks the marker images inside the body of the patient P that are projected within a range specified by the user of the treatment system 1 in each X-ray fluoroscopic image FI output by the image acquisition unit 101, using the tracking method output by the selection unit 103. At this time, the tracking unit 104 tracks the marker images by template matching using the template represented by the template information output by the selection unit 103.
- the tracking unit 104 outputs information representing the current state in which the marker image is being tracked to the display control unit 50.
- the display control unit 50 generates a display image for presenting the current state in which the marker position is detected and tracked, and displays the generated display image on the display device 51, thereby presenting the current state of the treatment system 1 to the user.
- the tracking unit 104 generates a signal for instructing the irradiation timing for irradiating the tumor with the treatment beam B based on the tracked marker image, and outputs the generated signal to the irradiation control unit 41.
- the irradiation control unit 41 controls the irradiation of the treatment beam B at the treatment beam irradiation gate 40 so that the treatment beam B is irradiated at the irradiation timing output by the tracking unit 104.
- the trajectory generating unit 102 generates a trajectory of the part of interest (the marker in the above example) based on the X-ray fluoroscopic image FI captured in the preparation stage for radiation therapy. Then, in the medical image processing device 100, the selection unit 103 selects a tracking method for tracking the part of interest based on the trajectory of the part of interest generated by the trajectory generating unit 102. Furthermore, in the medical image processing device 100, the tracking unit 104 tracks the part of interest using the tracking method selected by the selection unit 103, and outputs information for presenting to the user the state in which the part of interest is being tracked, and a signal indicating the irradiation timing of the treatment beam B.
- the treatment system 1 equipped with the medical image processing device 100 it is possible to present to the user whether or not the marker can be tracked (indirectly track the tumor) using the marker tracking method, and to irradiate the tumor with the treatment beam B at an appropriate timing.
- FIG. 6 is a flowchart showing the flow of operation in the medical image processing device 100 of the first embodiment.
- the image acquisition unit 101 acquires each of the first frame X-ray fluoroscopic images FI output by the radiation detector 30 (step S100).
- the image acquisition unit 101 associates geometry information corresponding to each acquired X-ray fluoroscopic image FI and outputs it to each of the trajectory generation unit 102 and the tracking unit 104.
- the trajectory generating unit 102 recognizes the marker images inside the body of the patient P that are projected within the range specified by the user in each X-ray fluoroscopic image FI of the first frame output by the image acquiring unit 101 (step S110). Then, the trajectory generating unit 102 generates a marker trajectory with the position of the marker image recognized in each X-ray fluoroscopic image FI set as the initial position (step S120).
- the trajectory generating unit 102 checks whether the generation of the marker trajectories for one breathing cycle of the patient P has been completed (step S130). If it is confirmed in step S130 that the generation of the marker trajectories for one breathing cycle of the patient P has not been completed, the trajectory generating unit 102 returns the process to step S100.
- the image acquiring unit 101 acquires the next frame of the X-ray fluoroscopic image FI in step S100, and the trajectory generating unit 102 recognizes the marker image projected on the next frame in step S110, and generates a marker trajectory with the position of the recognized marker image as the next position in step S120. In this way, in the medical image processing device 100, the image acquiring unit 101 and the trajectory generating unit 102 each repeat the process to generate the marker trajectory for one breathing cycle of the patient P.
- step S130 if it is confirmed in step S130 that the generation of the marker trajectories for one respiratory cycle of patient P has been completed, the trajectory generation unit 102 outputs each piece of information representing the positions of the marker images projected onto each X-ray fluoroscopic image FI as marker trajectories to the selection unit 103, and the process proceeds to step S200.
- the selection unit 103 then executes a process of determining whether or not it is possible to employ a method of tracking a marker in radiation therapy based on the information representing the trajectory of each marker output by the trajectory generation unit 102 (step S200).
- step S200 the operation of the process of determining whether or not it is possible to track a marker in radiation therapy will be described in more detail.
- FIG. 7 is a flowchart showing the flow of the operation of determining whether or not it is possible to track a marker in the selection unit 103 provided in the medical image processing device 100 of the first embodiment.
- FIG. 7 is a flowchart showing the flow of the operation of determining whether or not it is possible to track a marker in the selection unit 103 provided in the medical image processing device 100 of the first embodiment.
- FIG. 7 shows an example of the flow of the operation when performing a process of determining whether or not it is possible to track a marker in three dimensions (a process of determining whether or not it is possible to track a marker in three dimensions) based on the trajectory of the marker in the three-dimensional coordinate system output by the trajectory generation unit 102.
- the selection unit 103 uses a corresponding classification model for the information on the marker trajectory calculated in a three-dimensional coordinate system output by the trajectory generation unit 102, and determines whether three-dimensional tracking is possible.
- the selection unit 103 checks whether the marker trajectory in all directions, the X-axis direction, the Y-axis direction, and the Z-axis direction at the three-dimensional position O, has a trajectory that changes periodically (step S210). This check in the process of step S210 is performed for each trajectory generated by the trajectory generation unit 102, in other words, for each template used by the trajectory generation unit 102 in template matching to generate the marker trajectory.
- step S210 If it is confirmed in step S210 that the marker trajectory in all directions changes periodically for any one of the marker trajectories, the selection unit 103 determines that marker tracking can be performed using the marker trajectory in a three-dimensional coordinate system in which all directions change periodically, that is, that three-dimensional tracking is possible (step S212). The selection unit 103 then returns to the previous process.
- step S210 determines that marker tracking cannot be performed using the marker trajectories in the three-dimensional coordinate system, that is, that three-dimensional tracking is not possible (step S214). The selection unit 103 then returns to the previous process.
- the selection unit 103 performs the process of determining whether or not three-dimensional tracking is possible.
- the selection unit 103 performs the process of determining whether or not tracking is possible in three dimensions based on the trajectory of the marker in the three-dimensional coordinate system output by the trajectory generation unit 102, and the process of determining whether or not tracking is possible in two dimensions based on the trajectory of the marker in the two-dimensional coordinate system output by the trajectory generation unit 102.
- the selection unit 103 performs the process of determining whether or not tracking is possible in three dimensions and the process of determining whether or not tracking is possible in two dimensions at the same time
- the process of determining whether or not two-dimensional tracking is possible for the X-ray fluoroscopic image FI-1 and the process of determining whether or not two-dimensional tracking is possible for the X-ray fluoroscopic image FI-2 are performed at the same time
- the process of determining whether or not tracking is possible in each case can be performed by replacing three dimensions with two dimensions in the process of determining whether or not three-dimensional tracking is possible shown in FIG. 7.
- FIG. 8 is a flowchart showing the flow of another tracking feasibility determination operation in the selection unit 103 provided in the medical image processing device 100 of the first embodiment.
- FIG. 8 shows an example of the flow of the operation when the two-dimensional tracking feasibility determination process (two-dimensional tracking feasibility determination process) is performed sequentially for each X-ray fluoroscopic image FI based on the trajectory of the marker in the two-dimensional coordinate system output by the trajectory generation unit 102.
- the selection unit 103 uses a corresponding classification model for the information on the marker trajectory calculated in a two-dimensional coordinate system for each X-ray fluoroscopic image FI output by the trajectory generation unit 102, and determines whether two-dimensional tracking is possible. First, the selection unit 103 checks whether the marker trajectory based on the two-dimensional pixel positions calculated for one of the X-ray fluoroscopic images FI (here, X-ray fluoroscopic image FI-1) includes a trajectory that changes periodically (step S220).
- step S220 The check in the process of step S220 is performed for each trajectory in the X-ray fluoroscopic image FI-1 generated by the trajectory generation unit 102, in other words, for each template used by the trajectory generation unit 102 in template matching to generate the trajectory of the marker projected onto the X-ray fluoroscopic image FI-1.
- step S220 If it is confirmed in step S220 that the trajectory of any one of the markers is a trajectory that changes periodically, the selection unit 103 determines that marker tracking can be performed using the trajectory of the marker in a two-dimensional coordinate system that changes periodically in this X-ray fluoroscopic image FI-1, that is, that two-dimensional tracking is possible (step S222). The selection unit 103 then returns to the processing.
- step S220 the selection unit 103 checks whether the marker trajectories based on the two-dimensional pixel positions determined for the X-ray fluoroscopic image FI in the other direction (here, X-ray fluoroscopic image FI-2) include a trajectory that changes periodically (step S224).
- This check in the processing of step S224 is also performed for each trajectory in the X-ray fluoroscopic image FI-2 generated by the trajectory generation unit 102, in other words, for each template used in template matching by the trajectory generation unit 102 to generate the trajectory of the marker projected onto the X-ray fluoroscopic image FI-2.
- step S224 If it is confirmed in step S224 that the trajectory of any one of the markers is a trajectory that changes periodically, the selection unit 103 determines that marker tracking can be performed using the trajectory of the marker in a two-dimensional coordinate system that changes periodically in this X-ray fluoroscopic image FI-2, that is, that two-dimensional tracking is possible (step S222). The selection unit 103 then returns to the previous process.
- step S224 determines that marker tracking cannot be performed using the marker trajectories in the two-dimensional coordinate system in either the X-ray fluoroscopic image FI-1 or the X-ray fluoroscopic image FI-2, that is, two-dimensional tracking is not possible (step S226).
- the selection unit 103 then returns to the previous process.
- the selection unit 103 performs the process of determining whether or not two-dimensional tracking is possible.
- the selection unit 103 performs the process of determining whether or not tracking is possible in three dimensions based on the trajectory of the marker in the three-dimensional coordinate system output by the trajectory generation unit 102, and the process of determining whether or not tracking is possible in two dimensions based on the trajectory of the marker in the two-dimensional coordinate system output by the trajectory generation unit 102.
- the selection unit 103 may perform the process of determining whether or not three-dimensional tracking is possible first as shown in FIG. 7, and perform the process of determining whether or not two-dimensional tracking is possible later as shown in FIG.
- the process of determining whether or not two-dimensional tracking is possible first as shown in FIG. 8, and perform the process of determining whether or not three-dimensional tracking is possible later as shown in FIG.
- the selection unit 103 performs the process of determining whether three-dimensional tracking is possible first and the process of determining whether two-dimensional tracking is possible later, if it is determined that three-dimensional tracking can be performed, the process of determining whether two-dimensional tracking is possible may not be performed. In other words, the selection unit 103 may return the process.
- the process of determining whether three-dimensional tracking is possible may be performed.
- the selection unit 103 selects a marker tracking method based on the results of the process of determining whether tracking is possible in three dimensions and the process of determining whether tracking is possible in two dimensions (step S300).
- the selection unit 103 selects a tracking method in which tracking is possible as the tracking method of the marker.
- the selection unit 103 selects a tracking method in which tracking is possible as the tracking method of the marker. And, for example, if the result of the process of determining whether tracking is possible in both the process of determining whether tracking is possible in three dimensions and the process of determining whether tracking is possible in two dimensions is the selection unit 103 selects a markerless tracking method (external respiratory synchronization tracking) that indirectly or directly determines and tracks the position of the tumor without using a marker as the tracking method. The selection unit 103 outputs information representing the selected tracking method to the tracking unit 104.
- the tracking unit 104 then tracks the marker image and tumor projected on each X-ray fluoroscopic image FI output by the image acquisition unit 101 using the tracking method output by the selection unit 103 (step S400). Then, when radiation therapy is started, the tracking unit 104 outputs a signal to the irradiation control unit 41 to instruct the irradiation timing for irradiating the tumor with the treatment beam B, and causes the treatment beam irradiation gate 40 to irradiate the treatment beam B (step S410). Thereafter, the tracking unit 104 repeats tracking the marker image and tumor in the process of step S400 and controlling the irradiation of the treatment beam B in step S410 until the radiation therapy is completed (until it is confirmed in step S420 that the radiation therapy is completed).
- the processing and operation (control) in the radiation therapy stage in the treatment system 1 are the same as those in the conventional treatment system. Therefore, a detailed description of the processing and operation (control) performed by the medical image processing device 100 in the radiation therapy stage will be omitted.
- the trajectory generation unit 102 generates a trajectory of the part of interest (the marker in the above example) based on the X-ray fluoroscopic image FI captured in the preparation stage for radiation therapy. Then, in the medical image processing device 100, the selection unit 103 determines whether the part of interest can be tracked for each template used in template matching to generate the trajectory of the part of interest based on the trajectory of the part of interest generated by the trajectory generation unit 102, and selects a tracking method for tracking the part of interest. After that, in the medical image processing device 100, the tracking unit 104 tracks the part of interest using the tracking method selected by the selection unit 103, and controls the irradiation of the treatment beam B when radiation therapy is started.
- the trajectory generating unit 102 recognizes a portion of interest projected on the X-ray fluoroscopic image FI and generates a trajectory for tracking the portion of interest, but the portion of interest recognized to generate a trajectory is not limited to the portion of interest projected on the X-ray fluoroscopic image FI.
- the trajectory generating unit 102 may generate a trajectory of the portion of interest based on any image, such as a CT image or a DRR image, as long as the portion of interest inside the body of the patient P can be recognized.
- the trajectory generating unit 102 recognizes a portion of interest by template matching, but the portion of interest may be recognized by a method other than template matching.
- the trajectory generating unit 102 may recognize the area of interest by detecting a contour using the gradient of brightness in the area of interest in the X-ray fluoroscopic image FI or the DRR image as a feature, or may recognize the area of interest using a deep learning model that has learned the features of the area of interest using deep learning, which is a type of machine learning.
- the configuration is shown in which the trajectory of the part of interest generated by the trajectory generation unit 102 is input to the classifier to perform the process of determining whether or not the marker or tumor can be tracked, but the determination of whether or not the trackability is determined by the classifier is not limited to the process based on the trajectory of the part of interest.
- the trajectory generation unit 102 may output the position of the part of interest (position of the pixel) obtained in a two-dimensional coordinate system or the position of the part of interest obtained in a three-dimensional coordinate system to the selection unit 103 without generating the trajectory of the part of interest, and the selection unit 103 may input the displacement of the position of the part of interest in each of the two-dimensional coordinate system and the three-dimensional coordinate system (the difference in position between two consecutive frames) to the classifier to perform the process of determining whether or not the marker or tumor can be tracked.
- the process and operation (control) of the trajectory generation unit 102 and the selection unit 103, as well as the process and operation (control) of each of the components of the medical image processing device 100 may be equivalent to the process and operation (control) in the first embodiment described above.
- the trajectory generation unit 102 and the selection unit 103 terminate their respective processes and operations (control), but they may be configured to continue their processes and operations (control). In this case, even if it becomes difficult to track the area of interest using the selected tracking method during radiation therapy, it becomes possible to switch to a different tracking method more quickly.
- the configuration of the treatment system equipped with the medical image processing device of the second embodiment is a configuration in which the medical image processing device 100 of the treatment system 1 equipped with the medical image processing device 100 of the first embodiment shown in Fig. 1 is replaced with a medical image processing device of the second embodiment (hereinafter referred to as "medical image processing device 200").
- medical image processing device 200 a medical image processing device of the second embodiment
- the treatment system equipped with the medical image processing device 200 will be referred to as "treatment system 2".
- components of the treatment system 2 equipped with the medical image processing device 200 that are similar to the components of the treatment system 1 equipped with the medical image processing device 100 of the first embodiment are given the same reference numerals, and detailed descriptions of each component are omitted.
- the medical image processing device 200 determines whether or not it is possible to track the position of a tumor that moves in sync with the breathing of the patient P, for example, based on the X-ray fluoroscopic image FI captured in the preparation stage for radiation therapy, and selects a method for tracking the tumor (tracking method). Like the medical image processing device 100, the medical image processing device 200 also tracks the tumor that moves inside the body of the patient P using the selected tracking method, and controls the irradiation of the treatment beam B to the tracked tumor.
- Figure 9 is a block diagram showing an example of the configuration of the medical image processing device 200 of the second embodiment.
- the medical image processing device 200 includes an image acquisition unit 101, a trajectory generation unit 102, a likelihood calculation unit 202, a selection unit 203, and a tracking unit 104.
- Figure 9 also shows an irradiation control unit 41 and a display control unit 50 (including a display device 51) connected to the medical image processing device 200.
- the medical image processing device 200 is configured by adding a likelihood calculation unit 202 to the medical image processing device 100, and accordingly, the selection unit 103 is replaced with a selection unit 203.
- the other components of the medical image processing device 200 are the same as those of the medical image processing device 100. Therefore, in the following description, the components of the medical image processing device 200 that are similar to those of the medical image processing device 100 are given the same reference numerals, and detailed description of each component is omitted. In the following description, only the components that differ from the medical image processing device 100 are described.
- the likelihood calculation unit 202 like the trajectory generation unit 102 provided in the medical image processing device 100, recognizes the marker image inside the body of the patient P projected within the range specified by the user of the treatment system 1 in each X-ray fluoroscopic image FI output by the image acquisition unit 101.
- the likelihood calculation unit 202 may acquire the recognition result of the marker image from the trajectory generation unit 102. In this case, the likelihood calculation unit 202 can reduce the processing load required for recognizing the marker image.
- the likelihood calculation unit 202 calculates the likelihood of the recognized marker image for each X-ray fluoroscopic image FI.
- the likelihood of the marker image calculated by the likelihood calculation unit 202 is a value representing the certainty (similarity) of the marker image within the specified range.
- the likelihood calculation unit 202 calculates the likelihood between the marker of the template used in template matching and the marker projected on the X-ray fluoroscopic image FI for each X-ray fluoroscopic image FI.
- the likelihood is a high value (highest value when the image is the template marker) if the similarity to the template marker is high, and is a value that decreases as the similarity to the template marker decreases.
- the likelihood calculation unit 202 calculates the likelihood for each frame of the X-ray fluoroscopic image FI for which the trajectory generation unit 102 generates a trajectory at the same time that the trajectory generation unit 102 generates the trajectory of the marker image.
- the calculation of the likelihood in the likelihood calculation unit 202 and the generation of the trajectory of the marker image in the trajectory generation unit 102 are not limited to be performed at the same time.
- the calculation of the likelihood in the likelihood calculation unit 202 may be performed first, and the generation of the trajectory of the marker image in the trajectory generation unit 102 may be performed later. In this case, if the trajectory generation unit 102 generates the trajectory of the marker image using a predetermined number of templates starting from the one with the highest likelihood, the processing load required for generating the trajectory of the marker image can be reduced.
- the likelihood calculation unit 202 outputs information representing the likelihood of the marker image calculated for the X-ray fluoroscopic image FI of each frame to the selection unit 203.
- the likelihood calculation unit 202 may treat a number of frames for a period of time corresponding to the length of one respiratory cycle of the patient P as one unit, calculate the average value (average likelihood) of the likelihood of each marker image calculated for the X-ray fluoroscopic images FI of all frames included in this unit, and output information representing the average likelihood to the selection unit 203.
- the selection unit 203 selects a tracking method for tracking a marker or a tumor in radiation therapy based on information representing the trajectory of each marker output by the trajectory generation unit 102 and information representing the likelihood of the marker image projected onto the X-ray fluoroscopic image FI output by the likelihood calculation unit 202.
- the selection of the tracking method in the selection unit 203 is similar to that of the selection unit 103 provided in the medical image processing device 100, but information representing the likelihood of the marker image is also input to the selection unit 203. Therefore, the selection unit 203 can select a tracking method for tracking a marker or a tumor from the X-ray fluoroscopic image FI in which the likelihood of the marker image is the highest, that is, the template that is optimal for radiation therapy corresponds.
- the selection unit 203 may select a tracking method for tracking a marker or a tumor from the X-ray fluoroscopic images FI in which a predetermined number of templates correspond starting from the one with the highest likelihood.
- the selection process of the tracking method in the selection unit 203 is simply the same as the selection process of the tracking method in the selection unit 103 described with reference to Figures 6 to 8, except that the likelihood of the marker image is taken into consideration, and can be easily considered based on the selection process of the tracking method in the selection unit 103. Therefore, a detailed description of the selection process of the tracking method in the selection unit 203 will be omitted.
- the selection unit 203 Similar to the selection unit 103, the selection unit 203 outputs information indicating the selected tracking method to the tracking unit 104. Similar to the selection unit 103, when the selection unit 203 selects three-dimensional tracking or two-dimensional tracking, the selection unit 203 also outputs information on the template used by the trajectory generation unit 102 for template matching (the template with the highest likelihood of the marker image, or a predetermined number of templates with the highest likelihood) to the tracking unit 104.
- the trajectory generation unit 102 generates a trajectory of the part of interest (the marker in the above example) based on the X-ray fluoroscopic image FI captured at the stage of preparation for radiation therapy. Furthermore, in the medical image processing device 200, the likelihood calculation unit 202 calculates the likelihood of the marker image for each X-ray fluoroscopic image FI. Then, in the medical image processing device 200, the selection unit 203 selects a tracking method for tracking the part of interest based on the trajectory of the part of interest generated by the trajectory generation unit 102 and the likelihood of the marker image calculated by the likelihood calculation unit 202.
- the tracking unit 104 tracks the part of interest using the tracking method selected by the selection unit 103, and outputs information for presenting to the user the state in which the part of interest is being tracked and a signal indicating the irradiation timing of the treatment beam B.
- the user can be informed of whether or not the marker can be tracked (indirectly tracked) using the marker tracking method, and the treatment beam B can be irradiated to the tumor at an appropriate timing.
- FIGS 10 and 11 are diagrams showing an example of a display image for presenting information in the medical image processing device of an embodiment (here, the medical image processing device 200).
- Figure 10 shows an example of a GUI (Graphical User Interface) image IM1 that shows the state in which the position of a marker is being detected in a three-dimensional coordinate system.
- GUI image IM1 the X-ray fluoroscopic images FI-1 and FI-2 in which the marker is being detected and the trajectory of the marker in each direction (X-axis direction, Y-axis direction, and Z-axis direction) determined in the three-dimensional coordinate system are displayed in a fluoroscopic image display area FA as a line graph.
- information about the detected marker is displayed in an information display area IA.
- information indicating that a marker for three-dimensional tracking is being detected information indicating the three-dimensional position O of the currently detected marker, information indicating the likelihood of the currently detected marker image, information about the template used for detection, and other information are displayed in the information display area IA.
- FIG. 11 shows an example of a GUI image IM2 showing a state in which the position of a marker is being detected in a two-dimensional coordinate system.
- GUI image IM2 X-ray fluoroscopic images FI-1 and FI-2 in which the marker is being detected are presented in the fluoroscopic image display area FA, and the trajectories of the marker in each direction (u-axis direction and v-axis direction) representing the position of the pixel determined in the two-dimensional coordinate system on the X-ray fluoroscopic image FI are presented as line graphs in association with each X-ray fluoroscopic image FI.
- the trajectories of the marker in the u1-axis direction and v1-axis direction, which represent the position of the pixel are presented as line graphs in association with the X-ray fluoroscopic image FI-1
- the trajectories of the marker in the u2-axis direction and v2-axis direction, which represent the position of the pixel are presented as line graphs in association with the X-ray fluoroscopic image FI-2.
- information about the detected marker is presented in the information display area IA.
- information is presented indicating that a marker for two-dimensional tracking has been detected, information indicating the two-dimensional pixel position of the marker currently detected on each X-ray fluoroscopic image FI, information indicating the likelihood of the currently detected marker image, information regarding the template used for detection, and other information.
- the user can check the information presented by GUI image IM1 and GUI image IM2 and control the tracking of the marker by operating a user interface such as an operation unit (not shown). More specifically, the user can switch the marker tracking method and the template used for tracking by "Switch tracking method” and “Switch filter” in the information display area IA, and can start tracking of the marker by "Start tracking" in the information display area IA.
- FIGS. 12 and 13 are diagrams showing another example of a display image for presenting information in a medical image processing device of an embodiment (here, medical image processing device 200).
- FIG. 12 shows an example of a GUI image IM3 showing the state in which a marker is being tracked in three dimensions.
- the X-ray fluoroscopic images FI-1 and FI-2 in which the marker is being tracked and the trajectory of the marker in each direction (X-axis direction, Y-axis direction, and Z-axis direction) calculated in the three-dimensional coordinate system are presented in a line graph in the fluoroscopic image display area FA.
- an "X mark" indicating the position of the tracked marker is also presented in each of the X-ray fluoroscopic images FI-1 and FI-2.
- the mark indicating the position of the tracked marker is not limited to an "X mark” and may be any mark such as an "O mark” or a different color as long as it can highlight the position of the marker.
- information about the tracked marker is presented in the information display area IA. More specifically, in the GUI image IM3, within the information display area IA, information is presented that represents the template being used for tracking, information that represents the three-dimensional position O of the marker being tracked in three dimensions, information that represents the likelihood of the marker image, and so on.
- FIG. 13 shows an example of a GUI image IM4 showing a state in which a marker is being tracked in two dimensions.
- X-ray fluoroscopic images FI-1 and FI-2 in which the marker is being tracked are presented in the fluoroscopic image display area FA, and the trajectory of the marker in each direction (u-axis direction and v-axis direction) representing the position of the pixel determined in the two-dimensional coordinate system on the X-ray fluoroscopic image FI is presented as a line graph in association with each X-ray fluoroscopic image FI.
- the trajectory of the marker in the u1-axis direction and v1-axis direction representing the position of the pixel is presented as a line graph in association with the X-ray fluoroscopic image FI-1
- the trajectory of the marker in the u2-axis direction and v2-axis direction representing the position of the pixel is presented as a line graph in association with the X-ray fluoroscopic image FI-2.
- an "X" representing the position of the marker being tracked is also presented only in the X-ray fluoroscopic image FI-1.
- the trajectory of the pixel positions of the tracked marker image in the v2 axis direction is a trajectory that changes rapidly, making it impossible to track the position of the marker.
- information about the tracked marker is presented in the information display area IA. More specifically, in the GUI image IM4, information such as information representing the template used for tracking, information representing the two-dimensional pixel positions of the marker being tracked in two dimensions, and information representing the likelihood of the marker image is presented in the information display area IA.
- the user can check the information presented by GUI image IM3 and GUI image IM4, and control the irradiation of the treatment beam B to the tumor by operating a user interface such as an operation unit (not shown). More specifically, the user can check the details of the radiation therapy to be performed by "Get treatment method” or “Change treatment method” in the information display area IA, and can start the irradiation of the treatment beam B to the tumor by "Start treatment” in the information display area IA.
- the trajectory generating unit generates a trajectory of the target area (e.g., a marker) based on a two-dimensional fluoroscopic image (e.g., an X-ray fluoroscopic image) captured in the preparation stage of radiation therapy.
- the selection unit selects a tracking method for tracking the target area. That is, in the medical image processing device of each embodiment, the selection unit automatically determines whether the target area can be tracked and selects a template to be used for tracking. Then, in the medical image processing device of each embodiment, the medical image processing device tracks the target area using the selected tracking method.
- the treatment system equipped with the medical image processing device of each embodiment it is possible to determine whether radiation (treatment beam) can be irradiated in synchronization with the respiration of the subject (patient) based on the fluoroscopic image (respiratory synchronization irradiation). Furthermore, in the treatment system equipped with the medical image processing device of each embodiment, the state in which the position of the target area is detected and the state in which the target area is tracked are presented to the user, and radiation can be irradiated to a tumor (lesion) in the subject's body at an appropriate timing.
- an image acquisition unit (101) that acquires multiple fluoroscopic images (e.g., X-ray fluoroscopic images FI) of a patient (P)
- a trajectory generation unit (102) that recognizes the position of a focus area (e.g., a marker) captured in each of the multiple fluoroscopic images and generates a trajectory of the focus area as it moves based on the recognized position of the focus area
- a selection unit (103) that selects a tracking method for tracking the focus area when treating the patient based on the trajectory of the focus area, it is possible to determine whether or not a treatment beam (B) can be irradiated in a respiratory synchronous manner based on the fluoroscopic images.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Medical Informatics (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- General Health & Medical Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- Radiology & Medical Imaging (AREA)
- Pathology (AREA)
- Surgery (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- High Energy & Nuclear Physics (AREA)
- Biophysics (AREA)
- Human Computer Interaction (AREA)
- Robotics (AREA)
- Radiation-Therapy Devices (AREA)
- Apparatus For Radiation Diagnosis (AREA)
Abstract
Description
図1は、第1の実施形態の医用画像処理装置を備えた治療システムの構成の一例を示すブロック図である。治療システム1は、例えば、治療台10と、寝台制御部11と、二つの放射線源20(放射線源20-1および放射線源20-2)と、二つの放射線検出器30(放射線検出器30-1および放射線検出器30-2)と、治療ビーム照射門40と、照射制御部41と、表示制御部50と、表示装置51と、医用画像処理装置100とを備える。
以下、第2の実施形態について説明する。第2の実施形態の医用画像処理装置を備えた治療システムの構成は、図1に示した第1の実施形態の医用画像処理装置100を備えた治療システム1の構成において、医用画像処理装置100が第2の実施形態の医用画像処理装置(以下、「医用画像処理装置200」という)に代わった構成である。以下の説明においては、医用画像処理装置200を備えた治療システムを、「治療システム2」という。
Claims (12)
- 患者を撮像した複数の透視画像を取得する画像取得部と、
複数の前記透視画像のそれぞれに写された着目部の位置を認識し、認識した前記着目部の位置に基づいて、前記着目部が移動している状態の軌跡を生成する軌跡生成部と、
前記着目部の軌跡に基づいて、前記着目部を追跡する追跡方法を選択する選択部と、
を備える医用画像処理装置。 - 前記画像取得部は、前記患者を異なる複数の方向から撮像した前記透視画像を取得する、
請求項1に記載の医用画像処理装置。 - 前記画像取得部は、少なくとも前記患者の呼吸の一周期分の長さの期間分を連続して撮像した複数の前記透視画像を取得する、
請求項2に記載の医用画像処理装置。 - 前記軌跡生成部は、前記着目部の状態を認識するための複数のテンプレートを用いたテンプレートマッチングによって前記着目部の位置を認識する、
請求項3に記載の医用画像処理装置。 - 前記軌跡生成部は、前記患者を一方向から撮像した前記透視画像における二次元の座標系で前記着目部を追跡する二次元追跡と、前記患者を異なる複数の方向から同時に撮像する撮像装置のジオメトリ情報に基づく三次元の座標系で前記着目部を追跡する三次元追跡とのそれぞれにおいて、複数の前記テンプレートごとに、前記着目部の軌跡を生成する、
請求項4に記載の医用画像処理装置。 - 前記選択部は、前記二次元追跡と、前記三次元追跡とのそれぞれにおいて生成された前記着目部の軌跡を、識別器に入力することによって、前記着目部の追跡を行うことができるか否かを判定し、前記着目部の追跡を行うことができると判定された前記着目部の軌跡に基づいて、前記追跡方法を選択する、
請求項5に記載の医用画像処理装置。 - 前記追跡方法で前記着目部を追跡する追跡部、
をさらに備える、
請求項6に記載の医用画像処理装置。 - 前記透視画像に写された前記着目部の尤度を算出する尤度計算部、
をさらに備え、
前記選択部は、前記着目部の軌跡と、前記着目部の尤度とに基づいて、前記追跡方法を選択する、
請求項1から請求項7のうちいずれか1項に記載の医用画像処理装置。 - 前記着目部は、前記患者の体内に留置させたマーカーである、
請求項8に記載の医用画像処理装置。 - 患者を撮像した複数の透視画像を取得する画像取得部と、複数の前記透視画像のそれぞれに写された着目部の位置を認識し、認識した前記着目部の位置に基づいて、前記着目部が移動している状態の軌跡を生成する軌跡生成部と、前記着目部の軌跡に基づいて、前記着目部を追跡する追跡方法を選択する選択部と、を備える医用画像処理装置と、
前記着目部の軌跡を表す表示画像を表示装置に表示させる表示制御部と、
追跡した前記着目部によって示される、前記患者の治療する対象の部位に治療ビームを照射する照射部と、
前記治療ビームの照射を制御する照射制御部と、
前記患者が固定された寝台の位置を移動させる寝台制御部と、
を備える治療システム。 - コンピュータが、
患者を撮像した複数の透視画像を取得し、
複数の前記透視画像のそれぞれに写された着目部の位置を認識し、認識した前記着目部の位置に基づいて、前記着目部が移動している状態の軌跡を生成し、
前記着目部の軌跡に基づいて、前記着目部を追跡する追跡方法を選択する、
医用画像処理方法。 - コンピュータに、
患者を撮像した複数の透視画像を取得させ、
複数の前記透視画像のそれぞれに写された着目部の位置を認識させ、認識させた前記着目部の位置に基づいて、前記着目部が移動している状態の軌跡を生成させ、
前記着目部の軌跡に基づいて、前記着目部を追跡させる追跡方法を選択させる、
プログラム。
Priority Applications (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202380058889.6A CN119677560A (zh) | 2022-11-29 | 2023-11-28 | 医用图像处理装置、治疗系统、医用图像处理方法及程序 |
| JP2024561510A JPWO2024117129A1 (ja) | 2022-11-29 | 2023-11-28 | |
| KR1020257004286A KR20250034994A (ko) | 2022-11-29 | 2023-11-28 | 의용 화상 처리 장치, 치료 시스템, 의용 화상 처리 방법, 프로그램, 및 기억 매체 |
| US19/052,024 US20250186009A1 (en) | 2022-11-29 | 2025-02-12 | Medical image processing device, treatment system, medical image processing method, and storage medium |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2022190393 | 2022-11-29 | ||
| JP2022-190393 | 2022-11-29 |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US19/052,024 Continuation US20250186009A1 (en) | 2022-11-29 | 2025-02-12 | Medical image processing device, treatment system, medical image processing method, and storage medium |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024117129A1 true WO2024117129A1 (ja) | 2024-06-06 |
Family
ID=91323794
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2023/042555 Ceased WO2024117129A1 (ja) | 2022-11-29 | 2023-11-28 | 医用画像処理装置、治療システム、医用画像処理方法、およびプログラム |
Country Status (5)
| Country | Link |
|---|---|
| US (1) | US20250186009A1 (ja) |
| JP (1) | JPWO2024117129A1 (ja) |
| KR (1) | KR20250034994A (ja) |
| CN (1) | CN119677560A (ja) |
| WO (1) | WO2024117129A1 (ja) |
Citations (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20060074299A1 (en) * | 2004-10-02 | 2006-04-06 | Sohail Sayeh | Non-linear correlation models for internal target movement |
| WO2006057911A2 (en) * | 2004-11-22 | 2006-06-01 | Civco Medical Instruments Co., Inc. | Real time ultrasound monitoring of the motion of internal structures during respiration for control of therapy delivery |
| US20110054293A1 (en) * | 2009-08-31 | 2011-03-03 | Medtronic, Inc. | Combination Localization System |
| JP2013544137A (ja) * | 2010-10-29 | 2013-12-12 | アキュレイ インコーポレイテッド | 標的の部分的移動範囲を治療するための方法および装置 |
| JP2014521370A (ja) * | 2011-03-31 | 2014-08-28 | リフレクション メディカル, インコーポレイテッド | 放射誘導型放射線療法における使用のためのシステムおよび方法 |
| JP2016511027A (ja) * | 2013-02-08 | 2016-04-14 | コビディエン エルピー | 肺の除神経システムおよび方法 |
| JP2016131737A (ja) * | 2015-01-20 | 2016-07-25 | 国立大学法人北海道大学 | 放射線治療システムおよび放射線治療プログラム |
| JP2018082767A (ja) * | 2016-11-21 | 2018-05-31 | 東芝エネルギーシステムズ株式会社 | 医用画像処理装置、医用画像処理方法、医用画像処理プログラム、動体追跡装置および放射線治療システム |
| JP2019017867A (ja) * | 2017-07-20 | 2019-02-07 | 株式会社東芝 | 情報処理装置、情報処理システム、およびプログラム |
| JP2019107392A (ja) * | 2017-12-20 | 2019-07-04 | 国立研究開発法人量子科学技術研究開発機構 | 医用装置、医用装置の制御方法およびプログラム |
| JP2019528149A (ja) * | 2016-08-29 | 2019-10-10 | アキュレイ インコーポレイテッド | 回転撮像及び追跡システムにおけるオンライン角度選択 |
| US20200129783A1 (en) * | 2017-06-19 | 2020-04-30 | Our New Medical Technologies | Target tracking and irradiation method and device using radiotherapy apparatus and radiotherapy apparatus |
-
2023
- 2023-11-28 KR KR1020257004286A patent/KR20250034994A/ko active Pending
- 2023-11-28 WO PCT/JP2023/042555 patent/WO2024117129A1/ja not_active Ceased
- 2023-11-28 CN CN202380058889.6A patent/CN119677560A/zh active Pending
- 2023-11-28 JP JP2024561510A patent/JPWO2024117129A1/ja active Pending
-
2025
- 2025-02-12 US US19/052,024 patent/US20250186009A1/en active Pending
Patent Citations (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20060074299A1 (en) * | 2004-10-02 | 2006-04-06 | Sohail Sayeh | Non-linear correlation models for internal target movement |
| WO2006057911A2 (en) * | 2004-11-22 | 2006-06-01 | Civco Medical Instruments Co., Inc. | Real time ultrasound monitoring of the motion of internal structures during respiration for control of therapy delivery |
| US20110054293A1 (en) * | 2009-08-31 | 2011-03-03 | Medtronic, Inc. | Combination Localization System |
| JP2013544137A (ja) * | 2010-10-29 | 2013-12-12 | アキュレイ インコーポレイテッド | 標的の部分的移動範囲を治療するための方法および装置 |
| JP2014521370A (ja) * | 2011-03-31 | 2014-08-28 | リフレクション メディカル, インコーポレイテッド | 放射誘導型放射線療法における使用のためのシステムおよび方法 |
| JP2016511027A (ja) * | 2013-02-08 | 2016-04-14 | コビディエン エルピー | 肺の除神経システムおよび方法 |
| JP2016131737A (ja) * | 2015-01-20 | 2016-07-25 | 国立大学法人北海道大学 | 放射線治療システムおよび放射線治療プログラム |
| JP2019528149A (ja) * | 2016-08-29 | 2019-10-10 | アキュレイ インコーポレイテッド | 回転撮像及び追跡システムにおけるオンライン角度選択 |
| JP2018082767A (ja) * | 2016-11-21 | 2018-05-31 | 東芝エネルギーシステムズ株式会社 | 医用画像処理装置、医用画像処理方法、医用画像処理プログラム、動体追跡装置および放射線治療システム |
| US20200129783A1 (en) * | 2017-06-19 | 2020-04-30 | Our New Medical Technologies | Target tracking and irradiation method and device using radiotherapy apparatus and radiotherapy apparatus |
| JP2019017867A (ja) * | 2017-07-20 | 2019-02-07 | 株式会社東芝 | 情報処理装置、情報処理システム、およびプログラム |
| JP2019107392A (ja) * | 2017-12-20 | 2019-07-04 | 国立研究開発法人量子科学技術研究開発機構 | 医用装置、医用装置の制御方法およびプログラム |
Also Published As
| Publication number | Publication date |
|---|---|
| CN119677560A (zh) | 2025-03-21 |
| KR20250034994A (ko) | 2025-03-11 |
| US20250186009A1 (en) | 2025-06-12 |
| JPWO2024117129A1 (ja) | 2024-06-06 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP6181459B2 (ja) | 放射線治療システム | |
| JP7178009B2 (ja) | 医用装置、医用装置の制御方法、およびプログラム | |
| JP6208535B2 (ja) | 放射線治療装置およびシステムおよび方法 | |
| US10143431B2 (en) | Medical image processing apparatus and method, and radiotherapeutic apparatus | |
| JP7113447B2 (ja) | 医用画像処理装置、治療システム、および医用画像処理プログラム | |
| CN103269752A (zh) | 放射线治疗装置控制装置及其处理方法以及程序 | |
| JP7513980B2 (ja) | 医用画像処理装置、治療システム、医用画像処理方法、およびプログラム | |
| US12364877B2 (en) | Medical image processing device, storage medium, medical device, and treatment system | |
| JP7444387B2 (ja) | 医用画像処理装置、医用画像処理プログラム、医用装置、および治療システム | |
| JP5010740B2 (ja) | 放射線治療装置制御方法および放射線治療装置制御装置 | |
| JP6310118B2 (ja) | 画像処理装置、治療システム及び画像処理方法 | |
| US20170296843A1 (en) | Processing device for a radiation therapy system | |
| JP7264389B2 (ja) | 医用装置、医用装置の制御方法およびプログラム | |
| WO2024117129A1 (ja) | 医用画像処理装置、治療システム、医用画像処理方法、およびプログラム | |
| JP7279336B2 (ja) | X線撮影装置 | |
| US20240362782A1 (en) | Medical image processing device, treatment system, medical image processing method, and storage medium | |
| US20240298979A1 (en) | Collision avoidance when positioning a medical imaging device and a patient positioning apparatus | |
| US20240369917A1 (en) | Medical image processing device, treatment system, medical image processing method, and storage medium | |
| US20230347180A1 (en) | Medical image processing device, medical image processing method, medical image processing program, and radiation therapy device | |
| JP7125703B2 (ja) | 医用装置、医用装置の制御方法およびプログラム | |
| WO2025131290A1 (en) | Computer-implemented method for medical imaging | |
| JP2024048137A (ja) | 照射位置確認支援装置、照射位置確認支援方法、および照射位置確認支援プログラム | |
| CN119212626A (zh) | 医用图像处理装置、治疗系统、医用图像处理方法、程序及存储介质 | |
| EP3231481A1 (en) | Processing device for a radiation therapy system |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23897777 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 20257004286 Country of ref document: KR Kind code of ref document: A |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 1020257004286 Country of ref document: KR Ref document number: 202380058889.6 Country of ref document: CN |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2024561510 Country of ref document: JP |
|
| WWP | Wipo information: published in national office |
Ref document number: 1020257004286 Country of ref document: KR |
|
| WWP | Wipo information: published in national office |
Ref document number: 202380058889.6 Country of ref document: CN |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 23897777 Country of ref document: EP Kind code of ref document: A1 |