WO2025004569A1 - Dispositif de configuration d'image, procédé de configuration d'image, programme de configuration d'image et support de stockage - Google Patents
Dispositif de configuration d'image, procédé de configuration d'image, programme de configuration d'image et support de stockage Download PDFInfo
- Publication number
- WO2025004569A1 WO2025004569A1 PCT/JP2024/017771 JP2024017771W WO2025004569A1 WO 2025004569 A1 WO2025004569 A1 WO 2025004569A1 JP 2024017771 W JP2024017771 W JP 2024017771W WO 2025004569 A1 WO2025004569 A1 WO 2025004569A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- image
- body movement
- unit
- unit time
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01T—MEASUREMENT OF NUCLEAR OR X-RADIATION
- G01T1/00—Measuring X-radiation, gamma radiation, corpuscular radiation, or cosmic radiation
- G01T1/16—Measuring radiation intensity
- G01T1/161—Applications in the field of nuclear medicine, e.g. in vivo counting
Definitions
- This invention relates to an image construction device, an image construction method, an image construction program, and a storage medium for non-invasively imaging conditions within a subject body, for example.
- PET scans positron emission tomography
- PET scans have been used as a medical imaging technique to identify lesions, examine the affected area, and determine diagnosis and treatment plans without harming the patient's (subject's) body.
- PET images are used as diagnostic images (medical images).
- a PET examination requires an imaging time of several minutes to several tens of minutes. Any movement of the subject (body movement) during the imaging time causes blurring of the image, and the accuracy of the PET image deteriorates due to body movement.
- body movement is suppressed by fixing the part of the subject to be imaged to an examination bed or the like.
- fixing the head imposes a burden on the subject. Furthermore, there are some subjects for whom it is difficult to fix the head.
- a data-driven frame method that estimates the amount of body movement at each time by analyzing measurement data obtained from a PET examination and extracts frame images with minimal body movement.
- This method obtains data indicating body movement from the measurement data, registers a predetermined continuous period of time during which no body movement occurs as a static period, and generates a PET image using only the captured images (frame images) from the static period.
- the data-driven frame method uses only frame images that are minimally affected by body movement, it is possible to obtain an image in which the effects of body movement have been eliminated.
- the data-driven frame method generates frame images using only imaging data from static sections, so frame images from sections with high frequency of body movement (dynamic sections) are discarded, resulting in a problem that the data statistics are small and the PET images contain a lot of noise. In other words, there is room for improvement in terms of the accuracy of the PET images.
- this invention aims to provide an imaging device, an imaging method, an imaging program, and a storage medium that can obtain high-precision PET images in a PET examination without immobilizing the subject.
- the present invention is characterized by an imaging device, imaging method, imaging program used therefor, and storage medium having the imaging program recorded thereon, which are provided with an image acquisition unit that acquires measurement data measured by a PET device in time series, a measurement acquisition unit that divides body movement data related to the subject's body movement obtained from the PET device in time series into unit time measurement values by dividing the data into predetermined unit times, a division point setting unit that sets a frame division point when the amount of fluctuation in the unit time measurement value exceeds a predetermined threshold, a correction data generation unit that generates correction data for correcting the measurement data so as to cancel out the body movement of the subject in the frame units divided at the division point according to the amount of fluctuation in the unit time measurement value, and a reconstruction unit that reconstructs a PET image from the measurement data based on the correction data.
- This invention provides an imaging device, an imaging method, an imaging program, and a storage medium that can obtain highly accurate PET images in a PET examination without immobilizing the subject.
- FIG. 1 is a block diagram showing an example of the configuration of an image composition device of the present invention.
- 11 is a flowchart showing the operation of an image construction process.
- 11 is a flowchart showing the operation of a frame division process.
- 5 is a graph showing the relationship between measurement data and frame division points.
- 11 is a graph showing an example of a frame division process.
- 11 is a flowchart showing a registration operation.
- 11 is a flowchart showing the operation of a reconstruction process including body motion correction.
- 1 is a graph showing the operation of a frame division process of the prior art;
- 5 is a graph showing the operation of a frame division process according to the present invention.
- SUVR images for comparing the present invention with the prior art.
- FIG. 1 is a block diagram showing the configuration of an image construction device 1 of the present invention.
- the image construction device 1 is communicably connected to an imaging device (PET device) 2 for positron emission tomography (PET) examinations, and has a function of constructing (reconstructing) a PET image based on measurement data output from the PET device 2.
- PET device positron emission tomography
- PET is a technology in which a radiopharmaceutical labeled with a positron-emitting nuclide is administered to the subject and the radiation emitted from within the body is measured by the PET device 2, and the acquired data is reconstructed into a tomographic image (PET image) after various corrections.
- PET images are used as medical images, and analysis of the PET images is used for evaluation of physiological and pathological functions and image diagnosis. In other words, examinations using the PET device 2 (PET examinations) can identify lesions and examine the relevant areas without causing any harm to the patient's (subject's) body.
- a test agent containing a radioactive nuclide that emits positrons is introduced into the subject's body by injection, inhalation, etc.
- the test agent introduced into the subject's body accumulates in a specific part of the subject's body that has a function according to the characteristics of the test agent. For example, when a sugar test agent is used, it is selectively accumulated in a part of the body with active metabolism such as cancer cells.
- a positron is emitted from the radioactive nuclide contained in the test agent, and when the emitted positron combines with surrounding electrons and annihilates, two gamma rays (so-called annihilation gamma rays) are emitted in directions of about 180 degrees from each other. These two gamma rays are detected by radiation detectors arranged around the subject, and the detected detection data (measurement data) is reconstructed to obtain (image) image data of the subject.
- the PET device 2 transmits the measurement data to the image construction device 1, and the measurement data is reconstructed in the image construction device 1.
- the image construction device 1 includes a control unit 11, an input unit 12, a display unit 13, a communication unit 14, and an auxiliary memory unit 15. Each of the input unit 12, the display unit 13, the communication unit 14, and the auxiliary memory unit 15 is connected to the control unit 11.
- the control unit 11 includes a calculation unit 16 and a main memory unit 17, and executes various calculations and control operations in the image construction device 1.
- the calculation unit 16 is an arithmetic processing unit having a CPU or MPU, etc.
- the main memory unit 17 is a so-called primary memory having a RAM (DRAM) and ROM, etc.
- the RAM is used as a work area and buffer area for the calculation unit 16.
- Program data stored in the auxiliary memory unit 15 and data required to execute programs are appropriately expanded in the RAM.
- the ROM stores the startup program of the image construction device 1 and default values for various information, etc.
- the input unit 12 has input components that accept operational inputs from a user of the image construction device 1 (e.g., a medical professional), and an input detection circuit that is interposed between the input components and the control unit 11.
- the input components are, for example, a touch panel (touch input means) and/or hardware operation keys.
- the input detection circuit outputs an operation signal or operation data to the control unit 11 in response to the operation (operation input) of each input component.
- the display unit 13 has a display and a display control circuit interposed between the display and the computing unit 16.
- the display may be, for example, an LCD (liquid crystal display) or an organic EL display.
- the display shows various medical images such as PET images.
- the touch panel may be provided so as to overlap the display surface of the display of the display unit 13.
- the touch panel cooperates with the display of the display unit 13 to form a display with a touch panel (touch panel display).
- the display of the display unit 13 may also be configured to display a GUI (Graphical User Interface) having software keys (operation keys or operation buttons). In this case, operation input can be accepted via the GUI (operation screen).
- GUI Graphic User Interface
- the communication unit 14 has a communication circuit for connecting to a communication line.
- the communication circuit is a wired communication circuit or a wireless communication circuit, and is connected to other electronic devices (e.g., the PET device 2) so as to be able to communicate with each other via the communication line according to instructions from the control unit 11.
- the auxiliary memory unit 15 is composed of other non-volatile memories such as HDD, SSD, flash memory, and EEPROM, and stores programs and various data used by the calculation unit 16 to control the operation of the image construction device 1.
- the auxiliary memory unit 15 is a so-called secondary memory.
- the auxiliary storage unit 15 of the image construction device 1 stores (registers) various data used by the image construction device 1, such as data for executing image reconstruction processing.
- the auxiliary storage unit 15 also stores an image construction processing program 18 for automatically executing image reconstruction processing in response to user operation input or when measurement data is input, and image construction processing data 19 required for executing the image reconstruction processing.
- the image composition processing program 18 and image composition processing data 19 are read out from the auxiliary storage unit 15 as necessary and stored (expanded) in the main storage unit 17 (RAM).
- the operation of the image composition device 1 is realized by the calculation unit 16 executing the image composition processing program 18 expanded in the main storage unit 17 (RAM).
- the image construction processing program 18 has at least a number of programs for executing the processing of each step of the image construction processing described below.
- the image construction processing program 18 has a main processing program for selecting and executing various functions equipped in the image construction device 1, a measurement data acquisition program for acquiring measurement data from the PET device 2, a body movement data acquisition program for acquiring body movement data related to the body movement of the subject (subject) from the measurement data, a division point setting program for setting frame division points (frame division points), a correction data generation program for generating correction data for correcting the measurement data to cancel out the body movement of the subject, and a reconstruction program for reconstructing a PET image from the measurement data based on the correction data.
- the image construction processing data 19 includes measurement data acquired from the PET device 2, body movement data relating to the body movement of the subject, data for one or more frame division points (frame division point data) constructed based on the measurement data, correction data for correcting the measurement data so as to cancel out the body movement of the subject on a frame-by-frame basis, and data for a PET image reconstructed from the measurement data based on the correction data (PET image data).
- the configuration of the image construction device 1 shown in FIG. 1 is merely an example, and the configuration does not need to be limited to this.
- the image construction device 1 is formed as a single integrated device, but the image construction device 1 may also be configured by a group of multiple computers.
- FIG. 2 is a flowchart showing the operation of the image construction process executed by the image construction device 1.
- the image construction device 1 when the image construction device 1 starts the image construction process, it acquires the measurement data transmitted from the PET device 2 (measured by the PET device 2) in chronological order (step S1), analyzes the measurement data and performs frame division processing according to the degree of body movement of the subject (step S2), performs image (frame image) reconstruction processing for each divided frame (step S3), performs registration (step S4), performs reconstruction processing including body movement correction to correct the measurement data to cancel out the body movement of the subject (step S5), generates a PET image including body movement correction (step S6), and ends the image construction process.
- FIG. 3 is a flowchart showing the operation of the frame division process.
- the flow of the frame division process shown in FIG. 3 is a subroutine executed in step S2 of the image construction process described above.
- the image construction device 1 when the image construction device 1 starts the frame division process, it first converts the measurement data into body movement data that indicates the body movement of the subject in a time series (step S21). Note that the measurement data converted into body movement data in step S21 may be data that is a copy of the measurement data acquired in step S1.
- the type of body movement data is not particularly limited, but for example, a CoD (Center of Distribution) trace can be used as body movement data.
- a CoD trace can be used as body movement data.
- a CoD trace is dynamic data obtained by, for example, acquiring the central coordinates (central positions) of the LoR of coincidence counting events with a specified time resolution and averaging them for each specified unit time or for each specified number of events.
- the central coordinates ( Xt , Yt , Zt ) of the LoR in the CoD trace can also be obtained using the difference in detection time between the two crystals.
- continuous body movement data in one measurement by the PET device 2 can be obtained by the following [Equation 4].
- Figure 5 is a graph showing an example of a CoD trace as an example of body movement data. As shown in Figure 5, the CoD trace is converted in time series in conjunction with the measurement data, and when body movement is large, the change in the CoD trace value becomes large, and when body movement is small, the change in the CoD trace value becomes small.
- the unit time is 0.1 sec to 1 sec
- the predetermined count is 10,000 events.
- Each unit time measurement value Cn has data on the elapsed time from the start of measurement (imaging). The following description will be given taking the case where the unit time is 1 sec as an example.
- a reference time i0 is set (step S23).
- the reference time i0 is a time that specifies the start of a certain frame.
- the "time” may be expressed as an elapsed time from the start of measurement (imaging).
- a start time n is set (step S24).
- the reference time i0 is set as the start time n.
- the processing time i is updated (step S25).
- a judgment time m is set (step S26).
- the processing time i is set as the judgment time m .
- the threshold value ⁇ is set to be equal to or less than the resolution (size of a single pixel) of the captured image (image reconstructed from measurement data) in the PET device 2.
- the threshold value ⁇ is set to be equal to or less than half (1/2) the size of a single pixel of the captured image in the PET device 2, and more preferably, the threshold value ⁇ is set to be equal to or less than 1/4 the size of a single pixel of the captured image in the PET device 2.
- step S27 NO
- step S30 it is determined whether the next judgment time m (next processing time i) when the unit time advances does not exceed the end time of the body movement data (measurement data) (i+1 ⁇ N) (step S30). If it is determined that the next processing time i does not exceed the end time of the body movement data (measurement data) (step S30: YES), the processing time i is updated to the next time, and the process returns to step S26 (setting of the next judgment time m).
- FIG. 6 is a graph showing an example of the frame division process (particularly, a series of processes from step S24 to S31).
- the amount of fluctuation of the unit time measurement value exceeds a threshold value ⁇ based on the unit time measurement value at the start of the frame (compared to the unit time measurement value at the start of the frame)
- a frame division point is set.
- the frame division point is the end (end time) of the frame from the viewpoint of the current frame, and the start (start time) of the frame from the viewpoint of the subsequent (next) frame.
- the measurement value at the frame division point is the reference unit time measurement value C i0 of the next frame. Then, in the next frame, the frame division point is set when the unit time measurement value C i exceeds the threshold value ⁇ compared to the reference unit time measurement value C i0. This process is repeated to set the frame division points.
- step S30 if it is determined (step S30: NO) that the next processing time i has exceeded the end time of the entire body movement data (measurement data) (the current unit time is the last unit time, or there is no next unit time), the frame division processing flow ends and the process returns to the image construction processing (step S2 ends and step S3 proceeds).
- step S3 reconstruction processing is performed for each generated frame image.
- the frame division process divides the body movement data of the subject obtained from the PET device 2 in a time series into predetermined unit times, obtains them as unit time measurement values, and sets the frame division point when the amount of fluctuation in the unit time measurement value exceeds a predetermined threshold.
- FIG. 7 is a flow chart showing the registration operation.
- the registration flow shown in FIG. 3 is a subroutine executed in step S5 of the image construction process described above.
- which of the frame images If is set as the reference frame Ir is not particularly limited, but for example, the first (first) frame I1 in the chronological order can be set as the reference frame Ir .
- the target frame I t set in step S44 can be set in chronological order from the oldest to the newest.
- the parameter R (for example, R t ) in a certain frame (frame t) is expressed, for example, by a 4 ⁇ 4 homogeneous transformation matrix.
- step S46 the similarity (sim( Ir , In )) between the reference frame Ir and the correction frame In is calculated (step S46), the parameter R is updated by an optimization process using the similarity (sim( Ir , In )) (step S47), and it is determined whether or not to end the registration (step S48).
- step S46 the similarity (sim( Ir , In )) is calculated using, for example, mutual information.
- a known technique such as the Nelder-Mead algorithm can be used for the optimization process in step S47.
- step S48 the registration can be ended when either a case where there is no unprocessed target frame (the target frame is the last target frame) or a case where the difference (variation) in the absolute values of the parameter R between consecutive frames exceeds a predetermined threshold b (
- step S48 If it is determined that the registration is not to be ended (the registration end condition is not satisfied) (step S48: NO), the process returns to step S48 (setting of the next target frame I t ). On the other hand, if it is determined that the registration is to be ended (the registration end condition is satisfied) (step S48: YES), the motion correction parameter M is derived, the registration flow is ended, and the process returns to the image construction process (step S4 is ended, and the process proceeds to step S5).
- the motion correction parameter M t in the target frame I t can be derived, for example, in the form of the following [Equation 5].
- the registration of the reference frame Ir can be omitted since it involves a process of comparing the reference frames Ir with each other.
- FIG. 8 is a flowchart showing the operation of the reconstruction process with motion correction.
- the flow of the reconstruction process with motion correction shown in FIG. 8 is a subroutine executed in step S6 of the image construction process described above.
- the image construction device 1 when the image construction device 1 starts the reconstruction process including the body motion correction, it generates an initial image from the measurement data (step S51) and reads out the body motion correction parameter Mf (step S52). Thereafter, using the body motion correction parameter Mf , a series of processes including the forward projection process (step S53), the back projection process (step S54), the sensitivity correction process (step S55), the absorption correction process (step S56), and the scattering correction process (step S57) are executed for the initial image, the image is updated (step S58), the flow of the reconstruction process including the body motion correction is ended, and the image construction process is returned to (ends step S5 and proceeds to step S6).
- a series of processes including the forward projection process (step S53), the back projection process (step S54), the sensitivity correction process (step S55), the absorption correction process (step S56), and the scattering correction process (step S57) are executed for the initial image, the image is updated (step S58), the flow of the reconstruction process including the body motion
- the body motion correction parameter Mf is applied, and the image is updated so as to cancel the body motion of the subject.
- the body motion correction parameter Mf is a value that moves and corrects the image in the opposite direction to the body motion, and the body motion of the subject is canceled by adding the body motion correction parameter Mf .
- the updated image is then generated as a PET image with motion correction (step S6). Note that the series of processing steps from the forward projection processing (step S53) to the scatter correction processing (step S57) can use known techniques in the field of PET image correction processing, and therefore detailed description thereof will be omitted.
- a frame division point is set, correction data is generated for each divided frame image, and reconstruction processing is performed based on the correction data, so that a PET image with body movement correction can be generated that includes data from sections where no body movement occurs (static sections) and data from sections where body movement occurs frequently (dynamic sections), and high-precision PET images can be obtained in PET examinations without immobilizing the subject.
- a body movement corrected PET image can be generated using frame images from all sections, including the dynamic sections (see FIG. 9B). This prevents a decrease in data statistics and reduces noise contained in the PET image.
- frames are set continuously, with the period from the frame start position until a certain amount of fluctuation occurs being treated as one frame.
- This allows frames to be taken in small intervals with large movement (the interval from 10 sec to 20 sec on the horizontal axis of Figure 9B) and frames to be taken in large intervals with little movement (the interval from 20 sec to 50 sec on the horizontal axis of Figure 9B), making adaptive frame division possible.
- FIG 10 shows a SUVR (Standardized Uptake Value Ratio) image generated from a PET image to compare the present invention with the conventional technology.
- SUVR Standardized Uptake Value Ratio
- the present invention can reconstruct an image that is more suitable as a medical image for diagnosis than the conventional technology.
- the section on the horizontal axis in Figure 9B from 10 sec to 20 sec where intermittent movement occurs is divided into small frame units of about 0.5 sec, so that the data can be used to estimate the subject's body movement without discarding all of it.
- the only parameter required is the difference from the starting position that serves as the basis for frame division. Furthermore, this parameter does not depend on the PET data, but is determined according to the resolution of the PET system. For this reason, it is possible to easily derive a general-purpose optimal value by trying several values in advance.
- the frame-divided data is imaged and then aligned, and corrections are made to cancel out body movement during image reconstruction. This makes it possible to obtain PET images that eliminate the effects of body movement.
- the present invention it is possible to perform highly accurate body motion correction on PET data captured using general clinical protocols without any complicated pre-processing, sensor installation, parameter adjustment, or other factors that make it difficult to handle. In particular, even in cases where intermittent movement occurs, which was difficult to handle with conventional methods, it is possible to estimate body motion after applying appropriate frame division according to the movement. If the body motion correction according to the present invention is implemented, it will be possible to perform PET examinations that reduce the burden on elderly patients and patients with severe illnesses whose heads are difficult to fix, without the need for additional protocols such as external sensors or prior measurements.
- the unit time measurement value is the average value of the center positions of the LORs for each unit time, so that the frame division points can be set appropriately and the frame images can be appropriately corrected.
- the threshold value ⁇ is set to a value that is equal to or less than the resolution of the captured image in the PET device 2 (the size of a single pixel), so that body movements accompanied by rotational movements that tend to produce small amounts of fluctuation can be properly detected.
- the PET image construction device in this invention corresponds to image construction device 1, and similarly below, the PET device corresponds to PET device 2, the image acquisition unit corresponds to the measurement data acquisition program and the control unit 11 that operates in accordance with the program, the measurement value acquisition unit corresponds to the body movement data acquisition program and the control unit 11 that operates in accordance with the program, the division point setting unit corresponds to the division point setting program and the control unit 11 that operates in accordance with the program, the correction data generation unit corresponds to the correction data generation program and the control unit 11 that operates in accordance with the program, and the reconstruction unit corresponds to the reconstruction program and the control unit 11 that operates in accordance with the program, but this invention is not limited to this embodiment and can be embodied in various other ways. Also, the specific configurations given in the above-mentioned embodiments are merely examples and can be modified as appropriate depending on the actual product.
- a method of reconstructing a PET image including motion correction to which the frame division of the present invention can be applied For example, there is a method called multiple acquisition frames (MAF) in which registered frame images are superimposed.
- MAF multiple acquisition frames
- MOLAR Motion-compensation OSEM List-mode Algorithm for Resolution-recovery reconstruction
- the crystal coordinates at both ends of the LoR included in the list mode event are coordinate-converted using motion correction data at the corresponding time and used for reconstruction.
- the invention of claim 1 can be an image construction device comprising an image acquisition unit that acquires measurement data measured by a PET device in time series, a measurement acquisition unit that divides body movement data related to the body movement of the subject obtained in time series from the PET device into predetermined unit times and acquires them as unit time measurement values, a division point setting unit that sets a frame division point when a fluctuation amount of the unit time measurement value exceeds a predetermined threshold, a correction data generation unit that generates correction data for correcting the measurement data so as to cancel out the body movement of the subject in frame units divided by the division point in accordance with the fluctuation amount of the unit time measurement value, and a reconstruction unit that reconstructs a PET image from the measurement data based on the correction data.
- the invention of claim 2 can be the image constructing apparatus of claim 1, wherein the unit time measurement value is an average value of the center position of the LOR for each unit time.
- the invention of claim 3 can be an image construction device as described in claim 1 or 2, in which the correction data generation unit sets a frame division point when the amount of fluctuation of the unit time measurement value based on the unit time measurement value at the start of the frame exceeds a predetermined threshold value.
- the invention of claim 4 can be the image reconstructing apparatus of claim 1, 2 or 3, wherein the threshold value is set to be equal to or smaller than the size of a single pixel of an image reconstructed from the measurement data.
- the invention according to claim 5 can be the image forming apparatus according to claim 4, wherein the threshold value is set to a value equal to or smaller than half the size of the single pixel.
- This invention can be used in the industry related to image construction devices for non-invasively imaging the activity state within the subject body.
Landscapes
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Optics & Photonics (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Physics & Mathematics (AREA)
- High Energy & Nuclear Physics (AREA)
- Molecular Biology (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Nuclear Medicine (AREA)
Abstract
Un dispositif de configuration d'image 1 acquiert des données de mesure capturées par un dispositif de TEP 2 selon une série chronologique, distingue les données de mouvement corporel relatives au mouvement corporel d'un sujet obtenu à partir du dispositif de TEP 2 selon la série chronologique pour chaque temps unitaire prescrit, acquiert les données de mouvement corporel en tant que valeur de mesure de temps unitaire, définit un point de division de trame lorsque la quantité de variation de la valeur de mesure de temps unitaire dépasse une valeur de seuil prescrite, génère des données de correction dans lesquelles des données de mesure sont corrigées afin d'annuler le mouvement corporel du sujet dans chaque unité de trame divisée au point de division, et reconstruit une image médicale sur la base des données de correction pour chaque trame. Ceci permet d'obtenir une image de TEP hautement précise sans fixer un sujet dans un examen de type TEP.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2023108744 | 2023-06-30 | ||
| JP2023-108744 | 2023-06-30 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025004569A1 true WO2025004569A1 (fr) | 2025-01-02 |
Family
ID=93938111
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2024/017771 Pending WO2025004569A1 (fr) | 2023-06-30 | 2024-05-14 | Dispositif de configuration d'image, procédé de configuration d'image, programme de configuration d'image et support de stockage |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025004569A1 (fr) |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2009528139A (ja) * | 2006-02-28 | 2009-08-06 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | リストモードデータに基づく局所動き補償 |
| JP2019510969A (ja) * | 2016-02-29 | 2019-04-18 | シャンハイ・ユナイテッド・イメージング・ヘルスケア・カンパニー・リミテッド | Ect画像を再構成するシステムおよび方法 |
-
2024
- 2024-05-14 WO PCT/JP2024/017771 patent/WO2025004569A1/fr active Pending
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2009528139A (ja) * | 2006-02-28 | 2009-08-06 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | リストモードデータに基づく局所動き補償 |
| JP2019510969A (ja) * | 2016-02-29 | 2019-04-18 | シャンハイ・ユナイテッド・イメージング・ヘルスケア・カンパニー・リミテッド | Ect画像を再構成するシステムおよび方法 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US8098916B2 (en) | System and method for image-based attenuation correction of PET/SPECT images | |
| JP6243121B2 (ja) | 飛行時間情報を用いて画像化スキャンにおける動き検出及び補正をする方法と装置 | |
| US7221728B2 (en) | Method and apparatus for correcting motion in image reconstruction | |
| US8600132B2 (en) | Method and apparatus for motion correcting medical images | |
| US8787643B2 (en) | Functional imaging | |
| US20220047227A1 (en) | Methods and systems for motion detection in positron emission tomography | |
| JP7159359B2 (ja) | 可動容積についての放射性画像を改善するシステム | |
| US7465929B2 (en) | Tracking region-of-interest in nuclear medical imaging and automatic detector head position adjustment based thereon | |
| CN101238391A (zh) | 功能成像中的运动补偿 | |
| US9466132B2 (en) | Systems and methods for motion mitigation determinations | |
| US20180289349A1 (en) | Data-driven surrogate respiratory signal generation for medical imaging | |
| US10852449B2 (en) | System and method for self-time alignment calibration for a positron emission tomography system | |
| CN102047143B (zh) | 保留列表模式格式的几何变换 | |
| US10043268B2 (en) | Medical image processing apparatus and method to generate and display third parameters based on first and second images | |
| WO2009101759A1 (fr) | Dispositif de quantification du flux sanguin cérébral, procédé et programme de quantification du flux sanguin cérébral | |
| Thies et al. | A gradient-based approach to fast and accurate head motion compensation in cone-beam CT | |
| KR20140042461A (ko) | 영상을 생성하는 방법 및 장치 | |
| US10039512B2 (en) | Image quality in computed tomography using redundant information in production data sets | |
| CN110811665A (zh) | Pet图像衰减校正方法、装置、计算机设备和存储介质 | |
| JP4298297B2 (ja) | 診断用撮像装置及び画像処理方法 | |
| CN113520432B (zh) | 适用于断层造影系统的门控方法 | |
| WO2025004569A1 (fr) | Dispositif de configuration d'image, procédé de configuration d'image, programme de configuration d'image et support de stockage | |
| CN115192052A (zh) | 医用图像处理装置以及医用图像处理方法 | |
| Mohammadi et al. | Motion in nuclear cardiology imaging: types, artifacts, detection and correction techniques | |
| US8867810B2 (en) | Automatic identification of disruptive events in imaging scans |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24831442 Country of ref document: EP Kind code of ref document: A1 |