WO2025173755A1 - Système de traitement d'informations, procédé de traitement d'informations et programme de traitement d'informations - Google Patents
Système de traitement d'informations, procédé de traitement d'informations et programme de traitement d'informationsInfo
- Publication number
- WO2025173755A1 WO2025173755A1 PCT/JP2025/004820 JP2025004820W WO2025173755A1 WO 2025173755 A1 WO2025173755 A1 WO 2025173755A1 JP 2025004820 W JP2025004820 W JP 2025004820W WO 2025173755 A1 WO2025173755 A1 WO 2025173755A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- target
- images
- lesion
- pelvic organs
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B10/00—Instruments for taking body samples for diagnostic purposes; Other methods or instruments for diagnosis, e.g. for vaccination diagnosis, sex determination or ovulation-period determination; Throat striking implements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/05—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves
- A61B5/055—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
Definitions
- One aspect of the present disclosure relates to an information processing system, an information processing method, and an information processing program.
- Endometriosis an example of a lesion, is diagnosed by laparoscopy or by macroscopic examination during laparotomy.
- a definitive diagnosis of endometriosis is made by demonstrating "glandular structures and stroma similar to endometrium" in a pathological tissue specimen.
- Diagnosis is not performed under direct vision in all cases, and in daily clinical practice, cases diagnosed comprehensively based on subjective symptoms, physical examination, and test findings are treated as clinical endometriosis.
- the accuracy rate of clinical endometriosis diagnosed by obstetrician-gynecologists is said to be approximately 80%.
- Ovarian chocolate cysts and deep endometriosis can be diagnosed through physical examination and imaging, but the diagnosis of subtle peritoneal lesions and mild adhesions is generally made under direct vision.
- Patent Document 1 describes a system for assessing the severity of vascular obstruction.
- the system segments at least a portion of a volumetric image dataset into data segments corresponding to wall regions of a target organ, analyzes the data segments to extract features indicative of the amount of perfusion experienced by the wall regions of the target organ, obtains a feature-perfusion classification model derived from a training set of perfused organs, classifies the data segments based on the extracted features and the FPC model, and provides as output a prediction indicative of the severity of vascular obstruction based on the feature classification.
- An information processing program causes a computer to execute the steps of acquiring one or more target images showing one or more target pelvic organs of a subject, inputting each of the one or more target images into an image analysis model trained to identify the pelvic organs and at least one lesion associated with the pelvic organs from the input images, and generating one or more analysis result images showing the processing results of the image analysis model in a format that distinguishes each of the one or more target pelvic organs from a target lesion that is a lesion in the subject.
- the subject's pelvic organs and lesions are identified, and an analysis result image showing the processing results is generated. Because the analysis result image shows the pelvic organs and lesions separately from each other, using this analysis result image makes it possible to accurately diagnose the pelvic organs.
- One aspect of the present disclosure makes it possible to accurately diagnose pelvic organs.
- FIG. 1 illustrates an example of a functional configuration of an information processing system.
- FIG. 1 is a diagram illustrating an example of a hardware configuration of a computer that functions as an information processing system.
- 10 is a flowchart illustrating an example of processing of a target image executed by the information processing system.
- FIG. 10 is a diagram illustrating an example of generating an analysis result image.
- FIG. 10 is a diagram showing an example of determining adhesion.
- FIG. 10 is a diagram illustrating an example of displaying a processing result.
- 10 is a flowchart illustrating an example of a comparison of analysis result images executed by the information processing system.
- the information processing system is a computer system that supports the diagnosis of a subject's pelvic organs.
- the information processing system is also referred to as a diagnosis support system.
- Pelvic organs refer to organs located in the pelvic cavity.
- a subject refers to a person who receives a diagnosis of their pelvic organs.
- the subject's pelvic organs are also referred to as "target pelvic organs.”
- the information processing system inputs a target image showing one or more target pelvic organs of a subject into an image analysis model and generates an analysis result image showing the processing results of the image analysis model.
- the image analysis model is a trained model trained by machine learning to identify pelvic organs and at least one lesion associated with the pelvic organs from the input image.
- the image analysis model is a trained model trained by machine learning to output segmentation information for each organ in an input image including multiple organs and to identify the lesion, if present.
- the image analysis model may be a trained model trained by machine learning to simultaneously output segmentation information and the lesion for each organ in an input image including multiple organs.
- the image analysis model may include a first trained model that simultaneously outputs segmentation information for each organ and, if present, a first lesion in an input image including multiple organs, and a second trained model that identifies the second lesion, if present, based on the segmentation information for each organ.
- Machine learning is a method of autonomously finding laws or rules through repeated learning based on given information.
- the analysis result image shows the processing results of the image analysis model in a format that distinguishes between one or more target pelvic organs and the target lesion, which is the affected area in the subject. Because the analysis result image shows the pelvic organs and the lesion separately, accurate diagnosis of the pelvic organs can be made using this analysis result image.
- the information processing system performs further processing based on the analysis result image. For example, the information processing system may process the analysis result image to determine whether or not adhesions exist between the target pelvic organs. Alternatively, the information processing system may estimate a shape physical quantity, which is a physical quantity related to the shape of the target lesion, based on the analysis result image. Alternatively, the information processing system may estimate the severity of disease in the target lesion. Alternatively, the information processing system may compare analysis result images from two different points in time and generate the comparison results. The information processing system may use a predetermined trained model learned by machine learning for at least part of these processes.
- the pelvic organ to be diagnosed may be at least one of the uterus, ovaries, rectum, and bladder.
- the lesion identified by the image analysis model may be at least one of adhesions between organs, fibrous plaques, endometriotic nodules on the posterior surface of the uterus, endometriotic nodules on the surface of the uterus other than the posterior surface of the uterus, rectovaginal septum lesions, rectal lesions, bladder lesions, ovarian cysts, endometriotic cysts, and adenomyosis lesions.
- the information processing system may process adhesions between organs as lesions, or may process them as a symptom different from lesions.
- the information processing system may treat adhesions between organs as a symptom separate from lesions.
- the image analysis model is trained to identify, from an input image, pelvic organs and at least one lesion associated with the pelvic organs, including fibrous plaques, ovarian cysts, rectovaginal septum lesions, rectal lesions, bladder lesions, endometriotic lesions, and adenomyosis lesions.
- the image analysis model is trained to identify, from an input image, pelvic organs including the uterus and ovaries and at least one lesion, including endometriotic nodules on the posterior surface of the uterus, endometriotic nodules on surfaces of the uterus other than the posterior surface, and endometriotic cysts.
- the information processing system generates one or more analysis result images showing the processing results of the image analysis model in a format that distinguishes between each of the one or more target pelvic organs and target lesions, which are lesions in the subject.
- the information processing system then processes the one or more analysis result images using the adhesion determination model to determine whether or not adhesions exist between the target pelvic organs.
- the diagnostic support system disclosed herein may be used to support diagnoses related to the uterus or ovaries.
- the diagnostic support system may be used to support the diagnosis of endometriosis, particularly deep endometriosis.
- a device corresponding to the information processing system according to the present disclosure can be referred to as an information processing device, and in that case, the information processing system and information processing device can be referred to as a diagnosis support device or a pathological condition evaluation device.
- the diagnosis support device may further include an imaging device that generates medical images. A doctor, medical professional, or other worker can operate the information processing device and refer to the generated evaluation information to diagnose a patient's pathological condition.
- [System configuration] 1 is a diagram showing the functional configuration of an information processing system 10 according to an example.
- the information processing system 10 accesses a database 20 and a user terminal 30 via a communication network.
- the database 20 and the user terminal 30 may both be provided in a computer system separate from the information processing system 10, or may be components of the information processing system 10.
- the communication network may be configured by at least one of the Internet and an intranet.
- the database 20 is a device that non-temporarily stores various data related to processing in the information processing system 10.
- the database 20 may store target images, or may store processing results such as analysis result images, adhesion determination results, estimated shape physical quantities, and disease severity at the target lesion.
- the user terminal 30 is a computer operated by a user such as an evaluator.
- the user terminal 30 can be a variety of computers, such as a personal computer, workstation, tablet terminal, smartphone, or wearable terminal.
- the information processing system 10 comprises the following functional modules: an image acquisition unit 11, an image analysis unit 12, an adhesion determination unit 13, a lesion estimation unit 14, a result output unit 15, and a comparison unit 16.
- the image acquisition unit 11 is a functional module that acquires one or more target images showing one or more target pelvic organs of a subject.
- the image analysis unit 12 is a functional module that inputs each of the one or more target images into an image analysis model and generates one or more analysis result images that show the processing results of the image analysis model.
- the adhesion determination unit 13 is a functional module that processes one or more analysis result images and determines whether or not adhesions exist between target pelvic organs.
- the lesion estimation unit 14 is a functional module that performs estimation regarding the target lesion.
- the result output unit 15 is a functional module that outputs processing results that include at least the analysis result image.
- the comparison unit 16 is a functional module that compares analysis result images from two different points in time and generates and outputs the comparison results.
- Each functional module of the information processing system 10 is realized by an information processing program 110 pre-stored in the auxiliary storage unit 103.
- the information processing program 110 can also be considered a diagnostic assistance program or computer program code.
- each functional module is realized by loading the information processing program 110 onto the processor 101 or the main storage unit 102 and having the processor 101 execute the information processing program 110.
- the processor 101 operates the communication control unit 104, input device 105, or output device 106 in accordance with the information processing program 110, and reads and writes data from and to the main storage unit 102 or the auxiliary storage unit 103. Data, trained models, or databases required for processing may be stored in the main storage unit 102 or the auxiliary storage unit 103.
- the information processing program 110 may be provided after being recorded on a non-transitory computer-readable storage medium such as a CD-ROM, DVD-ROM, or semiconductor memory. Alternatively, the information processing program 110 may be provided via a communications network as a data signal superimposed on a carrier wave. The provided information processing program 110 is stored in memory such as the auxiliary storage unit 103.
- the information processing system 10 may be composed of one computer 100, or multiple computers 100. When multiple computers 100 are used, these computers 100 are connected via a communications network such as the Internet or an intranet, thereby logically constructing a single information processing system 10.
- a communications network such as the Internet or an intranet
- Fig. 3 is a flowchart showing this example as a processing flow S1.
- the image acquisition unit 11 acquires one or more target images showing one or more target pelvic organs of the subject.
- the target images are slice images that show cross sections of the target pelvic organs.
- the image acquisition unit 11 acquires one or more target images showing at least one of the subject's uterus, ovaries, rectum, and bladder.
- the target images are medical images generated by any imaging device.
- the target images are obtained by photographing or measuring the human body and performing imaging.
- the image acquisition unit 11 acquires one or more MRI images obtained by nuclear magnetic resonance imaging (MRI) as the one or more target images.
- the MRI images may be sagittal images, coronal images, or axial images.
- the image acquisition unit 11 may read one or more target images from the database 20 in response to an instruction signal from the user terminal 30. Alternatively, the image acquisition unit 11 may receive one or more target images from a predetermined imaging device such as an MRI device in response to the instruction signal. Alternatively, the image acquisition unit 11 may receive one or more target images transmitted from the user terminal 30.
- step S12 the image analysis unit 12 selects one of the one or more target images. For example, the image analysis unit 12 selects one target image according to the order of the one or more target images arranged according to the location on the subject's body that has been photographed.
- step S13 the image analysis unit 12 inputs the selected target image into an image analysis model to generate an analysis result image.
- the image analysis unit 12 uses one or more image analysis models.
- each image analysis model is a trained model that has been trained to identify pelvic organs and at least one lesion associated with the pelvic organs from the input image.
- each image analysis model is trained to identify at least one lesion among multiple lesions associated with pelvic organs.
- the multiple image analysis models are trained independently of each other. That is, the machine learning of each of the multiple image analysis models is performed separately from the other machine learning, for example, without being influenced by the other machine learning.
- the image analysis unit 12 inputs the selected target image into that image analysis model and generates an analysis result image that shows the processing results of the image analysis model.
- the analysis result image shows the processing results in a format that distinguishes, for example, one or more target pelvic organs from the target lesion.
- the image analysis model associates a label with each pixel of the target image, and the image analysis unit 12 generates the analysis result image by referring to each label.
- the image analysis unit 12 When using multiple image analysis models that have been trained independently of each other, the image analysis unit 12 inputs the selected target image into each of the multiple image analysis models to generate multiple provisional images.
- Each of the multiple provisional images shows the processing results of the corresponding image analysis model, for example, in a format that distinguishes one or more target pelvic organs from the target lesion.
- the image analysis model associates a label with each pixel of the target image, and the image analysis unit 12 generates the provisional image by referring to each label.
- the image analysis unit 12 generates an analysis result image based on the multiple provisional images.
- FIG 4 is a diagram showing an example of the generation process.
- the image analysis unit 12 uses multiple image analysis models including image analysis models 121, 122, and 123.
- the image analysis unit 12 inputs the selected target image 200 into each of these image analysis models to generate multiple provisional images 210 including provisional images 211, 212, and 213.
- the relationship between the image analysis models and the provisional images is one-to-one.
- the image analysis unit 12 performs statistical processing on the multiple provisional images 210 to generate an analysis result image 220.
- the analysis result image 220 shows the bladder 221, uterus 222, ovaries 223, and rectum 224.
- the image analysis unit 12 calculates the average pixel value for each of multiple voxels in the set of multiple provisional images 210, and generates the analysis result image 220 using the individual average pixel values.
- a set of multiple interim images 210 for obtaining multiple voxels is obtained by virtually stacking multiple interim images 210.
- the image analysis unit 12 may calculate the uncertainty of at least a portion of the analysis result image based on the multiple provisional images and the analysis result image. Uncertainty is an index that quantitatively indicates the degree of variation in multiple processing results obtained by multiple image analysis models.
- the image analysis unit 12 may calculate the uncertainty for a region of interest in diagnosing pelvic organs. In the example of Figure 4, the image analysis unit 12 calculates, for a portion of region 231, the uncertainty for each of multiple voxels obtained by assembling the multiple provisional images 210 as voxel uncertainty based on the multiple provisional images 210 and the analysis result image 220.
- the voxel uncertainty is the variation in multiple pixel values in the multiple provisional images 210 at that voxel, when the pixel value (average value) of the analysis result image 220 at that voxel is used as a reference.
- the image analysis unit 12 may generate a reference image 230 that shows the distribution of voxel uncertainty in region 231.
- the image analysis unit 12 calculates the statistical value (e.g., average value) of the voxel uncertainty in region 231 as the uncertainty Us in region 231. In the example of Figure 4, the uncertainty Us is 0.002.
- the region of interest may be set arbitrarily depending on the target organ or disease. Calculating uncertainty specific to the lesion rather than evaluating the entire pelvic cavity will further improve the interpretability of the prediction results.
- the image analysis unit 12 may calculate the uncertainty for detecting lesions in the area around the contour of the posterior surface of the uterus.
- the image analysis unit 12 may also have a function to output or display an alert when the uncertainty exceeds a threshold.
- step S14 if there are unprocessed target images (NO in step S14), the process returns to step S12.
- the image analysis unit 12 selects the next target image.
- the image analysis unit 12 inputs the target image into one or more image analysis models, respectively, to generate an analysis result image. In this way, the image analysis unit 12 inputs one or more target images into one or more image analysis models, respectively, to generate one or more analysis result images.
- step S15 the adhesion determination unit 13 processes one or more analysis result images to determine adhesions between the target pelvic organs.
- the adhesion determination unit 13 determines whether or not adhesions exist between the target pelvic organs, i.e., whether or not two or more adjacent target pelvic organs are adhered to each other.
- the adhesion determination unit 13 may determine at least one of adhesions between the uterus and rectum, adhesions between the left ovary and rectum, adhesions between the right ovary and rectum, adhesions between the left ovary and uterus, adhesions between the right ovary and uterus, adhesions between the left ovary and right ovary, and adhesions between the uterus and bladder.
- the adhesion determination unit 13 may also determine adhesions when adhesions exist between the target organs via plaques (nodules).
- the adhesion determination unit 13 may further determine the degree of each adhesion.
- the degree of adhesion may be expressed as no adhesion (None), mild adhesion (Mild), severe adhesion (Severe), etc.
- the adhesion determination unit 13 may use the size of the adhesion area to determine the degree of adhesion.
- the adhesion determination unit 13 may determine that there is mild adhesion when the target pelvic organs are in contact with each other over a long area without the presence of intervening fat tissue, ascites, or when a thin, cord-like or tent-like low-signal area is observed between the target pelvic organs.
- the adhesion determination unit 13 may also determine that there is severe adhesion when a thick, plate-like low-signal area is observed between the target pelvic organs, or when there is clear deformation of the target pelvic organs due to adhesion.
- the adhesion assessment unit 13 analyzes one or more analysis result images to generate target shape data, and determines whether or not adhesions exist based on this target shape data.
- the target shape data is data indicating a set of one or more parameter values of the three-dimensional shapes of multiple target pelvic organs.
- the target shape data may also be referred to as feature values related to the shape of each organ.
- the adhesion assessment unit 13 calculates multiple parameter values for each of multiple target pelvic organs based on one or more analysis result images, such as surface area, volume, sphericity, flatness, length, major/minor axis lengths, axis lengths in the direction of the largest principal component, maximum diameters in the height/width/depth directions, maximum diameter in 3D, and surface-to-volume ratio. At least some of these parameter values may correspond to radiomics features.
- the adhesion assessment unit 13 analyzes individual analysis result images for each target pelvic organ to calculate one or more parameter values for the target pelvic organ.
- the adhesion determination unit then generates target shape data indicating a set of calculated parameter values.
- the adhesion determination unit 13 inputs the target shape data into the adhesion determination model to determine whether or not adhesions exist.
- the adhesion determination model is a trained model that has been trained to identify adhesions between pelvic organs from shape data, which is a set of parameter values for the three-dimensional shapes of each of multiple pelvic organs.
- one adhesion determination model determines whether or not adhesion exists between two specific pelvic organs. Therefore, an adhesion determination model is prepared for each combination of two pelvic organs to be determined. In other words, the adhesion determination model determines adhesion between a first organ and a second organ among multiple pelvic organs.
- the adhesion determination unit 13 inputs target shape data corresponding to each combination of first and second organs into multiple adhesion determination models that have different combinations of first and second organs, and determines whether or not adhesion exists for each of the multiple combinations of first and second organs.
- FIG. 5 is a diagram showing an example of the determination process.
- the adhesion determination unit 13 analyzes multiple analysis result images to generate object shape data for the uterus, left ovary, right ovary, rectum, and bladder.
- the adhesion determination unit 13 uses seven adhesion determination models 131 to 137 to determine adhesions for each of seven combinations of the first organ and the second organ.
- Adhesion determination model 131 accepts object shape data corresponding to the uterus and rectum and determines adhesions between these organs.
- Adhesion determination model 132 accepts object shape data corresponding to the left ovary and rectum and determines adhesions between these organs.
- Adhesion determination model 133 accepts object shape data corresponding to the right ovary and rectum and determines adhesions between these organs.
- Adhesion determination model 134 accepts object shape data corresponding to the left ovary and uterus and determines adhesions between these organs.
- Adhesion determination model 135 accepts object shape data corresponding to the right ovary and uterus and determines adhesions between these organs.
- Adhesion determination model 136 accepts object shape data corresponding to the left ovary and right ovary and determines adhesions between these organs.
- Adhesion determination model 137 accepts object shape data corresponding to the uterus and bladder and determines adhesions between these organs.
- Adhesion determination models 131 to 137 all output adhesion data indicating the determination results.
- the lesion estimation unit 14 analyzes one or more analysis result images to estimate the shape physical quantity of the target lesion.
- Shape physical quantity refers to a physical quantity related to the shape of the target lesion.
- the lesion estimation unit 14 may estimate at least one of the dimensions and volume of the target lesion as the shape physical quantity.
- the lesion estimation unit 14 may estimate at least one of the major axis and minor axis of the target lesion as the dimension.
- the major axis refers to the longest length among the lengths of imaginary axes crossing the target lesion.
- the minor axis refers to the shortest length among the lengths of imaginary axes perpendicular to the imaginary axis indicating the major axis.
- the major axis may be referred to as length, and the minor axis may be referred to as width or thickness.
- the lesion estimation unit 14 may estimate the thickness of the fibrous plaque as the shape physical quantity. If the target lesion is an ovarian cyst, the lesion estimation unit 14 may estimate at least one of the volume, major axis, and minor axis of the ovarian cyst as the shape physical quantity.
- the lesion estimation unit 14 measures the shape physical quantities of the target lesion for each of one or more analysis result images showing the target lesion using a predetermined measurement algorithm.
- the lesion estimation unit 14 may estimate the shape physical quantities for each of one or more analysis result images showing the target lesion by inputting the analysis result image into a physical quantity estimation model that has been trained to estimate physical quantities related to the shape of the lesion from an image showing the lesion.
- a physical quantity estimation model may be prepared for each lesion. After determining the shape physical quantities for each analysis result image, the lesion estimation unit 14 estimates the statistical values of these shape physical quantities as the final shape physical quantities. Examples of such statistical values include the maximum value, average value, and median.
- the lesion estimation unit 14 estimates the severity of the disease at the target lesion site.
- the severity of the disease can be expressed as mild, severe, etc. This enables doctors, who are users of the system, to easily make comprehensive judgments about the disease.
- the lesion estimation unit 14 estimates the severity based on the estimated shape physical quantity.
- the lesion estimation unit 14 may refer to a predetermined correspondence table showing the correspondence between shape physical quantities and the severity of the disease, and obtain the severity corresponding to the estimated shape physical quantity as the estimation result.
- the lesion estimation unit 14 may estimate the severity of the disease at the target lesion site in response to determining that adhesions exist between the target pelvic organs.
- the lesion estimation unit 14 may refer to a predetermined correspondence table showing the correspondence between the degree of adhesion and the severity of the disease, and obtain the severity corresponding to the determined degree of adhesion as the estimation result.
- the lesion estimation unit 14 may refer to a predetermined correspondence table showing the correspondence between shape physical quantities, the degree of adhesion, and the severity of the disease, and obtain the severity corresponding to both the estimated shape physical quantities and the determined degree of adhesion as an estimation result.
- the lesion estimation unit 14 may estimate the severity using a severity estimation model trained to estimate the severity of the disease at the lesion site from at least one of the shape physical quantities and the degree of adhesion.
- a severity estimation model may be prepared for each disease.
- the result output unit 15 generates and outputs the processing results.
- the result output unit 15 generates and outputs the processing results including one or more analysis result images, a determination result regarding adhesions, a shape physical quantity, and/or the severity of disease at the lesion.
- the result output unit 15 may edit the analysis result images to highlight the target lesion and not highlight at least one of the one or more target pelvic organs, and generate the processing results including the edited analysis result images.
- the result output unit 15 may generate the processing results including the uncertainty in at least a portion of the analysis result images.
- the result output unit 15 may generate the processing results including at least one of a reference image showing the distribution of voxel uncertainty (e.g., reference image 230 shown in FIG. 4) and an uncertainty value (e.g., uncertainty Us shown in FIG. 4) for each of the one or more analysis result images.
- the result output unit 15 may generate the processing results including an alert regarding the uncertainty in response to the uncertainty for at least one analysis result image exceeding a predetermined threshold. This alert is intended to inform users that there is a relatively high degree of uncertainty in the analysis results image.
- the result output unit 15 transmits the generated processing results to the user terminal 30.
- the user terminal 30 receives and displays the processing results. Therefore, the transmission of the processing results by the result output unit 15 is an example of outputting the processing results, an example of displaying the processing results on a display device, and an example of outputting an alert.
- FIG. 6 is a diagram showing an example of displaying processing results.
- the screen 300 shown in this example includes an analysis result image 310 edited to highlight the target lesion 311 and not the target pelvic organ, a slide bar 320 that is a user interface for switching between multiple analysis result images 310 to be displayed, and a determination result 330 regarding adhesions between the target pelvic organs.
- the user can operate the slide bar 320 to manually or automatically switch between the multiple analysis result images 310 displayed.
- the result output unit 15 stores the generated processing results in the database 20.
- the result output unit 15 generates a data record in which a subject ID, which is an identifier that uniquely identifies the subject, a shooting date and time indicating when one or more subject images were taken, and the generated processing results are associated with each other, and stores this data record in the database 20.
- the information processing system 10 can execute processing flow S1 for one or more target images for each of multiple subjects.
- the database 20 can store processing results for multiple subjects. If one or more target images are acquired for a certain subject at each of multiple time points, the information processing system 10 can execute processing flow S1 for one or more target images at each time point. For example, the information processing system 10 executes processing flow S1 for one or more target images at a time point before a therapeutic drug is administered to the subject, and one or more target images at a time point after the therapeutic drug is administered to the subject.
- the information processing system 10 may also execute processing flow S1 for one or more target images at each of multiple time points after the therapeutic drug is administered to the subject.
- the database 20 can store a history of processing results for a certain subject.
- Fig. 7 is a flowchart showing this example as a processing flow S2.
- the comparison unit 16 receives a comparison instruction from the user terminal 30.
- the comparison instruction is a data signal that requests the information processing system 10 to compare analysis result images at two different points in time. Based on user operation, the user terminal 30 generates a comparison instruction that indicates the subject ID, a first point in time, and a second point in time that is different from the first point in time, and transmits this comparison instruction to the information processing system 10.
- the comparison unit 16 receives the comparison instruction.
- step S22 the comparison unit 16 acquires one or more first analysis result images generated based on one or more first target images at a first time point.
- the comparison unit 16 reads out from the database 20 one or more analysis result images corresponding to the subject ID and first time point indicated in the comparison instruction as one or more first analysis result images.
- the first analysis result images are images obtained by inputting the first target image into an image analysis model.
- step S23 the comparison unit 16 acquires one or more second analysis result images generated based on one or more second target images at a second time point.
- the comparison unit 16 reads out from the database 20 one or more analysis result images corresponding to the subject ID and second time point indicated in the comparison instruction as one or more second analysis result images.
- the second analysis result images are images obtained by inputting the second target image into an image analysis model.
- step S24 the comparison unit 16 compares one or more first analysis result images with one or more second analysis result images.
- the comparison unit 16 generates one or more pairs of first analysis result images and second analysis result images to be compared, and performs the following processing on the one or more pairs. That is, the comparison unit 16 aligns the positions of the two images so that the positions of one or more target pelvic organs match between the first analysis result image and the second analysis result image.
- the comparison unit 16 then compares the two images to determine whether the target lesion has changed. For example, the comparison unit 16 determines whether the dimensions or position of the target lesion has changed.
- the comparison unit 16 generates and outputs the comparison results.
- the comparison unit 16 may generate the comparison results including at least one of a comparison image, which is an image showing the location of changes in the target lesion, and a numerical value indicating the amount of change in the dimensions or position of the target lesion.
- the comparison unit 16 transmits the generated comparison results to the user terminal 30.
- the user terminal 30 receives and displays the comparison results. Therefore, the transmission of the comparison results by the comparison unit 16 is an example of a process for displaying the comparison results on a display device.
- the comparison unit 16 may compare analysis result images at three or more time points in chronological order to generate comparison results that indicate changes in the target lesion over the three or more time points.
- the comparison unit 16 treats two of the three or more time points as the first and second time points and generates comparison results corresponding to those two time points.
- the comparison unit 16 compares the analysis result images in chronological order while changing the combination of the first and second time points.
- the information processing system 10 can perform processing using various trained models.
- Each trained model may be generated by the information processing system 10.
- the information processing system 10 further includes a learning unit that generates the various trained models.
- each trained model may be generated by a computer system separate from the information processing system 10.
- a trained model generated by another computer system can be ported to the information processing system 10.
- the generation of a trained model corresponds to the learning phase of machine learning.
- the processing flow S1 executed using the generated trained model corresponds to the operation phase or estimation phase of machine learning.
- Each of the one or more image analysis models is trained to identify pelvic organs and at least one lesion associated with the pelvic organs from an input image.
- the multiple image analysis models are trained independently of each other.
- training data is used, including multiple data records representing pairs of sample images showing at least one or more pelvic organs and annotations corresponding to the sample images.
- the sample images may show at least one lesion in addition to one or more pelvic organs.
- the annotations indicate the correct identification of one or more pelvic organs shown in the sample images.
- the annotations may also indicate the correct identification of at least one lesion shown in the sample images.
- the image analysis model is realized by a machine learning model that performs semantic segmentation, such as 3D U-Net or SwinUNETR. SwinUNETR is a Transformer-based machine learning model that performs self-supervised learning without annotation costs.
- Each of the one or more adhesion detection models is trained to identify adhesions between pelvic organs from shape data, which is a collection of parameter values for the three-dimensional shapes of each of the multiple pelvic organs.
- shape data is a collection of parameter values for the three-dimensional shapes of each of the multiple pelvic organs.
- the adhesion detection model may be configured using a first detection model that determines the presence or absence of adhesions and a second detection model that determines the degree of adhesions.
- training data is used that includes multiple data records indicating pairs of shape data for two pelvic organs and adhesion data regarding adhesions between the two pelvic organs.
- the adhesion data indicates at least a correct answer regarding the presence or absence of adhesions, and this correct answer can be used in machine learning for the first detection model.
- the adhesion data may further indicate a correct answer regarding the degree of adhesions, and this correct answer can be used in machine learning for the second detection model.
- the adhesion detection model may be implemented, for example, using a decision tree such as LightGBM, a neural network, deep learning such as DenseNet, or multi-task learning.
- Each of the one or more physical quantity estimation models is trained to estimate physical quantities related to the shape of a lesion from an image showing the lesion.
- the multiple physical quantity estimation models are trained independently of each other.
- Machine learning uses training data that includes multiple data records showing pairs of sample images showing the lesion and physical quantities related to the shape of the lesion.
- Physical quantity estimation models can be realized using techniques such as deep learning.
- the information processing system 10 includes an adhesion determination unit 13, a lesion estimation unit 14, and a comparison unit 16, but the information processing system does not necessarily need to include at least one of these functional modules.
- the expression "at least one processor executes a first process, executes a second process, ... executes an nth process,” or an expression corresponding thereto, refers to a concept that includes cases where the processor that executes the n processes from the first process to the nth process changes midway.
- this expression refers to a concept that includes both cases where all n processes are executed by the same processor, and cases where the processor changes among the n processes according to an arbitrary policy.
- ⁇ to indicate a range is an expression that includes both ends of the range.
- a to B means a range that is greater than or equal to A and less than or equal to B.
- the term "about,” when used in conjunction with a numerical value, means a range of +10% and -10% of that numerical value.
- (Appendix 2) the image analysis model is trained to identify the at least one lesion among a plurality of lesions associated with the pelvic organs.
- the information processing system of claim 1. the at least one processor acquiring the one or more target images showing at least one of a uterus, an ovary, a rectum, and a bladder as the one or more target pelvic organs; 3.
- the information processing system according to claim 1 or 2. the one or more target pelvic organs are a plurality of target pelvic organs; the at least one processor processes the one or more analysis result images to determine whether adhesions exist between the target pelvic organs. 4.
- the at least one processor As the processing of the one or more analysis result images, a set of parameter values of three-dimensional shapes of each of the plurality of target pelvic organs is generated as target shape data based on the one or more analysis result images; determining whether or not adhesion exists between the target pelvic organs based on the target shape data; 5.
- the information processing system according to claim 4. the one or more target images are a plurality of target images, the at least one processor: inputting each of the plurality of target images into the image analysis model to generate a plurality of analysis result images; generating the target shape data based on the plurality of analysis result images; 6.
- the at least one processor determines whether or not at least one of adhesions is present: adhesion between the uterus and the rectum, adhesion between the left ovary and the rectum, adhesion between the right ovary and the rectum, adhesion between the left ovary and the uterus, adhesion between the right ovary and the uterus, adhesion between the left ovary and the right ovary, and adhesion between the uterus and the bladder; An information processing system according to any one of appendices 4 to 10.
- the at least one processor determines whether or not the adhesions include adhesions between the uterus and the rectum, adhesions between the left ovary and the rectum, adhesions between the right ovary and the rectum, adhesions between the left ovary and the uterus, adhesions between the right ovary and the uterus, adhesions between the left ovary and the right ovary, and adhesions between the uterus and the bladder; 12.
- the at least one processor for each of the one or more target images: inputting the target image into each of a plurality of image analysis models that have been trained independently of one another to generate a plurality of provisional images, wherein each of the provisional images represents a processing result of the corresponding image analysis model in a format that distinguishes each of the one or more target pelvic organs from the target lesion; generating the analysis result image based on the plurality of provisional images; 21.
- An information processing system according to any one of appendices 1 to 20.
- the at least one processor performs statistical processing on the plurality of interim images to generate the analysis result image. 22.
- the information processing system of claim 23 (Appendix 25) the at least one processor calculates the uncertainty for a region around a contour of the posterior uterus for each of the one or more target images. 25. The information processing system according to claim 23 or 24. (Appendix 26) the at least one processor outputs an alert in response to the calculated uncertainty exceeding a predetermined threshold. 26. An information processing system according to any one of appendices 23 to 25. (Appendix 27) the at least one processor acquires one or more MRI images obtained by nuclear magnetic resonance imaging as the one or more target images; 27. An information processing system according to any one of appendices 1 to 26.
- the at least one lesion includes at least one of an interorgan adhesion, an endometriotic nodule on the posterior surface of the uterus, an endometriotic nodule on a surface of the uterus other than the posterior surface of the uterus, a rectovaginal septum lesion, a rectal lesion, a bladder lesion, an endometriotic cyst, a fibrous plaque, an ovarian cyst, and an adenomyosis lesion.
- An information processing system according to any one of appendices 1 to 27.
- the one or more target images include one or more first target images acquired at a first time point and one or more second target images acquired at a second time point different from the first time point;
- the one or more analysis result images include one or more first analysis result images generated by inputting each of the one or more first target images into the image analysis model, and one or more second analysis result images generated by inputting each of the one or more second target images into the image analysis model, the at least one processor generates a comparison result between the one or more first analysis result images and the one or more second analysis result images; 29.
- the one or more analysis result images are a plurality of analysis result images corresponding to a plurality of the target images, the at least one processor displays, on a display device, a screen including a user interface for switching the analysis result image to be displayed among the plurality of analysis result images.
- (Appendix 32) at least one processor; the at least one processor: acquiring one or more target images showing a plurality of target pelvic organs of a subject; inputting each of the one or more target images into an image analysis model trained to identify pelvic organs and at least one lesion associated with the pelvic organs from an input image, and generating one or more analysis result images showing the processing results of the image analysis model in a format that distinguishes each of the plurality of target pelvic organs from a target lesion that is the lesion in the subject; processing the one or more analysis result images to determine whether adhesions exist between the target pelvic organs; Information processing system.
- a diagnostic support system for supporting a diagnosis related to the uterus or ovaries comprising: at least one processor; the at least one processor: acquiring one or more target images showing one or more target pelvic organs of a subject; inputting each of the one or more target images into an image analysis model that has been trained to identify pelvic organs and at least one lesion associated with the pelvic organs from the input images, and generating one or more analysis result images that show the processing results of the image analysis model in a format that distinguishes each of the one or more target pelvic organs from a target lesion that is the lesion in the subject. Diagnostic support system.
- a diagnostic support system for supporting the diagnosis of endometriosis comprising: at least one processor; the at least one processor: acquiring one or more target images showing a plurality of target pelvic organs of the subject, including a uterus or ovaries; inputting each of the one or more target images into an image analysis model trained to identify pelvic organs and at least one lesion associated with the pelvic organs from an input image, and generating one or more analysis result images showing the processing results of the image analysis model in a format that distinguishes each of the plurality of target pelvic organs from a target lesion that is the lesion in the subject; Evaluating the pathology of endometriosis based on the one or more analysis result images;
- the evaluation of the pathological condition of endometriosis includes: (i) processing the one or more analysis result images to determine whether adhesions exist between the target pelvic organs, and estimating and outputting the severity of disease at the target lesion in response to the determination that adhesions exist;
- a pathology evaluation device for evaluating a pathology related to the uterus or ovaries, comprising: at least one processor; the at least one processor: acquiring one or more target images showing one or more target pelvic organs of the subject, including a uterus or an ovary; inputting each of the one or more target images into an image analysis model trained to identify pelvic organs and at least one lesion associated with the pelvic organs from an input image, and generating one or more analysis result images showing the processing results of the image analysis model in a format that distinguishes each of the one or more target pelvic organs from a target lesion that is the lesion in the subject; Evaluating the pathology of endometriosis based on the one or more analysis result images;
- the evaluation of the pathological condition of endometriosis includes: processing the one or more analysis result images to determine whether adhesions exist between the target pelvic organs; and and estimating a shape physical quantity, which is a physical quantity related to the shape of the target le
- Pathological evaluation device at least one processor and at least one memory storing computer program code; The at least one memory and the computer program code, together with the at least one processor, acquiring one or more target images showing one or more target pelvic organs of a subject; inputting each of the one or more target images into an image analysis model that has been trained to identify pelvic organs and at least one lesion associated with the pelvic organs from an input image, and generating one or more analysis result images that show the processing results of the image analysis model in a format that distinguishes each of the one or more target pelvic organs from a target lesion that is the lesion in the subject; Information processing device. (Appendix 44) 1.
- An information processing method executed by an information processing system including at least one processor comprising: inputting one or more target images showing one or more target pelvic organs of a subject into an image analysis model trained to identify pelvic organs and at least one lesion associated with the pelvic organs from input images, and generating one or more analysis result images showing the processing results of the image analysis model in a format that distinguishes each of the one or more target pelvic organs from the target lesion that is the lesion in the subject; processing the one or more analysis result images to determine whether adhesions exist between the target pelvic organs;
- An information processing method including: (Appendix 45) 1.
- An information processing method executed by an information processing system including at least one processor comprising: inputting one or more target images showing one or more target pelvic organs of a subject into an image analysis model trained to identify pelvic organs and at least one lesion associated with the pelvic organs from input images, and generating one or more analysis result images showing the processing results of the image analysis model in a format that distinguishes each of the one or more target pelvic organs from the target lesion that is the lesion in the subject; in response to the target lesion being identified in at least one of the one or more target images by the image analysis model, estimating a shape physical quantity, which is a physical quantity related to the shape of the target lesion, based on the one or more analysis result images;
- An information processing method including: (Appendix 46) 1.
- An information processing method including:
- Appendix 2 it is possible to obtain an analysis result image that shows specific lesions separately from pelvic organs and other lesions.
- an analysis result image is obtained that clearly shows at least one of the uterus, ovaries, rectum, and bladder, allowing for accurate diagnosis of that particular pelvic organ.
- Adhesion can be an important factor in diagnosing pelvic organs. Therefore, obtaining a determination regarding adhesions allows for a more detailed diagnosis of pelvic organs.
- adhesion determination is performed based on the three-dimensional shape of the target pelvic organ, making it possible to perform the determination with greater accuracy.
- adhesion determination is performed by inputting target shape data into an adhesion determination model obtained through learning.
- adhesion determination model it is possible to more accurately determine adhesions, which may occur in a variety of cases.
- an adhesion detection model is prepared for each combination of two adjacent target internal organs, allowing for more accurate adhesion detection depending on that combination.
- Appendix 9 also assesses the degree of adhesion, providing more detailed information about adhesions.
- information regarding at least one of adhesions between the uterus and rectum, adhesions between the left ovary and rectum, adhesions between the right ovary and rectum, adhesions between the left ovary and uterus, adhesions between the right ovary and uterus, adhesions between the left ovary and right ovary, and adhesions between the uterus and bladder can be automatically obtained.
- Appendix 12 allows for automatic acquisition of information about major adhesions related to pelvic organs.
- At least one of endometriotic nodules on the posterior surface of the uterus, endometriotic nodules on surfaces of the uterus other than the posterior surface of the uterus, and endometriotic cysts are distinguished and displayed in the analysis result image. Furthermore, a determination regarding adhesions is made based on this analysis result image. As a result, an accurate diagnosis of the lesion is possible.
- the shape and physical quantities of the target lesion are estimated based on one or more analysis result images.
- the physical quantities related to the shape of the lesion can be important factors for diagnosing the pelvic organs. Therefore, by obtaining these shape and physical quantities, the pelvic organs can be diagnosed in more detail.
- shape physical quantities are estimated by inputting the analysis result image into a physical quantity estimation model obtained through learning.
- physical quantities related to the shape of the lesion can be estimated with greater accuracy.
- the major and minor diameters which are physical quantities that directly indicate the condition of the affected area, can be obtained, allowing for a more detailed diagnosis of the pelvic organs.
- Appendix 18 allows for the estimation of the thickness of the fibrous plaque, enabling a more detailed diagnosis of this lesion.
- At least one of the volume, major axis, and minor axis of an ovarian cyst can be estimated, enabling a more detailed diagnosis of the lesion.
- the severity of disease at the target lesion is estimated based on the shape and physical quantities, allowing for a more detailed diagnosis of the pelvic organs.
- uncertainty which is information indicating how likely at least a portion of the analysis result image is, can be presented to users such as evaluators. This uncertainty can also be useful in diagnosing pelvic organs.
- voxel uncertainty is calculated for each of the multiple voxels in a set of multiple provisional images, and the statistical value of these voxel uncertainties is obtained as the final uncertainty. This calculation makes it possible to accurately determine how likely at least a portion of the analysis result image is.
- Appendix 25 allows the user to be informed of uncertainty regarding the area surrounding the posterior uterine contour.
- an alert can be sent to the user to inform them that the reliability of the analysis result image falls below a predetermined standard. This alert can be useful in interpreting the analysis result image.
- analysis result images can be generated from MRI images, which are often used to diagnose pelvic organs.
- the analysis result image distinguishes and shows at least one of the following: adhesions between organs, endometriotic nodules on the posterior surface of the uterus, endometriotic nodules on the surface of the uterus other than the posterior surface of the uterus, rectovaginal septum lesions, rectal lesions, bladder lesions, endometriotic cysts, fibrous plaques, ovarian cysts, and adenomyosis lesions.
- adhesions between organs endometriotic nodules on the posterior surface of the uterus
- endometriotic nodules on the surface of the uterus other than the posterior surface of the uterus rectovaginal septum lesions
- rectal lesions rectal lesions
- bladder lesions endometriotic cysts
- fibrous plaques ovarian cysts
- adenomyosis lesions Using this analysis result image enables accurate diagnosis of these lesions.
- results are generated by comparing analysis result images taken at two different points in time, making it possible to accurately diagnose changes in pelvic organs or lesions over time.
- the target lesion is displayed prominently, allowing the location or condition of the lesion to be clearly presented to users such as evaluators.
- a user interface is provided for switching the analysis result image to be displayed, making it easy to handle multiple analysis result images on the display device.
- Supplementary Notes 32 and 44 by inputting a target image into the image analysis model obtained through learning, the subject's pelvic organs and lesions are identified, and an analysis result image showing the processing results is generated. Because the analysis result image shows the pelvic organs and lesions separately from each other, accurate diagnosis of the pelvic organs is possible by using this analysis result image. In addition, information regarding adhesions between the target pelvic organs can be automatically obtained from the analysis result image. Adhesion can be an important factor in diagnosing pelvic organs. A determination regarding adhesions can also be obtained, allowing for a more detailed diagnosis of the pelvic organs.
- Supplementary Notes 33 and 45 by inputting a target image into the image analysis model obtained through learning, the subject's pelvic organs and lesions are identified, and an analysis result image showing the processing results is generated. Because the analysis result image shows the pelvic organs and lesions separately from each other, accurate diagnosis of the pelvic organs is possible by using this analysis result image.
- the shape and physical quantities of the target lesion are estimated based on one or more analysis result images. The physical quantities related to the shape of the lesion can be important factors for diagnosing the pelvic organs. Therefore, by obtaining these shape and physical quantities, the pelvic organs can be diagnosed in more detail.
- Supplements 34 and 46 by inputting a target image into the image analysis model obtained through learning, the subject's pelvic organs and lesions are identified, and an analysis result image showing the processing results is generated. Because the analysis result image shows the pelvic organs and lesions separately from each other, accurate diagnosis of the pelvic organs is possible by using this analysis result image. Furthermore, multiple image analysis models are used for each target image, and a final analysis result image is generated based on the results (provisional images) of each image analysis model. This system allows for more accurate generation of analysis result images. Therefore, pelvic organs can be diagnosed more accurately.
- Appendix 35 by inputting a target image into the image analysis model obtained through learning, the subject's pelvic organs and lesions are identified, and an analysis result image showing the processing results is generated.
- the analysis result image shows the pelvic organs and lesions separately from each other, so using this analysis result image enables accurate diagnosis of the pelvic organs.
- a comparison image is generated showing the results of comparing analysis result images taken at two different points in time, making it possible to accurately diagnose changes in the pelvic organs or lesions over time.
- Supplements 39 to 41 by inputting a target image into the image analysis model obtained through learning, the subject's pelvic organs and lesions are identified, and an analysis result image showing the processing results is generated. Because the analysis result image distinguishes between the pelvic organs and lesions, accurate diagnosis of the pelvic organs is possible using this analysis result image. In one example, visualizing the predicted position of the pelvic organs makes it easier to determine the pathology of the pelvic organs, and visualizing the predicted position of the lesions prevents the physician using the system from overlooking lesions and reduces variability in diagnosis.
- machine learning technology can automatically detect lesions related to uterine or ovarian conditions, such as ovarian cysts and deep-seated endometriotic lesions (e.g., plaques and adhesions) present in the pelvic cavity, which contains mobile organs.
- lesions related to uterine or ovarian conditions such as ovarian cysts and deep-seated endometriotic lesions (e.g., plaques and adhesions) present in the pelvic cavity, which contains mobile organs.
- endometriotic lesions e.g., plaques and adhesions
- Supplementary Note 42 by inputting a target image into the image analysis model obtained through learning, the subject's pelvic organs and lesions are identified, and an analysis result image showing the processing results is generated. Because the analysis result image shows the pelvic organs and lesions separately from each other, using this analysis result image enables accurate diagnosis of the pelvic organs. Furthermore, information regarding adhesions between the target pelvic organs is automatically obtained from the analysis result image. Adhesion can be an important factor in diagnosing the pelvic organs. A determination regarding adhesions can also be obtained, allowing for a more detailed diagnosis of the pelvic organs. Furthermore, the shape and physical quantities of the target lesion are estimated based on one or more analysis result images. The physical quantities related to the shape of the lesion can be an important factor in diagnosing the pelvic organs. Therefore, obtaining these shape and physical quantities allows for a more detailed diagnosis of the pelvic organs.
- Appendix 47 by inputting a target image into the image analysis model obtained through learning, the subject's pelvic organs and lesions are identified, and an analysis result image showing the processing results is generated. Because the analysis result image shows the pelvic organs and lesions separately from each other, using this analysis result image makes it possible to accurately evaluate the pathology of the pelvic organs. Furthermore, by automatically obtaining information about adhesions between the target pelvic organs and the physical quantities of the shape of the target lesion from the analysis result image, the pathology of the pelvic organs can be evaluated in more detail.
- 10 Information processing system, 11... Image acquisition unit, 12... Image analysis unit, 13... Adhesion determination unit, 14... Lesion estimation unit, 15... Output unit, 16... Comparison unit, 20... Database, 30... User terminal, 110... Information processing program, 121-123... Image analysis model, 131-137... Adhesion determination model, 200... Target image, 210... Provisional image, 220... Analysis result image, 230... Reference image, 300... Screen, 310... Analysis result image, 320... Slide bar, 330... Adhesion determination result.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Pathology (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Biomedical Technology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Heart & Thoracic Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- High Energy & Nuclear Physics (AREA)
- Radiology & Medical Imaging (AREA)
- Biophysics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
Abstract
Ce système de traitement d'informations comprend au moins un processeur. Le ou les processeurs acquièrent une ou plusieurs images cibles indiquant un ou plusieurs organes pelviens cibles d'un sujet, et entre chacune de la ou des images cibles dans un modèle d'analyse d'image, qui est entraîné de façon à identifier l'organe pelvien et au moins une lésion associée à l'organe pelvien à partir de l'image d'entrée, pour générer une ou plusieurs images de résultat d'analyse indiquant le résultat du traitement par le modèle d'analyse d'image dans un format dans lequel chacun du ou des organes pelviens cibles et une lésion cible qui est la lésion du sujet sont distingués les uns des autres.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2024-022264 | 2024-02-16 | ||
| JP2024022264 | 2024-02-16 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025173755A1 true WO2025173755A1 (fr) | 2025-08-21 |
Family
ID=96773787
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2025/004820 Pending WO2025173755A1 (fr) | 2024-02-16 | 2025-02-13 | Système de traitement d'informations, procédé de traitement d'informations et programme de traitement d'informations |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025173755A1 (fr) |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2014507231A (ja) * | 2011-02-24 | 2014-03-27 | ドッグ マイクロシステムズ インコーポレーテッド | 撮像データにおける潜在異常を特定する方法及び装置並びに医用画像へのその応用 |
| JP2021523449A (ja) * | 2018-10-25 | 2021-09-02 | テンセント・テクノロジー・(シェンジェン)・カンパニー・リミテッド | 検出モデルのトレーニング方法及び装置、並びに端末機器及びプログラム |
| JP2023511300A (ja) * | 2020-01-16 | 2023-03-17 | コーニンクレッカ フィリップス エヌ ヴェ | 医用画像における解剖学的構造を自動的に発見するための方法及びシステム |
| WO2023048267A1 (fr) * | 2021-09-27 | 2023-03-30 | 富士フイルム株式会社 | Dispositif, procédé et programme de traitement d'informations |
| JP2023521738A (ja) * | 2020-04-07 | 2023-05-25 | ベラソン インコーポレイテッド | 前立腺自動解析システム |
| WO2024048509A1 (fr) * | 2022-08-30 | 2024-03-07 | 株式会社Preferred Networks | Dispositif d'évaluation d'un état pathologique |
-
2025
- 2025-02-13 WO PCT/JP2025/004820 patent/WO2025173755A1/fr active Pending
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2014507231A (ja) * | 2011-02-24 | 2014-03-27 | ドッグ マイクロシステムズ インコーポレーテッド | 撮像データにおける潜在異常を特定する方法及び装置並びに医用画像へのその応用 |
| JP2021523449A (ja) * | 2018-10-25 | 2021-09-02 | テンセント・テクノロジー・(シェンジェン)・カンパニー・リミテッド | 検出モデルのトレーニング方法及び装置、並びに端末機器及びプログラム |
| JP2023511300A (ja) * | 2020-01-16 | 2023-03-17 | コーニンクレッカ フィリップス エヌ ヴェ | 医用画像における解剖学的構造を自動的に発見するための方法及びシステム |
| JP2023521738A (ja) * | 2020-04-07 | 2023-05-25 | ベラソン インコーポレイテッド | 前立腺自動解析システム |
| WO2023048267A1 (fr) * | 2021-09-27 | 2023-03-30 | 富士フイルム株式会社 | Dispositif, procédé et programme de traitement d'informations |
| WO2024048509A1 (fr) * | 2022-08-30 | 2024-03-07 | 株式会社Preferred Networks | Dispositif d'évaluation d'un état pathologique |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN109069014B (zh) | 用于估计在冠状动脉中的健康管腔直径和狭窄定量的系统和方法 | |
| US8380013B2 (en) | Case image search apparatus, method and computer-readable recording medium | |
| US20210151171A1 (en) | Apparatus and method for medical image reading assistant providing representative image based on medical use artificial neural network | |
| US12458232B2 (en) | Medical image visualization apparatus and method for diagnosis of aorta | |
| US20110075900A1 (en) | Diagnosis assisting system, computer readable recording medium having diagnosis assisting program recorded thereon, and diagnosis assisting method | |
| JP2009082441A (ja) | 医用診断支援システム | |
| JP2013542046A (ja) | 超音波画像処理のシステムおよび方法 | |
| Jannin et al. | Validation in medical image processing. | |
| EP1542589B1 (fr) | Affichage de donnees d'images | |
| CN118762849B (zh) | 一种泌尿外科诊疗数据智能化处理方法及系统 | |
| CN119139020A (zh) | 一种穿刺路径规划系统及采用穿刺路径规划系统的设备 | |
| CN118039125A (zh) | 一种基于人工智能的病理图片辅助诊断方法及系统 | |
| CN113658701B (zh) | 术后评估方法、计算机设备和存储介质 | |
| JP5348998B2 (ja) | 画像検索装置及びその方法 | |
| US20230238148A1 (en) | Information processing device, information processing method, and computer program | |
| CN120580526B (zh) | 基于多模态融合的肝病图像识别处理方法、介质和设备 | |
| CN120376111B (zh) | 泌尿结石检查图像识别及数据可视化分析与预警提醒系统 | |
| CN120451394B (zh) | 基于冠状动脉造影重建冠状动脉模型的方法 | |
| WO2025173755A1 (fr) | Système de traitement d'informations, procédé de traitement d'informations et programme de traitement d'informations | |
| US20230274424A1 (en) | Appartus and method for quantifying lesion in biometric image | |
| JP7723922B2 (ja) | レントゲン年齢推定学習装置、レントゲン年齢推定装置、画像年齢推定学習装置、画像年齢推定装置、レントゲン撮影システム、評価装置、レントゲン年齢推定学習方法、レントゲン年齢推定方法及びプログラム | |
| JP2021175454A (ja) | 医用画像処理装置、方法およびプログラム | |
| CN110910980A (zh) | 一种脓毒血症的预警装置、设备及存储介质 | |
| JP7679025B2 (ja) | 情報処理システム | |
| CN121191677A (zh) | 用于获取患者的当前医学图像并且基于所获取的当前医学图像生成当前最终报告的系统、计算机程序产品以及用于使用该系统的方法 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 25754626 Country of ref document: EP Kind code of ref document: A1 |