WO2023126999A1 - 画像処理装置、画像処理方法、及び、記憶媒体 - Google Patents
画像処理装置、画像処理方法、及び、記憶媒体 Download PDFInfo
- Publication number
- WO2023126999A1 WO2023126999A1 PCT/JP2021/048499 JP2021048499W WO2023126999A1 WO 2023126999 A1 WO2023126999 A1 WO 2023126999A1 JP 2021048499 W JP2021048499 W JP 2021048499W WO 2023126999 A1 WO2023126999 A1 WO 2023126999A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- polyp
- image
- size
- endoscopic
- estimation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/045—Control thereof
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000094—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000096—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope using artificial intelligence
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30028—Colon; Small intestine
- G06T2207/30032—Colon polyp
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30092—Stomach; Gastric
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30096—Tumor; Lesion
Definitions
- the present disclosure relates to image processing related to endoscopy.
- Patent Literature 1 proposes a method of recognizing a region of interest with high accuracy from an image captured by an endoscope imaging device.
- Patent Document 1 Even with Patent Document 1, it is not always possible to determine the size of the polyp with high accuracy.
- One object of the present disclosure is to provide an image processing device capable of estimating the size of a polyp found by endoscopy with high accuracy.
- an image processing device includes: endoscopic image acquisition means for acquiring an endoscopic image; polyp detection means for detecting a polyp region from the endoscopic image; a first estimation means for estimating the size of the polyp based on the image of the detected polyp region; an output means for outputting the result of estimating the size of the polyp; Prepare.
- an image processing method includes: Acquiring endoscopic images, detecting a polyp region from the endoscopic image; estimating the size of the polyp based on the image of the detected polyp area; Output the estimated polyp size.
- a storage medium comprises: Acquiring endoscopic images, detecting a polyp region from the endoscopic image; estimating the size of the polyp based on the image of the detected polyp area; A program is recorded that causes a computer to execute a process of outputting the results of estimating the size of a polyp.
- FIG. 1 is a block diagram showing a schematic configuration of an endoscopy system according to a first embodiment
- FIG. 2 is a block diagram showing the hardware configuration of the image processing apparatus according to the first embodiment
- FIG. 2 is a block diagram showing the functional configuration of the image processing apparatus of the first embodiment
- FIG. 4 is a flowchart of image display processing by the image processing apparatus of the first embodiment
- FIG. 10 shows a display example of a polyp size estimation result.
- FIG. It is a block diagram which shows the functional structure of the modification 1 of 1st Embodiment. It is a block diagram which shows the functional structure of the modification 2 of 1st Embodiment.
- FIG. 6 is a block diagram showing the functional configuration of an image processing apparatus according to a second embodiment;
- FIG. 1 is a block diagram showing a schematic configuration of an endoscopy system according to a first embodiment
- FIG. 2 is a block diagram showing the hardware configuration of the image processing apparatus according to the first embodiment
- FIG. 2 is a
- FIG. 9 is a flowchart of image display processing by the image processing apparatus of the second embodiment
- FIG. 11 is a block diagram showing the functional configuration of an image processing apparatus according to a third embodiment
- FIG. 10 is a flow chart of processing by the image processing apparatus of the third embodiment
- FIG. 1 shows a schematic configuration of an endoscopy system 100.
- the endoscopy system 100 estimates whether the detected polyp is larger than a predetermined size and displays the result. This allows the doctor to determine the treatment method according to the size of the polyp.
- the endoscopy system 100 mainly includes an image processing device 1, a display device 2, and an endoscope 3 connected to the image processing device 1.
- the image processing apparatus 1 acquires an image (i.e., moving image, hereinafter also referred to as "endoscopic image Ic") captured by the endoscope 3 during an endoscopy, from the endoscope 3, and
- the display device 2 is caused to display display data for confirmation by the examiner of the endoscopic examination.
- the image processing apparatus 1 acquires, as an endoscopic image Ic, a video of the interior of an organ captured by the endoscope 3 during an endoscopy.
- the image processing apparatus 1 extracts a still image (frame image) from the endoscope image Ic, detects a polyp, and estimates whether the polyp is larger than a predetermined size using AI. Then, the image processing apparatus 1 generates a display image including an endoscopic image, a polyp size estimation result, and the like.
- the display device 2 is a display or the like that displays an image based on a display signal supplied from the image processing device 1 .
- the endoscope 3 mainly includes an operation unit 36 for inputting air supply, water supply, angle adjustment, imaging instruction, etc. by the examiner, and a flexible endoscope which is inserted into an organ to be examined of the examinee.
- a flexible shaft 37 a flexible shaft 37 , a distal end portion 38 containing an imaging unit such as an ultra-compact imaging device, and a connecting portion 39 for connecting to the image processing apparatus 1 .
- the inspection object is not limited to the large intestine, but the gastrointestinal tract (digestive organ) such as the stomach, esophagus, small intestine, duodenum, etc. good too.
- FIG. 2 shows the hardware configuration of the image processing apparatus 1.
- the image processing apparatus 1 mainly includes a processor 11, a memory 12, an interface 13, an input section 14, a light source section 15, a sound output section 16, and a database (hereinafter referred to as "DB") 17. ,including.
- DB database
- the processor 11 executes a predetermined process by executing a program or the like stored in the memory 12.
- the processor 11 is a processor such as a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), or a TPU (Tensor Processing Unit). Note that the processor 11 may be composed of a plurality of processors. Processor 11 is an example of a computer.
- the memory 12 is composed of various volatile memories used as working memory, such as RAM (Random Access Memory) and ROM (Read Only Memory), and non-volatile memory for storing information necessary for processing of the image processing apparatus 1. be done.
- the memory 12 may include an external storage device such as a hard disk connected to or built into the image processing apparatus 1, or may include a storage medium such as a detachable flash memory or disk medium.
- the memory 12 stores a program for the image processing apparatus 1 to execute each process in this embodiment.
- the memory 12 temporarily stores a series of endoscope images Ic captured by the endoscope 3 during endoscopy.
- the memory 12 temporarily stores a still image acquired from the endoscopic image Ic during the endoscopic examination.
- These images are stored in the memory 12 in association with, for example, subject identification information (for example, patient ID), time stamp information, and the like.
- the interface 13 performs an interface operation between the image processing device 1 and an external device.
- the interface 13 supplies the display data Id generated by the processor 11 to the display device 2 .
- the interface 13 supplies illumination light generated by the light source unit 15 to the endoscope 3 .
- the interface 13 also supplies the processor 11 with an electrical signal indicating the endoscopic image Ic supplied from the endoscope 3 .
- the interface 13 may be a communication interface such as a network adapter for performing wired or wireless communication with an external device, and may be a hardware interface conforming to USB (Universal Serial Bus), SATA (Serial AT Attachment), or the like. may
- the input unit 14 generates an input signal based on the operation of the inspector.
- the input unit 14 is, for example, a button, touch panel, remote controller, voice input device, or the like.
- the light source unit 15 generates light to be supplied to the distal end portion 38 of the endoscope 3 .
- the light source unit 15 may also incorporate a pump or the like for sending out water or air to be supplied to the endoscope 3 .
- the sound output unit 16 outputs sound under the control of the processor 11 .
- the DB 17 stores endoscopic images and lesion information acquired by the subject's past endoscopic examinations.
- the lesion information includes lesion images and related information. Lesions include polyps (elevated lesions).
- the DB 17 may include an external storage device such as a hard disk connected to or built into the image processing apparatus 1, or may include a storage medium such as a detachable flash memory. Instead of providing the DB 17 in the endoscopy system 100, the DB 17 may be provided in an external server or the like, and the lesion information may be acquired from the server through communication.
- FIG. 3 is a block diagram showing the functional configuration of the image processing apparatus 1. As shown in FIG.
- the image processing apparatus 1 functionally includes an image capture section 21 , an image area recognition section 22 , a polyp detection section 23 and a first estimation section 24 .
- An endoscope image Ic is input from the endoscope 3 to the image processing device 1 .
- the endoscope image Ic is input to the image capture unit 21 .
- the image capture unit 21 extracts a still image (frame image) for each frame from the endoscope image Ic.
- the extracted frame image includes an area for displaying information about the subject and an area for displaying the image of the endoscope camera.
- the video capture unit 21 outputs the extracted frame images to the video region recognition unit 22 .
- the image area recognition unit 22 recognizes the area where the image of the endoscope camera is displayed from the frame image generated by the image capture unit 21, and cuts out only that area.
- the video region recognition unit 22 outputs an image of the clipped region displaying the endoscope camera video (hereinafter referred to as “endoscopic image”) to the polyp detection unit 23 and the first estimation unit 24 .
- the polyp detection unit 23 detects a polyp from the endoscopic image generated by the video area recognition unit 22 and estimates the position of the polyp. Specifically, the polyp detection unit 23 resizes the endoscopic image generated by the video region recognition unit 22 to a size that allows image analysis by AI. Then, the polyp detection unit 23 detects polyps included in the resized image using an image recognition model prepared in advance.
- This image recognition model is a model trained in advance so as to estimate the position of a polyp contained in an endoscopic image, and is hereinafter also referred to as a "polyp detection model".
- polyp detection unit 23 When detecting a polyp, polyp detection unit 23 generates coordinate information of a rectangle surrounding the polyp region and outputs the coordinate information to first estimation unit 24 .
- the first estimation unit 24 estimates whether the polyp detected by the polyp detection unit 23 is equal to or larger than a predetermined size. Specifically, the first estimation unit 24 uses the coordinate information generated by the polyp detection unit 23 to cut out only the polyp region from the endoscopic image generated by the image region recognition unit 22 . Next, the first estimation unit 24 resizes the clipped image of the polyp region to a size that enables image analysis by AI. Then, the first estimation unit 24 estimates whether or not the polyp is equal to or larger than a predetermined size using an image recognition model prepared in advance. This image recognition model is a pre-learned model that estimates whether or not the size of the polyp contained in the image is equal to or greater than a predetermined size from the image of the polyp region. Also called model.
- the first estimation unit 24 estimates whether the polyp is equal to or larger than a predetermined size using the first size estimation model.
- the predetermined size is "X mm"
- a polyp of X mm or more is defined as a large polyp
- a polyp of less than X mm is defined as a small polyp.
- the predetermined size "X mm” here is a value determined in advance based on guidelines for endoscopy.
- the first estimating unit 24 estimates the size of the polyp contained in the image of the polyp region, and calculates a score indicating the probability that the polyp is a large polyp (referred to as a "large polyp score") and a score indicating the probability that the polyp is a small polyp.
- a score indicating a certain probability (referred to as a "small polyp score”) is calculated. For example, the first estimation unit 24 calculates each score so that the sum of the large polyp score and the small polyp score is "1".
- the first estimation unit 24 compares the large polyp score and the small polyp score with a predetermined threshold TH, and adopts the score larger than the threshold TH as the estimation result. For example, if the threshold TH is "0.5", the first estimation unit 24 estimates that the polyp is a large polyp (X mm or more) when the large polyp score is greater than the threshold TH, and the small polyp score is greater than the threshold TH. If large, presume the polyp to be a small polyp (less than X mm). Then, the first estimation unit 24 generates display data Id based on the endoscopic image and the polyp size estimation result, and outputs the display data Id to the display device 2 .
- a predetermined threshold TH For example, if the threshold TH is "0.5", the first estimation unit 24 estimates that the polyp is a large polyp (X mm or more) when the large polyp score is greater than the threshold TH, and the small polyp score is greater than the threshold TH. If large, presume the polyp to
- the first estimating unit 24 uses an image focused on the polyp region to accurately estimate the size of the polyp. can be estimated by In the above example, the first estimation unit 24 estimates the size of the polyp after cutting out the polyp region from the endoscopic image. Instead, the first estimation unit 24 draws a rectangle surrounding the polyp area on the endoscopic image using the coordinate information of the polyp area detected by the polyp detection unit 23, and draws an image of the area surrounded by the rectangle. The size of the polyp may be estimated based on.
- the first estimation unit 24 may draw a rectangle surrounding the polyp region on the endoscopic image, and output the image data to the display device 2 . This allows the doctor to easily grasp the position of the polyp by looking at the displayed image.
- the image capture unit 21 and the image area recognition unit 22 are an example of endoscope image acquisition means
- the polyp detection unit 23 is an example of polyp detection means
- the first estimation unit 24 is a first estimation means.
- output means, and display control means are examples of the image capture unit 21 and the image area recognition unit 22 .
- FIG. 4 is a flow chart of image display processing by the image processing device 1 . This processing is realized by executing a program prepared in advance by the processor 11 shown in FIG. 2 and operating as each element shown in FIG.
- the image capture unit 21 acquires the endoscopic image Ic through the input unit 14, and acquires the frame image 41 from the endoscopic image Ic (step S11).
- the frame image 41 includes an area for displaying information about the subject and an area for displaying the image of the endoscope camera (endoscopic image).
- the image region recognition unit 22 acquires the endoscope image 42 from the frame image 41 acquired by the image capture unit 21 (step S12).
- the polyp detection unit 23 resizes the endoscope image 42 to a size suitable for polyp detection using the polyp detection model, and generates a resized image 43 (step S13).
- the polyp detection unit 23 detects polyps from the resized image 43 using the polyp detection model (step S14).
- the polyp detection unit 23 detects a polyp, it generates coordinate information of a rectangle 43x indicating the polyp region of the resized image 43 .
- the coordinate information of the rectangle 43x indicating the polyp region can be represented, for example, by the coordinates (x, y) of the upper left point of the rectangle and the width w and height h of the rectangle when that point is taken as the origin.
- the polyp detection unit 23 performs coordinate transformation of the rectangle 43x indicating the polyp area (step S15). This coordinate conversion corrects the coordinates of the rectangle 43x to the coordinates in the coordinate system before resizing in step S13. As a result, the rectangle 43 x after coordinate conversion is corrected to have a size in the coordinate system of the endoscopic image 42 . Polyp detection unit 23 outputs the coordinate information of the polyp region after coordinate conversion to first estimation unit 24 .
- the first estimation unit 24 crops the polyp area from the endoscopic image 42 based on the coordinate information of the rectangle 43x indicating the polyp area (step S16). Note that cropping means cutting out part of an image.
- the first estimation unit 24 resizes the cropped polyp region 44 to generate a resized image 45 (step S17).
- the resizing here corrects the polyp region 44 to a size suitable for the first size estimation model used by the first estimation unit 24 .
- the first estimation unit 24 estimates the size of the polyp based on the resized image 45 using the first size estimation model (step S18). Specifically, the first estimation unit 24 calculates a score indicating the probability that the polyp is X mm or larger (large polyp score) and a score indicating the probability that the polyp is smaller than X mm (small polyp score). .
- the first estimator 24 compares each calculated score with a threshold value TH, and displays an estimation result indicating whether the polyp is X mm or more or less than X mm (step S22). Then, the image display processing ends.
- FIG. 5 shows a display example of the polyp size estimation result.
- an endoscopic image 61 a polyp detection image 62 , and a polyp size estimation result 63 are displayed in the display area 60 .
- the endoscopic image 61 is an endoscopic image of the organ site where the polyp was detected.
- the polyp detection image 62 is an image displaying the polyp detected by the polyp detection unit 23, and shows the polyp in the endoscopic image surrounded by a rectangle 62x. In this example, the polyp is surrounded by a rectangle, but it may be surrounded by other figures such as an ellipse.
- the polyp size estimation result 63 is the polyp size estimation result by the first estimation unit 24 . In this example, “Non-diminutive" is displayed when the polyp size is equal to or larger than a predetermined size (X mm), and “Diminutive” is displayed when the polyp size is less than a predetermined threshold.
- the polyp size estimation result 63 may display other information related to the polyp, such as the degree of malignancy of the polyp, in addition to whether the size of the polyp is equal to or greater than a predetermined size.
- the first estimator 24 estimates the size of the polyp based on the image of the polyp region.
- the first estimation unit 24 may estimate the size of the polyp using optical flow information in addition to the image of the polyp region.
- FIG. 6 shows the functional configuration of an image display device 1a of Modification 1.
- an optical flow calculator 28 is provided in the image display device 1a.
- the optical flow calculation unit 28 acquires the endoscopic image from the video region recognition unit 22, and calculates optical flow information based on the endoscopic image one frame before and the latest endoscopic image.
- the optical flow calculator 28 then outputs the calculated optical flow information to the first estimator 24 .
- the optical flow calculation means is an example of calculation means.
- the first size estimation model used by the first estimation unit 24 is a learned model that has been pre-trained to estimate the size of the polyp based on the image of the polyp region and the optical flow information.
- the first estimation unit 24 estimates the size of the polyp based on the image of the polyp region input from the polyp detection unit 23 and the optical flow information calculated by the optical flow calculation unit 28 . Since the optical flow indicates the direction and amount of movement of each point in the image, the accuracy of polyp size estimation can be improved by using the optical flow.
- FIG. 7 shows the functional configuration of an image display device 1b of Modification 2.
- the image display device 1b is provided with an image characteristic acquisition section 29 .
- the image characteristic acquisition unit 29 acquires image characteristic information such as the manufacturer of the endoscope, the type of light source used by the endoscope, and whether or not the endoscope is performing enlarged display. Specifically, specification information of an endoscope may be used as the image characteristic information.
- the image characteristic acquiring unit 29 may read the specification information from a storage medium storing the specification information of the endoscope.
- the image characteristic acquisition unit 29 may analyze the endoscopic image generated by the video region recognition unit 22 to estimate image quality such as brightness and color tone of the image, and use it as image characteristic information.
- the image property acquisition unit 29 is an example of image property acquisition means.
- the image property acquisition unit 29 outputs the extracted image property information to the first estimation unit 24 .
- the first size estimation model used by the first estimation unit 24 is a trained model that has been trained in advance to estimate the size of the polyp based on the image of the polyp region and the image characteristic information.
- the first estimation unit 24 estimates the size of the polyp based on the image of the polyp region input from the polyp detection unit 23 and the image characteristic information input from the image characteristic acquisition unit 29 . This can improve the accuracy of polyp size estimation.
- FIG. 8 is a block diagram showing the functional configuration of the image processing device 1x of the second embodiment.
- the image processing apparatus 1x of the second embodiment is obtained by adding a second estimating section 25 and an estimation result integrating section 26 to the image processing apparatus 1 of the first embodiment.
- the second estimation unit 25 estimates the size of the polyp from the endoscopic image generated by the video region recognition unit 22 using the second size estimation model. That is, the first size estimation model estimates the size of the polyp from the image of the polyp region, while the second size estimation model estimates the size of the polyp from the entire endoscopic image.
- the second size estimation model is a model pre-trained to estimate the size of polyps contained in endoscopic images.
- the second estimation unit 25 first resizes the endoscopic image generated by the video region recognition unit 22 to a size suitable for the second size estimation model.
- the second estimation unit 25 detects a polyp using a second size estimation model prepared in advance, and estimates whether the polyp is equal to or larger than a predetermined size.
- the second estimation unit 25 provides a score indicating the probability that the polyp contained in the endoscopic image is X mm or larger (large polyp score) and a score indicating the probability that the polyp is smaller than X mm (small polyp score). score) is calculated and output to the estimation result integration unit 26 .
- the estimation result integration unit 26 integrates the polyp size estimation result obtained by the first estimation unit 24 and the polyp size estimation result obtained by the second estimation unit 25 . For example, for a certain polyp size, the first estimation unit 24 outputs a large polyp score of "0.6" and a small polyp score of "0.4", and the second estimation unit 25 outputs a large polyp score of "0.8", Assume that a small polyp score of "0.2" is output.
- the estimation result integration unit 26 first calculates the average value of the estimation results of the first estimation unit 24 and the second estimation unit 25 . In this example, the estimation result integration unit 26 calculates the average value of the large polyp score as "0.7" and the average value of the small polyp score as "0.3".
- the estimation result integration unit 26 compares the obtained average value with a predetermined threshold value TH, and outputs the polyp size estimation result.
- a predetermined threshold value TH As described above, when the threshold TH is set to "0.5", the estimation result integration unit 26 estimates the polyp to be a large polyp, that is, X mm or larger. Then, the estimation result integration unit 26 generates display data Id based on the estimation result of the polyp size and the endoscopic image generated by the image area recognition unit 22 , and outputs the display data Id to the display device 2 .
- the second estimating unit 25 for estimating the size of the polyp from the entire endoscopic image is added to the configuration of the first embodiment. In this manner, the second estimation unit 25 detects polyps and estimates the size of the polyps from the entire endoscopic image, so that when the polyp detection unit 23 misses a polyp, the second estimation unit 25 can compensate for it. becomes possible.
- the second estimation unit 25 is an example of second estimation means
- the estimation result integration unit 26 is an example of estimation result integration means
- FIG. 9 is a flowchart of image display processing by the image display device 1x of the second embodiment.
- the image display processing flowchart of the second embodiment is obtained by adding steps S19 to S21 to the image display processing flowchart of the first embodiment shown in FIG. Since the processes of steps S11 to S18 are the same as those of the first embodiment, description thereof will be omitted.
- the second estimation unit 25 acquires the endoscopic image 42 generated by the image area recognition unit 22 in step S12. The second estimation unit 25 then resizes the endoscopic image 42 to a size suitable for the second size estimation model to generate a resized image 46 (step S19). Next, the second estimation unit 25 detects polyps based on the resized image 46 using the second size estimation model, and outputs the result of polyp size estimation to the estimation result integration unit 26 (step S20). The estimation result integration unit 26 calculates the average value of the size estimation result of the first estimation unit 24 and the size estimation result of the second estimation unit 25 (step S21), and compares it with the threshold TH to obtain the polyp size estimation result. is determined and displayed (step S22). Then the process ends.
- the first estimator 24 estimates the size of the polyp based on the image of the polyp region
- the second estimator 25 estimates the size of the polyp based on the endoscopic image.
- the second estimator 25 may be configured to estimate the size of the polyp using optical flow information.
- the optical flow calculator 28 is provided in the image display device 1 x and the optical flow information calculated by the optical flow calculator 28 is input to the second estimator 25 as in the first modified example described above.
- the second size estimation model used by the second estimation unit 25 is a learned model that has been pre-trained to estimate the size of the polyp based on the endoscopic image and the optical flow information.
- the first estimation unit 24 may be configured to estimate the polyp size using optical flow information.
- both the first estimating section 24 and the second estimating section 25 may be configured to estimate the polyp size using the optical flow information.
- an estimating section for estimating the size of the polyp based on the endoscopic image and the optical flow information may be provided. This makes it possible to improve the accuracy of polyp size estimation using optical flow information, as in the first modification.
- the image display device 1x is provided with the image property acquisition unit 29, and the image property information acquired by the image property acquisition unit 29 is input to the second estimation unit 25, as in the second modification described above.
- the second size estimation model used by the second estimation unit 25 is a pre-trained model for estimating the size of the polyp based on the endoscopic image and the image characteristic information. This makes it possible to improve the accuracy of polyp size estimation using the image characteristic information, as in the second modification.
- the first estimating section 24 may be configured to estimate the polyp size using the image characteristic information.
- both the first estimation unit 24 and the second estimation unit 25 may be configured to estimate the polyp size using the image characteristic information.
- FIG. 10 is a block diagram showing the functional configuration of the image processing apparatus of the third embodiment.
- the image processing device 70 includes endoscope image acquisition means 71 , polyp detection means 72 , first estimation means 73 , and output means 74 .
- FIG. 11 is a flowchart of processing by the image processing apparatus of the third embodiment.
- the endoscopic image acquisition means 71 acquires an endoscopic image captured by an endoscope (step S71).
- the polyp detection means 72 detects a polyp region from the endoscopic image (step S72).
- the first estimation means estimates the size of the polyp based on the detected image of the polyp region (step S73).
- the output means 74 outputs the estimation result of the polyp size (step S74).
- polyps can be detected during endoscopy, and the size of the polyp can be estimated based on the image of the polyp region.
- endoscopic image acquisition means for acquiring an endoscopic image
- polyp detection means for detecting a polyp region from the endoscopic image
- a first estimation means for estimating the size of the polyp based on the image of the detected polyp region
- an output means for outputting the result of estimating the size of the polyp
- the first estimating means when the polyp detecting means detects a polyp, draws on the endoscopic image so as to surround the polyp area, and estimates the size of the polyp based on the surrounded polyp area.
- the image processing device according to appendix 1.
- Appendix 4 Display control means for displaying the endoscopic image and the result of estimating the size of the polyp on a display device; 4. The image processing apparatus according to any one of appendices 1 to 3, wherein the display control means superimposes and displays a figure surrounding the detected polyp region on the endoscopic image.
- the image processing apparatus according to any one of additional notes 1 to 5, wherein the first estimation means estimates the size of the polyp based on the image of the polyp region and the image characteristic information.
- (Appendix 7) a second estimating means for estimating a polyp size based on the endoscopic image; estimation result integration means for integrating the estimation result of the polyp size by the first estimation means and the estimation result of the polyp size by the second estimation means; with 7.
- the image processing apparatus according to any one of appendices 1 to 6, wherein the output means outputs the result of integration by the estimation result integration means.
- (Appendix 11) Acquiring endoscopic images, detecting a polyp region from the endoscopic image; estimating the size of the polyp based on the image of the detected polyp area; A storage medium recording a program that causes a computer to execute a process of outputting a polyp size estimation result.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- Radiology & Medical Imaging (AREA)
- General Health & Medical Sciences (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Public Health (AREA)
- Biomedical Technology (AREA)
- Pathology (AREA)
- Optics & Photonics (AREA)
- Biophysics (AREA)
- Heart & Thoracic Surgery (AREA)
- Veterinary Medicine (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Geometry (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Quality & Reliability (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Endoscopes (AREA)
- Image Processing (AREA)
Abstract
Description
内視鏡画像を取得する内視鏡画像取得手段と、
前記内視鏡画像からポリープ領域を検出するポリープ検出手段と、
検出されたポリープ領域の画像に基づいて、ポリープのサイズを推定する第1推定手段と、
ポリープのサイズの推定結果を出力する出力手段と、
を備える。
内視鏡画像を取得し、
前記内視鏡画像からポリープ領域を検出し、
検出されたポリープ領域の画像に基づいて、ポリープのサイズを推定し、
ポリープのサイズの推定結果を出力する。
内視鏡画像を取得し、
前記内視鏡画像からポリープ領域を検出し、
検出されたポリープ領域の画像に基づいて、ポリープのサイズを推定し、
ポリープのサイズの推定結果を出力する処理をコンピュータに実行させるプログラムを記録する。
<第1実施形態>
[システム構成]
図1は、内視鏡検査システム100の概略構成を示す。内視鏡検査システム100は、内視鏡を利用した検査(治療を含む)の際に、ポリープを検知した場合、検知したポリープが所定サイズ以上か否かを推定し結果を表示する。これにより、医師は、ポリープのサイズに応じた処置方法を判断することが可能となる。
図2は、画像処理装置1のハードウェア構成を示す。画像処理装置1は、主に、プロセッサ11と、メモリ12と、インターフェース13と、入力部14と、光源部15と、音出力部16と、データベース(以下、「DB」と記す。)17と、を含む。これらの各要素は、データバス19を介して接続されている。
図3は、画像処理装置1の機能構成を示すブロック図である。画像処理装置1は、機能的には、映像キャプチャ部21と、映像領域認識部22と、ポリープ検知部23と、第1推定部24と、を含む。
次に、上記のような表示を行う画像表示処理について説明する。図4は、画像処理装置1による画像表示処理のフローチャートである。この処理は、図2に示すプロセッサ11が予め用意されたプログラムを実行し、図3に示す各要素として動作することにより実現される。
次に、表示装置2による表示例を説明する。図5は、ポリープのサイズ推定結果の表示例を示す。ある患者の内視鏡検査の際に、ポリープが検知された場合、内視鏡画像と、ポリープの検知結果と、ポリープのサイズ推定結果とが、表示装置2に表示される。
次に、第1実施形態の変形例を説明する。以下の変形例は、適宜組み合わせて第1実施形態に適用することができる。
(変形例1)
第1実施形態では、第1推定部24は、ポリープ領域の画像に基づいてポリープのサイズを推定している。その代わりに、第1推定部24は、ポリープ領域の画像に加え、オプティカルフロー(Optical flow)情報を用いて、ポリープのサイズを推定してもよい。図6は、変形例1の画像表示装置1aの機能構成を示す。図示のように、変形例1では、画像表示装置1aにオプティカルフロー計算部28を設ける。オプティカルフロー計算部28は、映像領域認識部22から内視鏡画像を取得し、1フレーム前の内視鏡画像と、最新の内視鏡画像を基にオプティカルフロー情報を計算する。そして、オプティカルフロー計算部28は、計算したオプティカルフロー情報を第1推定部24に出力する。オプティカルフロー計算手段は、計算手段の一例である。
ポリープのサイズ推定において、内視鏡の製造業者や、内視鏡が使用している光源の種類、内視鏡が拡大表示をしているか否か、などを考慮することで、ポリープのサイズ推定の精度を向上させることができる。図7は、変形例2の画像表示装置1bの機能構成を示す。図示のように、変形例2では、画像表示装置1bに画像特性取得部29を設ける。画像特性取得部29は、内視鏡の製造業者、内視鏡が使用している光源の種類、内視鏡が拡大表示をしているか否かなどの、画像特性情報を取得する。具体的に、画像特性情報としては、内視鏡スコープの仕様情報などを用いてもよい。この場合には、内視鏡スコープの仕様情報を記憶した記憶媒体などから画像特性取得部29が仕様情報を読み込んでもよい。その代わりに、画像特性取得部29は、映像領域認識部22が生成した内視鏡画像を分析することにより、画像の輝度、色合いなどの画質を推定し、画像特性情報として使用してもよい。画像特性取得部29は、画像特性取得手段の一例である。
次に、第2実施形態について説明する。第2実施形態における内視鏡検査システム100の構成、及び、画像処理装置1xのハードウェア構成は、上述した第1実施形態と同様であるので、説明を省略する。
図8は、第2実施形態の画像処理装置1xの機能構成を示すブロック図である。第2実施形態の画像処理装置1xは、第1実施形態の画像処理装置1に、第2推定部25と、推定結果統合部26とを加えたものである。
図9は、第2実施形態の画像表示装置1xによる画像表示処理のフローチャートである。第2実施形態の画像表示処理フローチャートは、図4に示す第1実施形態の画像表示処理のフローチャートに、ステップS19~S21を追加したものである。ステップS11~S18の処理は第1実施形態と同様であるので、説明を省略する。
次に、第2実施形態の変形例を説明する。以下の変形例は、適宜組み合わせて第2実施形態に適用することができる。
(変形例3)
第2実施形態では、第1推定部24がポリープ領域の画像に基づいてポリープのサイズを推定し、第2推定部25が内視鏡画像に基づいてポリープのサイズを推定している。ここで、第2推定部25を、オプティカルフロー情報を用いてポリープのサイズを推定するように構成してもよい。この場合、前述の変形例1と同様に、画像表示装置1xにオプティカルフロー計算部28を設け、オプティカルフロー計算部28が計算したオプティカルフロー情報を第2推定部25に入力する。第2推定部25が使用する第2のサイズ推定モデルは、内視鏡画像及びオプティカルフロー情報に基づいてポリープのサイズを推定するように予め学習された学習済みのモデルとする。
第2実施形態においても、ポリープのサイズ推定において、内視鏡の製造業者や、内視鏡が使用している光源の種類、内視鏡が拡大表示をしているか否か、などを考慮することで、ポリープのサイズ推定の精度を向上させることができる。この場合、前述の変形例2と同様に、画像表示装置1xに画像特性取得部29を設け、画像特性取得部29が取得した画像特性情報を第2推定部25に入力する。第2推定部25が使用する第2のサイズ推定モデルは、内視鏡画像と画像特性情報とに基づいてポリープのサイズを推定するように予め学習済みのモデルとする。これにより、変形例2と同様に、画像特性情報を用いてポリープのサイズ推定の精度を向上させることができる。なお、第2推定部25の代わりに、第1推定部24が、画像特性情報を用いてポリープサイズを推定するように構成してもよい。また、第1推定部24と第2推定部25の両方が画像特性情報を用いてポリープサイズを推定するように構成してもよい。
図10は、第3実施形態の画像処理装置の機能構成を示すブロック図である。画像処理装置70は、内視鏡画像取得手段71と、ポリープ検出手段72と、第1推定手段73と、出力手段74と、を備える。
内視鏡画像を取得する内視鏡画像取得手段と、
前記内視鏡画像からポリープ領域を検出するポリープ検出手段と、
検出されたポリープ領域の画像に基づいて、ポリープのサイズを推定する第1推定手段と、
ポリープのサイズの推定結果を出力する出力手段と、
を備える画像処理装置。
前記第1推定手段は、前記ポリープ検出手段がポリープを検出した場合、前記内視鏡画像からポリープ領域の画像を切り出し、切り出したポリープ領域の画像に基づいてポリープのサイズを推定する付記1に記載の画像処理装置。
前記第1推定手段は、前記ポリープ検出手段がポリープを検出した場合、前記内視鏡画像上に、ポリープ領域を囲むように描画を行い、囲まれたポリープ領域に基づいてポリープのサイズを推定する付記1に記載の画像処理装置。
前記内視鏡画像及び前記ポリープのサイズの推定結果を表示装置に表示する表示制御手段を備え、
前記表示制御手段は、前記内視鏡画像上に、検出されたポリープ領域を囲う図形を重畳表示する付記1乃至3のいずれか一項に記載の画像処理装置。
前記内視鏡画像に基づいてオプティカルフロー情報を計算する計算手段を備え、
前記第1推定手段は、前記ポリープ領域の画像と、前記オプティカルフロー情報とに基づいて、ポリープのサイズを推定する付記1乃至4のいずれか一項に記載の画像処理装置。
前記内視鏡画像の画像特性情報を取得する画像特性取得手段を備え、
前記第1推定手段は、前記ポリープ領域の画像と、前記画像特性情報とに基づいて、ポリープのサイズを推定する付記1乃至5のいずれか一項に記載の画像処理装置。
前記内視鏡画像に基づいて、ポリープのサイズを推定する第2推定手段と、
前記第1推定手段によるポリープのサイズの推定結果と、前記第2推定手段によるポリープのサイズの推定結果を統合する推定結果統合手段と、
を備え、
前記出力手段は、前記推定結果統合手段による統合結果を出力する付記1乃至6のいずれか一項に記載の画像処理装置。
前記内視鏡画像に基づいてオプティカルフロー情報を計算する計算手段を備え、
前記第2推定手段は、前記内視鏡画像と、前記オプティカルフロー情報とに基づいて、ポリープのサイズを推定する付記7に記載の画像処理装置。
前記内視鏡画像の画像特性情報を取得する画像特性取得手段を備え、
前記第2推定手段は、前記内視鏡画像と、前記画像特性情報とに基づいて、ポリープのサイズを推定する付記7又は8に記載の画像処理装置。
内視鏡画像を取得し、
前記内視鏡画像からポリープ領域を検出し、
検出されたポリープ領域の画像に基づいて、ポリープのサイズを推定し、
ポリープのサイズの推定結果を出力する画像処理方法。
内視鏡画像を取得し、
前記内視鏡画像からポリープ領域を検出し、
検出されたポリープ領域の画像に基づいて、ポリープのサイズを推定し、
ポリープのサイズの推定結果を出力する処理をコンピュータに実行させるプログラムを記録した記憶媒体。
2 表示装置
3 内視鏡スコープ
11 プロセッサ
12 メモリ
17 データベース(DB)
21 映像キャプチャ部
22 映像領域認識部
23 ポリープ検知部
24 第1推定部
25 第2推定部
26 推定結果統合部
100 内視鏡検査システム
Claims (11)
- 内視鏡画像を取得する内視鏡画像取得手段と、
前記内視鏡画像からポリープ領域を検出するポリープ検出手段と、
検出されたポリープ領域の画像に基づいて、ポリープのサイズを推定する第1推定手段と、
ポリープのサイズの推定結果を出力する出力手段と、
を備える画像処理装置。 - 前記第1推定手段は、前記ポリープ検出手段がポリープを検出した場合、前記内視鏡画像からポリープ領域の画像を切り出し、切り出したポリープ領域の画像に基づいてポリープのサイズを推定する請求項1に記載の画像処理装置。
- 前記第1推定手段は、前記ポリープ検出手段がポリープを検出した場合、前記内視鏡画像上に、ポリープ領域を囲むように描画を行い、囲まれたポリープ領域に基づいてポリープのサイズを推定する請求項1に記載の画像処理装置。
- 前記内視鏡画像及び前記ポリープのサイズの推定結果を表示装置に表示する表示制御手段を備え、
前記表示制御手段は、前記内視鏡画像上に、検出されたポリープ領域を囲う図形を重畳表示する請求項1乃至3のいずれか一項に記載の画像処理装置。 - 前記内視鏡画像に基づいてオプティカルフロー情報を計算する計算手段を備え、
前記第1推定手段は、前記ポリープ領域の画像と、前記オプティカルフロー情報とに基づいて、ポリープのサイズを推定する請求項1乃至4のいずれか一項に記載の画像処理装置。 - 前記内視鏡画像の画像特性情報を取得する画像特性取得手段を備え、
前記第1推定手段は、前記ポリープ領域の画像と、前記画像特性情報とに基づいて、ポリープのサイズを推定する請求項1乃至5のいずれか一項に記載の画像処理装置。 - 前記内視鏡画像に基づいて、ポリープのサイズを推定する第2推定手段と、
前記第1推定手段によるポリープのサイズの推定結果と、前記第2推定手段によるポリープのサイズの推定結果を統合する推定結果統合手段と、
を備え、
前記出力手段は、前記推定結果統合手段による統合結果を出力する請求項1乃至6のいずれか一項に記載の画像処理装置。 - 前記内視鏡画像に基づいてオプティカルフロー情報を計算する計算手段を備え、
前記第2推定手段は、前記内視鏡画像と、前記オプティカルフロー情報とに基づいて、ポリープのサイズを推定する請求項7に記載の画像処理装置。 - 前記内視鏡画像の画像特性情報を取得する画像特性取得手段を備え、
前記第2推定手段は、前記内視鏡画像と、前記画像特性情報とに基づいて、ポリープのサイズを推定する請求項7又は8に記載の画像処理装置。 - 内視鏡画像を取得し、
前記内視鏡画像からポリープ領域を検出し、
検出されたポリープ領域の画像に基づいて、ポリープのサイズを推定し、
ポリープのサイズの推定結果を出力する画像処理方法。 - 内視鏡画像を取得し、
前記内視鏡画像からポリープ領域を検出し、
検出されたポリープ領域の画像に基づいて、ポリープのサイズを推定し、
ポリープのサイズの推定結果を出力する処理をコンピュータに実行させるプログラムを記録した記憶媒体。
Priority Applications (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2023570496A JPWO2023126999A5 (ja) | 2021-12-27 | 画像処理装置、画像処理方法、及び、プログラム | |
| PCT/JP2021/048499 WO2023126999A1 (ja) | 2021-12-27 | 2021-12-27 | 画像処理装置、画像処理方法、及び、記憶媒体 |
| EP21969883.4A EP4458249A4 (en) | 2021-12-27 | 2021-12-27 | IMAGE PROCESSING DEVICE, IMAGE PROCESSING METHOD AND STORAGE MEDIUM |
| US18/722,891 US20250061569A1 (en) | 2021-12-27 | 2021-12-27 | Image processing device, image processing method, and recording medium |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2021/048499 WO2023126999A1 (ja) | 2021-12-27 | 2021-12-27 | 画像処理装置、画像処理方法、及び、記憶媒体 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2023126999A1 true WO2023126999A1 (ja) | 2023-07-06 |
Family
ID=86998297
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2021/048499 Ceased WO2023126999A1 (ja) | 2021-12-27 | 2021-12-27 | 画像処理装置、画像処理方法、及び、記憶媒体 |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US20250061569A1 (ja) |
| EP (1) | EP4458249A4 (ja) |
| WO (1) | WO2023126999A1 (ja) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN117796745A (zh) * | 2024-02-29 | 2024-04-02 | 四川大学 | 一种估计消化内镜镜头进退距离的方法 |
| WO2025150034A3 (en) * | 2024-01-09 | 2025-08-28 | Palliare Limited | An insufflator, an insufflating apparatus and a method for insufflating a vessel in the body of a human or animal subject |
Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2018105063A1 (ja) * | 2016-12-07 | 2018-06-14 | オリンパス株式会社 | 画像処理装置 |
| US20180253839A1 (en) * | 2015-09-10 | 2018-09-06 | Magentiq Eye Ltd. | A system and method for detection of suspicious tissue regions in an endoscopic procedure |
| WO2020008651A1 (ja) * | 2018-07-06 | 2020-01-09 | オリンパス株式会社 | 内視鏡用画像処理装置、及び、内視鏡用画像処理方法、並びに、内視鏡用画像処理プログラム |
| US20200279373A1 (en) * | 2019-02-28 | 2020-09-03 | EndoSoft LLC | Ai systems for detecting and sizing lesions |
| WO2021140602A1 (ja) | 2020-01-09 | 2021-07-15 | オリンパス株式会社 | 画像処理システム、学習装置及び学習方法 |
| WO2021157487A1 (ja) * | 2020-02-06 | 2021-08-12 | 富士フイルム株式会社 | 医用画像処理装置、内視鏡システム、医用画像処理方法、及びプログラム |
| WO2021181564A1 (ja) * | 2020-03-11 | 2021-09-16 | オリンパス株式会社 | 処理システム、画像処理方法及び学習方法 |
Family Cites Families (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN111383214B (zh) * | 2020-03-10 | 2021-02-19 | 长沙慧维智能医疗科技有限公司 | 实时内窥镜肠镜息肉检测系统 |
-
2021
- 2021-12-27 US US18/722,891 patent/US20250061569A1/en active Pending
- 2021-12-27 EP EP21969883.4A patent/EP4458249A4/en active Pending
- 2021-12-27 WO PCT/JP2021/048499 patent/WO2023126999A1/ja not_active Ceased
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20180253839A1 (en) * | 2015-09-10 | 2018-09-06 | Magentiq Eye Ltd. | A system and method for detection of suspicious tissue regions in an endoscopic procedure |
| WO2018105063A1 (ja) * | 2016-12-07 | 2018-06-14 | オリンパス株式会社 | 画像処理装置 |
| WO2020008651A1 (ja) * | 2018-07-06 | 2020-01-09 | オリンパス株式会社 | 内視鏡用画像処理装置、及び、内視鏡用画像処理方法、並びに、内視鏡用画像処理プログラム |
| US20200279373A1 (en) * | 2019-02-28 | 2020-09-03 | EndoSoft LLC | Ai systems for detecting and sizing lesions |
| WO2021140602A1 (ja) | 2020-01-09 | 2021-07-15 | オリンパス株式会社 | 画像処理システム、学習装置及び学習方法 |
| WO2021157487A1 (ja) * | 2020-02-06 | 2021-08-12 | 富士フイルム株式会社 | 医用画像処理装置、内視鏡システム、医用画像処理方法、及びプログラム |
| WO2021181564A1 (ja) * | 2020-03-11 | 2021-09-16 | オリンパス株式会社 | 処理システム、画像処理方法及び学習方法 |
Non-Patent Citations (1)
| Title |
|---|
| See also references of EP4458249A4 |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2025150034A3 (en) * | 2024-01-09 | 2025-08-28 | Palliare Limited | An insufflator, an insufflating apparatus and a method for insufflating a vessel in the body of a human or animal subject |
| CN117796745A (zh) * | 2024-02-29 | 2024-04-02 | 四川大学 | 一种估计消化内镜镜头进退距离的方法 |
| CN117796745B (zh) * | 2024-02-29 | 2024-05-03 | 四川大学 | 一种估计消化内镜镜头进退距离的方法 |
Also Published As
| Publication number | Publication date |
|---|---|
| JPWO2023126999A1 (ja) | 2023-07-06 |
| US20250061569A1 (en) | 2025-02-20 |
| EP4458249A4 (en) | 2025-02-26 |
| EP4458249A1 (en) | 2024-11-06 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2023126999A1 (ja) | 画像処理装置、画像処理方法、及び、記憶媒体 | |
| US20250000329A1 (en) | Information processing device, information processing method, and recording medium | |
| EP4302681A1 (en) | Medical image processing device, medical image processing method, and program | |
| US20250281022A1 (en) | Endoscopy support device, endoscopy support method, and recording medium | |
| JP7750418B2 (ja) | 内視鏡検査支援装置、内視鏡検査支援方法、及び、プログラム | |
| JP7647873B2 (ja) | 画像処理装置、画像処理方法及びプログラム | |
| US20240180395A1 (en) | Endoscopic examination support apparatus, endoscopic examination support method, and recording medium | |
| US20250185884A1 (en) | Endoscopic examination support apparatus, endoscopic examination support method, and recording medium | |
| EP4434434A1 (en) | Information processing device, information processing method, and recording medium | |
| JP7448923B2 (ja) | 情報処理装置、情報処理装置の作動方法、及びプログラム | |
| EP4470448A1 (en) | Image-determining device, image-determining method, and recording medium | |
| US20250182882A1 (en) | Endoscopic examination support apparatus, endoscopic examination support method, and recording medium | |
| JP7264407B2 (ja) | 訓練用の大腸内視鏡観察支援装置、作動方法、及びプログラム | |
| US20250241514A1 (en) | Image display device, image display method, and recording medium | |
| US20250166297A1 (en) | Image processing apparatus, image processing method, and storage medium | |
| US20240335093A1 (en) | Medical support device, endoscope system, medical support method, and program | |
| US20250134348A1 (en) | Image processing device, image processing method, and storage medium | |
| US20250378556A1 (en) | Endoscopic examination assistance device, endoscopic examination system, processing method, and storage medium | |
| WO2025104800A1 (ja) | 内視鏡検査支援装置、内視鏡検査支援方法、及び、記録媒体 | |
| WO2024185357A1 (ja) | 医療支援装置、内視鏡システム、医療支援方法、及びプログラム | |
| WO2024185468A1 (ja) | 医療支援装置、内視鏡システム、医療支援方法、及びプログラム | |
| WO2023089716A1 (ja) | 情報表示装置、情報表示方法、及び、記録媒体 | |
| WO2024190272A1 (ja) | 医療支援装置、内視鏡システム、医療支援方法、及びプログラム | |
| WO2024166731A1 (ja) | 画像処理装置、内視鏡、画像処理方法、及びプログラム | |
| WO2023187886A1 (ja) | 画像処理装置、画像処理方法及び記憶媒体 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21969883 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2023570496 Country of ref document: JP |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 18722891 Country of ref document: US |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| ENP | Entry into the national phase |
Ref document number: 2021969883 Country of ref document: EP Effective date: 20240729 |
|
| WWW | Wipo information: withdrawn in national office |
Ref document number: 2021969883 Country of ref document: EP |