WO2019087790A1 - Dispositif d'aide à l'inspection, dispositif d'endoscope, procédé d'aide à l'inspection et programme d'aide à l'inspection - Google Patents
Dispositif d'aide à l'inspection, dispositif d'endoscope, procédé d'aide à l'inspection et programme d'aide à l'inspection Download PDFInfo
- Publication number
- WO2019087790A1 WO2019087790A1 PCT/JP2018/038754 JP2018038754W WO2019087790A1 WO 2019087790 A1 WO2019087790 A1 WO 2019087790A1 JP 2018038754 W JP2018038754 W JP 2018038754W WO 2019087790 A1 WO2019087790 A1 WO 2019087790A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image data
- captured image
- processing
- area
- sight
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/0005—Display arrangement combining images e.g. side-by-side, superimposed or tiled
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000094—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00039—Operational features of endoscopes provided with input arrangements for the user
- A61B1/00042—Operational features of endoscopes provided with input arrangements for the user for mechanical operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/045—Control thereof
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/24—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/22—Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30096—Tumor; Lesion
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/03—Recognition of patterns in medical or anatomical images
Definitions
- the present invention relates to an examination support apparatus, an endoscope apparatus, an examination support method, and an examination support program.
- CT computed tomography
- MRI magnetic resonance imaging
- VFD virtual slide scanners that capture pathological specimens
- endoscopic devices it is possible to acquire a large number of digitized high-definition medical images. ing.
- Patent Document 1 describes a system in which, when a user designates a region of interest on a medical image, detailed analysis is performed on the image in the region of interest and the analysis result is displayed.
- Patent Document 2 describes a system for extracting a lesion candidate region from a medical image and identifying the lesion with respect to the extracted lesion candidate region using a recognition model generated by machine learning.
- Patent Document 3 describes a system that analyzes a captured image captured by an endoscope, detects a lesion site, and superimposes and displays the detected lesion site on the captured image. In this system, it is possible to perform the above analysis limited to only the area on the captured image that the observer is focusing on.
- a lesion is confirmed while confirming a captured image displayed on a display device in a state in which the insertion portion of the endoscope is inserted into the body of a subject It is necessary to judge the presence or absence of the site and to excise the lesion site. For this reason, in the examination using the endoscope apparatus, it is necessary to carry out processing of detection or identification of a lesion using a computer in real time during the examination.
- Patent Literature 1-2 describes a technology for performing analysis limited to a specific area of a medical image, it is assumed that post-test analysis of image data saved by a test or the like is performed, It is not assumed that real-time detection and identification of lesions are performed.
- Patent Document 3 describes that analysis is limited to a focused area on a captured image at the time of examination using an endoscope. However, how to select this area is not considered.
- the endoscope needs to be operated using both hands, and at the time of examination, the operator is in a state where both hands are closed. Therefore, it is important to ensure the accuracy and efficiency of the examination how to determine the area in which processing of detection or identification of a lesion by a computer is required in a captured image displayed in real time. .
- the present invention has been made in view of the above circumstances, and provides an examination support apparatus, an endoscope apparatus, an examination support method, and an examination support program, which can achieve both the accuracy and the efficiency of the examination using an endoscope. Intended to be provided.
- the examination support apparatus of the present invention is directed to a captured image data acquisition unit that acquires captured image data obtained by imaging the inside of a subject by an endoscope, and a display device that displays a captured image based on the captured image data.
- Processing for performing at least the detection of detection of a lesion site from the captured image data and identification of the detected lesion site from the captured image data to the captured image data A processing unit, and a display control unit for causing the display device to display the result of the processing by the processing unit, the processing unit processing the captured image data based on the line of sight detected by the line of sight detection unit Control the contents of the above process for
- An endoscope apparatus includes the above-described examination support apparatus and the above-described endoscope.
- the examination support method of the present invention is directed to a captured image data acquisition step of acquiring captured image data obtained by imaging the inside of a subject by an endoscope, and a display device displaying a captured image based on the captured image data.
- a display control step for causing the display device to display the result of the processing according to the processing step, the processing step including the captured image data based on the line of sight detected by the line of sight detection step. Control the contents of the above process for
- the examination support program of the present invention is directed to a captured image data acquisition step of acquiring captured image data obtained by imaging the inside of a subject by an endoscope, and a display device displaying a captured image based on the captured image data.
- Processing for performing at least the detection of the sight line detection step of detecting the sight line of sight, the detection of a lesion site from the captured image data, and the identification of the detected lesion site to the captured image data A program for causing a computer to execute a processing step to be performed and a display control step for causing the display device to display the result of the processing according to the processing step, and in the processing step, a gaze detected by the gaze detection step And control the contents of the process for the captured image data.
- an examination support apparatus an endoscope apparatus, an examination support method, and an examination support program that can achieve both the accuracy and the efficiency of the examination using the endoscope.
- FIG. 100 It is a figure showing a schematic structure of endoscope apparatus 100 which is one embodiment of the present invention. It is a schematic diagram which shows the internal structure of the endoscope apparatus 100 shown in FIG. It is a figure which shows the functional block of the system control part 44 of the control apparatus 4 shown in FIG. It is a figure for demonstrating operation
- FIG. 1 is a view showing a schematic configuration of an endoscope apparatus 100 according to an embodiment of the present invention.
- the endoscope apparatus 100 includes an endoscope 1 and a main unit 2 including a control device 4 to which the endoscope 1 is connected and a light source device 5.
- the control device 4 includes a display device 7 for displaying a captured image and the like obtained by imaging the inside of the subject with the endoscope 1, an imaging device 8 installed in the vicinity of the display device 7, and the control device 4.
- the control device 4 controls the endoscope 1, the light source device 5, the display device 7, and the imaging device 8.
- the display device 7 has a display surface in which display pixels are two-dimensionally arranged, and pixel data constituting image data is drawn on each display pixel of the display surface, thereby an image based on the image data. Is displayed.
- the imaging device 8 is provided to detect the line of sight of the observer who observes the image displayed on the display surface of the display device 7.
- the imaging device 8 is arranged to be capable of imaging in front of the display surface.
- the gaze detection image data obtained by imaging the subject by the imaging device 8 is transmitted to the control device 4.
- the endoscope 1 is a tubular member extending in one direction, and is provided at the proximal end of the insertion unit 10 inserted into the subject and the proximal end of the insertion unit 10, and performs observation mode switching operation, photographing and recording operation, forceps operation, An operation unit 11 provided with an operation member for performing an air / water supply operation, a suction operation, etc., an angle knob 12 provided adjacent to the operation unit 11, an endoscope 1, a light source device 5 and a control device And 4 a universal cord 13 including connector portions 13A and 13B which are detachably connected.
- forceps holes for inserting forceps for collecting living tissue such as cells or polyps are inserted into the operation unit 11 and the insertion unit 10, channels for air supply and water supply, Various channels are provided, such as channels for suction.
- the insertion part 10 is comprised from the flexible part 10A which has flexibility, the curved part 10B provided in the front-end
- the bending portion 10 ⁇ / b> B is configured to be bendable by the turning operation of the angle knob 12.
- the bending portion 10B can be bent in any direction and at any angle depending on the region of the subject where the endoscope 1 is used, and the tip portion 10C can be oriented in a desired direction.
- FIG. 2 is a schematic view showing an internal configuration of the endoscope apparatus 100 shown in FIG.
- the light source device 5 includes a light source control unit 51 and a light source unit 52.
- the light source unit 52 generates illumination light for irradiating the subject.
- the illumination light emitted from the light source unit 52 is incident on the light guide 53 contained in the universal code 13, and is irradiated on the subject through the illumination lens 50 provided at the distal end portion 10 C of the insertion unit 10.
- a white light source that emits white light or a plurality of light sources including a white light source and light sources that emit light of other colors (for example, blue light sources that emit blue light) are used.
- a plurality of illumination lenses 50 may be provided on the front end surface of the front end portion 10C in accordance with the type of light emitted from the light source unit 52.
- the light source control unit 51 is connected to the system control unit 44 of the control device 4.
- the light source control unit 51 controls the light source unit 52 based on an instruction from the system control unit 44.
- the distal end portion 10C of the endoscope 1 includes an imaging optical system including an objective lens 21 and a lens group 22, an imaging element 23 for imaging an object through the imaging optical system, an analog-to-digital converter (ADC) 24, and a RAM.
- a memory 25 such as (Random Accsess Memory), a communication interface (I / F) 26, an imaging control unit 27, and a light guide 53 for guiding illumination light emitted from the light source unit 52 to the illumination lens 50; Is provided.
- the light guide 53 extends from the tip end portion 10C to the connector portion 13A of the universal cord 13. In the state where the connector portion 13A of the universal cord 13 is connected to the light source device 5, the illumination light emitted from the light source portion 52 of the light source device 5 can enter the light guide 53.
- CMOS complementary metal oxide semiconductor
- the imaging device 23 has a light receiving surface in which a plurality of pixels are two-dimensionally arranged, and converts the optical image formed on the light receiving surface by the above-described imaging optical system into an electrical signal (imaging signal) in each pixel And output to the ADC 24.
- imaging signal electrical signal
- the imaging device 23 for example, one equipped with a color filter such as a primary color or a complementary color is used.
- a set of imaging signals output from each pixel of the light receiving surface of the imaging device 23 is referred to as a captured image signal.
- the imaging device 23 When white light emitted from a white light source is split in time division by a color filter of a plurality of colors to generate illumination light as the light source unit 52, the imaging device 23 has a color filter You may use what is not.
- the imaging device 23 may be disposed at the tip 10 C in a state where the light receiving surface is perpendicular to the optical axis Ax of the objective lens 21, or the light receiving surface is parallel to the optical axis Ax of the objective lens 21. It may be arranged at tip part 10C in the state which becomes.
- the imaging optical system provided in the endoscope 1 includes an optical member such as a lens or a prism (including the above-described lens group 22) on the optical path of light from a subject between the imaging element 23 and the objective lens 21; And an objective lens 21.
- the imaging optical system may be configured of only the objective lens 21.
- the ADC 24 converts the imaging signal output from the imaging device 23 into a digital signal of a predetermined number of bits.
- the memory 25 temporarily records an imaging signal that has been digitally converted by the ADC 24.
- the communication I / F 26 is connected to the communication interface (I / F) 41 of the control device 4.
- the communication I / F 26 transmits the imaging signal recorded in the memory 25 to the control device 4 through the signal line in the universal code 13.
- the imaging control unit 27 is connected to the system control unit 44 of the control device 4 via the communication I / F 26.
- the imaging control unit 27 controls the imaging device 23, the ADC 24, and the memory 25 based on the command from the system control unit 44 received by the communication I / F 26.
- the control device 4 includes a communication I / F 41 connected to the communication I / F 26 of the endoscope 1 by the universal code 13, a signal processing unit 42, a display controller 43, a system control unit 44, and a recording medium 45. Equipped with
- the communication I / F 41 receives an imaging signal transmitted from the communication I / F 26 of the endoscope 1 and transmits the imaging signal to the signal processing unit 42.
- the signal processing unit 42 incorporates a memory for temporarily recording an imaging signal received from the communication I / F 41, and processes a captured image signal which is a set of imaging signals recorded in the memory (demosaicing processing, gamma correction processing, etc. Image processing) to generate captured image data in a format capable of recognition processing to be described later.
- the captured image data generated by the signal processing unit 42 is recorded on a recording medium 45 such as a hard disk or a flash memory.
- the display controller 43 causes the display device 7 to display a captured image based on the captured image data generated by the signal processing unit 42.
- the coordinates of each piece of pixel data making up the captured image data generated by the signal processing unit 42 are managed in association with the coordinates of any display pixel making up the display surface of the display device 7.
- the system control unit 44 controls each part of the control device 4 and sends commands to the imaging control unit 27 of the endoscope 1 and the light source control unit 51 of the light source device 5 to control the entire endoscope apparatus 100 collectively. Do.
- the system control unit 44 controls the imaging element 23 via the imaging control unit 27. In addition, the system control unit 44 controls the light source unit 52 via the light source control unit 51.
- the system control unit 44 includes various processors that execute programs and perform processing, RAM (Ramdom Access Memory), and ROM (Read Only Memory).
- RAM Random Access Memory
- ROM Read Only Memory
- the various processors include a CPU (central processing unit) that is a general-purpose processor that executes programs and performs various processes, and a programmable logic that is a processor that can change the circuit configuration after manufacturing a field programmable gate array (FPGA) or the like.
- the processor includes a dedicated electric circuit or the like which is a processor having a circuit configuration specially designed to execute specific processing such as a device (Programmable Logic Device: PLD) or an ASIC (Application Specific Integrated Circuit).
- the structures of these various processors are electric circuits in which circuit elements such as semiconductor elements are combined.
- the system control unit 44 may be configured with one of various processors, or configured with a combination of two or more processors of the same type or different types (for example, a combination of a plurality of FPGAs or a combination of a CPU and an FPGA) It may be done.
- FIG. 3 is a diagram showing functional blocks of the system control unit 44 of the control device 4 shown in FIG.
- the processor of the system control unit 44 executes the inspection support program stored in the ROM incorporated in the system control unit 44 to obtain the captured image data acquisition unit 44A, the gaze detection unit 44B, the processing unit 44C, and the display control unit. It functions as an inspection support apparatus provided with 44D.
- the captured image data acquisition unit 44A sequentially acquires captured image data obtained by processing the imaging signal obtained by imaging the inside of the subject by the imaging device 23 by the signal processing unit 42.
- the line-of-sight detection unit 44B acquires the line-of-sight detection image data transmitted from the imaging device 8, and is directed to the display device 7 based on the image of the human eye included in the line-of-sight detection image data.
- the line of sight of the observer (the operator of the endoscope 1) is detected.
- the line-of-sight detection unit 44B outputs, as a line-of-sight detection result, information on coordinates at which the line-of-sight intersects on the display surface of the display device 7.
- the processing unit 44C is processing for performing detection of a lesion site from the captured image data and identification of the detected lesion site from the captured image data acquired by the captured image data acquisition unit 44A. I do.
- processing for detection of a lesion site is called detection processing
- processing for identification of a lesion site is called identification processing.
- the detection of a lesion site refers to finding out a site (lesion candidate area) suspected of being a lesion such as a malignant tumor or a benign tumor from captured image data.
- the identification of the lesion site means whether the lesion site detected by the detection process is malignant, benign or not, what kind of disease if it is malignant, what is the degree of progression of the disease, etc. , To distinguish the type or nature of the detected lesion site.
- the above detection processing and identification processing are respectively performed by an image recognition model (for example, a neural network or a support vector machine) having a hierarchical structure and parameters for feature value extraction determined by machine learning or deep learning etc. .
- an image recognition model for example, a neural network or a support vector machine
- the processing unit 44C controls the content of the above-described recognition processing performed on the captured image data based on the line of sight detected by the line of sight detection unit 44B.
- the display control unit 44D instructs the display controller 43 to cause the display device 7 to display a captured image based on the captured image data recorded in the recording medium 45 as a live view image, and the above recognition by the processing unit 44C. And control for causing the display device 7 to display the processing result.
- FIG. 4 is a diagram for explaining the operation of the processing unit 44C in the system control unit 44 shown in FIG.
- FIG. 4 shows captured image data IM acquired by the captured image data acquisition unit 44A.
- the imaged image data IM shown in FIG. 4 includes the lesion sites P1 to P3.
- the processing unit 44C focuses on the captured image data IM acquired by the captured image data acquisition unit 44A from the information of the coordinates of the display pixel that intersects the operator's gaze on the display surface of the display device 7 output from the gaze detection unit 44B. It is determined which region is being used (hereinafter referred to as the region of interest).
- the processing unit 44C divides the captured image data IM into a total of four areas: a divided area AR1, a divided area AR2, a divided area AR3, and a divided area AR4.
- the processing unit 44C repeatedly performs a process of specifying pixel data of the captured image data IM corresponding to the coordinates from the coordinates of the display pixel output from the gaze detection unit 44B for a certain period. Then, the processing unit 44C determines, of the divided areas AR1 to AR4, the divided area including the largest number of pixel data specified in this fixed period as the attention area.
- the processing unit 44C executes recognition processing using the above-described image recognition model only on the attention area of the captured image data IM determined in this manner, and an area excluding the attention area in the captured image data IM (hereinafter referred to as The above recognition process is not performed for the non-interest region).
- the case where the divided area AR4 is determined as the attention area is taken as an example.
- pixel data in the divided area AR4 is input to the image recognition model to execute recognition processing, and no recognition process is performed on pixel data in the divided areas AR1 to AR3.
- a lesion site P1 included in the divided area AR4 is detected by this recognition process, and identification of the lesion site P1 is further performed. It will be.
- the display control unit 44D displays the lesion site P1 in the captured image im based on the captured image data IM, as shown in FIG.
- the identification result (the characters of “cancer, stage 2” in the example of FIG. 5) is displayed together with the captured image im.
- recognition processing is not performed on the divided areas AR1 to AR3. Therefore, as shown in FIG. 5, the lesion sites P2 and P3 are not highlighted or the identification result is not displayed.
- the processing unit 44C determines which region of the captured image data should be subjected to the recognition process based on the sight line detected by the sight line detection unit 44B.
- FIG. 6 is a flowchart for explaining the operation of the system control unit 44 shown in FIG.
- a captured image signal is output from the imaging device 23, and the captured image signal is processed to sequentially capture image data of one frame of the moving image. It is generated and recorded on the recording medium 45.
- captured images based on the generated captured image data are sequentially displayed on the display device 7 as live view images.
- the gaze detection unit 44B starts the process of detecting the gaze of the operator.
- the captured image data acquisition unit 44A acquires captured image data generated by the signal processing unit 42 (step S1).
- the processing unit 44C determines a notable area in the captured image data acquired in step S1 based on the line of sight of the operator detected by the line of sight detection unit 44B (step S2).
- the processing unit 44C performs the above-described recognition processing (detection processing and identification processing) only on the determined attention area (step S3).
- step S3 After completion of the recognition process in step S3, under the control of the display control unit 44D, the detection result that is the result of the detection process and the identification result that is the result of the identification process are together with the captured image as illustrated in FIG. It is displayed on the display device 7 (step S4).
- step S4 when the imaging end instruction by the imaging element 23 is issued and the inspection is ended (step S5: YES), the system control unit 44 ends the process. On the other hand, if the examination is continued (step S5: NO), the process returns to step S1, and the subsequent processes are repeated.
- the recognition process is performed with limitation to the attention area where the line of sight of the operator who operates the endoscope 1 is gathered. Even in a situation where both hands are closed, the operator can obtain detection results and identification results in the region of interest only by changing the position to direct the gaze to the captured image, and the inspection is efficient And it can be done precisely.
- the range in which the recognition process is performed can be narrowed. Therefore, when the detection of the lesion site and the identification of the lesion site are performed in real time at the endoscopy, the processing load of the system control unit 44 can be reduced. As described above, since the processing load can be reduced, an image recognition model having higher detection accuracy or identification accuracy of a lesion site can be used as an image recognition model used for recognition processing, and the accuracy of endoscopic examination is improved. be able to.
- FIG. 7 is a flowchart for explaining a first modification of the operation of the system control unit 44 shown in FIG.
- step S3 is changed to step S3a and step S3b and step S4 is changed to step S4a.
- step S3 is changed to step S3a and step S3b and step S4 is changed to step S4a.
- step S4 is changed to step S4a.
- step S2 the processing unit 44C performs detection processing using the same image recognition model on each of the attention area in the captured image data and the non-attention area excluding the attention area.
- step S3a a detection process of a lesion site is performed on the entire captured image data (step S3a).
- step S3a After the lesion site is detected by the process of step S3a, the processing unit 44C performs the identification process only on the attention area determined in step S2 (step S3b).
- step S3b the display control unit 44D causes the display device 7 to display the result of the detection process of step S3a and the result of the identification process of step S3b together with the captured image (step S4a). After step S4a, the process proceeds to step S5.
- FIG. 8 is a view showing an example of an image displayed on the display device 7 in step S4a shown in FIG. As shown in FIG. 8, the lesion sites P1 to P3 on the captured image im are highlighted as a detection result of the lesion site. Then, the identification result is displayed only for the lesion site P1 to which the line of sight of the operator is directed.
- the processing unit 44C of the system control unit 44 performs the detection processing on the entire captured image data, and the identification processing only on the attention area Do.
- the identification processing is performed only on the attention area, so that the processing load of the system control unit 44 can be reduced, or the inspection accuracy can be improved by adopting the identification processing with high accuracy. .
- the line of sight is directed from the lesion site P1 to the lesion site P2 to discriminate the discrimination result. It can be displayed. In this way, by displaying the identification result only when necessary, it is possible to conduct an inspection efficiently.
- FIG. 9 is a flowchart for explaining a second modification of the operation of the system control unit 44 shown in FIG.
- step S3 is changed to step S3c and step S4 is changed to step S4b.
- step S4 is changed to step S4b.
- the processing unit 44C determines the attention area in step S2
- the processing unit 44C performs recognition processing with different performance on each of the attention area in the captured image data and the non-attention area excluding the attention area (step S3c). ).
- step S3c the processing unit 44C causes the content of the recognition process (at least one of the configuration of the image recognition model or the parameter) to be different between the focused area and the non-focused area of the captured image data.
- the processing unit 44 ⁇ / b> C controls the content of the recognition process for the attention area so that the performance of the recognition process is higher than the performance of the recognition process for the non-interest area.
- the performance of the detection process is higher.
- the more the types of information that can be output as a result by the image recognition model the higher the performance of the identification processing.
- the higher the number of layers of the image recognition model the higher the performance of the identification process.
- the higher the resolution of the image data that the image recognition model can analyze the higher the performance of the identification process.
- step S3c the display control unit 44D causes the display device 7 to display the result of the recognition process of step S3c along with the captured image (step S4b). After step S4b, the process proceeds to step S5.
- FIG. 10 is a view showing an example of an image displayed on the display device 7 in step S4b shown in FIG.
- the lesion sites P1 to P3 on the captured image im are highlighted as the detection results. Then, for the lesion site P1 to which the line of sight of the operator is directed, information on the cancer stage is displayed as the identification result in addition to the type of tumor (cancer). On the other hand, only the type of tumor is displayed for the lesion sites P2 and P3 where the line of sight of the operator is not directed.
- relatively high performance recognition processing is performed on the area of the captured image to which the line of sight of the operator is directed, whereby the inspection can be efficiently advanced.
- the recognition processing of relatively low performance is performed even on the area of the captured image where the line of sight of the operator is not directed, so that missing of the lesion site can be prevented or the examination efficiency can be improved.
- FIG. 11 is a flowchart for explaining a third modification of the operation of the system control unit 44 shown in FIG.
- step S3 is changed to step S31 and step S4 is changed to step S41.
- step S4 is changed to step S41.
- the processing unit 44C determines the attention area in step S2
- the processing unit 44C performs identification processing only on the non-interest area that is an area excluding the attention area in the captured image data (step S31).
- step S31 the display control unit 44D causes the display device 7 to display the result of the recognition process of step S31 together with the captured image (step S41). After step S41, the process proceeds to step S5.
- recognition processing is performed only on the area of the captured image that the operator does not pay attention to, and the result is displayed. Operators are often skilled doctors. For this reason, it can be considered that detection of a lesion and identification of a lesion are performed with high accuracy according to the experience of the operator with respect to the region of the captured image viewed by the operator.
- the third modification by performing identification processing only on the area where the line of sight is not directed, it is possible to prevent missing of a lesion site that can not be detected by the experience of the operator. It can be enhanced. Further, according to the third modification, the range in which the identification process is performed is narrowed, so that the accuracy of the identification process can be enhanced, and the processing load of the system control unit 44 can be reduced.
- FIG. 12 is a flowchart for explaining a fourth modification of the operation of the system control unit 44 shown in FIG.
- step S3b is changed to step S32 and step S4a is changed to step S42.
- step S3b is changed to step S32 and step S4a is changed to step S42.
- step S4a is changed to step S42.
- step S3a After the lesion site is detected by the process of step S3a, the processing unit 44C performs the identification process only on the non-interest area which is the area excluding the attention area determined in step S2 (step S32).
- the display control unit 44D causes the display device 7 to display the result of the detection process of step S3a and the result of the identification process of step S32 together with the captured image (step S42).
- the fourth modification in a region where the operator is not looking at, where the lesion site is and the identification result of the lesion site are displayed. Therefore, it can be used to determine the subsequent treatment.
- the processing load of the system control unit 44 can be reduced by performing the identification processing only on the non-interesting area, or the inspection accuracy can be improved by adopting the identification processing with high accuracy. it can.
- FIG. 13 is a flowchart for explaining the fifth modification of the operation of the system control unit 44 shown in FIG.
- step S3c is changed to step S33 and step S4b is changed to step S43.
- step S3c is changed to step S33
- step S4b is changed to step S43.
- FIG. 13 the same processes as in FIG. 9 are assigned the same reference numerals and descriptions thereof will be omitted.
- the processing unit 44C determines the attention area in step S2
- the processing unit 44C performs recognition processing with different performance on each of the attention area in the captured image data and the non-attention area excluding the attention area (step S33). ).
- step S33 the processing unit 44C causes the content of the recognition process (at least one of the configuration of the image recognition model or the parameter) to be different between the focused area and the non-focused area of the captured image data.
- the processing unit 44C controls the content of the recognition process for the non-interest area so that the performance of the recognition process is higher than the performance of the recognition process for the attention area.
- step S33 the display control unit 44D causes the display device 7 to display the result of the recognition process of step S33 together with the captured image (step S43). After step S43, the process proceeds to step S5.
- recognition processing with relatively high performance is performed on the area of the captured image to which the line of sight of the operator is not directed, so that missing of the lesion site can be efficiently prevented. It can do and can advance an inspection efficiently.
- recognition processing of relatively low performance is performed on the area of the captured image to which the operator's gaze is directed, so that missing of the lesion site can be prevented or inspection efficiency can be improved.
- the identification process is not essential as the recognition process performed by the processing unit 44C. Even when the processing unit 44C performs only the detection process, the processing load of the system control unit 44 can be reduced, the accuracy of the detection process can be improved, and the inspection efficiency can be improved. it can.
- the line of sight of the operator is detected based on the image data obtained by imaging the subject by the imaging device 8, but the method of detecting the line of sight is not limited to this, and various known methods are available. Can be adopted.
- the line of sight can also be detected based on detection information of an acceleration sensor mounted on a glasses-type wearable terminal worn by the operator.
- a captured image data acquisition unit that acquires captured image data obtained by imaging the inside of a subject with an endoscope, and a line of sight directed to a display device that displays a captured image based on the captured image data
- a line-of-sight detection unit for detecting, a processing unit for performing processing of at least the detection of detection of a lesion site from the captured image data and identification of the detected lesion site on the captured image data;
- a display control unit that causes the display unit to display the result of the process performed by the processing unit, the process unit including the content of the process on the captured image data based on the gaze detected by the gaze detection unit.
- An inspection support device that controls
- the processing unit determines a focused area to be focused on in the captured image data based on the line of sight detected by the line of sight detection unit, and performs the imaging An inspection support apparatus which makes the above process non-execution with respect to a non-attention area which is an area excluding the above-mentioned attention area in image data, and executes the above-mentioned process only with respect to the above-mentioned attention area.
- the processing unit performs the processing for detecting the lesion site on the entire captured image data, and the detection is performed by the gaze detection unit.
- An examination support apparatus that determines a focused area of interest in the captured image data based on the line of sight, and performs the above-described process for identifying the lesion site only on the focused area.
- the processing unit determines a focused area that is focused on in the captured image data based on the line of sight detected by the line of sight detection unit, and performs the imaging
- the above processing is executed on the entire image data, and the contents of the processing for the attention area in the captured image data, and the contents of the processing for a non-interest area which is an area excluding the attention area in the captured image data
- the inspection support device which makes it differ.
- the processing unit is configured to set the content of the process for the region of interest such that the performance of the process is higher than the performance of the process for the region of non-interest Inspection support device to control.
- the processing unit determines a focused area that is focused on in the captured image data based on the line of sight detected by the line of sight detection unit;
- An inspection support apparatus which makes the above process non-executed for an area and executes the above process only for a non-attention area which is an area excluding the attention area in the captured image data.
- the processing unit determines a focused area to be focused on in the captured image data based on the line of sight detected by the line of sight detection unit, and performs the imaging
- the above process for detecting the lesion area is performed on the entire image data, and the process for identifying the lesion area is a non-focus area that is an area excluding the attention area in the captured image data.
- Inspection support device that performs only for.
- An endoscope apparatus comprising the examination support apparatus according to any one of (1) to (8), and the endoscope.
- a captured image data acquisition step for acquiring captured image data obtained by imaging the inside of a subject with an endoscope, and a line of sight directed to a display device for displaying a captured image based on the captured image data
- a display control step of causing the display device to display the result of the process by the process step, wherein the process step includes the contents of the process on the captured image data based on the line of sight detected by the line of sight detection step.
- a captured image data acquisition step for acquiring captured image data obtained by imaging the inside of a subject with an endoscope, and a line of sight directed to a display device for displaying a captured image based on the captured image data
- An examination support program which controls the contents of the above-mentioned processing to picturized image data.
- an examination support apparatus an endoscope apparatus, an examination support method, and an examination support program that can achieve both the accuracy and the efficiency of the examination using the endoscope.
- Endoscope device 100 Endoscope device 1 Endoscope 2 Main body portion 10 Insertion portion 10A Flexible portion 10B Curved portion 10C Tip portion 11 Operation portion 12 Angle knob 13 Universal code 13A, 13B Connector portion 6 Input portion 7 Display device 8 Imaging device 21 Objective lens Ax Optical axis 22 Lens group 23 Image sensor 24 ADC 25 memory 26 communication interface 27 imaging control unit 4 control device 41 communication interface 42 signal processing unit 43 display controller 44 system control unit 44A captured image data acquisition unit 44B line of sight detection unit 44C processing unit 44D display control unit 45 recording medium 5 light source device 50 Illumination lens 51 Light source control unit 52 Light source unit 53 Light guide IM Captured image data im Captured images AR1, AR2, AR3, AR4 Division areas P1, P2, P3 Lesions
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Physics & Mathematics (AREA)
- Radiology & Medical Imaging (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Optics & Photonics (AREA)
- Pathology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Mechanical Engineering (AREA)
- Astronomy & Astrophysics (AREA)
- Endoscopes (AREA)
Abstract
Priority Applications (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2019551062A JP6967602B2 (ja) | 2017-10-31 | 2018-10-18 | 検査支援装置、内視鏡装置、内視鏡装置の作動方法、及び検査支援プログラム |
| EP18873241.6A EP3705024B1 (fr) | 2017-10-31 | 2018-10-18 | Dispositif d'aide à l'inspection, dispositif d'endoscope, procédé d'aide à l'inspection et programme d'aide à l'inspection |
| CN201880070441.5A CN111295127B (zh) | 2017-10-31 | 2018-10-18 | 检查支持装置、内窥镜装置及记录介质 |
| US16/842,761 US11302092B2 (en) | 2017-10-31 | 2020-04-08 | Inspection support device, endoscope device, inspection support method, and inspection support program |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2017210379 | 2017-10-31 | ||
| JP2017-210379 | 2017-10-31 |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/842,761 Continuation US11302092B2 (en) | 2017-10-31 | 2020-04-08 | Inspection support device, endoscope device, inspection support method, and inspection support program |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2019087790A1 true WO2019087790A1 (fr) | 2019-05-09 |
Family
ID=66333201
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2018/038754 Ceased WO2019087790A1 (fr) | 2017-10-31 | 2018-10-18 | Dispositif d'aide à l'inspection, dispositif d'endoscope, procédé d'aide à l'inspection et programme d'aide à l'inspection |
Country Status (5)
| Country | Link |
|---|---|
| US (1) | US11302092B2 (fr) |
| EP (1) | EP3705024B1 (fr) |
| JP (1) | JP6967602B2 (fr) |
| CN (1) | CN111295127B (fr) |
| WO (1) | WO2019087790A1 (fr) |
Cited By (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN110132547A (zh) * | 2019-05-14 | 2019-08-16 | 杭州电子科技大学 | 一种内窥镜头光学性能检测装置及检测方法 |
| JP2020089712A (ja) * | 2018-12-04 | 2020-06-11 | Hoya株式会社 | 情報処理装置、内視鏡用プロセッサ、情報処理方法およびプログラム |
| JP2021045337A (ja) * | 2019-09-18 | 2021-03-25 | 富士フイルム株式会社 | 医用画像処理装置、プロセッサ装置、内視鏡システム、医用画像処理方法、及びプログラム |
| WO2021149169A1 (fr) * | 2020-01-21 | 2021-07-29 | 日本電気株式会社 | Dispositif d'aide au fonctionnement, procédé d'aide au fonctionnement et support d'enregistrement lisible par ordinateur |
| CN114025674A (zh) * | 2019-08-09 | 2022-02-08 | 富士胶片株式会社 | 内窥镜装置、控制方法、控制程序及内窥镜系统 |
| JP2022541897A (ja) * | 2019-07-16 | 2022-09-28 | ドックボット, インコーポレイテッド | 機械学習システムのリアルタイム展開 |
| WO2025173164A1 (fr) * | 2024-02-15 | 2025-08-21 | 日本電気株式会社 | Dispositif d'aide à l'inspection endoscopique, procédé d'aide à l'inspection endoscopique, et support d'enregistrement |
Families Citing this family (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN109427060A (zh) * | 2018-10-30 | 2019-03-05 | 腾讯科技(深圳)有限公司 | 一种影像识别的方法、装置、终端设备和医疗系统 |
| CN117814732A (zh) * | 2018-12-04 | 2024-04-05 | Hoya株式会社 | 模型生成方法 |
| US11730491B2 (en) | 2020-08-10 | 2023-08-22 | Kunnskap Medical, LLC | Endoscopic image analysis and control component of an endoscopic system |
| JP7476814B2 (ja) * | 2021-01-28 | 2024-05-01 | トヨタ自動車株式会社 | 検査装置 |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2001258820A (ja) | 2000-01-13 | 2001-09-25 | Fuji Photo Film Co Ltd | 蛍光画像表示方法および装置 |
| JP2005034211A (ja) | 2003-07-16 | 2005-02-10 | Fuji Photo Film Co Ltd | 画像判別装置、方法およびプログラム |
| WO2013187116A1 (fr) * | 2012-06-14 | 2013-12-19 | オリンパス株式会社 | Dispositif de traitement d'image et système d'observation d'image en trois dimensions |
| JP2014094157A (ja) | 2012-11-09 | 2014-05-22 | Toshiba Corp | 医用画像処理装置および医用画像診断装置 |
| WO2016117277A1 (fr) * | 2015-01-21 | 2016-07-28 | Hoya株式会社 | Système d'endoscope |
| WO2017183353A1 (fr) * | 2016-04-19 | 2017-10-26 | オリンパス株式会社 | Système d'endoscope |
Family Cites Families (26)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| IT1194149B (it) * | 1983-03-07 | 1988-09-14 | Michele Testa | Procedimento per la determinazione dell'attivita' ossidativa di un liquido biologico e reagente relativo |
| JP2001258802A (ja) | 2000-03-16 | 2001-09-25 | Matsushita Electric Ind Co Ltd | ガス駆動掃除機 |
| US20090312817A1 (en) * | 2003-11-26 | 2009-12-17 | Wicab, Inc. | Systems and methods for altering brain and body functions and for treating conditions and diseases of the same |
| US20050240882A1 (en) * | 2004-04-21 | 2005-10-27 | Ge Medical Systems Information Technologies, Inc. | Method and system for displaying regions of pathological interest |
| CA2587804A1 (fr) * | 2007-05-08 | 2008-11-08 | William K. Mccroskey | Detecteur et systeme de tomographie modulaire multimodale |
| JP2008301968A (ja) * | 2007-06-06 | 2008-12-18 | Olympus Medical Systems Corp | 内視鏡画像処理装置 |
| CN101185603A (zh) * | 2007-06-22 | 2008-05-28 | 北京眼吧科技有限公司 | 立体视觉训练系统及方法 |
| DE102009010263B4 (de) | 2009-02-24 | 2011-01-20 | Reiner Kunz | Verfahren zur Navigation eines endoskopischen Instruments bei der technischen Endoskopie und zugehörige Vorrichtung |
| JP5529480B2 (ja) * | 2009-09-29 | 2014-06-25 | 富士フイルム株式会社 | 医用画像診断支援装置 |
| JP5535725B2 (ja) * | 2010-03-31 | 2014-07-02 | 富士フイルム株式会社 | 内視鏡観察支援システム、並びに、内視鏡観察支援装置、その作動方法およびプログラム |
| JP2012095274A (ja) * | 2010-10-28 | 2012-05-17 | Fujifilm Corp | 立体視画像表示装置および立体視画像表示方法 |
| JP5663283B2 (ja) | 2010-12-02 | 2015-02-04 | オリンパス株式会社 | 内視鏡画像処理装置及びプログラム |
| US20130023767A1 (en) * | 2011-05-12 | 2013-01-24 | Mammone Richard J | Low-cost, high fidelity ultrasound system |
| JP6112861B2 (ja) * | 2012-12-28 | 2017-04-12 | キヤノン株式会社 | 被検体情報取得装置、信号処理装置および表示装置 |
| JP6389299B2 (ja) * | 2013-09-26 | 2018-09-12 | 富士フイルム株式会社 | 内視鏡システム、内視鏡システムのプロセッサ装置、内視鏡システムの作動方法、プロセッサ装置の作動方法 |
| WO2015125508A1 (fr) * | 2014-02-21 | 2015-08-27 | ソニー株式会社 | Visiocasque, dispositif et procédé de commande |
| JP6264087B2 (ja) * | 2014-02-21 | 2018-01-24 | ソニー株式会社 | 表示制御装置、表示装置および表示制御システム |
| CN104055478B (zh) * | 2014-07-08 | 2016-02-03 | 金纯� | 基于视线追踪控制的医用内窥镜操控系统 |
| WO2016057960A1 (fr) * | 2014-10-10 | 2016-04-14 | Radish Medical Solutions, Inc. | Appareil, système et procédé de diagnostic en nuage et d'archivage et d'extraction d'image |
| JP6132984B2 (ja) * | 2014-12-12 | 2017-05-24 | オリンパス株式会社 | カプセル内視鏡システム及びその撮像方法 |
| JPWO2016170604A1 (ja) * | 2015-04-21 | 2018-03-15 | オリンパス株式会社 | 内視鏡装置 |
| US10510144B2 (en) * | 2015-09-10 | 2019-12-17 | Magentiq Eye Ltd. | System and method for detection of suspicious tissue regions in an endoscopic procedure |
| JP6633641B2 (ja) * | 2015-09-29 | 2020-01-22 | 富士フイルム株式会社 | 画像処理装置、内視鏡システム、及び画像処理方法 |
| JP2017070636A (ja) * | 2015-10-09 | 2017-04-13 | ソニー株式会社 | 手術システム、並びに、手術用制御装置および手術用制御方法 |
| CN108348145B (zh) * | 2015-11-10 | 2020-06-26 | 奥林巴斯株式会社 | 内窥镜装置 |
| JP6744712B2 (ja) * | 2015-12-17 | 2020-08-19 | 富士フイルム株式会社 | 内視鏡システム、プロセッサ装置、及び内視鏡システムの作動方法 |
-
2018
- 2018-10-18 CN CN201880070441.5A patent/CN111295127B/zh active Active
- 2018-10-18 WO PCT/JP2018/038754 patent/WO2019087790A1/fr not_active Ceased
- 2018-10-18 JP JP2019551062A patent/JP6967602B2/ja active Active
- 2018-10-18 EP EP18873241.6A patent/EP3705024B1/fr active Active
-
2020
- 2020-04-08 US US16/842,761 patent/US11302092B2/en active Active
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2001258820A (ja) | 2000-01-13 | 2001-09-25 | Fuji Photo Film Co Ltd | 蛍光画像表示方法および装置 |
| JP2005034211A (ja) | 2003-07-16 | 2005-02-10 | Fuji Photo Film Co Ltd | 画像判別装置、方法およびプログラム |
| WO2013187116A1 (fr) * | 2012-06-14 | 2013-12-19 | オリンパス株式会社 | Dispositif de traitement d'image et système d'observation d'image en trois dimensions |
| JP2014094157A (ja) | 2012-11-09 | 2014-05-22 | Toshiba Corp | 医用画像処理装置および医用画像診断装置 |
| WO2016117277A1 (fr) * | 2015-01-21 | 2016-07-28 | Hoya株式会社 | Système d'endoscope |
| WO2017183353A1 (fr) * | 2016-04-19 | 2017-10-26 | オリンパス株式会社 | Système d'endoscope |
Non-Patent Citations (1)
| Title |
|---|
| See also references of EP3705024A4 |
Cited By (13)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP7015275B2 (ja) | 2018-12-04 | 2022-02-02 | Hoya株式会社 | モデルの生成方法、教師データの生成方法、および、プログラム |
| JP2020089712A (ja) * | 2018-12-04 | 2020-06-11 | Hoya株式会社 | 情報処理装置、内視鏡用プロセッサ、情報処理方法およびプログラム |
| JP2020089710A (ja) * | 2018-12-04 | 2020-06-11 | Hoya株式会社 | 情報処理装置、内視鏡用プロセッサ、情報処理方法およびプログラム |
| JP2020089711A (ja) * | 2018-12-04 | 2020-06-11 | Hoya株式会社 | モデルの生成方法およびプログラム |
| CN110132547A (zh) * | 2019-05-14 | 2019-08-16 | 杭州电子科技大学 | 一种内窥镜头光学性能检测装置及检测方法 |
| JP2022541897A (ja) * | 2019-07-16 | 2022-09-28 | ドックボット, インコーポレイテッド | 機械学習システムのリアルタイム展開 |
| JP7420916B2 (ja) | 2019-07-16 | 2024-01-23 | サティスファイ ヘルス インコーポレイテッド | 機械学習システムのリアルタイム展開 |
| CN114025674A (zh) * | 2019-08-09 | 2022-02-08 | 富士胶片株式会社 | 内窥镜装置、控制方法、控制程序及内窥镜系统 |
| JP2021045337A (ja) * | 2019-09-18 | 2021-03-25 | 富士フイルム株式会社 | 医用画像処理装置、プロセッサ装置、内視鏡システム、医用画像処理方法、及びプログラム |
| JP7373335B2 (ja) | 2019-09-18 | 2023-11-02 | 富士フイルム株式会社 | 医用画像処理装置、プロセッサ装置、内視鏡システム、医用画像処理装置の作動方法、及びプログラム |
| WO2021149169A1 (fr) * | 2020-01-21 | 2021-07-29 | 日本電気株式会社 | Dispositif d'aide au fonctionnement, procédé d'aide au fonctionnement et support d'enregistrement lisible par ordinateur |
| US12295549B2 (en) | 2020-01-21 | 2025-05-13 | Nec Corporation | Surgical operation support apparatus, surgical operation support method, and computer-readable recording medium |
| WO2025173164A1 (fr) * | 2024-02-15 | 2025-08-21 | 日本電気株式会社 | Dispositif d'aide à l'inspection endoscopique, procédé d'aide à l'inspection endoscopique, et support d'enregistrement |
Also Published As
| Publication number | Publication date |
|---|---|
| JP6967602B2 (ja) | 2021-11-17 |
| US11302092B2 (en) | 2022-04-12 |
| EP3705024A1 (fr) | 2020-09-09 |
| JPWO2019087790A1 (ja) | 2020-11-12 |
| CN111295127B (zh) | 2022-10-25 |
| US20200234070A1 (en) | 2020-07-23 |
| CN111295127A (zh) | 2020-06-16 |
| EP3705024B1 (fr) | 2025-07-02 |
| EP3705024A4 (fr) | 2020-11-11 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP6967602B2 (ja) | 検査支援装置、内視鏡装置、内視鏡装置の作動方法、及び検査支援プログラム | |
| JP7756621B2 (ja) | 内視鏡システム、医療画像処理装置の作動方法及びプログラム、記録媒体 | |
| JP6049518B2 (ja) | 画像処理装置、内視鏡装置、プログラム及び画像処理装置の作動方法 | |
| JP6348078B2 (ja) | 分岐構造判定装置、分岐構造判定装置の作動方法および分岐構造判定プログラム | |
| CN107708521B (zh) | 图像处理装置、内窥镜系统、图像处理方法以及图像处理程序 | |
| CN106659362B (zh) | 图像处理装置、图像处理方法以及内窥镜系统 | |
| JP6254053B2 (ja) | 内視鏡画像診断支援装置、システムおよびプログラム、並びに内視鏡画像診断支援装置の作動方法 | |
| CN113498323B (zh) | 医用图像处理装置、处理器装置、内窥镜系统、医用图像处理方法、及记录介质 | |
| JP7326308B2 (ja) | 医療画像処理装置及び医療画像処理装置の作動方法、内視鏡システム、プロセッサ装置、診断支援装置並びにプログラム | |
| JP6949999B2 (ja) | 画像処理装置、内視鏡システム、画像処理方法、プログラム及び記録媒体 | |
| JP6907324B2 (ja) | 診断支援システム、内視鏡システム及び診断支援方法 | |
| JP2012245157A (ja) | 内視鏡装置 | |
| JP2011177419A (ja) | 蛍光観察装置 | |
| WO2006087981A1 (fr) | Dispositif de traitement d'images medicales, dispositif et procede de traitement d'image d'une lumiere et programmes leur etant destines | |
| JPWO2020170791A1 (ja) | 医療画像処理装置及び方法 | |
| JP6840263B2 (ja) | 内視鏡システム及びプログラム | |
| JP6128989B2 (ja) | 画像処理装置、内視鏡装置及び画像処理装置の作動方法 | |
| CN114302035A (zh) | 一种图像处理方法、装置、电子设备及内窥镜系统 | |
| JP7289241B2 (ja) | ファイリング装置、ファイリング方法及びプログラム | |
| US20230410304A1 (en) | Medical image processing apparatus, medical image processing method, and program | |
| JP2009198787A (ja) | 内視鏡装置 | |
| JP6199267B2 (ja) | 内視鏡画像表示装置、その作動方法およびプログラム | |
| KR101626802B1 (ko) | 내시경기반의 특정병변 추적 및 모니터링 시스템 | |
| JP2015008781A (ja) | 画像処理装置、内視鏡装置及び画像処理方法 | |
| WO2023282143A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations, système endoscopique et dispositif d'aide à la création de rapport |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18873241 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2019551062 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| ENP | Entry into the national phase |
Ref document number: 2018873241 Country of ref document: EP Effective date: 20200602 |
|
| WWG | Wipo information: grant in national office |
Ref document number: 2018873241 Country of ref document: EP |