[go: up one dir, main page]

WO2025032671A1 - Dispositif d'aide à l'examen endoscopique, procédé d'aide à l'examen endoscopique, et support d'enregistrement - Google Patents

Dispositif d'aide à l'examen endoscopique, procédé d'aide à l'examen endoscopique, et support d'enregistrement Download PDF

Info

Publication number
WO2025032671A1
WO2025032671A1 PCT/JP2023/028674 JP2023028674W WO2025032671A1 WO 2025032671 A1 WO2025032671 A1 WO 2025032671A1 JP 2023028674 W JP2023028674 W JP 2023028674W WO 2025032671 A1 WO2025032671 A1 WO 2025032671A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
endoscopic
lesion
display image
visual information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/JP2023/028674
Other languages
English (en)
Japanese (ja)
Inventor
達 木村
憲一 上條
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NEC Corp
Original Assignee
NEC Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NEC Corp filed Critical NEC Corp
Priority to PCT/JP2023/028674 priority Critical patent/WO2025032671A1/fr
Publication of WO2025032671A1 publication Critical patent/WO2025032671A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof

Definitions

  • This disclosure relates to technology that can be used to present information to assist in endoscopic examinations.
  • Patent Document 1 discloses a method for supporting diagnosis of diseases using endoscopic images of the digestive tract.
  • Patent Document 1 does not specifically disclose a method for presenting information that allows an operator to intuitively grasp the infiltration state of a lesion discovered during an endoscopic examination. Therefore, the technology disclosed in Patent Document 1 has the problem that, for example, an excessive burden may be imposed on the operator who treats the lesion discovered during an endoscopic examination.
  • One objective of the present disclosure is to provide an endoscopic examination support device that can reduce the burden placed on surgeons who treat lesions discovered during endoscopic examinations.
  • an endoscopic examination support device includes a lesion detection means for detecting at least one lesion contained in an endoscopic image obtained during an endoscopic examination, an estimation means for estimating the infiltration state of the lesion detected from the endoscopic image, a visual information generation means for generating visual information indicating the estimation result of the infiltration state of the lesion, and a display image generation means for generating a display image including the endoscopic image and the visual information.
  • an endoscopic examination support method detects at least one lesion included in an endoscopic image obtained during an endoscopic examination, estimates an infiltration state of the lesion detected from the endoscopic image, generates visual information indicating the estimated result of the infiltration state of the lesion, and generates a display image including the endoscopic image and the visual information.
  • the recording medium records a program that causes a computer to execute a process of detecting at least one lesion contained in an endoscopic image obtained during an endoscopic examination, estimating an infiltration state of the lesion detected from the endoscopic image, generating visual information indicating the estimated result of the infiltration state of the lesion, and generating a display image including the endoscopic image and the visual information.
  • This disclosure makes it possible to reduce the burden placed on surgeons who treat lesions discovered during endoscopic examinations.
  • FIG. 1 is a diagram showing an example of a schematic configuration of an endoscopic examination system according to the present disclosure.
  • FIG. 2 is a block diagram showing an example of a hardware configuration of an endoscopic examination support device according to the present disclosure.
  • FIG. 2 is a block diagram showing an example of a functional configuration of an endoscopic examination support device according to the present disclosure.
  • FIG. 13 is a diagram showing an example of an endoscopic image including a lesion.
  • 5 is a diagram showing an example of a display image generated using the endoscopic image of FIG. 4 .
  • FIG. 13 is a diagram showing another example of an endoscopic image including a lesion.
  • 7 is a diagram showing an example of a display image generated using the endoscopic image of FIG. 6 .
  • FIG. 13 is a diagram showing another example of an endoscopic image including a lesion.
  • FIG. 9 is a diagram showing an example of a display image generated using the endoscopic image of FIG. 8 .
  • 9 is a diagram showing another example of a display image generated using the endoscopic image of FIG. 8 .
  • 9 is a diagram showing another example of a display image generated using the endoscopic image of FIG. 8 .
  • 9 is a diagram showing another example of a display image generated using the endoscopic image of FIG. 8 .
  • FIG. 13 is a diagram showing another example of an endoscopic image including a lesion.
  • FIG. 13 is a block diagram showing another example of the functional configuration of the endoscopic examination support device according to the present disclosure.
  • 10 is a flowchart showing another example of processing performed in the endoscopic examination support device according to the present disclosure.
  • Fig. 1 is a diagram showing an example of a schematic configuration of an endoscopic examination system according to the present disclosure.
  • the endoscopic examination system 100 includes an endoscopic examination support device 1, a display device 2, and an endoscope 3 connected to the endoscopic examination support device 1.
  • the endoscopic examination support device 1 acquires from the endoscope scope 3 an image including a time series of images obtained by imaging a subject during an endoscopic examination (hereinafter, also referred to as an endoscopic image), and displays on the display device 2 a display image for confirmation by an operator such as a doctor performing the endoscopic examination. Specifically, the endoscopic examination support device 1 acquires an image of the inside of the large intestine obtained during an endoscopic examination from the endoscope scope 3 as an endoscopic image. The endoscopic examination support device 1 also performs a process of extracting an endoscopic image from the endoscopic image and detecting a lesion from the extracted endoscopic image. In the following description, unless otherwise specified, it is assumed that a neoplastic lesion is included in the endoscopic image.
  • the endoscopic examination support device 1 when a lesion is detected from an endoscopic image, the endoscopic examination support device 1 also performs a process of estimating the infiltration state of the lesion. In addition, the endoscopic examination support device 1 displays on the display device 2 a display image including information that allows the infiltration state of the lesion detected from the endoscopic image to be understood. The endoscopic examination support device 1 can be used to support the decision-making of an operator who performs treatment for a lesion discovered during an endoscopic examination.
  • the display device 2 has, for example, a liquid crystal monitor.
  • the display device 2 also displays images and the like output from the endoscopic examination support device 1.
  • the endoscope scope 3 mainly comprises an operating section 36 that allows the surgeon to input commands such as air supply, water supply, angle adjustment, and imaging instructions, a flexible shaft 37 that is inserted into the subject's organ to be examined, a tip section 38 that incorporates an endoscopic camera such as a miniature image sensor, and a connection section 39 for connecting to the endoscopic examination support device 1.
  • an operating section 36 that allows the surgeon to input commands such as air supply, water supply, angle adjustment, and imaging instructions
  • a flexible shaft 37 that is inserted into the subject's organ to be examined
  • a tip section 38 that incorporates an endoscopic camera such as a miniature image sensor
  • a connection section 39 for connecting to the endoscopic examination support device 1.
  • FIG. 1 is a block diagram showing an example of a hardware configuration of the endoscopic examination support device according to the present disclosure.
  • the endoscopic examination support device 1 mainly includes a processor 11, a memory 12, an interface 13, an input unit 14, a light source unit 15, a sound output unit 16, and a database (hereinafter, referred to as "DB") 17. These elements are connected via a data bus 19.
  • DB database
  • the processor 11 executes predetermined processing by executing programs stored in the memory 12.
  • the processor 11 is a processor such as a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), or a TPU (Tensor Processing Unit).
  • the processor 11 may be composed of multiple processors.
  • the processor 11 is an example of a computer.
  • the processor 11 also performs processing such as generating a display image that includes information that allows the infiltration state of a lesion to be understood.
  • the memory 12 is composed of various volatile memories used as working memories, such as RAM (Random Access Memory) and ROM (Read Only Memory), and non-volatile memories that store information necessary for the processing of the endoscopic examination support device 1.
  • the memory 12 may include an external storage device such as a hard disk connected to or built into the endoscopic examination support device 1, or may include a storage medium such as a removable flash memory or disk medium.
  • the memory 12 stores programs that allow the endoscopic examination support device 1 to execute each process in this embodiment.
  • the memory 12 temporarily stores a series of endoscopic images captured by the endoscope scope 3 during an endoscopic examination.
  • the interface 13 performs interface operations between the endoscopic examination support device 1 and an external device.
  • the interface 13 supplies the display image generated by the processor 11 to the display device 2.
  • the interface 13 also supplies illumination light generated by the light source unit 15 to the endoscope scope 3.
  • the interface 13 also supplies an electrical signal indicating the endoscopic video supplied from the endoscope scope 3 to the processor 11.
  • the interface 13 also supplies an endoscopic image extracted from the endoscopic video to the processor 11.
  • the interface 13 may be a communication interface such as a network adapter for communicating with an external device by wire or wirelessly, or may be a hardware interface compliant with USB (Universal Serial Bus), SATA (Serial AT Attachment), etc.
  • the input unit 14 generates an input signal in response to the operation of the surgeon.
  • the input unit 14 has at least one device, for example, a button, a touch panel, a remote controller, a foot switch, and a voice input device.
  • the light source unit 15 generates light to be supplied to the tip 38 of the endoscope 3.
  • the light source unit 15 may also incorporate a pump for sending water or air to be supplied to the endoscope 3.
  • the sound output unit 16 outputs sound based on the control of the processor 11.
  • DB17 stores endoscopic images and the like acquired during past endoscopic examinations of the subject.
  • DB17 may include an external storage device such as a hard disk connected to or built into the endoscopic examination support device 1, or may include a removable storage medium such as a flash memory.
  • DB17 may be provided on an external server, etc., and related information may be acquired from the server via communication.
  • the endoscopic examination support device 1 may also be equipped with a sensor capable of measuring the rotation and translation of the endoscopic camera, such as a magnetic sensor.
  • [Functional configuration] 3 is a block diagram showing an example of a functional configuration of an endoscopic examination support device according to the present disclosure.
  • the endoscopic examination support device 1 includes a lesion detection unit 21, an infiltration state estimation unit 22, and a display image generation unit 23.
  • the lesion detection unit 21 functions as a lesion detection means. Furthermore, the lesion detection unit 21 can detect at least one lesion contained in an endoscopic image obtained during an endoscopic examination. Furthermore, the lesion detection unit 21 performs processing to detect a lesion TU from the endoscopic image EG. Furthermore, the lesion detection unit 21 acquires position information PJ indicating the position of the lesion TU detected from the endoscopic image EG, and outputs the endoscopic image EG and the position information PJ to the infiltration state estimation unit 22. Note that if the lesion detection unit 21 is unable to detect a lesion from the endoscopic image, it does not output information, etc. to the infiltration state estimation unit 22.
  • the infiltration state estimation unit 22 functions as an estimation means.
  • the infiltration state estimation unit 22 can estimate the infiltration state of a lesion detected from an endoscopic image.
  • the infiltration state estimation unit 22 can estimate the infiltration state of each of the multiple lesions.
  • the infiltration state estimation unit 22 has an estimation model 22A that has been trained to be able to estimate the infiltration distance and infiltration direction of the lesion contained in the image as the infiltration state of the lesion.
  • the estimation model 22A is configured as a machine learning model trained by training data that associates, for example, an image including a lesion with the infiltration distance and infiltration direction of the lesion in each pixel constituting the lesion in the image.
  • the infiltration distance included in the training data is preferably, for example, a measured value of the distance from the boundary between the diseased tissue and non-lesioned tissue to the deepest infiltration position.
  • the infiltration direction included in the training data is preferably, for example, the direction in which the measured value was measured.
  • the infiltration state estimation unit 22 inputs the endoscopic image EG and position information PJ obtained from the lesion detection unit 21 to the estimation model 22A, thereby estimating the infiltration distance and infiltration direction of the lesion TU for each pixel constituting the lesion TU in the endoscopic image EG. That is, the infiltration state estimation unit 22 can estimate the infiltration state of the lesion TU contained in the endoscopic image EG using the estimation model 22A. The infiltration state estimation unit 22 then outputs the position information PJ obtained from the lesion detection unit 21 and the estimation result ER obtained by the above-mentioned estimation to the display image generation unit 23. Note that if the infiltration state estimation unit 22 does not obtain the endoscopic image and position information from the lesion detection unit 21, it does not perform processing related to estimating the infiltration distance and infiltration direction of the lesion.
  • the display image generating unit 23 has the functions of a display image generating means, a visual information generating means, and an infiltration state image generating means.
  • the display image generating unit 23 also identifies the position of the lesion TU included in the endoscopic image EG based on the position information PJ obtained from the infiltration state estimating unit 22.
  • the display image generating unit 23 also generates visual information VJ indicating the infiltration state of the lesion TU based on the infiltration distance and infiltration direction of the lesion TU indicated by the estimation result ER obtained from the infiltration state estimating unit 22. That is, the display image generating unit 23 can generate visual information indicating the estimation result of the infiltration state of the lesion.
  • the display image generating unit 23 also generates a display image HG including the endoscopic image EG and the visual information VJ, and outputs the generated display image HG to the display device 2. If the display image generating unit 23 does not obtain the position information and the estimation result from the infiltration state estimating unit 22, it outputs a display image including the endoscopic image EG but not including the visual information VJ to the display device 2.
  • the display image generating unit 23 may generate, for example, information indicating the infiltration state in a cross section obtained by cutting the lesion TU vertically as the visual information VJ. Also, according to this embodiment, the display image generating unit 23 may generate, for example, information indicating the infiltration state of the lesion TU in three dimensions as the visual information VJ.
  • the lesion detection unit 21 detects a lesion TUA included in the endoscopic image EGA.
  • the lesion detection unit 21 also acquires position information PJA indicating the position of the lesion TUA in the endoscopic image EGA, and outputs the endoscopic image EGA and the position information PJA to the infiltration state estimation unit 22.
  • Fig. 4 is a diagram showing an example of an endoscopic image including a lesion.
  • the infiltration state estimation unit 22 inputs the endoscopic image EGA and the position information PJA to the estimation model 22A, thereby estimating the infiltration distance and infiltration direction of the lesion TUA for each pixel constituting the lesion TUA in the endoscopic image EGA.
  • the infiltration state estimation unit 22 then outputs the position information PJA and the estimation result ERA obtained by the above-mentioned estimation to the display image generation unit 23.
  • the display image generating unit 23 identifies the position of the lesion TUA contained in the endoscopic image EGA based on the position information PJA.
  • the display image generating unit 23 also generates visual information VJA indicating the infiltration state of the lesion TUA based on the infiltration distance and infiltration direction of the lesion TUA indicated by the estimation result ERA.
  • the display image generating unit 23 also generates a display image HGA as shown in FIG. 5 as a display image including the endoscopic image EGA and the visual information VJA, and outputs the generated display image HGA to the display device 2.
  • FIG. 5 is a diagram showing an example of a display image generated using the endoscopic image of FIG. 4.
  • the visual information VJA is superimposed on a pixel group corresponding to the infiltration distance and infiltration direction of the lesion TUA in the endoscopic image EGA. Also, according to the display image HGA of FIG. 5, the area corresponding to the visual information VJA in the endoscopic image EGA is colored in a color different from the color of the biological tissue.
  • the display image generating unit 23 can generate a display image including an endoscopic image and visual information superimposed on a position corresponding to the lesion in the endoscopic image. Furthermore, according to the process described above, the surgeon can intuitively grasp the infiltration state of the lesion TUA included in the endoscopic image EGA by checking the visual information VJA of the endoscopic image EGA.
  • the lesion detection unit 21 detects a lesion TUB included in the endoscopic image EGB. Furthermore, the lesion detection unit 21 obtains position information PJB indicating the position of the lesion TUB in the endoscopic image EGB, and outputs the endoscopic image EGB and the position information PJB to the infiltration state estimation unit 22.
  • Fig. 6 is a diagram showing another example of an endoscopic image including a lesion.
  • the infiltration state estimation unit 22 inputs the endoscopic image EGB and the position information PJB to the estimation model 22A, thereby estimating the infiltration distance and infiltration direction of the lesion TUB for each pixel constituting the lesion TUB in the endoscopic image EGB.
  • the infiltration state estimation unit 22 then outputs the position information PJB and the estimation result ERB obtained by the above-mentioned estimation to the display image generation unit 23.
  • the display image generating unit 23 identifies the position of the lesion TUB contained in the endoscopic image EGB based on the position information PJB.
  • the display image generating unit 23 also generates visual information VJB indicating the infiltration state of the lesion TUB based on the infiltration distance and infiltration direction of the lesion TUB indicated by the estimation result ERB.
  • the display image generating unit 23 If the size of the lesion TUB in the endoscopic image EGB is equal to or larger than a predetermined size THS, the display image generating unit 23 generates a display image similar to that shown in FIG. 5 as a display image including the endoscopic image EGB and the visual information VJB, and outputs the generated display image to the display device 2.
  • the predetermined size THS may be set, for example, as the number of pixels, or may be set in units such as millimeters.
  • the display image generating unit 23 extracts a rectangular area including the lesion TUB from the endoscopic image EGB and enlarges the rectangular area to generate an enlarged image ZGB.
  • the display image generating unit 23 also generates a display image HGB as shown in FIG. 7 as a display image including the endoscopic image EGB, visual information VJB, and enlarged image ZGB, and outputs the generated display image HGB to the display device 2.
  • FIG. 7 is a diagram showing an example of a display image generated using the endoscopic image of FIG. 6.
  • the visual information VJB is superimposed on a group of pixels in the enlarged image ZGB that correspond to the infiltration distance and infiltration direction of the lesion TUB. Also, according to the display image HGB in FIG. 7, the area in the enlarged image ZGB that corresponds to the visual information VJB is colored in a color different from the color of the biological tissue.
  • the display image generating unit 23 can generate a display image including the endoscopic image, an enlarged image of an area including the lesion, and visual information superimposed on a position in the enlarged image corresponding to the lesion. Furthermore, according to the process described above, the surgeon can intuitively grasp the infiltration state of the lesion TUB included in the endoscopic image EGB by checking the visual information VJB of the enlarged image ZGB.
  • the lesion detection unit 21 detects lesions TUC1, TUC2, and TUC3 included in the endoscopic image EGC.
  • the lesion detection unit 21 also obtains position information PJC capable of individually identifying the position of lesion TUC1 in the endoscopic image EGC, the position of lesion TUC2 in the endoscopic image EGC, and the position of lesion TUC3 in the endoscopic image EGC.
  • the lesion detection unit 21 also outputs the endoscopic image EGC and the position information PJC to the infiltration state estimation unit 22.
  • Fig. 8 is a diagram showing another example of an endoscopic image including a lesion.
  • the infiltration state estimation unit 22 inputs the endoscopic image EGC and the position information PJC to the estimation model 22A, and estimates the infiltration distance and infiltration direction of each of the lesions TUC1, TUC2, and TU3 in the endoscopic image EGB at each pixel constituting the lesion.
  • the infiltration state estimation unit 22 then outputs the position information PJC and the estimation result ERC obtained by the above-mentioned estimation to the display image generation unit 23.
  • the display image generating unit 23 identifies the positions of the lesions TUC1, TUC2, and TUC3 contained in the endoscopic image EGC based on the position information PJC.
  • the display image generating unit 23 also generates visual information indicating the infiltration state of the lesion for each of the lesions TUC1, TUC2, and TU3 contained in the endoscopic image EGB based on the infiltration distance and infiltration direction of the lesion indicated by the estimation result ERC.
  • the display image generating unit 23 can generate visual information VJC1 indicating the infiltration state of lesion TUC1, visual information VJC2 indicating the infiltration state of lesion TUC2, and visual information VJC3 indicating the infiltration state of lesion TUC3.
  • the display image generating unit 23 extracts a rectangular area including the lesion from the endoscopic image EGC for each of the lesions TUC1, TUC2, and TU3 contained in the endoscopic image EGC, and generates an enlarged image by enlarging the rectangular area.
  • the display image generating unit 23 can generate an enlarged image ZGC1 obtained by enlarging a rectangular area including the lesion TUC1 in the endoscopic image EGC, an enlarged image ZGC2 obtained by enlarging a rectangular area including the lesion TUC2 in the endoscopic image EGC, and an enlarged image ZGC3 obtained by enlarging a rectangular area including the lesion TUC3 in the endoscopic image EGC.
  • the display image generating unit 23 also generates an infiltration state image SGC1 by combining the lesion TUC1 with visual information VJC1 resized according to the size of the lesion TUC1.
  • the display image generating unit 23 generates an infiltration state image SGC2 by combining the lesion TUC2 with visual information VJC2 resized according to the size of the lesion TUC2.
  • the display image generating unit 23 generates an infiltration state image SGC3 by combining the lesion TUC3 with visual information VJC3 resized according to the size of the lesion TUC3.
  • the display image generating unit 23 can generate the infiltration state images SGC1 to SGC3 by performing image processing such as removing the background around the lesion.
  • the display image generating unit 23 generates a display image HGC1 as shown in FIG. 9 as a display image including the endoscopic image EGC, the enlarged image ZGC1, and the infiltration state image SGC1.
  • the display image generating unit 23 also generates a display image HGC2 as shown in FIG. 10 as a display image including the endoscopic image EGC, the enlarged image ZGC2, and the infiltration state image SGC2.
  • the display image generating unit 23 also generates a display image HGC3 as shown in FIG. 11 as a display image including the endoscopic image EGC, the enlarged image ZGC3, and the infiltration state image SGC3.
  • FIG. 9 is a diagram showing an example of a display image generated using the endoscopic image of FIG. 8.
  • FIGS. 10 and 11 are diagrams showing other examples of display images generated using the endoscopic image of FIG. 8.
  • the display image generating unit 23 sequentially outputs the display images HGC1, HGC2, and HGC3 to the display device 2 so that the display is switched at regular intervals. Specifically, the display image generating unit 23 switches the display images to be output to the display device 2 in the order HGC1 ⁇ HGC2 ⁇ HGC3 ⁇ HGC1..., for example, at regular intervals. Furthermore, when outputting one of the display images HGC1, HGC2, and HGC3 to the display device 2, the display image generating unit 23 generates a frame WGC indicating a lesion corresponding to the enlarged image and infiltration state image in that one display image, and superimposes the generated frame WGC around that one lesion in the endoscopic image.
  • a frame WGC corresponding to the enlarged image ZGC1 and the infiltration state image SGC1 is superimposed around the lesion TUC1 in the endoscopic image EGC.
  • the orientations of the lesion TUC1 and the visual information VJC1 in the infiltration state image SGC1 are set so that the infiltration direction of the lesion TUC1 coincides with the vertical direction of the display image HGC1.
  • the infiltration state image SGC1 is displayed below the enlarged image ZGC1.
  • a frame WGC corresponding to the enlarged image ZGC2 and the infiltration state image SGC2 is superimposed around the lesion TUC2 in the endoscopic image EGC.
  • the orientation of the lesion TUC2 and the visual information VJC2 in the infiltration state image SGC2 is set so that the infiltration direction of the lesion TUC2 coincides with the vertical direction of the display image HGC2.
  • the infiltration state image SGC2 is displayed below the enlarged image ZGC2.
  • a frame WGC corresponding to the enlarged image ZGC3 and the infiltration state image SGC3 is superimposed around the lesion TUC3 in the endoscopic image EGC.
  • the orientations of the lesion TUC3 and the visual information VJC3 in the infiltration state image SGC3 are set so that the infiltration direction of the lesion TUC3 coincides with the vertical direction of the display image HGC3.
  • the infiltration state image SGC3 is displayed below the enlarged image ZGC3.
  • the display image generating unit 23 can generate a plurality of visual information pieces showing the estimated results of the infiltration state of each of a plurality of lesions. Also, according to the above-described processing, the display image generating unit 23 can generate a plurality of infiltration state images by combining a plurality of lesions and a plurality of visual information pieces. Also, according to the above-described processing, the display image generating unit 23 can generate a display image while sequentially switching between a plurality of infiltration state images. Also, according to the above-described processing, the surgeon can intuitively grasp the infiltration state of the lesion TUC1 included in the endoscopic image EGC by checking the visual information VJC1 of the infiltration state image SGC1.
  • the surgeon can intuitively grasp the infiltration state of the lesion TUC2 included in the endoscopic image EGC by checking the visual information VJC2 of the infiltration state image SGC2. Also, according to the above-described processing, the surgeon can intuitively grasp the infiltration state of the lesion TUC3 included in the endoscopic image EGC by checking the visual information VJC3 of the infiltration state image SGC3.
  • the lesion detection unit 21 acquires position information PJC by performing the same process as in specific example 3. In addition, the lesion detection unit 21 outputs the endoscopic image EGC and the position information PJC to the infiltration state estimation unit 22.
  • the infiltration state estimation unit 22 obtains the estimation result ERC by performing an estimation similar to that in specific example 3.
  • the infiltration state estimation unit 22 also outputs the position information PJC and the estimation result ERC to the display image generation unit 23.
  • the display image generating unit 23 identifies the positions of the lesions TUC1, TUC2, and TUC3 contained in the endoscopic image EGC based on the position information PJC. In addition, the display image generating unit 23 performs processing similar to that of specific example 3 to generate visual information VJC1 indicating the infiltration state of lesion TUC1, visual information VJC2 indicating the infiltration state of lesion TUC2, and visual information VJC3 indicating the infiltration state of lesion TUC3.
  • the display image generating unit 23 extracts a rectangular area including the lesion from the endoscopic image EGC for each of the lesions TUC1, TUC2, and TU3 included in the endoscopic image EGC. With this processing, the display image generating unit 23 can extract a partial image PGC1 corresponding to the rectangular area including the lesion TUC1 in the endoscopic image EGC, a partial image PGC2 corresponding to the rectangular area including the lesion TUC2 in the endoscopic image EGC, and a partial image PGC3 corresponding to the rectangular area including the lesion TUC3 in the endoscopic image EGC.
  • the display image generating unit 23 also generates an infiltration state image SGC4 by combining the lesion TUC1 with visual information VJC1 resized according to the size of the lesion TUC1.
  • the display image generating unit 23 also generates an infiltration state image SGC5 by combining the lesion TUC2 with visual information VJC2 resized according to the size of the lesion TUC2.
  • the display image generating unit 23 also generates an infiltration state image SGC6 by combining the lesion TUC3 with visual information VJC3 resized according to the size of the lesion TUC3.
  • the display image generating unit 23 can generate the infiltration state images SGC4 to SGC6 by performing image processing such as removing the background around the lesion.
  • the display image generating unit 23 generates a display image HGC4 as shown in FIG. 12, which includes the endoscopic image EGC, the partial images PGC1 to PGC3, and the infiltration state images SGC4 to SGC6.
  • the display image generating unit 23 also superimposes an identifier indicating the correspondence between the lesions included in the endoscopic image EGC and the lesions included in the partial images PGC1 to PGC3 for each of the lesions TUC1, TUC2, and TU3.
  • FIG. 12 is a diagram showing another example of a display image generated using the endoscopic image of FIG. 8.
  • partial images PGC1, PGC2, and PGC3 are displayed side by side in the horizontal direction.
  • an infiltration state image SGC4 is displayed below partial image PGC1
  • an infiltration state image SGC5 is displayed below partial image PGC2
  • an infiltration state image SGC6 is displayed below partial image PGC3.
  • an identifier "1" is superimposed near lesion TUC1 in the endoscopic image EGC and on the top of partial image PGC1.
  • an identifier "2" is superimposed near lesion TUC2 in the endoscopic image EGC and on the top of partial image PGC2.
  • the identifier "3" is superimposed near the lesion TUC3 in the endoscopic image EGC and on the top of the partial image PGC3.
  • the display image generating unit 23 can generate a plurality of visual information pieces showing the estimated results of the infiltration state of each of a plurality of lesions. According to the above-described processing, the display image generating unit 23 can generate a plurality of infiltration state images by combining a plurality of lesions and a plurality of visual information pieces. According to the above-described processing, the display image generating unit 23 can generate a display image including an endoscopic image and a plurality of infiltration state images. According to the above-described processing, the surgeon can intuitively grasp the infiltration state of the lesion TUC1 included in the endoscopic image EGC by checking the visual information VJC1 of the infiltration state image SGC4.
  • the surgeon can intuitively grasp the infiltration state of the lesion TUC2 included in the endoscopic image EGC by checking the visual information VJC2 of the infiltration state image SGC5.
  • the surgeon can intuitively grasp the infiltration state of the lesion TUC3 included in the endoscopic image EGC by checking the visual information VJC3 of the infiltration state image SGC6.
  • the display image generating unit 23 may change the number of partial images and infiltration state images that are simultaneously displayed in the display image HGC4, for example, depending on the display size of the display image HGC4.
  • the lesion detection unit 21 detects a lesion TUS included in the endoscopic image EGS.
  • the lesion detection unit 21 also acquires position information PJS indicating the position of the lesion TUS in the endoscopic image EGS, and outputs the endoscopic image EGS and the position information PJS to the infiltration state estimation unit 22.
  • the lesion detection unit 21 also determines whether or not a lesion TUS is included in an endoscopic image EGV input after the endoscopic image EGS.
  • FIG. 13 is a diagram showing another example of an endoscopic image including a lesion.
  • the infiltration state estimation unit 22 estimates the infiltration distance and infiltration direction of the lesion TUS at each pixel constituting the lesion TUS in the endoscopic image EGS by inputting the endoscopic image EGS and position information PJS to the estimation model 22A.
  • the infiltration state estimation unit 22 then outputs the position information PJS and the estimation result ERS obtained by the above-mentioned estimation to the display image generation unit 23.
  • the infiltration state estimation unit 22 also estimates the infiltration distance and infiltration direction of the lesion TUS at each pixel constituting the lesion TUS in the endoscopic image EGV by inputting the endoscopic image EGV and position information PJV to the estimation model 22A.
  • the infiltration state estimation unit 22 then outputs the position information PJV and the estimation result ERV obtained by the above-mentioned estimation to the display image generation unit 23. If the infiltration state estimation unit 22 does not receive an endoscopic image and position information from the lesion detection unit 21, it does not perform processing related to estimating the infiltration distance and infiltration direction of the lesion.
  • the endoscopic image EGS corresponds to, for example, a still image obtained in response to the operation of the operation unit 36 by the surgeon.
  • the endoscopic image EGV corresponds to an endoscopic image extracted from the endoscopic video after the acquisition of the endoscopic image EGS.
  • the display image generation unit 23 performs processing to output to the display device 2 a display image that includes an endoscopic image extracted from the endoscopic video but does not include the visual information VJS and VJV described below.
  • processing is performed by the lesion detection unit 21 and the infiltration state estimation unit 22. Also, according to this specific example, during the period from when detection of the lesion TUS in the endoscopic image EGV is interrupted until the next endoscopic image EGS is input, no processing is performed by the lesion detection unit 21 and the infiltration state estimation unit 22.
  • the display image generation unit 23 performs processing to output to the display device 2 a display image that includes an endoscopic image extracted from the endoscopic video but does not include visual information VJS and VJV, which will be described later.
  • the display image generating unit 23 identifies the position of the lesion TUS contained in the endoscopic image EGS based on the position information PJS.
  • the display image generating unit 23 also generates visual information VJS indicating the infiltration state of the lesion TUS based on the infiltration distance and infiltration direction of the lesion TUS indicated by the estimation result ERS.
  • the display image generating unit 23 also identifies the position of the lesion TUS contained in the endoscopic image EGV based on the position information PJV.
  • the display image generating unit 23 also generates visual information VJV indicating the infiltration state of the lesion TUS based on the infiltration distance and infiltration direction of the lesion TUS indicated by the estimation result ERV.
  • the display image generating unit 23 also generates a display image HGV as shown in FIG. 14 as a display image including the endoscopic image EGV, the endoscopic image EGS, and the visual information VJV.
  • a display image HGV as shown in FIG. 14 as a display image including the endoscopic image EGV, the endoscopic image EGS, and the visual information VJV.
  • FIG. 14 shows an example of a display image generated using the endoscopic image in FIG. 13.
  • the display image HGV of FIG. 14 visual information VJV corresponding to the lesion TUS contained in both the endoscopic image EGV and the endoscopic image EGS is displayed. Also, according to the display image HGV of FIG. 14, the area corresponding to the visual information VJV in the endoscopic image EGV is colored in a color different from the color of the living tissue. Also, according to this specific example, as long as the lesion TUS is contained in the image, even if an endoscopic image HGV such as that shown in FIG. 15 is input, for example, the visual information VJV corresponding to the position of the lesion TUS is displayed.
  • FIG. 15 is a diagram showing another example of a display image generated using the endoscopic image of FIG. 13.
  • the display image generating unit 23 can generate a display image including a first endoscopic image including a lesion, a second endoscopic image including the lesion and acquired after the first endoscopic image, and visual information superimposed on the second endoscopic image at a position corresponding to the lesion. Furthermore, according to the process described above, the surgeon can intuitively grasp the infiltration state of the lesion TUS included in the endoscopic image EGV and the endoscopic image EGS by checking the visual information VJV of the endoscopic image EGV.
  • the lesion detection unit 21 and the infiltration state estimation unit 22 may perform processing during the period from when the endoscopic image EGS is input until detection of each of the multiple lesions ceases.
  • the lesion detection unit 21 may perform a discrimination process to discriminate lesions contained in an endoscopic image.
  • a machine learning model trained with learning data that associates an image containing a lesion with information that can identify whether the lesion is neoplastic or non-neoplastic may be used.
  • the display image generation unit 23 may generate a display image that includes information indicating the processing result of the discrimination process by the lesion detection unit 21. According to such processing, the display image generation unit 23 can generate discrimination information DJW that includes a character string indicating whether the lesion TUS is neoplastic or non-neoplastic.
  • the lesion detection unit 21 may perform a measurement process to measure the size of a lesion contained in an endoscopic image.
  • a measurement process for example, a machine learning model trained using learning data that associates an image containing a lesion with information capable of identifying the size of the lesion may be used.
  • the display image generation unit 23 may generate a display image including information indicating the results of the measurement process performed by the lesion detection unit 21. According to such processing, the display image generation unit 23 can generate size information SJW including a character string indicating the size of the lesion TUS.
  • the display image generating unit 23 can generate a display image HGW as shown in FIG. 16 by adding, for example, discrimination information DJW and size information SJW to the display image HGV of FIG. 14.
  • FIG. 16 is a diagram showing another example of a display image generated using the endoscopic image of FIG. 13.
  • the display image generating unit 23 may generate a display image by adding either the discrimination information DJW or the size information SJW to the display image HGV of FIG. 14, for example.
  • FIG. 17 is a flowchart showing an example of processing performed in the endoscopic examination support device according to the present disclosure. Note that the endoscopic examination support device 1 repeats the processing shown in Fig. 17 during the period from the start to the end of the endoscopic examination.
  • the endoscopic examination support device 1 detects lesions from the endoscopic image (step S11).
  • the endoscopic examination support device 1 estimates the infiltration state of the lesion detected by the processing of step S11 (step S12).
  • the endoscopic examination support device 1 generates visual information indicating the infiltration state of the lesion estimated by the processing of step S12 (step S13).
  • the endoscopic examination support device 1 generates a display image including the visual information generated by the processing of step S13, and outputs the display image to the display device 2 (step S14).
  • the endoscopic examination support device 1 may generate a display image in which the visual information is superimposed at a position corresponding to the lesion in the endoscopic image.
  • the endoscopic examination support device 1 may generate an infiltration state image that combines the lesion and the visual information, and generate a display image including the infiltration state image.
  • visual information indicating the infiltration state of a lesion contained in an endoscopic image obtained during an endoscopic examination can be generated, and a display image including the generated visual information can be displayed. Therefore, according to this embodiment, it is possible to reduce the burden imposed on the surgeon who treats a lesion discovered during an endoscopic examination.
  • FIG. 18 is a block diagram showing another example of the functional configuration of the endoscopic examination support device according to the present disclosure.
  • the endoscopic examination support device 500 has the same hardware configuration as the endoscopic examination support device 1.
  • the endoscopic examination support device 500 also has a lesion detection means 501, an estimation means 502, a visual information generation means 503, and a display image generation means 504.
  • the lesion detection means 501 can be realized, for example, by using the functions of the lesion detection unit 21 of the first embodiment.
  • the estimation means 502 can be realized, for example, by using the functions of the infiltration state estimation unit 22 of the first embodiment.
  • the visual information generation means 503 can be realized, for example, by using the functions of the display image generation unit 23 of the first embodiment.
  • the display image generation means 504 can be realized, for example, by using the functions of the display image generation unit 23 of the first embodiment.
  • FIG. 19 is a flowchart showing another example of processing performed in the endoscopic examination support device according to the present disclosure.
  • the lesion detection means 501 detects at least one lesion contained in an endoscopic image obtained during an endoscopic examination (step S51).
  • the estimation means 502 estimates the infiltration state of the lesion detected from the endoscopic image (step S52).
  • the visual information generating means 503 generates visual information showing the estimated result of the infiltration state of the lesion (step S53).
  • the display image generating means 504 generates a display image including the endoscopic image and visual information (step S54).
  • This embodiment can reduce the burden placed on the surgeon who treats lesions discovered during endoscopic examination.
  • Appendix 1 a lesion detection means for detecting at least one lesion included in an endoscopic image obtained during an endoscopic examination; an estimation means for estimating an infiltration state of the lesion detected from the endoscopic image; a visual information generating means for generating visual information indicating the estimated result of the infiltration state of the lesion; a display image generating means for generating a display image including the endoscopic image and the visual information; An endoscopic examination support device having the same.
  • the display image generating means generates, as the display image, an image including the endoscopic image and the visual information superimposed at a position corresponding to the lesion in the endoscopic image.
  • the estimation means when a plurality of lesions are detected from the endoscopic image, estimates an infiltration state of each of the plurality of lesions;
  • the visual information generating means generates a plurality of pieces of visual information indicating an estimation result of an infiltration state of each of the plurality of lesions;
  • the infiltration state image generating means generates a plurality of infiltration state images by combining the plurality of lesions and the plurality of visual information;
  • the endoscopic examination support device according to claim 4, wherein the display image generating means generates the display image while sequentially switching between the plurality of infiltration state images.
  • the estimation means when a plurality of lesions are detected from the endoscopic image, estimates an infiltration state of each of the plurality of lesions;
  • the visual information generating means generates a plurality of pieces of visual information indicating an estimation result of an infiltration state of each of the plurality of lesions;
  • the infiltration state image generating means generates a plurality of infiltration state images by combining the plurality of lesions and the plurality of visual information;
  • the endoscopic examination support device according to claim 4, wherein the display image generating means generates an image including the endoscopic image and the multiple infiltration state images as the display image.
  • the display image generating means of the endoscopic examination support device of Appendix 1 generates, as the display image, an image including a first endoscopic image including the lesion, a second endoscopic image including the lesion and acquired after the first endoscopic image, and the visual information superimposed on a position corresponding to the lesion in the second endoscopic image.
  • Appendix 8 The endoscopic examination support device of Appendix 1, wherein the estimation means estimates the infiltration distance and infiltration direction of the lesion for each pixel constituting the lesion in the endoscopic image by inputting the endoscopic image and positional information indicating the position of the lesion in the endoscopic image into a machine learning model.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Endoscopes (AREA)

Abstract

Dans ce dispositif d'aide à l'examen endoscopique, un moyen de détection de lésion détecte au moins une lésion incluse dans une image endoscopique obtenue pendant un examen endoscopique. Un moyen d'estimation estime l'état d'infiltration de la lésion détectée à partir de l'image endoscopique. Un moyen de génération d'informations visuelles génère des informations visuelles indiquant un résultat d'estimation de l'état d'infiltration de la lésion. Un moyen de génération d'image d'affichage génère une image d'affichage comprenant l'image endoscopique et les informations visuelles. Le dispositif d'aide à l'examen endoscopique peut être utilisé pour l'aide à la prise de décision d'un opérateur qui effectue un traitement de la lésion découverte pendant l'examen endoscopique.
PCT/JP2023/028674 2023-08-07 2023-08-07 Dispositif d'aide à l'examen endoscopique, procédé d'aide à l'examen endoscopique, et support d'enregistrement Pending WO2025032671A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2023/028674 WO2025032671A1 (fr) 2023-08-07 2023-08-07 Dispositif d'aide à l'examen endoscopique, procédé d'aide à l'examen endoscopique, et support d'enregistrement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2023/028674 WO2025032671A1 (fr) 2023-08-07 2023-08-07 Dispositif d'aide à l'examen endoscopique, procédé d'aide à l'examen endoscopique, et support d'enregistrement

Publications (1)

Publication Number Publication Date
WO2025032671A1 true WO2025032671A1 (fr) 2025-02-13

Family

ID=94533673

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/028674 Pending WO2025032671A1 (fr) 2023-08-07 2023-08-07 Dispositif d'aide à l'examen endoscopique, procédé d'aide à l'examen endoscopique, et support d'enregistrement

Country Status (1)

Country Link
WO (1) WO2025032671A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010185782A (ja) * 2009-02-12 2010-08-26 Fujifilm Corp 光構造観察装置及びその構造情報処理方法
JP2015087167A (ja) * 2013-10-29 2015-05-07 キヤノン株式会社 画像処理方法、画像処理システム
WO2019078204A1 (fr) * 2017-10-17 2019-04-25 富士フイルム株式会社 Dispositif de traitement d'image médicale et dispositif d'endoscope
WO2020105699A1 (fr) * 2018-11-21 2020-05-28 株式会社Aiメディカルサービス Procédé d'aide au diagnostic de maladie basé sur des images endoscopiques d'organe digestif, système d'aide au diagnostic, programme d'aide au diagnostic et support d'enregistrement lisible par ordinateur ayant un programme d'aide au diagnostic stocké sur celui-ci
WO2020166247A1 (fr) * 2019-02-14 2020-08-20 日本電気株式会社 Dispositif de division de zone de lésion, système de diagnostic d'image médicale, procédé de division de zone de lésion, et support lisible par ordinateur non transitoire de stockage de programme
WO2022196494A1 (fr) * 2021-03-17 2022-09-22 テルモ株式会社 Procédé de traitement d'informations, procédé de génération de modèle d'apprentissage, programme et appareil de traitement d'informations

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010185782A (ja) * 2009-02-12 2010-08-26 Fujifilm Corp 光構造観察装置及びその構造情報処理方法
JP2015087167A (ja) * 2013-10-29 2015-05-07 キヤノン株式会社 画像処理方法、画像処理システム
WO2019078204A1 (fr) * 2017-10-17 2019-04-25 富士フイルム株式会社 Dispositif de traitement d'image médicale et dispositif d'endoscope
WO2020105699A1 (fr) * 2018-11-21 2020-05-28 株式会社Aiメディカルサービス Procédé d'aide au diagnostic de maladie basé sur des images endoscopiques d'organe digestif, système d'aide au diagnostic, programme d'aide au diagnostic et support d'enregistrement lisible par ordinateur ayant un programme d'aide au diagnostic stocké sur celui-ci
WO2020166247A1 (fr) * 2019-02-14 2020-08-20 日本電気株式会社 Dispositif de division de zone de lésion, système de diagnostic d'image médicale, procédé de division de zone de lésion, et support lisible par ordinateur non transitoire de stockage de programme
WO2022196494A1 (fr) * 2021-03-17 2022-09-22 テルモ株式会社 Procédé de traitement d'informations, procédé de génération de modèle d'apprentissage, programme et appareil de traitement d'informations

Similar Documents

Publication Publication Date Title
EP1685787B1 (fr) Systeme support d'insertion
JP5486432B2 (ja) 画像処理装置、その作動方法およびプログラム
JP2010279539A (ja) 診断支援装置および方法並びにプログラム。
JP2019517291A (ja) 内視鏡画像及び超音波画像の画像ベースの融合
JP7189355B2 (ja) コンピュータプログラム、内視鏡用プロセッサ、及び情報処理方法
JPWO2020165978A1 (ja) 画像記録装置、画像記録方法および画像記録プログラム
JP7493285B2 (ja) 情報処理装置、情報処理方法、及びコンピュータプログラム
KR101595962B1 (ko) 대장내시경 수술 시뮬레이션시스템
WO2021176664A1 (fr) Système et procédé d'aide à l'examen médical et programme
JP7441934B2 (ja) 処理装置、内視鏡システム及び処理装置の作動方法
WO2025032671A1 (fr) Dispositif d'aide à l'examen endoscopique, procédé d'aide à l'examen endoscopique, et support d'enregistrement
JP2001275986A (ja) 医用画像表示装置
JP2008054763A (ja) 医療画像診断装置
WO2021176665A1 (fr) Système d'assistance à la chirurgie, procédé d'assistance à la chirurgie, et programme
WO2024195020A1 (fr) Dispositif d'aide à l'examen endoscopique, procédé d'aide à l'examen endoscopique, et support d'enregistrement
JP7750418B2 (ja) 内視鏡検査支援装置、内視鏡検査支援方法、及び、プログラム
US20240122444A1 (en) Endoscopic examination support apparatus, endoscopic examination support method, and recording medium
WO2025104800A1 (fr) Dispositif d'aide à l'examen endoscopique, procédé d'aide à l'examen endoscopique, et support d'enregistrement
JP7647873B2 (ja) 画像処理装置、画像処理方法及びプログラム
US20250078255A1 (en) Endoscopic examination support apparatus, endoscopic examination support method, and recording medium
US20250078348A1 (en) Endoscopic examination support apparatus, endoscopic examination support method, and recording medium
WO2023195103A1 (fr) Système d'aide à l'inspection et procédé d'aide à l'inspection
WO2023089717A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et support d'enregistrement
US20250241514A1 (en) Image display device, image display method, and recording medium
WO2023089716A1 (fr) Dispositif d'affichage d'informations, procédé d'affichage d'informations et support d'enregistrement

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23948391

Country of ref document: EP

Kind code of ref document: A1