[go: up one dir, main page]

WO2024195020A1 - Dispositif d'aide à l'examen endoscopique, procédé d'aide à l'examen endoscopique, et support d'enregistrement - Google Patents

Dispositif d'aide à l'examen endoscopique, procédé d'aide à l'examen endoscopique, et support d'enregistrement Download PDF

Info

Publication number
WO2024195020A1
WO2024195020A1 PCT/JP2023/011075 JP2023011075W WO2024195020A1 WO 2024195020 A1 WO2024195020 A1 WO 2024195020A1 JP 2023011075 W JP2023011075 W JP 2023011075W WO 2024195020 A1 WO2024195020 A1 WO 2024195020A1
Authority
WO
WIPO (PCT)
Prior art keywords
lesion
endoscopic
image
guide
guide information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/JP2023/011075
Other languages
English (en)
Japanese (ja)
Inventor
貴行 奥野
祐子 前川
雅弘 西光
直人 前田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NEC Corp
Original Assignee
NEC Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NEC Corp filed Critical NEC Corp
Priority to PCT/JP2023/011075 priority Critical patent/WO2024195020A1/fr
Publication of WO2024195020A1 publication Critical patent/WO2024195020A1/fr
Anticipated expiration legal-status Critical
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof

Definitions

  • This disclosure relates to technology that can be used to present information to assist in endoscopic examinations.
  • Patent Document 1 discloses a technology that superimposes a mark around a lesion area included in an endoscopic image and outputs the image to a display device.
  • Patent Document 1 does not specifically disclose the aspect of presenting information that can provide support according to the content of the treatment for the lesion. Therefore, with the technology disclosed in Patent Document 1, there is a problem that the quality of the treatment for the lesion may not be ensured due to factors such as the knowledge and experience of the surgeon.
  • One objective of the present disclosure is to provide an endoscopic examination support device that can ensure the quality of treatment performed on lesions.
  • an endoscopic examination support device includes an image acquisition means for acquiring an endoscopic image of a lesion inside a subject's body, a guide information generation means for generating guide information for providing support according to the content of a procedure to be performed on the lesion, and a display image generation means for generating a display image including the endoscopic image and the guide information.
  • a method for supporting endoscopic examinations acquires an endoscopic image of a lesion inside a subject's body, generates guide information for providing support according to the content of a procedure to be performed on the lesion, and generates a display image including the endoscopic image and the guide information.
  • the recording medium records a program that causes a computer to execute a process of acquiring an endoscopic image of a lesion inside a subject's body, generating guide information for providing assistance according to the content of a procedure to be performed on the lesion, and generating a display image including the endoscopic image and the guide information.
  • This disclosure makes it possible to ensure the quality of treatment given to lesions.
  • FIG. 1 is a diagram showing an example of a schematic configuration of an endoscopic examination system according to a first embodiment.
  • FIG. 2 is a block diagram showing an example of the hardware configuration of the endoscopic examination support device according to the first embodiment.
  • FIG. 2 is a block diagram showing an example of the functional configuration of the endoscopic examination support device according to the first embodiment.
  • FIG. 13 is a diagram showing an example of an endoscopic image obtained before marking is performed.
  • FIG. 13 is a diagram showing an example of an endoscopic image obtained while marking is being performed.
  • FIG. 6 is a diagram showing an example of a resection guide image corresponding to the endoscopic image in FIG. 5 .
  • FIG. 13 is a diagram showing an example of an endoscopic image obtained upon completion of marking.
  • FIG. 8 is a diagram showing an example of a resection guide image corresponding to the endoscopic image in FIG. 7 .
  • FIG. 13 is a diagram showing an example of an endoscopic image obtained before tissue for biopsy is sampled.
  • FIG. 10 is a diagram showing an example of a biopsy guide image corresponding to the endoscopic image in FIG. 9 .
  • 4 is a flowchart showing an example of processing performed in the endoscopic examination support device according to the first embodiment.
  • FIG. 11 is a block diagram showing an example of the functional configuration of an endoscopic examination support device according to a second embodiment. 10 is a flowchart showing an example of processing performed in the endoscopic examination support device according to the second embodiment.
  • Fig. 1 is a diagram showing an example of a schematic configuration of an endoscopic examination system according to the first embodiment.
  • the endoscopic examination system 100 includes an endoscopic examination support device 1, a display device 2, and an endoscope 3 connected to the endoscopic examination support device 1.
  • the endoscopic examination support device 1 acquires from the endoscopic scope 3 an image (hereinafter also referred to as an endoscopic image) including a time series of images obtained by capturing an image of a subject during an endoscopic examination, and displays on the display device 2 a display image for confirmation by an operator such as a doctor performing the endoscopic examination. Specifically, the endoscopic examination support device 1 acquires an image of the inside of the large intestine obtained during an endoscopic examination from the endoscopic scope 3 as an endoscopic image. The endoscopic examination support device 1 also detects lesions such as tumors based on an image extracted from the endoscopic image (hereinafter also referred to as an "endoscopic image").
  • the endoscopic examination support device 1 also detects a marking position that serves as a landmark when resecting the lesion based on the endoscopic image.
  • the endoscopic examination support device 1 also estimates the depth of the lesion based on the endoscopic image.
  • the endoscopic examination support device 1 also displays on the display device 2 a display image including an endoscopic image and information regarding the marking position detected from the endoscopic image when a procedure related to the resection of the lesion is performed.
  • the endoscopic examination support device 1 causes the display device 2 to display a display image including an endoscopic image and information related to the depth of invasion of the lesion detected from the endoscopic image.
  • the display device 2 has, for example, a liquid crystal monitor.
  • the display device 2 also displays images and the like output from the endoscopic examination support device 1.
  • the endoscope scope 3 mainly comprises an operating section 36 that allows the surgeon to input commands such as air supply, water supply, angle adjustment, and imaging instructions, a flexible shaft 37 that is inserted into the subject's organ to be examined, a tip section 38 that incorporates an endoscopic camera such as a miniature imaging element, and a connection section 39 for connecting to the endoscopic examination support device 1.
  • FIG. 1 is a block diagram showing an example of a hardware configuration of the endoscopic examination support device according to the first embodiment.
  • the endoscopic examination support device 1 mainly includes a processor 11, a memory 12, an interface 13, an input unit 14, a light source unit 15, a sound output unit 16, and a database (hereinafter, referred to as "DB") 17. These elements are connected via a data bus 19.
  • DB database
  • the processor 11 executes predetermined processing by executing programs stored in the memory 12.
  • the processor 11 is a processor such as a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), or a TPU (Tensor Processing Unit).
  • the processor 11 may be composed of multiple processors.
  • the processor 11 is an example of a computer.
  • the processor 11 also performs processing related to the generation of a display image based on an endoscopic image included in an endoscopic video.
  • the memory 12 is composed of various volatile memories used as working memories, such as RAM (Random Access Memory) and ROM (Read Only Memory), and non-volatile memories that store information necessary for the processing of the endoscopic examination support device 1.
  • the memory 12 may include an external storage device such as a hard disk connected to or built into the endoscopic examination support device 1, or may include a storage medium such as a removable flash memory or disk medium.
  • the memory 12 stores programs that allow the endoscopic examination support device 1 to execute each process in this embodiment.
  • the memory 12 temporarily stores a series of endoscopic images captured by the endoscope scope 3 during an endoscopic examination.
  • the interface 13 performs interface operations between the endoscopic examination support device 1 and an external device.
  • the interface 13 supplies the display image generated by the processor 11 to the display device 2.
  • the interface 13 also supplies illumination light generated by the light source unit 15 to the endoscope scope 3.
  • the interface 13 also supplies an electrical signal indicating the endoscopic video supplied from the endoscope scope 3 to the processor 11.
  • the interface 13 also supplies an endoscopic image extracted from the endoscopic video to the processor 11.
  • the interface 13 may be a communication interface such as a network adapter for communicating with an external device by wire or wirelessly, or may be a hardware interface compliant with USB (Universal Serial Bus), SATA (Serial AT Attachment), etc.
  • the input unit 14 generates an input signal in response to the operation of the surgeon.
  • the input unit 14 has at least one device among, for example, a button, a touch panel, a remote controller, a foot switch, and a voice input device.
  • the light source unit 15 generates light to be supplied to the tip 38 of the endoscope 3.
  • the light source unit 15 may also incorporate a pump for sending water or air to be supplied to the endoscope 3.
  • the sound output unit 16 outputs sound based on the control of the processor 11.
  • DB17 stores endoscopic images and the like acquired during past endoscopic examinations of the subject.
  • DB17 may include an external storage device such as a hard disk connected to or built into the endoscopic examination support device 1, or may include a removable storage medium such as a flash memory.
  • DB17 may be provided on an external server, etc., and related information may be acquired from the server via communication.
  • the endoscopic examination support device 1 may also be equipped with a sensor capable of measuring the rotation and translation of the endoscopic camera, such as a magnetic sensor.
  • FIG. 3 is a block diagram showing an example of a functional configuration of the endoscopic examination support device according to the first embodiment.
  • the endoscopic examination support device 1 has a mode setting unit 21, a lesion detection unit 22, a marking detection unit 23, an invasion depth estimation unit 24, a guide information generation unit 25, and a display image generation unit 26.
  • the mode setting unit 21 sets the display mode of the endoscopic examination support device 1 to one of the normal display mode, the resection guide display mode, and the biopsy guide display mode based on the mode setting signal.
  • the mode setting unit 21 can use the input signal generated in the input unit 14 as the mode setting signal.
  • the mode setting unit 21 can use, for example, an input signal generated by performing voice recognition on a voice input to a voice input device of the input unit 14 as the mode setting signal.
  • the mode setting unit 21 can use, for example, an input signal generated in response to the operation of a foot switch of the input unit 14 as the mode setting signal.
  • a voice capable of identifying each display mode such as, for example, "normal mode,” “resection mode,” “resection guide,” “biopsy mode,” and “biopsy guide.”
  • the mode setting unit 21 functions as an image acquisition means. In addition, the mode setting unit 21 acquires an endoscopic image contained in the endoscopic video, and outputs the acquired endoscopic image to an output destination according to the current display mode.
  • the mode setting unit 21 when the current display mode is the normal display mode, the mode setting unit 21 outputs an endoscopic image to the lesion detection unit 22. Also, when the current display mode is the resection guide display mode, the mode setting unit 21 outputs an endoscopic image to the marking detection unit 23. Also, when the current display mode is the biopsy guide display mode, the mode setting unit 21 outputs an endoscopic image to the invasion depth estimation unit 24.
  • the lesion detection unit 22 performs processing to detect lesions contained in an endoscopic image based on the endoscopic image output from the mode setting unit 21. In such processing, for example, a machine learning model trained to detect lesions contained in an endoscopic image can be used. Furthermore, when the endoscopic image acquired from the mode setting unit 21 contains a lesion, the lesion detection unit 22 outputs the endoscopic image and lesion detection information including information indicating the position of the lesion to the display image generation unit 26. Furthermore, when the endoscopic image acquired from the mode setting unit 21 does not contain a lesion, the lesion detection unit 22 outputs the endoscopic image to the display image generation unit 26.
  • the marking detection unit 23 performs a process of detecting the marking position included in the endoscopic image based on the endoscopic image output from the mode setting unit 21.
  • the marking detection unit 23 also outputs marking detection information including the endoscopic image acquired from the mode setting unit 21 and information indicating the marking position detected based on the endoscopic image to the guide information generation unit 25.
  • marking is performed by cauterizing biological tissue with energy supplied through an energy treatment tool such as a high-frequency knife.
  • the marking position in the endoscopic image is represented as a white area of a predetermined size or more. Therefore, the marking detection unit 23 only needs to have a function capable of detecting a white area of a predetermined size or more contained in the endoscopic image as the marking position.
  • the penetration depth estimation unit 24 performs processing to estimate the penetration depth of the lesion contained in the endoscopic image based on the endoscopic image output from the mode setting unit 21.
  • a machine learning model that has been trained to represent the penetration depth of the lesion as an estimated value (hereinafter also referred to as a score) within a predetermined range for each pixel included in the endoscopic image can be used. It is desirable to set the range of the aforementioned estimated value to, for example, 0 to 1.
  • the penetration depth estimation unit 24 outputs to the guide information generation unit 25 penetration depth estimation information including the endoscopic image acquired from the mode setting unit 21 and information indicating the estimated result of the penetration depth of the lesion included in the endoscopic image.
  • the guide information generating unit 25 functions as a guide information generating means.
  • the guide information generating unit 25 also detects the current display mode of the endoscopic examination support device 1 based on a mode setting signal.
  • the guide information generating unit 25 also generates guide information according to the current display mode of the endoscopic examination support device 1 based on either the marking detection information output from the marking detection unit 23 or the depth of penetration estimation information output from the depth of penetration estimation unit 24.
  • the guide information generating unit 25 when the current display mode is the resection guide display mode, the guide information generating unit 25 generates guide information capable of assisting in the resection of a lesion based on the marking detection information output from the marking detection unit 23. Then, the guide information generating unit 25 outputs to the display image generating unit 26 resection guide information including the endoscopic image of the marking detection information acquired from the marking detection unit 23 and the guide information generated based on the marking detection information.
  • the guide information generating unit 25 when the current display mode is the biopsy guide display mode, the guide information generating unit 25 generates guide information capable of assisting in the sampling of a lesion based on the depth of invasion estimation information output from the depth of invasion estimation unit 24. Then, the guide information generating unit 25 outputs biopsy guide information including an endoscopic image of the depth of invasion estimation information acquired from the depth of invasion estimation unit 24 and guide information generated based on the depth of invasion estimation information to the display image generating unit 26.
  • the guide information generating unit 25 can generate guide information to provide assistance according to the content of the treatment to be performed on the lesion.
  • the display image generating unit 26 functions as a display image generating means.
  • the display image generating unit 26 detects the current display mode of the endoscopic examination support device 1 based on a mode setting signal.
  • the display image generating unit 26 generates a display image according to the current display mode of the endoscopic examination support device 1 based on any one of the lesion detection information output from the lesion detection unit 22, the resection guide information output from the guide information generating unit 25, or the biopsy guide information output from the guide information generating unit 25.
  • the display image generating unit 26 outputs the display image generated as described above to the display device 2.
  • the surgeon for example, starts an endoscopic examination of the subject with the display mode of the endoscopic examination support device 1 set to the normal display mode.
  • the display image generation unit 26 determines whether or not a lesion is included in the endoscopic image of the lesion detection information based on the lesion detection information output from the lesion detection unit 22.
  • the display image generating unit 26 When the display image generating unit 26 obtains a determination result that the endoscopic image of the lesion detection information does not include a lesion, it outputs the endoscopic image to the display device 2 as a display image. Furthermore, when the display image generating unit 26 obtains a determination result that the endoscopic image of the lesion detection information includes a lesion, it generates a display image including the endoscopic image and a detection frame that surrounds the lesion in the endoscopic image, and outputs the generated display image to the display device 2. It is desirable that the display state of the aforementioned detection frame can be switched on or off in response to an input signal from the input unit 14, for example.
  • the surgeon checks the presence or absence of a lesion in the endoscopic image based on the endoscopic image displayed on the display device 2 in the normal display mode. In addition, when the surgeon checks that a lesion is present in the endoscopic image displayed on the display device 2 in the normal display mode, he or she determines whether resection or biopsy is the appropriate treatment to be performed on the lesion.
  • the surgeon determines that resection is an appropriate treatment for the lesion included in the endoscopic image
  • the surgeon performs an operation to switch the display mode of the endoscopic examination support device 1 from the normal display mode to the resection guide display mode.
  • the surgeon determines that biopsy is an appropriate treatment for the lesion included in the endoscopic image
  • the surgeon performs an operation to switch the display mode of the endoscopic examination support device 1 from the normal display mode to the biopsy guide display mode.
  • Fig. 4 is a diagram showing an example of an endoscopic image obtained before marking is performed.
  • the mode setting unit 21 acquires an endoscopic image EGB as shown in FIG. 5 while marking is being performed on the tissue surrounding the lesion LCA, and outputs the acquired endoscopic image EGB to the marking detection unit 23.
  • the endoscopic image EGB corresponds to an endoscopic image capturing an image of a lesion inside the subject's body.
  • FIG. 5 is a diagram showing an example of an endoscopic image acquired while marking is being performed.
  • the marking detection unit 23 detects three marking positions MK contained in the endoscopic image EGB based on the endoscopic image EGB.
  • the marking detection unit 23 also outputs marking detection information MJB including the endoscopic image EGB and information indicating the three marking positions MK to the guide information generation unit 25.
  • the guide information generating unit 25 generates a marking guide MGB corresponding to a line segment connecting adjacent marking positions among the three marking positions MK based on the marking detection information MJB.
  • the marking guide MGB may be generated as a broken line or a curved line, so long as it is a line segment connecting the three marking positions MK.
  • the guide information generating unit 25 then outputs resection guide information RJB, which includes the endoscopic image EGB and the marking guide MGB, to the display image generating unit 26.
  • the display image generating unit 26 generates a resection guide image RSB as shown in FIG. 6 by superimposing the marking guide MGB on the endoscopic image EGB of the resection guide information RJB.
  • the display image generating unit 26 also generates a display image including the resection guide image RSB, and outputs the generated display image to the display device 2.
  • FIG. 6 is a diagram showing an example of a resection guide image corresponding to the endoscopic image of FIG. 5.
  • the mode setting unit 21 acquires an endoscopic image EGC as shown in FIG. 7, and outputs the acquired endoscopic image EGC to the marking detection unit 23.
  • FIG. 7 is a diagram showing an example of an endoscopic image acquired when marking is completed.
  • the marking detection unit 23 detects nine marking positions MK contained in the endoscopic image EGC based on the endoscopic image EGC.
  • the marking detection unit 23 also outputs marking detection information MJC including the endoscopic image EGC and information indicating the nine marking positions MK to the guide information generation unit 25.
  • the guide information generating unit 25 generates a marking guide MGC corresponding to a line segment connecting adjacent marking positions among the nine marking positions MK based on the marking detection information MJC. In such a case, it is desirable that the marking guide MGC is generated as a closed line segment indicating the boundary of the area including the lesion LCA. Furthermore, as long as the marking guide MGC is a line segment connecting the nine marking positions MK, it may be generated as a broken line or a curved line. Then, the guide information generating unit 25 outputs resection guide information RJC including the endoscopic image EGC and the marking guide MGC to the display image generating unit 26.
  • the guide information generating unit 25 can generate information indicating the resection range of the lesion LCA as guide information.
  • the guide information generating unit 25 can generate a marking guide MGC corresponding to a line segment connecting adjacent marking positions MK among a plurality of marking positions MK detected from the endoscopic image EGC as information indicating the resection range of the lesion LCA.
  • the display image generating unit 26 generates a resection guide image RSC as shown in FIG. 8 by superimposing the marking guide MGC on the endoscopic image EGC of the resection guide information RJC.
  • the display image generating unit 26 also generates a display image including the resection guide image RSC, and outputs the generated display image to the display device 2.
  • the display image generating unit 26 can generate a display image including the endoscopic image EGC and the marking guide MGC.
  • FIG. 8 is a diagram showing an example of a resection guide image corresponding to the endoscopic image of FIG. 7.
  • the surgeon can perform marking from the fourth location onwards while checking the marking guide MGB in the resection guide image RSB. Furthermore, according to the specific example described above, the surgeon can suitably resect the lesion LCA, for example, by applying energy from a treatment tool to the tissue at the location where the marking guide MGC in the resection guide image RSC is superimposed.
  • Fig. 9 is a diagram showing an example of an endoscopic image obtained before the tissue for the biopsy is sampled.
  • the mode setting unit 21 outputs the endoscopic image EGD acquired before the lesion LCD is sampled to the invasion depth estimation unit 24.
  • the endoscopic image EGD corresponds to an endoscopic image of a lesion inside the subject's body.
  • the depth of invasion estimation unit 24 estimates the depth of invasion of the lesion LCD included in the endoscopic image EGD, for example, using a machine learning model that has been trained to represent the depth of invasion of the lesion with a score ranging from 0 to 1 for each pixel included in the endoscopic image. According to this processing, the depth of invasion estimation unit 24 can obtain an estimation result in which, for example, the score of a pixel in the endoscopic image EGD where the depth of invasion of the lesion is relatively deep is 1 or a value close to 1, and the score of a pixel in the endoscopic image EGD where the depth of invasion of the lesion is relatively shallow is a value close to 0.
  • the depth of invasion estimation unit 24 can obtain an estimation result in which, for example, the score of a pixel in the endoscopic image EGD where no lesion is present is 0. Furthermore, the depth of invasion estimation unit 24 outputs the endoscopic image EGD and the depth of invasion estimation information SJD including information indicating the estimated result of the depth of invasion of the lesion LCD to the guide information generation unit 25.
  • the guide information generating unit 25 Based on the invasion depth estimation information SJD, the guide information generating unit 25 generates a plurality of guide lines GL for making the position of the tissue recommended for collection visible on the lesion LCD. Specifically, the guide information generating unit 25 generates, for example, a line segment corresponding to the boundary of an area where the score indicating the estimated result of the invasion depth of the lesion is equal to or greater than the threshold value THU in the endoscopic image EGD of the invasion depth estimation information SJD as the guide line GLU.
  • the guide information generating unit 25 also generates, for example, a line segment corresponding to the boundary of an area where the score indicating the estimated result of the invasion depth of the lesion is equal to or greater than the threshold value THD as the guide line GLD in the endoscopic image EGD of the invasion depth estimation information SJD.
  • the threshold value THD is set to a value less than the threshold value THU.
  • the guide information generating unit 25 then outputs the biopsy guide information BJD including the endoscopic image EGD and the guide lines GLU and GLD to the display image generating unit 26.
  • the guide information generating unit 25 can generate, as guide information, information indicating the position of tissue that is recommended to be collected from the lesion LCD when a biopsy is performed on the lesion LCD.
  • the guide information generating unit 25 can also generate, as information indicating the position of tissue that is recommended to be collected from the lesion LCD, a plurality of line segments indicating the boundary of an area including tissue that is estimated to have a relatively deep depth of invasion in the lesion LCD.
  • the guide information generating unit 25 can also generate, as the above-mentioned plurality of line segments, a guide line GLU indicating the boundary of an area in the lesion LCD that is estimated to have a first depth or more of invasion, and a guide line GLD indicating the boundary of an area in the lesion LCD that is estimated to have a second depth or more that is less than the first depth.
  • the display image generating unit 26 generates a biopsy guide image BSD as shown in FIG. 10 by superimposing guide lines GLU and GLD having different display modes on the endoscopic image EGD of the biopsy guide information BJD.
  • FIG. 10 is a diagram showing an example of a biopsy guide image corresponding to the endoscopic image in FIG. 9.
  • the display image generating unit 26 can display guide lines GLU and GLD corresponding to the lesion LCD that spreads in an elliptical shape around a narrow area.
  • the guide line GLU is represented by a solid line
  • the guide line GLD is represented by a dashed line. That is, the display image generating unit 26 can generate the biopsy guide image BSD of FIG. 10 by superimposing the guide lines GLU and GLD having different line types on the endoscopic image EGD of the biopsy guide information BJD.
  • the display image generating unit 26 can generate a display image including the endoscopic image EGD and the guide lines GLU and GLD.
  • the display image generating unit 26 generates a display image including the resection guide image RSC, and outputs the generated display image to the display device 2.
  • the display image generating unit 26 generates a biopsy guide image BSD including a plurality of guide lines GL that are displayed in different display modes.
  • the display image generating unit 26 may generate a biopsy guide image BSD that includes guide lines GLU and GLD that are colored in different colors.
  • the surgeon can collect tissue within the area surrounded by the guide line GLU in the biopsy guide image BSD on the lesion LCD as tissue for biopsy while checking the guide line GLU in the biopsy guide image BSD.
  • the surgeon can collect tissue estimated to have the deepest depth of invasion as tissue for biopsy while checking the guide line GLU in the biopsy guide image BSD.
  • the surgeon can collect tissue usable for biopsy from the lesion LCD while checking the guide line GLD in the biopsy guide image BSD.
  • Fig. 11 is a flowchart showing an example of processing performed in the endoscopic examination support device according to the first embodiment. Note that the endoscopic examination support device 1 repeats the processing shown in Fig. 11 during the period from the start to the end of an endoscopic examination.
  • the endoscopic examination support device 1 sets the current display mode to one of the normal display mode, the resection guide display mode, and the biopsy guide display mode based on the mode setting signal (step S11).
  • the endoscopic examination support device 1 detects a lesion contained in the endoscopic image (step S12). Then, the endoscopic examination support device 1 generates a display image according to the lesion detection result obtained in step S12 (step S13). According to this processing, for example, when a lesion is contained in the endoscopic image, a display image showing the position of the lesion in the endoscopic image can be displayed on the display device 2. Also, according to the above-mentioned processing, for example, when a lesion is not contained in the endoscopic image, the endoscopic image can be displayed as a display image on the display device 2.
  • the endoscopic examination support device 1 detects the marking positions included in the endoscopic image (step S14). Based on the marking position detection result obtained in step S14, the endoscopic examination support device 1 generates a marking guide by connecting adjacent marking positions among the multiple marking positions (step S15). The endoscopic examination support device 1 then generates a display image including a resection guide image in which the marking guide generated in step S15 is superimposed on the endoscopic image (step S16). This process makes it possible to display on the display device 2 a display image in which the resection range of the lesion corresponding to the multiple marking positions is clearly indicated by the marking guide.
  • the endoscopic examination support device 1 When the current display mode is set to the biopsy guide display mode, the endoscopic examination support device 1 performs a process of estimating the depth of invasion of the lesion contained in the endoscopic image (step S17). Based on the estimated result of the invasion depth obtained in step S17, the endoscopic examination support device 1 also generates a plurality of guide lines indicating the boundary of an area including tissue that is estimated to have a relatively deep invasion depth in the lesion (step S18). The endoscopic examination support device 1 then generates a display image including a biopsy guide image in which the plurality of guide lines generated in step S17 are superimposed on the endoscopic image (step S19). This process allows the display device 2 to display a display image in which the position of tissue that is recommended to be collected from the lesion as tissue for biopsy is clearly indicated by the plurality of guide lines.
  • a marking guide can be displayed that indicates the resection range of the lesion corresponding to the position marked by the surgeon.
  • a marking guide can be displayed that indicates the resection range of the lesion corresponding to the position marked by the surgeon.
  • a guide line corresponding to the tissue that is recommended to be collected from the lesion can be displayed. Therefore, according to this embodiment, the quality of the treatment performed on the lesion can be ensured.
  • the mode setting unit 21 may set the display mode of the endoscopic examination support device 1 to the resection guide display mode when detecting that the surgeon has completed the marking. Specifically, the mode setting unit 21 may set the display mode of the endoscopic examination support device 1 to the resection guide display mode when detecting, for example, an input signal indicating the completion of the marking, which is generated in response to the operation of the input unit 14 by the surgeon.
  • the mode setting unit 21 may inquire of the surgeon as to whether or not to change the display mode of the endoscopic examination support device 1 to a resection guide display mode, or whether or not to change the display mode of the endoscopic examination support device 1 to a biopsy guide display mode, depending on the type of treatment tool included in the endoscopic image.
  • the mode setting unit 21 may inquire of the surgeon as to whether or not to set the display mode of the endoscopic examination support device 1 to a resection guide display mode. Also, when the mode setting unit 21 detects that an endoscopic image contains a treatment tool used to biopsy a lesion, such as a biopsy forceps, the mode setting unit 21 may inquire of the surgeon as to whether or not to set the display mode of the endoscopic examination support device 1 to a biopsy guide display mode.
  • the mode setting unit 21 may inquire of the surgeon as to whether the display mode of the endoscopic examination support device 1 should be set to the resection guide display mode or the biopsy guide display mode.
  • the mode setting unit 21 may inquire of the surgeon as to whether the display mode of the endoscopic examination support device 1 should be set to the resection guide display mode or the biopsy guide display mode.
  • the mode setting unit 21 may inquire of the surgeon as to whether the display mode of the endoscopic examination support device 1 should be set to the resection guide display mode or the biopsy guide display mode.
  • the mode setting unit 21 may inquire of the surgeon as to whether or not to change the display mode of the endoscopic examination support device 1.
  • the mode setting unit 21 may inquire of the surgeon as to whether or not to set the display mode of the endoscopic examination support device 1 to the resection guide display mode. Also, for example, when a still image to be recorded includes a treatment tool, the mode setting unit 21 may make an inquiry to the surgeon similar to that of variant example 2. For example, when a still image to be recorded includes a lesion, the mode setting unit 21 may make an inquiry to the surgeon similar to that of variant example 3 or 4.
  • FIG. 12 is a block diagram showing an example of the functional configuration of the endoscopic examination support device according to the second embodiment.
  • the endoscopic examination support device 500 has the same hardware configuration as the endoscopic examination support device 1.
  • the endoscopic examination support device 500 also includes an image acquisition means 501, a guide information generation means 502, and a display image generation means 503.
  • FIG. 13 is a flowchart showing an example of processing performed in the endoscopic examination support device according to the second embodiment.
  • the image acquisition means 501 acquires an endoscopic image of a lesion inside the subject's body (step S51).
  • the guide information generating means 502 generates guide information to provide support according to the content of the treatment to be performed on the lesion (step S52).
  • the display image generating means 503 generates a display image including an endoscopic image and guide information (step S53).
  • This embodiment ensures the quality of treatment performed on the lesion.
  • An endoscopic examination support device comprising:
  • the guide information generating means generates, as information indicating a resection range of the lesion, a line segment connecting adjacent marking positions among a plurality of marking positions detected from the endoscopic image;
  • the endoscopic examination support device wherein the display image generating means generates a display image including a guide image in which the line segment is superimposed on the endoscopic image.
  • the guide information generating means generates a plurality of line segments indicating the boundary of an area including tissue estimated to have a relatively deep penetration depth in the lesion, as information indicating the position of tissue recommended to be collected from the lesion.
  • the guide information generating means generates, as the plurality of line segments, a first line segment indicating a boundary of a region in which a depth of invasion in the lesion is estimated to be equal to or greater than a first depth, and a second line segment indicating a boundary of a region in which a depth of invasion in the lesion is estimated to be equal to or greater than a second depth that is less than the first depth;
  • Appendix 8 Obtaining an endoscopic image of a lesion inside the subject's body; generating guide information for providing assistance according to the content of a treatment to be performed on the lesion; An endoscopic examination support method for generating a display image including the endoscopic image and the guide information.
  • Appendix 9 Acquire endoscopic images of lesions inside the subject's body, generating guide information for providing assistance according to the content of a treatment to be performed on the lesion;
  • a recording medium having recorded thereon a program for causing a computer to execute a process for generating a display image including the endoscopic image and the guide information.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Endoscopes (AREA)

Abstract

Dans ce dispositif d'aide à l'examen endoscopique, un moyen d'acquisition d'image acquiert une image endoscopique qui produit une image d'une lésion interne d'un sujet. Un moyen de génération d'informations de guidage génère des informations de guidage pour une assistance en fonction du contenu du traitement à effectuer sur la lésion. Un moyen de génération d'image d'affichage génère une image d'affichage comprenant l'image endoscopique et les informations de guidage.
PCT/JP2023/011075 2023-03-22 2023-03-22 Dispositif d'aide à l'examen endoscopique, procédé d'aide à l'examen endoscopique, et support d'enregistrement Pending WO2024195020A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2023/011075 WO2024195020A1 (fr) 2023-03-22 2023-03-22 Dispositif d'aide à l'examen endoscopique, procédé d'aide à l'examen endoscopique, et support d'enregistrement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2023/011075 WO2024195020A1 (fr) 2023-03-22 2023-03-22 Dispositif d'aide à l'examen endoscopique, procédé d'aide à l'examen endoscopique, et support d'enregistrement

Publications (1)

Publication Number Publication Date
WO2024195020A1 true WO2024195020A1 (fr) 2024-09-26

Family

ID=92841099

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/011075 Pending WO2024195020A1 (fr) 2023-03-22 2023-03-22 Dispositif d'aide à l'examen endoscopique, procédé d'aide à l'examen endoscopique, et support d'enregistrement

Country Status (1)

Country Link
WO (1) WO2024195020A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018532467A (ja) * 2015-09-18 2018-11-08 オーリス ヘルス インコーポレイテッド 管状網のナビゲーション
WO2022025151A1 (fr) * 2020-07-30 2022-02-03 アナウト株式会社 Programme informatique, procédé de génération de modèle d'apprentissage, dispositif d'assistance chirurgicale et procédé de traitement d'informations
WO2022195744A1 (fr) * 2021-03-17 2022-09-22 オリンパスメディカルシステムズ株式会社 Dispositif de commande, dispositif d'endoscope et procédé de commande
JP2023007331A (ja) * 2021-06-28 2023-01-18 富士フイルム株式会社 内視鏡システム、医療画像処理装置及びその作動方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018532467A (ja) * 2015-09-18 2018-11-08 オーリス ヘルス インコーポレイテッド 管状網のナビゲーション
WO2022025151A1 (fr) * 2020-07-30 2022-02-03 アナウト株式会社 Programme informatique, procédé de génération de modèle d'apprentissage, dispositif d'assistance chirurgicale et procédé de traitement d'informations
WO2022195744A1 (fr) * 2021-03-17 2022-09-22 オリンパスメディカルシステムズ株式会社 Dispositif de commande, dispositif d'endoscope et procédé de commande
JP2023007331A (ja) * 2021-06-28 2023-01-18 富士フイルム株式会社 内視鏡システム、医療画像処理装置及びその作動方法

Similar Documents

Publication Publication Date Title
CN109998678B (zh) 在医学规程期间使用增强现实辅助导航
JP5486432B2 (ja) 画像処理装置、その作動方法およびプログラム
EP1867271B1 (fr) Dispositif de détection de la forme d'insertion d'un endoscope
JP2021137564A (ja) 電気生理学的ユーザインターフェース
EP3618747B1 (fr) Système médical
WO2019202827A1 (fr) Système de traitement d'image, dispositif de traitement d'image, procédé de traitement d'image, et programme
JP2020531099A (ja) 手術処置中に関心地点を空間的に場所特定する方法
JPWO2020165978A1 (ja) 画像記録装置、画像記録方法および画像記録プログラム
JP7493285B2 (ja) 情報処理装置、情報処理方法、及びコンピュータプログラム
JP2006198032A (ja) 手術支援システム
US12207889B2 (en) Endoscope with procedure guidance
EP4285810A1 (fr) Dispositif, procédé et programme de traitement d'image médicale
JPWO2022202401A5 (fr)
WO2023126999A1 (fr) Dispositif de traitement d'image, procédé de traitement d'image et support de stockage
WO2024195020A1 (fr) Dispositif d'aide à l'examen endoscopique, procédé d'aide à l'examen endoscopique, et support d'enregistrement
JP2003153876A (ja) 外科手術支援装置
JP4546043B2 (ja) バーチャル画像表示装置
JP2003339735A (ja) 手術支援装置
WO2025032671A1 (fr) Dispositif d'aide à l'examen endoscopique, procédé d'aide à l'examen endoscopique, et support d'enregistrement
JP2013094173A (ja) 観察システム、マーキング装置、観察装置及び内視鏡診断システム
WO2025104800A1 (fr) Dispositif d'aide à l'examen endoscopique, procédé d'aide à l'examen endoscopique, et support d'enregistrement
US20250204763A1 (en) Medical camera system and method for capturing images and processing said images
US20240335093A1 (en) Medical support device, endoscope system, medical support method, and program
WO2025173164A1 (fr) Dispositif d'aide à l'inspection endoscopique, procédé d'aide à l'inspection endoscopique, et support d'enregistrement
WO2024111106A1 (fr) Dispositif d'assistance à l'endoscopie, procédé d'assistance à l'endoscopie, et support d'enregistrement

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23928603

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE