WO2019087969A1 - Système endoscope, procédé de rapport et programme - Google Patents
Système endoscope, procédé de rapport et programme Download PDFInfo
- Publication number
- WO2019087969A1 WO2019087969A1 PCT/JP2018/039901 JP2018039901W WO2019087969A1 WO 2019087969 A1 WO2019087969 A1 WO 2019087969A1 JP 2018039901 W JP2018039901 W JP 2018039901W WO 2019087969 A1 WO2019087969 A1 WO 2019087969A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- notification
- feature
- endoscope
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/045—Control thereof
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
Definitions
- the present invention relates to an endoscope system, a notification method, and a program, and more particularly to display of a virtual endoscopic image.
- the endoscopic image is an image captured using an imaging device such as a CCD (Charge Coupled Device).
- the endoscopic image is an image in which the color and texture of the inside of the tubular structure are clearly expressed.
- an endoscopic image is a two-dimensional image representing the inside of a tubular structure. For this reason, it is difficult to grasp which position in the tubular structure the endoscopic image represents.
- the virtual endoscopic image may be used as a navigation image to guide the endoscope to a target position in the tubular structure.
- CT is an abbreviation of Computed Tomography.
- MRI is an abbreviation of Magnetic Resonance Imaging.
- the image of the tubular structure is extracted from the three-dimensional inspection image, and the correspondence between the image of the tubular structure and the actual endoscopic image which is an actual endoscopic image acquired by imaging using the endoscope is acquired.
- a method has been proposed in which a virtual endoscopic image at the current position of the endoscope is generated from the three-dimensional inspection image of the tubular structure and displayed.
- Patent Document 1 describes an endoscopic observation support device that detects a lesion such as a polyp from three-dimensional image data and reports that the observation position of the endoscope has reached the vicinity of the lesion.
- Patent Document 2 extracts volume data representing a region to be imaged, identifies the position and direction of the tip of the endoscope probe in the volume data, and generates and displays a virtual endoscopic image in an arbitrary field of view.
- a medical imaging device is described.
- the medical image processing apparatus described in Patent Document 2 specifies the shape and position of a tumor candidate based on volume data, and displays a marker superimposed on the position of the tumor candidate. This enables the operator to recognize the presence or absence of a tumor candidate using a marker.
- Patent Document 3 describes a medical image display apparatus that detects a blind area in a developed image of a luminal organ in a subject and notifies the operator of the presence or absence of the blind area.
- the medical image processing device described in Patent Document 3 displays, when there is a blind spot area, character information indicating that the blind spot area is present.
- Patent Document 3 describes another display mode in which the position of the blind spot area is colored and displayed using a marker when the blind spot area is present.
- Patent document 4 produces
- An endoscope system is described that matches the composition of a mirror actual image and a virtual endoscopic image.
- the endoscope system described in Patent Document 4 detects a characteristic shape from a virtual endoscopic image. Next, the pixel value of the area corresponding to the characteristic shape of the virtual endoscopic image in the color endoscopic image is changed. This realizes a display form distinguishable from other areas.
- Patent Document 5 describes a medical image processing apparatus that acquires volume data from a CT apparatus and generates and displays a three-dimensional image from the acquired volume data.
- the medical image processing apparatus described in Patent Document 5 receives an input operation for marking a feature portion of a three-dimensional image displayed on the display unit, and the mark is displayed on the display unit.
- Patent Document 5 describes that a characteristic part can be automatically set using image analysis.
- Patent Document 6 describes an endoscope apparatus including an endoscope for observing the inside of a subject, and a monitor for displaying an endoscope image acquired using the endoscope.
- the endoscope apparatus described in Patent Document 6 acquires an image corresponding to a subject image captured using the endoscope, and executes processing for detecting a lesion site in the acquired image every time an image is acquired. Do.
- JP 2014-230612 A JP 2011-139797 A International Publication No. 2010/074058 JP, 2006-61274, A JP, 2016-143194, A JP, 2008-301968, A
- Patent Document 1 reports that the observation position of the endoscope has reached the vicinity of a lesion, no response is made to a lesion located at a blind spot in the observation range of the endoscope. Then, the invention described in Patent Document 1 may overlook a lesion located at a blind spot in the observation range of the endoscope.
- Patent Document 2 displays a marker superimposed on the position of a tumor candidate, correspondence is not made in the case where the position of the tumor candidate is a blind spot in the observation range of the endoscope. Then, the invention described in Patent Document 2 may overlook a lesion located at a blind spot in the observation range of the endoscope.
- Patent Document 3 informs the operator of the presence or absence of a blind area in a developed image regardless of the presence or absence of a lesion, and does not notify the presence or absence of a lesion in a blind area in a developed image. Then, the invention described in Patent Document 3 may overlook a lesion located at a blind spot in the observation range of the endoscope.
- Patent Document 4 changes the pixel value of a characteristic region in a virtual endoscopic image to enable distinction between a characteristic region and another region, and an endoscope image It does not apply to the discovery of the lesion etc. located in the blind spot of the observation range of the endoscope using.
- Patent Document 5 Although the invention described in Patent Document 5 is capable of automatically setting the characteristic portion to the endoscopic image, Patent Document 5 describes the case where the characteristic region is a blind spot in the observation range of the endoscope. There is no. Then, the invention described in Patent Document 5 may overlook a lesion located at a blind spot in the observation range of the endoscope.
- Patent Document 6 can detect a lesion site in frame image units constituting an endoscopic image
- Patent Document 6 a case where the lesion site becomes a blind spot in the observation range of the endoscope
- the invention described in Patent Document 6 may overlook a lesion located at a blind spot in the observation range of the endoscope.
- Patent Document 1 to Patent Document 6 there is a problem of overlooking a lesion or the like that is difficult to detect in endoscopy such as overlooking a lesion located at a blind spot in the observation range of the endoscope. Needs to be addressed.
- the present invention has been made in view of such circumstances, and in an endoscopic examination using an endoscope, an endoscope system, a notification method, and a program capable of suppressing the oversight of a lesion or the like which is difficult to detect. Intended to be provided.
- An endoscope system is configured to image an observation target of a subject using an endoscope and a first image input unit that inputs a virtual endoscopic image generated from a three-dimensional image of the subject.
- a second image input unit for inputting a real endoscopic image obtained by the user, a matching unit for correlating the virtual endoscopic image and the real endoscopic image, and a first of the prescriptions from the virtual endoscopic image
- a first feature region extraction unit that extracts a first feature region that matches the condition
- a second feature region extraction unit that extracts a second feature region that matches a second condition that corresponds to the first condition from the real endoscopic image
- an informing unit for informing when the first feature area is not associated with the second feature area.
- the first feature area is extracted from the virtual endoscopic image.
- the virtual endoscopic image is associated with the real endoscopic image.
- Notification is performed when the first feature area is not associated with the second feature area.
- the user may recognize that the first feature area is not associated with the second feature area due to the notification.
- the first image input unit may input a virtual endoscopic image generated in advance, or acquires a three-dimensional inspection image, generates a virtual endoscopic image from the acquired three-dimensional inspection image, and generates You may input a virtual endoscopic image.
- a three-dimensional inspection image a three-dimensional inspection image obtained by tomographic imaging of an object using a CT apparatus can be mentioned.
- a virtual endoscope a virtual large intestine endoscope which uses a large intestine as a subject is mentioned.
- the aspect provided with the 1st condition setting part which sets the 1st condition applied to extraction of the 1st feature field is preferred.
- the aspect provided with the 2nd condition setting part which sets the 2nd condition applied to extraction of a 2nd feature area is preferable.
- a 2nd aspect is an endoscope system of a 1st aspect, Comprising: A 1st characteristic area is located in the observation range of an endoscope, when a 1st characteristic area is matched with a 2nd characteristic area. Informing that the first feature area is associated with the second feature area, and notifying that the first feature area is associated with the second feature area when the first feature area is not associated with the second feature area And at least one of the notification method and the notification level may be changed in comparison with the notification level.
- notification can be made by distinguishing the case where the first feature area is associated with the second feature area and the case where the first feature area is not associated with the second feature area.
- a 3rd aspect is provided with the display part which displays a real endoscopic image in the endoscope system of a 2nd aspect, and the alerting
- the first notification image to be displayed and a second notification image notifying that the first feature region is associated with the second feature region located in the observation range of the endoscope are displayed on the display unit, and the second notification image Further, the first notification image may be enlarged and displayed.
- the case where the first feature region is not associated with the second feature region is emphasized.
- a 4th aspect is the endoscope system of a 2nd aspect.
- WHEREIN The display part which displays an actual endoscopic image is provided, and the alerting
- the first notification image to be displayed and the second notification image notifying that the first feature region is associated with the second feature region are displayed on the display unit, and the first notification image changes color from the second notification image It may be configured to
- the fourth aspect in contrast to the case where the first feature region is associated with the second feature region, the case where the first feature region is not associated with the second feature region is emphasized.
- a 5th aspect is a endoscope system of a 2nd aspect.
- WHEREIN The display part which displays a real endoscope image is provided, and the alerting
- the first notification image and a second notification image indicating that the first feature region is associated with the second feature region are displayed on the display unit, and the first notification image is displayed in a blinking manner, while the second notification image is displayed. It may be configured to light up and display.
- the case where the first feature region is not associated with the second feature region is emphasized.
- a 6th aspect is the endoscope system of a 2nd aspect.
- WHEREIN The display part which displays a real endoscope image is provided, and the alerting
- the first notification image and the second notification image indicating that the first feature region is associated with the second feature region are displayed on the display unit in a blinking manner, and the blinking period of the first notification image with respect to the second notification image May be shortened.
- the case where the first feature region is associated with the second feature region is emphasized.
- the endoscope system according to a seventh aspect is the endoscope system according to any one of the third aspect to the sixth aspect, wherein the display unit is configured to display the first notification image and the second notification image generated separately from the real endoscope image. It may be configured to be displayed superimposed on the endoscopic image.
- the seventh aspect it is possible to highlight an actual endoscopic image without processing the actual endoscopic image.
- the display unit displays a virtual endoscopic image, and a position of the endoscope in the virtual endoscopic image It may be configured to be displayed.
- the operator of the endoscope can recognize the position of the virtual endoscopic image corresponding to the observation position in the real endoscopic image.
- a ninth aspect is the endoscope system according to any one of the third to seventh aspects, wherein the display unit is configured to display a virtual endoscopic image and to display information of the first feature area. Good.
- the operator of the endoscope can recognize the first feature area in the virtual endoscopic image.
- the display unit may be configured to display the first feature area in an enlarged manner.
- the first feature area in the virtual endoscopic image can be easily viewed.
- the display may display the first feature area in a blinking manner.
- the first feature area in the virtual endoscopic image can be easily viewed.
- the notification sound output unit for outputting the notification sound is provided, and the notification unit uses the notification sound output unit to perform the first operation.
- the first notification sound may be output to indicate that the feature region is not associated with the second feature region.
- the first notification sound is output when the first feature area is not associated with the second feature area.
- the thirteenth aspect is the endoscope system according to the twelfth aspect, wherein the notification unit indicates that the first feature region is associated with the second feature region using the notification sound output unit, and the first notification sound A second notification sound different from the above may be output.
- notification can be made by distinguishing the case where the first feature region is not associated with the second feature region and the case where the first feature region is associated with the second feature region.
- the notification unit may be configured to increase the volume of the first notification sound with respect to the second notification sound.
- the fourteenth aspect in contrast to the case where the first feature region is associated with the second feature region, the case where the first feature region is not associated with the second feature region is emphasized.
- the notification unit detects the real endoscope from the area of the real endoscope image correlated with the first feature area.
- the notification level may be changed as the distance to the observation position of the image becomes shorter.
- the area of the real endoscope image associated with the first feature area is approaching the observation area of the endoscope.
- the notification level when changing the notification level.
- the notification level may be increased continuously or may be increased stepwise.
- the region of the real endoscope image associated with the first feature region in the fifteenth aspect is the second feature region associated with the first feature region, and the actual inside associated with the first feature region. At least one of the non-extraction regions of the endoscopic image is included.
- the non-extraction area of the real endoscope image represents an area which is not extracted as the second feature area from the real endoscope image.
- the first feature region extraction unit predetermines from a virtual endoscopic image when observing a real endoscopic image.
- the first feature area may be extracted.
- extraction of the first feature region from the virtual endoscopic image can be omitted. This reduces the processing load of image processing.
- the first feature region extraction unit observes a real endoscopic image when observing a real endoscopic image.
- the first feature area may be sequentially extracted from the virtual endoscopic image in accordance with.
- the seventeenth aspect it is possible to acquire a virtual endoscopic image in which the first feature region is not extracted. Thereby, the processing load on the first image input unit is reduced.
- An eighteenth aspect is the endoscope system according to any one of the first to seventeenth aspects, wherein the first feature region extraction unit extracts a plurality of first feature regions using the same first condition.
- the plurality of first feature areas may be collectively managed.
- the first feature region extraction unit applies, as the first condition, information on a position in a virtual endoscopic image. It is also good.
- the first feature region extraction unit may extract the first feature region based on the information of the position in the virtual endoscopic image.
- the first feature area extraction unit may apply a position of a blind spot in the observation range of the endoscope as the position information.
- the first feature region extraction unit can extract the first feature region at the position of the blind spot in the observation range of the endoscope. Therefore, with regard to the position of the blind spot in the observation range of the endoscope, the oversight of the area to be extracted as the second feature area can be suppressed.
- the first feature area extraction unit sets the position of the blind spot in the observation range of the endoscope as the first feature area. It can be extracted.
- the first feature region extraction unit may be configured to apply the back side of the fold as the position information.
- the first feature region extraction unit can extract the first feature region at the position on the back side of the fold.
- the second feature region extraction unit may extract a lesion as the second feature region.
- a twenty-third aspect is the endoscope system according to any one of the first aspect to the twenty-second aspect, wherein the second feature region extraction unit applies an extraction rule generated using machine learning to obtain a real endoscope.
- the second feature area may be extracted from the mirror image.
- the accuracy of the second feature region extraction in the real endoscopic image can be improved. This may improve the accuracy of endoscopy.
- the informing method comprises: acquiring a first image input step of inputting a virtual endoscopic image generated from a three-dimensional image of the subject; and imaging an observation target of the subject using the endoscope.
- the second image input process of inputting the acquired real endoscopic image, the correlating process of correlating the virtual endoscopic image and the real endoscopic image, and the virtual endoscopic image to the prescribed first condition A first feature area extraction step of extracting a first feature area that matches the second feature area extraction step of extracting a second feature area that matches a second condition corresponding to the first condition from the real endoscopic image; And a notifying step of notifying when the first feature region is not associated with the second feature region.
- the same matters as the matters specified in the second to twenty-third aspects can be combined as appropriate.
- the component carrying the processing or function specified in the endoscope system can be grasped as the component of the notification method carrying the processing or function corresponding thereto.
- the program according to the twenty-fifth aspect comprises, in a computer, a first image input function of inputting a virtual endoscopic image generated from a three-dimensional image of a subject, and imaging an observation target of the subject using an endoscope
- the second image input function to input the obtained real endoscopic image, the correlating function to associate the virtual endoscopic image with the real endoscopic image, and the virtual endoscopic image are met with a prescribed first condition
- the same matters as the matters specified in the second to twenty-third aspects can be combined as appropriate.
- the component carrying the processing or function specified in the endoscope system can be grasped as the component of the program carrying the processing or function corresponding to this.
- a twenty-fifth aspect is a system having at least one or more processors and at least one or more memories, the first image input function for inputting a virtual endoscopic image generated from a three-dimensional image of a subject
- the system may be configured as a system that implements a second feature area extraction function of extracting two feature areas and a notification function of notifying when the first feature area is not associated with the second feature area.
- the first feature area is extracted from the virtual endoscopic image.
- the virtual endoscopic image is associated with the real endoscopic image.
- Notification is performed when the first feature area is not associated with the second feature area.
- the user may recognize that the first feature area is not associated with the second feature area due to the notification. This makes it possible to suppress the oversight of a lesion or the like that is difficult to detect in endoscopy, which is grasped as a region to be extracted as the second feature region in endoscopy using an endoscope.
- FIG. 1 is a schematic view showing an entire configuration of an endoscope system.
- FIG. 2 is a functional block diagram showing functions of the medical image processing apparatus.
- FIG. 3 is a functional block diagram showing the function of the medical image analysis processing unit.
- FIG. 4 is a functional block diagram showing the function of the image storage unit.
- FIG. 5 is a schematic view of a CTC image.
- FIG. 6 is a schematic view of an endoscopic image.
- FIG. 7 is a schematic view showing a blind spot in the observation range of the endoscope.
- FIG. 8 is an explanatory view of first feature area extraction.
- FIG. 9 is an explanatory diagram of second feature region extraction.
- FIG. 10 is a schematic view showing an example of association of lesions.
- FIG. 11 is a schematic view showing an example of corrugation correspondence.
- FIG. 12 is a schematic view showing an example of the arrangement of the folds using the fold numbers.
- FIG. 13 is a schematic view of an endoscopic image and a virtual endoscopic image in the case of non notification.
- FIG. 14 is a schematic view of an endoscopic image and a virtual endoscopic image in the case of the first notification.
- FIG. 15 is a schematic view of an endoscopic image and a virtual endoscopic image in the case of the second notification.
- FIG. 16 is a flowchart showing the procedure of the notification method.
- FIG. 17 is an explanatory diagram of notification according to the first modification.
- FIG. 18 is an explanatory diagram of notification according to the second modified example.
- FIG. 19 is an explanatory diagram of notification according to the third modification.
- FIG. 20 is an explanatory diagram of another display example of the first feature area.
- FIG. 21 is a functional block diagram showing functions of a medical image processing apparatus for realizing notification according to another embodiment.
- FIG. 1 is a schematic view showing an entire configuration of an endoscope system.
- An endoscope system 9 shown in FIG. 1 includes an endoscope 10, a light source device 11, a processor 12, a display device 13, a medical image processing device 14, an operation device 15, and a monitor device 16. Prepare.
- the endoscope system 9 is communicably connected to the image storage device 18 via the network 17.
- the endoscope 10 is an electronic endoscope.
- the endoscope 10 is a flexible endoscope.
- the endoscope 10 includes an insertion unit 20, an operation unit 21, and a universal cord 22.
- the insert 20 comprises a distal end and a proximal end.
- the insertion unit 20 is inserted into the subject.
- the operator holds the operation unit 21 to perform various operations.
- the operation unit 21 is continuously provided on the proximal end side of the insertion unit 20.
- the insertion part 20 is formed in a long and narrow shape as a whole.
- the insertion portion 20 includes a flexible portion 25, a bending portion 26, and a tip portion 27.
- the insertion portion 20 is configured by connecting the flexible portion 25, the bending portion 26, and the distal end portion 27 in series.
- the flexible portion 25 has flexibility in order from the proximal side to the distal side of the insertion portion 20.
- the bending portion 26 has a structure that can be bent when the operation portion 21 is operated.
- the distal end portion 27 incorporates a photographing optical system and an imaging device 28 which are not shown.
- the imaging device 28 is a CMOS imaging device or a CCD imaging device.
- CMOS is an abbreviation of Complementary Metal Oxide Semiconductor, which is the English language for Complementary Metal Oxide Semiconductor.
- CCD is an abbreviation of Charge Coupled Device, which is an English notation for charge coupled devices.
- An observation window (not shown) is disposed on the distal end surface 27 a of the distal end portion 27.
- the observation window is an opening formed in the distal end surface 27 a of the distal end portion 27.
- a photographing optical system (not shown) is disposed behind the observation window. Image light of a region to be observed is incident on the imaging surface of the imaging element 28 through an observation window, a photographing optical system, and the like.
- the imaging device 28 images the image light of the observed region incident on the imaging surface of the imaging device 28 and outputs an imaging signal.
- imaging as used herein includes the meaning of converting image light into an electrical signal.
- the operation unit 21 includes various operation members.
- the various operating members are operated by the operator.
- the operation unit 21 includes two types of bending operation knobs 29.
- the bending operation knob 29 is used when bending the bending portion 26.
- the operation unit 21 includes an air / water feed button 30 and a suction button 31.
- the air / water supply button 30 is used at the time of air / water operation.
- the suction button 31 is used at the time of suction operation.
- the operation unit 21 includes a still image photographing instruction unit 32 and a treatment instrument introduction port 33.
- the still image photographing instruction unit 32 is used when instructing the photographing of the still image 39 of the region to be observed.
- the treatment instrument introduction port 33 is an opening for inserting the treatment instrument into the inside of the treatment instrument insertion path passing through the inside of the insertion portion 20. The treatment tool insertion path and the treatment tool are not shown.
- the universal cord 22 is a connection cord that connects the endoscope 10 to the light source device 11.
- the universal cord 22 includes the light guide 35 passing through the inside of the insertion portion 20, the signal cable 36, and a fluid tube (not shown).
- an end of the universal cord 22 includes a connector 37 a connected to the light source device 11 and a connector 37 b branched from the connector 37 a and connected to the processor 12.
- the connector 37 a When the connector 37 a is connected to the light source device 11, the light guide 35 and a fluid tube (not shown) are inserted into the light source device 11. Thereby, necessary illumination light, water, and gas are supplied from the light source device 11 to the endoscope 10 through the light guide 35 and the fluid tube (not shown).
- illumination light is emitted from the illumination window (not shown) of the distal end surface 27 a of the distal end portion 27 toward the region to be observed.
- gas or water is jetted from an air / water supply nozzle (not shown) of the distal end surface 27a of the distal end portion 27 toward an observation window (not shown) of the distal end surface 27a.
- the signal cable 36 and the processor 12 are electrically connected.
- an imaging signal of the region to be observed is output from the imaging element 28 of the endoscope 10 to the processor 12 through the signal cable 36, and a control signal is output from the processor 12 to the endoscope 10.
- a flexible endoscope has been described as an example of the endoscope 10, but various types of electronic devices capable of capturing moving images of a region to be observed such as a rigid endoscope can be used as the endoscope 10
- An endoscope may be used.
- the light source device 11 supplies illumination light to the light guide 35 of the endoscope 10 via the connector 37a.
- the illumination light may be white light or light of a specific wavelength band.
- the illumination light may combine white light and light of a specific wavelength band.
- the light source device 11 is configured to be able to appropriately select light of a wavelength band according to the purpose of observation as illumination light.
- the white light may be light of a white wavelength band or light of a plurality of wavelength bands.
- the specific wavelength band is a band narrower than the white wavelength band.
- light of a specific wavelength band light of one type of wavelength band may be applied, or light of a plurality of wavelength bands may be applied.
- the particular wavelength band may be called special light.
- the processor 12 controls the operation of the endoscope 10 via the connector 37 b and the signal cable 36.
- the processor 12 also acquires an imaging signal from the imaging element 28 of the endoscope 10 via the connector 37 b and the signal cable 36.
- the processor 12 applies a specified frame rate to acquire an imaging signal output from the endoscope 10.
- the processor 12 generates a moving image 38 of the region to be observed based on the imaging signal acquired from the endoscope 10. Furthermore, when the still image photographing instruction unit 32 is operated by the operation unit 21 of the endoscope 10, the processor 12 observes the object based on the imaging signal acquired from the imaging device 28 in parallel with the generation of the moving image 38. A still image 39 of the site is generated. The still image 39 may be generated at a high resolution with respect to the resolution of the moving image 38.
- the processor 12 When generating the moving image 38 and the still image 39, the processor 12 performs image quality correction to which digital signal processing such as white balance adjustment and shading correction is applied.
- the processor 12 may add incidental information defined by the DICOM (Digital Imaging and Communications in Medicine) standard to the moving image 38 and the still image 39.
- DICOM Digital Imaging and Communications in Medicine
- the moving image 38 and the still image 39 are in-vivo images of the inside of a subject, that is, the inside of a living body. Furthermore, when the moving image 38 and the still image 39 are images obtained by imaging using light of a specific wavelength band, both are special light images. Then, the processor 12 outputs the generated moving image 38 and still image 39 to each of the display device 13 and the medical image processing device 14. The processor 12 may output the moving image 38 and the still image 39 to the image storage device 18 via the network 17 in accordance with a communication protocol conforming to the DICOM standard.
- the display device 13 is connected to the processor 12.
- the display device 13 displays the moving image 38 and the still image 39 input from the processor 12.
- a user such as a doctor performs an operation of advancing and retracting the insertion unit 20 while checking the moving image 38 displayed on the display device 13 and detects the still image photographing instruction unit 32 when a lesion etc. is detected in the observed region. It is possible to operate to perform still image shooting of a region to be observed.
- the medical image processing apparatus 14 uses a computer.
- a keyboard, a mouse or the like connectable to a computer is used.
- the connection between the controller device 15 and the computer may be either a wired connection or a wireless connection.
- the monitor device 16 uses various monitors connectable to a computer.
- a diagnosis support apparatus such as a workstation and a server apparatus may be used.
- the controller device 15 and the monitor device 16 are provided for each of a plurality of terminals connected to a work station or the like.
- a medical care operation support apparatus that supports creation of a medical report or the like may be used.
- the medical image processing apparatus 14 acquires a moving image 38 and stores the moving image 38.
- the medical image processing apparatus 14 acquires a still image 39 and stores the still image 39.
- the medical image processing apparatus 14 performs reproduction control of the moving image 38 and reproduction control of the still image 39.
- the operating device 15 is used to input an operation instruction to the medical image processing apparatus 14.
- the monitor device 16 displays the moving image 38 and the still image 39 under the control of the medical image processing apparatus 14.
- the monitor device 16 functions as a display unit of various information in the medical image processing apparatus 14.
- the image storage device 18 connected to the medical image processing device 14 via the network 17 stores the CTC image 19.
- the CTC image 19 is generated using a CTC image generator (not shown).
- CTC is a shorthand notation showing CT colonography (colonography) showing a large intestine three-dimensional CT examination.
- a CTC image generator (not shown) generates a CTC image 19 from the three-dimensional inspection image.
- the three-dimensional inspection image is generated from an imaging signal obtained by imaging a region to be inspected using a three-dimensional imaging device.
- the three-dimensional imaging apparatus include a CT apparatus, an MRI apparatus, PET (Positron Emission Tomography), and an ultrasonic diagnostic apparatus.
- the CTC image 19 is generated from a three-dimensional inspection image obtained by imaging the large intestine.
- the endoscope system 9 may be communicably connected to the server device via the network 17.
- the server apparatus can apply a computer that stores and manages various data.
- the information stored in the image storage device 18 shown in FIG. 1 may be managed using a server device.
- DICOM format, a protocol conforming to the DICOM standard, or the like can be applied to the storage format of the image data and the communication between the respective devices via the network 17.
- FIG. 2 is a functional block diagram showing functions of the medical image processing apparatus.
- the medical image processing apparatus 14 shown in FIG. 2 includes a computer (not shown).
- the computer functions as an image acquisition unit 41, an information acquisition unit 42, a medical image analysis processing unit 43, and a display control unit 44 based on the execution of a program.
- the medical image processing apparatus 14 includes a storage unit 47 that stores information used for various controls of the medical image processing apparatus 14.
- the image acquisition unit 41 includes a CTC image acquisition unit 41a and an endoscope image acquisition unit 41b.
- the CTC image acquisition unit 41a acquires a CTC image 19 via an image input / output interface (not shown).
- the endoscopic image acquisition unit 41b acquires an endoscopic image 37 via an image input / output interface (not shown).
- the connection form of the image input / output interface may be wired or wireless.
- the CTC image acquisition unit 41a and the endoscope image acquisition unit 41b will be described in detail below.
- the CTC image acquisition unit 41a acquires the CTC image 19 stored in the image storage device 18 shown in FIG.
- the CTC image 19 acquired using the CTC image acquisition unit 41 a shown in FIG. 2 is stored in the image storage unit 48.
- the CTC image acquisition unit 41a can apply the same configuration as the endoscopic image acquisition unit 41b described later.
- Reference numeral 19b represents a viewpoint image.
- the viewpoint image 19 b is an image of the field of view at the viewpoint set in the CTC image 19. The viewpoint is shown in FIG. Details of the viewpoint image and the viewpoint will be described later.
- the term image in the present embodiment includes the concept of data representing an image or the concept of a signal.
- the CTC image 19 is an example of a virtual endoscopic image.
- the CTC image 19 corresponds to a virtual colonoscopy image.
- the CTC image acquisition unit 41a is an example of a first image input unit that inputs a virtual endoscopic image.
- the endoscopic image acquisition unit 41 b acquires an endoscopic image 37 generated using the processor 12 illustrated in FIG. 1.
- the endoscopic image 37 includes the moving image 38 and the still image 39 shown in FIG.
- the endoscopic image 37 generated using the processor 12 shown in FIG. 1 is acquired, but the endoscopic image 37 stored in an external storage device may be acquired.
- the endoscopic image acquisition unit 41b illustrated in FIG. 2 may acquire the endoscopic image 37 via various information storage media such as a memory card.
- the endoscopic image acquiring unit 41b acquires the moving image 38 and the still image 39 from the processor 12 illustrated in FIG.
- the medical image processing apparatus 14 stores the moving image 38 and the still image 39 acquired by using the endoscopic image acquisition unit 41 b in the image storage unit 48.
- Reference numeral 38a represents a plurality of frame images constituting the moving image 38.
- the medical image processing apparatus 14 does not have to store all of the moving image 38 of the endoscopic image 37 input from the processor 12 or the like in the image storage unit 48, and the operation of the still image photographing instruction unit 32 shown in FIG.
- the 1-minute moving image 38 before and after that may be stored in the image storage unit 48 shown in FIG.
- the one minute before and after represents a period from one minute before photographing to one minute after photographing.
- the endoscope image acquisition unit 41 b is an example of a second image input unit that inputs an actual endoscope image.
- the endoscopic image 37 corresponds to a real endoscopic image.
- the information acquisition unit 42 acquires information input from the outside via the operation device 15 or the like. For example, when the determination result determined by the user using the operation device 15 and the extraction result are input, the information acquisition unit 42 acquires the determination information of the user, the extraction information, and the like.
- the medical image analysis processing unit 43 analyzes the CTC image 19. Further, the medical image analysis processing unit 43 analyzes the endoscopic image 37. Details of the analysis of the CTC image 19 and the endoscopic image 37 using the medical image analysis processing unit 43 will be described later.
- the medical image analysis processing unit 43 performs an image analysis process using deep learning based on the deep learning algorithm 65.
- the deep learning algorithm 65 is an algorithm including a known convolutional neural network method, an entire combined layer, and an output layer.
- Deep learning is sometimes called deep learning.
- a convolutional neural network is an iterative process of convolutional and pooling layers. Convolutional neural networks may be referred to as convolutional neural networks.
- image analysis process using deep learning is a well-known technique, specific description is abbreviate
- the display control unit 44 controls image display of the monitor device 16.
- the display control unit 44 functions as a reproduction control unit 44a and an information display control unit 44b.
- the reproduction control unit 44a performs reproduction control of the CTC image 19 acquired using the CTC image acquisition unit 41a and the endoscope image 37 acquired using the endoscopic image acquisition unit 41b.
- the reproduction control unit 44a controls the monitor device 16 by executing a display control program.
- the display control program is included in the program stored in the program storage unit 49.
- the reproduction control unit 44a may switch between the two displays described above.
- the reproduction control unit 44a may switch between the two displays described above.
- FIGS. 1 and the endoscopic image 37 As a display example of the CTC image 19 and the endoscopic image 37, an example in which the CTC image 19 and the endoscopic image 37 are displayed in parallel in one screen is shown in FIGS.
- the information display control unit 44 b performs display control of incidental information of the CTC image 19 and display control of incidental information of the endoscope image 37.
- An example of incidental information of the CTC image 19 includes information representing the first feature area.
- incidental information of the endoscopic image 37 information representing a second feature area can be mentioned.
- the information display control unit 44 b performs display control of information necessary for various processes in the medical image analysis processing unit 43.
- various processes in the medical image analysis processing unit 43 association processing between the CTC image 19 and the endoscope image 37, feature region extraction processing of the CTC image 19 and feature region extraction processing of the endoscope image 37 are cited.
- the storage unit 47 includes an image storage unit 48.
- the image storage unit 48 stores the CTC image 19 acquired by the medical image processing apparatus 14 and the endoscopic image 37.
- the medical image processing apparatus 14 illustrated the aspect provided with the memory
- an image storage device 18 communicably connected via the network 17 shown in FIG. 1 may be mentioned.
- the storage unit 47 includes a program storage unit 49.
- the program stored using the program storage unit 49 includes an application program for causing the medical image processing apparatus 14 to execute reproduction control of the moving image 38.
- the program stored using the program storage unit 49 includes a program for causing the medical image processing apparatus 14 to execute the processing of the medical image analysis processing unit 43.
- the medical image processing apparatus 14 may be configured using a plurality of computers or the like.
- a plurality of computers and the like may be communicably connected via a network.
- the plurality of computers referred to here may be separated in terms of hardware, may be integrally configured in terms of hardware, and may be separated functionally.
- the various processors are processors that can change the circuit configuration after manufacturing a central processing unit (CPU) or a field programmable gate array (FPGA) that is a general-purpose processor that executes software and functions as various control units. It includes a dedicated electric circuit or the like which is a processor having a circuit configuration specially designed to execute a specific process such as a programmable logic device (PLD) or an application specific integrated circuit (ASIC).
- PLD programmable logic device
- ASIC application specific integrated circuit
- software here is synonymous with a program.
- One processing unit may be configured by one of these various processors, or may be configured using two or more processors of the same type or different types. Examples of two or more processors include a plurality of FPGAs, or a combination of a CPU and an FPGA. Also, the plurality of control units may be configured by one processor. As an example in which a plurality of control units are configured by one processor, first, as represented by a computer such as a client device and a server device, one combination of one or more CPUs and software is used. There is a form which comprises a processor and this processor functions as a plurality of control units.
- IC is an abbreviation of Integrated Circuit, which is the English notation of integrated circuits.
- FIG. 3 is a functional block diagram showing the function of the medical image analysis processing unit.
- An endoscope 10 in the following description is illustrated in FIG.
- the CTC image 19, the viewpoint image 19b, the endoscope image 37, and the frame image 38a are illustrated in FIG.
- the medical image analysis processing unit 43 shown in FIG. 3 includes a first feature region extraction unit 50, a first condition setting unit 52, a second feature region extraction unit 54, a second condition setting unit 56, and an association unit. 58, a notification unit 59, and a notification image generation unit 60.
- the first feature region extraction unit 50 extracts, from the CTC image 19, a first feature region that is a feature region that meets the defined first condition.
- Examples of the first feature area of the CTC image 19 include a lesion, a fold, a change point between colons, and a blood vessel.
- the blood vessel includes a running pattern of the blood vessel.
- the function of the first feature area extraction unit 50 corresponds to a first feature area extraction function.
- the first condition setting unit 52 sets a first condition.
- the first condition is an extraction condition applied to the extraction process using the first feature region extraction unit 50.
- the first condition setting unit 52 can set information input using the controller device 15 shown in FIG. 2 as a first condition.
- the illustration of the first feature area described above is grasped as an illustration of the first condition.
- the second feature area extraction unit 54 extracts, from the endoscopic image 37 shown in FIG. 2, a second feature area that is a feature area that meets the prescribed second condition. Similar to the first feature area of the CTC image 19, examples of the second feature area of the endoscopic image 37 include a lesion, a fold, a change point between colons, and a blood vessel.
- the second feature area extraction unit 54 may automatically extract a second feature area that matches the second condition from the endoscopic image 37.
- the second feature region extraction unit 54 may obtain an extraction result in which the user manually extracts a second feature region that matches the second condition from the endoscopic image 37.
- the user may input the extraction result manually extracted using the information acquisition unit 42 shown in FIG.
- the function of the second feature area extraction unit 54 corresponds to a second feature area extraction function.
- the second condition setting unit 56 sets a second condition corresponding to the first condition as the extraction condition of the second feature area of the endoscope image 37.
- the second condition corresponding to the first condition includes the same second condition as the first condition. For example, when a lesion is set as the first condition, a lesion may be set as the second condition.
- first condition and the second condition specific lesions such as polyps and inflammation may be set instead of the generic concept of lesions.
- the first condition and the second condition may be a combination of a plurality of conditions.
- the associating unit 58 associates the CTC image 19 and the endoscopic image 37 shown in FIG.
- the correspondence between the first feature area of the CTC image 19 and the second feature area of the endoscopic image 37 can be mentioned.
- the first feature area of the CTC image 19 corresponding to the detected lesion is associated with the second feature area of the endoscopic image 37. .
- the correspondence between the CTC image 19 and the endoscopic image 37 can use position information. For example, the reference position of the CTC image 19 and the reference position of the endoscopic image 37 are matched, and the CTC image 19 and the endoscopic image 37 are compared with each other using the distances from the reference positions. Correspondence with the endoscopic image 37 is possible.
- the coordinate value of the CTC image 19 may be associated with the number of the frame image 38 a of the endoscopic image 37.
- the correspondence between the CTC image 19 and the endoscopic image 37 includes the correspondence between the first feature area of the CTC image 19 and the non-extraction area of the endoscopic image 37. A specific example of the correspondence between the first feature area of the CTC image 19 and the non-extraction area of the endoscopic image 37 will be described later.
- the function of the association unit 58 corresponds to the association function.
- ⁇ notification part If the notification unit 59 determines that there is a non-extraction region not extracted from the endoscopic image 37 among the regions of the endoscopic image 37 associated with the first feature region 80 extracted from the CTC image 19. , To that effect.
- the position of the blind spot of the observation range of endoscope 10 is mentioned as an example of a non-extraction field.
- the notification unit 59 displays notification information on the monitor device 16 via the display control unit 44. As an example of the notification information, a notification image to be described later can be mentioned.
- the function of the notification unit 59 corresponds to a notification function.
- the notification image generation unit 60 generates a notification image for notifying of the presence of the second feature region of the endoscope image 37.
- Examples of the notification image include a symbol attached to an arbitrary position of the second feature region, a closed curve representing an edge of the second feature region, and the like.
- the notification image generation unit 60 generates a notification image that can be displayed superimposed on the endoscopic image 37 without processing the endoscopic image 37.
- the first notification image 140 is illustrated in FIG. 14 as an example of the notification image.
- a second notification image 142 is illustrated in FIG. Details of the notification image will be described later.
- FIG. 4 is a functional block diagram showing the function of the image storage unit.
- the image storage device 18 includes a first feature area storage unit 64, a second feature area storage unit 66, and an association result storage unit 68.
- the first feature area storage unit 64 stores the information of the first feature area extracted from the CTC image 19 using the first feature area extraction unit 50 shown in FIG.
- information representing the position of the first feature area in the CTC image 19 may be mentioned.
- the position of the first feature region in the CTC image 19 can be identified using coordinate values at the coordinates set in the CTC image 19 and the viewpoint set in the CTC image 19 or the like.
- the second feature area storage unit 66 stores the information of the second feature area extracted from the endoscopic image 37 using the second feature area extraction unit 54 shown in FIG. 3.
- information of the second feature area information representing the position of the second feature area in the endoscopic image 37 can be mentioned.
- the position of the second feature region in the endoscopic image 37 can be identified using the distance from the reference position of the object to be observed using detection information of a sensor provided in the endoscope 10.
- the association result storage unit 68 stores the result of association between the CTC image 19 and the endoscopic image 37 executed using the association unit 58 shown in FIG. For example, the result of associating the information of the position of the first feature area of the CTC image 19 with the information of the position of the second feature area of the endoscopic image 37 can be stored.
- FIG. 5 is a schematic view of a CTC image.
- the whole image 19a shown in FIG. 5 is one form of the CTC image 19 representing the whole of a large intestine which is a region to be observed.
- the observation site has the same meaning as the subject and the observation target of the subject.
- Entire image 19a is placed one or more viewpoints P on the path 19c that is set, from the start position P S, while changing the sequentially viewpoint P to the goal position P G, the inside of the lumen from the viewpoint P It is an image assuming that it saw.
- the pass 19c may be generated by thinning the entire image 19a.
- a known thinning method can be applied to the thinning processing. Although a plurality of viewpoints P are illustrated in FIG. 5, the arrangement and the number of the viewpoints P can be appropriately determined according to the inspection condition and the like.
- a viewpoint image representing an image of a field of view at the designated viewpoint P can be displayed. Note that the viewpoint image at each viewpoint P is illustrated in FIG. 8 are denoted by the reference numerals 19b 1, and reference numeral 19b 2.
- a viewpoint image in which the imaging direction of the endoscope 10 is reflected can be generated.
- a viewpoint image reflecting the imaging direction of the endoscope 10 may be generated for each of a plurality of imaging directions.
- the entire image 19a shown in FIG. 5 and the viewpoint image not shown in FIG. 5 are included in the concept of the CTC image 19 shown in FIG.
- the CTC image 19 whose whole image 19a is shown in FIG. 5 has three-dimensional coordinates not shown.
- the three-dimensional coordinates set in the CTC image 19 can be three-dimensional coordinates having an arbitrary reference position of the CTC image 19 as an origin.
- Three-dimensional coordinates can apply arbitrary three-dimensional coordinates, such as rectangular coordinates, polar coordinates, and cylindrical coordinates. Note that illustration of three-dimensional coordinates is omitted.
- FIG. 5 In virtual colonoscopy, a large intestine is imaged using a CT apparatus to acquire a CT image of the large intestine, and a lesion etc. is detected using a CTC image 19 generated by performing image processing on the CT image of the large intestine. .
- Virtual colonoscopy in conjunction with the movement of the endoscope 10, the pointer 19d likened to the endoscope 10 from the start position P S to a goal position P G, it is moved on the path 19c.
- the arrows shown in FIG. 5 indicate the moving direction of the pointer 19d.
- FIG. 5 shows, by applying the cecum as the start position P S, and shows an example of applying the anus as goal position P G. That is, in FIG. 5, insert the endoscope 10 to the start position P S, the virtual colonoscopy when moving while venting the endoscope 10 to the goal position P G position shown schematically.
- the position of the pointer 19 d is derived from the movement condition of the endoscope 10.
- Examples of movement conditions of the endoscope 10 include the movement speed of the endoscope 10 and a movement vector representing the movement direction of the endoscope 10.
- the endoscope 10 can grasp the position inside the observation site using a sensor (not shown). In addition, the endoscope 10 can derive the movement speed of the endoscope 10 and the movement vector representing the movement direction using a sensor (not shown). Furthermore, the endoscope 10 can derive the orientation of the endoscope 10 using a sensor (not shown).
- Endoscopy detects lesions such as polyps from the endoscopic image 37. That is, in the endoscopy, the endoscope 10 is used to look at a moving image 38 generated in real time, and specify the position, shape, and the like of a lesion. The endoscopic examination may use a reproduced image of the endoscopic image 37.
- FIG. 6 is a schematic view of an endoscopic image.
- an optional frame image 38a constituting the moving image 38 is shown in FIG.
- the frame image 38a shown in FIG. 6 is a two-dimensional image.
- the frame image 38a has color information and texture information.
- endoscopic examination is strong in detecting flat lesions, differences in surface condition, and the like.
- endoscopy is not good at finding a lesion on the back side of a ridged structure such as a fold.
- FIG. 7 is a schematic view showing a blind spot in the observation range of the endoscope.
- FIG. 7 illustrates a schematic cross section 100 along the path 19 c of the CTC image 19 and a schematic cross section 120 of the endoscopic image 37 corresponding to the cross section 100 of the CTC image 19.
- the endoscope 10A and the endoscope 10B illustrated using a two-dot chain line represent the endoscope 10 at the observation position which has already been observed.
- the endoscope 10 illustrated using a solid line represents the endoscope 10 at the observation position during observation. Arrow lines indicate the moving direction of the endoscope 10.
- the CTC image 19 Since the CTC image 19 has three-dimensional information, virtual colonoscopy is strong in detecting convex shapes such as polyps. In addition, it is also strong in detecting polyps and the like hidden behind the folds. For example, the CTC image 19 can detect either the polyp 104 located on the back side of the fold 102 and the polyp 106 located on the front side of the fold 102. However, in the viewpoint image, the polyp 106 located behind the fold 102 may not be displayed.
- the endoscopic examination allows detection of the polyp 126 located on the front side of the fold 122, but the polyp 124 located on the back side of the fold 122. Are not good at detecting
- the polyp 126 on the front side of the fold 122 is located in the observation range of the endoscope 10 B or the endoscope 10.
- the endoscope 10A can detect the polyp 126.
- the polyp 124 on the back of the fold 122 is located at a blind spot in the observation range of the endoscope 10A, the endoscope 10B, and the endoscope 10.
- the endoscope 10A, the endoscope 10B, and the endoscope 10 have difficulty in detecting the polyp 124 on the back side of the fold 122.
- the endoscope 10, the endoscope 10A, and the endoscope 10B all have difficulty in detecting the polyp 124 on the back side of the fold 122.
- FIG. 8 is an explanatory view of first feature area extraction.
- FIG. 8 illustrates a viewpoint image 19 b 1 and a viewpoint image 19 b 2 at an arbitrary viewpoint P in the CTC image 19.
- the concept including the viewpoint image 19 b 1 and the viewpoint image 19 b 2 shown in FIG. 8 is the viewpoint image 19 b.
- the first feature region 80 is extracted from the CTC image 19 shown in FIG. 8 using the first feature region extraction unit 50 shown in FIG. Also, the first feature region extraction process can detect the polyp 106 located on the back side of the crimp 102 shown in FIG. 7 as a first feature region 80.
- the process of extracting the first feature area 80 from the CTC image 19 can apply a known feature area extraction technique. The same applies to second feature region extraction described later.
- a known feature area extraction technique there is an example in which feature quantities for each of a plurality of areas are calculated, and an area matching the first condition is specified as an extraction target area according to the feature quantity for each area. .
- the feature amount for each area can be calculated using the pixel value of each pixel included in each area.
- a convex polyp is extracted as the first feature region 80.
- the first feature area 80 of the CTC image 19 can specify coordinate values in three-dimensional coordinates set in the CTC image 19.
- the plurality of first feature regions 80 are extracted, the plurality of first feature regions 80 are associated with the first condition and collectively managed.
- the first feature area 80 may be classified into a plurality of attributes.
- a lesion extracted as the first feature region 80 may be classified according to the position, with information on the position of the lesion as the classification condition.
- An example of information on the location of a lesion is the information on the front or back of a fold. That is, the lesion extracted as the first feature region 80 may be classified into a lesion on the front of the fold and a lesion on the back of the fold.
- the lesion on the front of the fold and the lesion on the back of the fold may be applied to extract two types of first feature regions.
- FIG. 9 is an explanatory diagram of second feature region extraction. 9 shows, among the endoscopic image 37 illustrates the arbitrary frame image 38a 1. The extraction result of the second feature area can be handled as the result of the endoscopy.
- the frame image 38a 1 shown in FIG. 9 may be used a still image 39.
- the second feature area 70 is extracted from the endoscopic image 37 using the second feature area extraction unit 54 illustrated in FIG. 3.
- Frame image 38a 1 shown in FIG. 9 polyps convex shape is extracted as a second feature region 70.
- the information of the first feature area 80 shown in FIG. 8 is stored in the first feature area storage unit 64 shown in FIG. 4 as the extraction result of the first feature area. Further, the information of the second feature area 70 shown in FIG. 9 is stored in the second feature area storage unit 66 shown in FIG. 4 as the extraction result of the second feature area.
- FIG. 12 is a schematic view showing an example of association of lesions. It shows an example in which the second characteristic region 70 is a polyp of the convex shape in the frame image 38a 1 of the endoscope image 37 is detected in FIG. 10.
- viewpoint image 19b 1 shown in FIG. 10 is a viewpoint image 19b 1 shown in FIG.
- Viewpoint image 19b 2 shown in FIG. 10 is a viewpoint image 19b 2 shown in FIG.
- the first feature area 80 shown in FIG. 10 is the first feature area 80 shown in FIG.
- the frame image 38a 1 shown in FIG. 10 is a frame image 38a 1 shown in FIG.
- the second feature area 70 shown in FIG. 10 is the second feature area 70 shown in FIG.
- the associating unit 58 shown in FIG. 3 displays a CTC image of the first feature area 80 corresponding to the second feature area 70. Search from 19 When the first feature area 80 of the CTC image 19 corresponding to the second feature area 70 of the endoscopic image 37 is detected, the first feature area 80 of the CTC image 19 and the second feature area of the endoscopic image 37 Correspond with 70.
- the association unit 58 shown in FIG. 3 uses the information on the position of the CTC image 19 and the information on the position of the endoscope image 37 to use the first feature area 80 of the CTC image 19 and the first feature area 80 of the endoscope image 37.
- the information of the image of the CTC image 19 is applied instead of the information of the position of the CTC image 19 which can be associated with the two characteristic regions 70, and the endoscope image is used instead of the information of the position of the endoscopic image 37
- the information of 37 images may be applied.
- the associating unit 58 illustrated in FIG. 3 illustrates the association result of the first feature region 80 of the CTC image 19 illustrated in FIG. 10 and the second feature region 70 of the endoscopic image 37 in FIG.
- the association result storage unit 68 is stored.
- the concept of the correspondence between the CTC image 19 and the endoscopic image 37 includes the concept of forming a combination of the components of the CTC image 19 and the components of the endoscopic image 37.
- the concept of the correspondence between the CTC image 19 and the endoscopic image 37 may include the concept of searching for and identifying the component of the CTC image 19 corresponding to the component of the endoscopic image 37. .
- FIG. 11 is a schematic view showing an example of corrugation correspondence.
- the frame image 38 a 11 shown in FIG. 11 is extracted as a second feature area 72.
- Viewpoint image 19b 11 are folds are extracted as the first feature area 82.
- FIG. 12 the viewpoint image 19 b 12 and the viewpoint image 19 b 13 at the viewpoint P continuous to the viewpoint P of the viewpoint image 19 b 11 are illustrated.
- the associating unit 58 illustrated in FIG. 3 associates the first feature area 82 and the second feature area 72 illustrated in FIG.
- the association unit 58 shown in FIG. 3 stores the association result between the first feature area 82 and the second feature area 72 shown in FIG. 11 in the association result storage unit 68 shown in FIG.
- FIG. 12 is a schematic view showing an example of the arrangement of the folds using the fold numbers.
- the number of folds does not change. Therefore, it is possible to set the reference fold and to associate the CTC image 19 with the endoscopic image 37 using the fold number.
- the frame image 38a 21 shown in FIG. 12 is extracted as a second feature area 74.
- the viewpoint image 19 b 21 is extracted as a first feature region 84.
- the viewpoint image 19 b 22 and the viewpoint image 19 b 23 shown in FIG. 12 are also extracted as the second feature region. Note that illustration of the viewpoint image 19b 22 and the second feature area of the viewpoint image 19b 23 is omitted.
- N 1 attached to the viewpoint image 19b 21 is an integer representing a fold number. The same applies to n 2 attached to the viewpoint image 19 b 22 , n 3 attached to the viewpoint image 19 b 23 , and n 1 attached to the frame image 38 a 21 .
- the associating unit 58 shown in FIG. 3 matches the second feature region 74 shown in FIG.
- the first feature area 84 is associated with it.
- the association unit 58 shown in FIG. 3 stores the association result with the second feature area 74 and the first feature area 84 shown in FIG. 12 in the association result storage unit 68 shown in FIG.
- FIG. 13 is a schematic view of an endoscopic image and a virtual endoscopic image in the case of non notification.
- the monitor device 16 shown in FIG. 13 displays the endoscopic image 37 and displays the CTC image 19 corresponding to the endoscopic image 37.
- the viewpoint image 19 b 31 of the CTC image 19 corresponds to the frame image 38 a 31 of the endoscopic image 37.
- the endoscopic image 37 displayed on the monitor device 16 is sequentially updated according to the progress of the endoscopic examination. Further, the CTC image 19 is sequentially updated in accordance with the update of the endoscopic image 37. There may be a delay within the allowable range between the CTC image 19 and the endoscopic image 37.
- the monitor device 16 does not display the first notification image 140 and the second notification image 142 described later.
- FIG. 14 is a schematic view of an endoscopic image and a virtual endoscopic image in the case of the first notification.
- the first notification image 140 is displayed as the frame image 38a 32 of the endoscopic image 37 shown in FIG.
- the first notification image 140 is displayed when a polyp (not shown) is present on the back side of the fold 150 but the polyp is not shown in the frame image 38a 32 .
- the CTC image 19 shows that the fold 160 is The polyp on the back side is extracted as a first feature area 80d.
- the viewpoint image 19b 32 displays the same view as the frame image 38a 32 , the polyp extracted as the first feature region 80d is not displayed. Dashed line representing a first feature area 80d indicates that the first characteristic region 80d is not displayed in the view image 19b 32.
- the first feature region 80 d illustrated with broken lines in FIG. 14 is associated with the non-extraction region 76 not extracted as the second feature region 70 from the endoscopic image 37.
- An example of the non-extraction area 76 is an area located at a blind spot in the observation range of the endoscope 10 in the endoscopic image 37.
- the non-extraction area 76 is an area from which the second feature area 70 is to be extracted.
- the non-extraction area 76 is an area where the second feature area 70 is not actually extracted because the area is located at a blind spot of the observation range of the endoscope 10.
- the notification unit 59 illustrated in FIG. 3 sets the position of the non-extraction area 76 of the endoscopic image 37 displayed on the monitor device 16 as the first notification. , And displays the first notification image 140 in an overlay manner. In addition, the first notification image 140 is displayed at the position of the non-extraction area 76 of the endoscopic image 37.
- the first notification image 140 illustrated in FIG. 14 is an example, and the shape and the like may be arbitrarily defined.
- the first notification image 140 may be displayed on the frame images 38a before and after the frame image 38a 32 . That is, at any timing from the timing when the non-extraction area 76 enters the field of view of the endoscope 10 to the timing when the non-extraction area 76 deviates from the field of view of the endoscope 10 according to the progress of the endoscopic examination
- the first notification image 140 can be displayed.
- the fold 150 of the endoscopic image 37 shown in FIG. 14 corresponds to the cross-section 120 fold 122 shown in FIG. 7.
- the fold 160 of the CTC image 19 corresponds to the cross-section 100 fold 122 shown in FIG. 7. The same applies to the folds 150 and 160 shown in FIG.
- FIG. 15 is a schematic view of an endoscopic image and a virtual endoscopic image in the case of the second notification.
- the second notification is performed when the second feature region 70 is extracted from the endoscopic image 37.
- a polyp is extracted as the second feature region 70, and the second notification image 142 is displayed as the second notification.
- the notification unit 59 illustrated in FIG. 3 is configured to display the second feature area 70 of the endoscopic image 37 displayed on the monitor device 16.
- the second notification image 142 is overlaid and displayed at the position of.
- the second feature area 70 of the frame image 38 a 33 of the endoscopic image 37 shown in FIG. 15 is associated with the first feature area 80 e of the viewpoint image 19 b 33 of the CTC image 19.
- the first feature area 80 e is a polyp on the front side of the fold 160.
- the first notification image 140 shown in FIG. 14 has the notification level changed with respect to the second notification image 142 shown in FIG. Specifically, the first notification image 140 shown in FIG. 14 has the notification level raised with respect to the second notification image 142 shown in FIG. 15, and the first notification image 140 shown in FIG. The size of the second notification image 142 shown in FIG. The details of the difference between the notification levels of the first notification and the second notification will be described later.
- the first feature region 80d extracted as a polyp on the back side of the fold 160 and the first feature region 80e extracted as a polyp on the front side of the fold 160 shown in the present embodiment have the condition of polyp and the front side of the fold
- a first condition combining the back side condition may be applied and extracted in advance from the CTC image 19.
- FIG. 16 is a flowchart showing the procedure of the notification method.
- a CTC image input process S10 is performed.
- the CTC image 19 is input using the CTC image acquisition unit 41a shown in FIG.
- the CTC image 19 is stored in the image storage unit 48.
- the CTC image input process S10 shown in FIG. 16 is an example of a first image input process.
- the process proceeds to a first feature area extraction process S12.
- the first feature region is extracted from the CTC image 19 using the first feature region extraction unit 50 shown in FIG.
- the information of the first feature area is stored in the first feature area storage unit 64 shown in FIG.
- the process proceeds to an endoscopic image input step S14.
- the endoscopic image input step S14 the endoscopic image 37 is input using the endoscopic image acquisition unit 41b shown in FIG.
- the endoscopic image input process S14 shown in FIG. 16 is an example of a second image input process.
- the process proceeds to a second feature area extraction process S16.
- the second feature area is extracted from the endoscopic image 37 using the second feature area storage unit 66 shown in FIG.
- the endoscopic image input process S14 and the second feature area extraction process S16 shown in FIG. 16 can be grasped as an endoscopic examination. That is, the endoscopic image acquisition unit 41b illustrated in FIG. 2 sequentially inputs the moving image 38 captured using the endoscope 10, and displays the endoscopic image 37 as the endoscopic image 37 on the monitor device 16 illustrated in FIG. Do.
- the second feature region extraction unit 54 illustrated in FIG. 3 automatically extracts a lesion as the second feature region 70 from the endoscopic image 37.
- the second feature area extraction unit 54 may extract a lesion as the second feature area 70 from the endoscopic image 37 based on the extraction information input by the user using the operation device 15.
- the first feature area extraction unit 50 shown in FIG. 3 executes the extraction of the first feature area 80 from the CTC image 19 in parallel with the acquisition of the endoscopic image 37 and the extraction of the second feature area 70. Do.
- the first feature region 80 may be extracted in advance and stored.
- the CTC image 19 and the endoscopic image 37 are associated using the associating unit 58 shown in FIG. That is, the associating unit 58 associates the first feature region 80 with the second feature region 70, or associates the first feature region 80 with the non-extraction region 76.
- the result of the matching in the matching step S18 shown in FIG. 16 is stored in the matching result storage unit 68 shown in FIG.
- the determination step S20 is performed. In the determination step S20, it is determined using the notification unit 59 shown in FIG. 5 whether to execute the first notification described using FIG. 14 or the second notification shown in FIG.
- the first feature area 80 When the first feature area 80 is associated with the non-extraction area 76 of the endoscopic image 37 in the determination step S20 shown in FIG. In the case of a Yes determination, it progresses to 1st alerting
- a lesion such as a polyp extracted from the CTC image 19 as the first feature region 80 is a blind spot in the observation range of the endoscope 10 or the endoscope It may be determined whether it is the position of ten observation ranges.
- the notification unit 59 can execute the first notification when a lesion such as a polyp extracted from the CTC image 19 as the first feature region 80 is located at a blind spot in the observation range of the endoscope 10.
- the second notification can be performed.
- the first notification step S22 executes the first notification described using FIG. 14 using the notification unit 59 shown in FIG. After the first notification step S22 shown in FIG. 16, the process proceeds to an inspection end determination step S26.
- the second notification step S24 executes the second notification described using FIG. 15 using the notification unit 59 shown in FIG. After the second notification step S24 shown in FIG. 16, the process proceeds to an inspection end determination step S26.
- the examination end determination step S26 it is determined using the medical image processing apparatus 14 shown in FIG. 3 whether or not the endoscopy is completed. If the medical image processing apparatus 14 determines that the endoscopy is completed in the examination end determination step S26, the determination is Yes. If the determination is Yes, the medical image processing apparatus 14 ends the notification method.
- the examination end determination step S26 when the medical image processing apparatus 14 determines that the endoscopic examination is continued, the result is No. If the determination is No, the medical image processing apparatus 14 continues the notification method. That is, in the case of No determination in the examination end determination step S26, the process proceeds to the endoscopic image input step S14. Thereafter, each process from the endoscopic image input process S14 to the examination completion determination process S26 is executed until the determination in the examination completion determination process S26 becomes Yes.
- a lesion such as a polyp is extracted as the first feature region 80 from the CTC image 19.
- the CTC image 19 and the endoscopic image 37 are associated with each other.
- the first notification is performed. Due to the first notification, the user can recognize the presence of a lesion such as a polyp which is not extracted from the endoscopic image 37, for example, at a blind spot in the observation range of the endoscope 10. Thereby, in endoscopy using the endoscope 10, it is possible to suppress the oversight of a lesion such as a polyp at a position at which the observation range of the endoscope 10 becomes a blind spot.
- observation of the endoscope 10 is performed by pushing a fold or the like that obstructs the observation range of the endoscope 10 It becomes possible to observe the blind spot of the range.
- the second notification is performed.
- the first notification changes the notification level with respect to the second notification and raises the notification level.
- the first notification image and the second notification image are overlaid on the endoscopic image 37.
- the first notification image 140 or the second notification image 142 can be superimposed and displayed on the endoscopic image 37 without processing the endoscopic image 37 itself.
- FIG. 17 is an explanatory diagram of notification according to the first modification.
- the density of the first notification image 144 shown in FIG. 17 is changed with respect to the second notification image 146.
- the first notification image 144 has a dark density applied to the second notification image 146.
- the colors of the first notification image 144 and the second notification image 146 may be changed. For example, black is used for the first notification image 144, and yellow is used for the second notification image 146. That is, a color with high visibility in the endoscopic image 37 is applied to the first notification image 144 as compared to the second notification image 146.
- At least one of the density and the color of the first notification image 144 and the second notification image 146 is changed. Thereby, the visibility of the first notification image 144 with respect to the second notification image 146 can be enhanced.
- FIG. 18 is an explanatory diagram of notification according to the second modified example.
- the first notification image 147 shown in FIG. 18 is displayed blinking.
- the second notification image 148 is lit and displayed.
- the lighting display can be grasped as a normal display.
- the first notification image 147 is blinked and the second notification image 148 is lit and displayed. Thereby, the visibility of the first notification image 147 with respect to the second notification image 148 can be increased.
- FIG. 19 is an explanatory diagram of notification according to the third modification.
- the first notification image 147A shown in FIG. 19 is displayed blinking.
- the second notification image 147B is also blinked and displayed.
- the blinking cycle of the first notification image 147A is shorter than that of the second notification image 147B.
- the first notification image 147A is blinked and displayed, and the second notification image 147B is blinked and displayed.
- a flashing cycle of the first notification image 147A is shorter than that of the second notification image 147B. Thereby, the visibility of the first notification image 147A can be increased with respect to the second notification image 147B.
- the first modification described above can be combined with the second modification or the third modification as appropriate.
- the first notification image 140 may be emphasized such as increasing in size continuously or in stages as the non-extraction area 76 of the endoscopic image 37 approaches the observation area of the endoscope 10 .
- the first notification image 140 and the second notification image 142 may be displayed on the CTC image 19.
- the first notification image 140 and the second notification image 142 displayed on the CTC image 19 can be displayed in the same manner as the first notification image 140 and the second notification image 142 displayed on the endoscope image 37. is there.
- FIG. 20 is an explanatory diagram of another display example of the first feature area.
- FIG. 20 shows an example of displaying the first feature area 80 in the CTC image 19.
- the CTC image 19 shown in FIG. 20 corresponds to the entire image 19a shown in FIG.
- the path 19 c illustrated with thin lines represents the path 19 c in the area where the endoscope 10 has already finished observation. Further, a path 19 c illustrated by using a thick line represents a path 19 c of an area to be observed by the endoscope 10 from now.
- the first feature area 80a represents the first feature area 80 that the endoscope 10 has already observed.
- the first feature area 80 b represents a first feature area 80 to be observed next by the endoscope 10.
- the first feature area 80b that the endoscope 10 observes next is highlighted.
- the highlighting can be applied to enlargement, color change, blinking, and the like.
- the first feature area 80c represents a first feature area 80 that the endoscope 10 observes next to the first feature area 80b. After the endoscope 10 observes the first feature area 80b, the first feature area 80c is highlighted.
- the monitor device 16 may display the CTC image 19 shown in FIG. 20 instead of the CTC image 19 shown in FIG. Thereby, the position of the first feature area 80 existing near the position of the endoscope 10 can be grasped.
- the combination with the first notification image 140 or the like shown in FIG. 14 contributes to the detection of a lesion such as a polyp which is not detected as the second feature region 70 from the endoscopic image 37.
- FIG. 21 is a functional block diagram showing functions of a medical image processing apparatus for realizing notification according to another embodiment.
- the notification using the notification sound is performed.
- a notification sound control unit 200 and a sound source 202 are added to the medical image processing apparatus 14 shown in FIG.
- a speaker 204 is added to the endoscope system 9 shown in FIG.
- the notification sound control unit 200 outputs a notification sound generated using the sound source 202 via the speaker 204.
- the notification sound may be voice.
- the notification sound may apply a warning sound.
- the notification sound control unit 200 associates the first feature region 80 of the CTC image 19 with the second feature region 70 of the endoscope image 37, such as a region located in the observation range of the endoscope 10, for example.
- the first feature area 80 of the CTC image 19 is associated with the non-extraction area 76 of the endoscope image 37, for example, an area located at a blind spot in the observation range of the endoscope 10
- the notification sound may be emphasized. As an example of emphasizing the notification sound, the volume may be raised.
- the notification sound control unit 200, the sound source 202, and the speaker 204 are examples of components of the notification sound output unit.
- notification when the first feature region 80 of the CTC image 19 is associated with the non-extraction region 76 of the endoscopic image 37, notification using a sound is performed. Thereby, notification can be performed without processing the endoscope image 37 itself.
- the medical image processing apparatus 14 illustrated in FIG. 2 may include a CTC image generation unit that generates a CTC image 19 from a three-dimensional inspection image such as a CT image.
- the medical image processing apparatus 14 may acquire a three-dimensional inspection image via the CTC image acquisition unit 41a, and generate the CTC image 19 using the CTC image generation unit.
- the viewpoint P shown in FIG. 5 is not limited to above the path 19c.
- the viewpoint P can be set at an arbitrary position.
- the viewing direction of the viewpoint image 19 b can be arbitrarily set corresponding to the imaging direction of the endoscope 10.
- the viewpoint image 19 b may be a two-dimensional inspection image obtained by converting a three-dimensional inspection image of an arbitrary cross section of the entire image 19 a into a two-dimensional image.
- First example Extraction of the first feature region 80 may use a three-dimensional inspection image used to generate the CTC image 19.
- the first feature region 80 may be extracted and stored in advance.
- the pre-extracted first feature area 80 may be searchably stored using information on the position of the first feature area 80 as an index.
- the extraction of the second feature area may be performed by reproducing the moving image 38.
- a first example of a particular wavelength band is the blue or green band in the visible range.
- the wavelength band of the first example includes a wavelength band of 390 nanometers or more and 450 nanometers or less, or 530 nanometers or more and 550 nanometers or less, and the light of the first example is 390 nanometers or more and 450 nanometers or less, or It has a peak wavelength within the wavelength band of 530 nanometers or more and 550 nanometers or less.
- a second example of a particular wavelength band is the red band in the visible range.
- the wavelength band of the second example includes a wavelength band of 585 nanometers or more and 615 nanometers or less, or 610 nanometers or more and 730 nanometers or less, and the light of the second example is 585 nanometers or more and 615 nanometers or less, or It has a peak wavelength within the wavelength band of 610 nanometers or more and 730 nanometers or less.
- the third example of the specific wavelength band includes wavelength bands in which the absorption coefficient is different between oxygenated hemoglobin and reduced hemoglobin, and the light of the third example has peak wavelengths in wavelength bands where the absorption coefficient is different between oxygenated hemoglobin and reduced hemoglobin.
- the wavelength band of this third example includes wavelength bands of 400 ⁇ 10 nanometers, 440 ⁇ 10 nanometers, 470 ⁇ 10 nanometers, or 600 nanometers to 750 nanometers, and the light of the third example is It has a peak wavelength in a wavelength band of 400 ⁇ 10 nm, 440 ⁇ 10 nm, 470 ⁇ 10 nm, or 600 nm or more and 750 nm or less.
- a fourth example of the specific wavelength band is a wavelength band of excitation light which is used to observe fluorescence emitted from a fluorescent substance in the living body and which excites the fluorescent substance.
- it is a wavelength band of 390 nanometers or more and 470 nanometers or less.
- observation of fluorescence may be called fluorescence observation.
- the fifth example of the specific wavelength band is a wavelength band of infrared light.
- the wavelength band of the fifth example includes a wavelength band of 790 nm or more and 820 nm or less, or 905 nm or more and 970 nm or less, and the light of the fifth example is 790 nm or more and 820 nm or less, Or has a peak wavelength in a wavelength band of 905 nm or more and 970 nm or less.
- the processor 12 may generate a special light image having information of a specific wavelength band based on a normal light image obtained by imaging using white light. Note that the generation referred to here includes acquisition. In this case, the processor 12 functions as a special light image acquisition unit. Then, the processor 12 obtains a signal of a specific wavelength band by performing an operation based on the color information of red, green and blue or cyan, magenta and yellow contained in the normal light image.
- red, green and blue may be represented as RGB (Red, Green, Blue).
- cyan, magenta and yellow may be expressed as CMY (Cyan, Magenta, Yellow).
- the processor 12 may generate a feature image such as a known oxygen saturation image based on at least one of the normal light image and the special light image.
- the second feature region extraction unit 54 illustrated in FIG. 3 performs machine learning by using the correspondence between the first feature region 80 of the CTC image 19 and the non-extraction region 76 of the endoscopic image 37 as learning data. It is possible to update the extraction rule of 2 feature areas. In machine learning, the deep learning algorithm 65 shown in FIG. 2 is applied.
- the image processing method described above can be configured as a program that implements functions corresponding to the respective steps in the image processing method using a computer.
- a program that causes a computer to realize a CTC image input function, a first feature area extraction function, an endoscope image input function, a second feature area extraction function, an association function, and a storage function can be configured.
- the CTC image input function corresponds to the first image input function.
- the endoscope image input function corresponds to the second image input function.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Biomedical Technology (AREA)
- Animal Behavior & Ethology (AREA)
- Radiology & Medical Imaging (AREA)
- Optics & Photonics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biophysics (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Pathology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Endoscopes (AREA)
Abstract
L'invention concerne un système endoscope, un procédé de rapport et un programme capable d'empêcher, lors d'une endoscopie utilisant un endoscope, l'omission de lésions qui peuvent être difficiles à détecter. Un système endoscope selon la présente invention comprend : une première unité d'entrée d'image (41a) pour entrer une image endoscopique virtuelle ; une seconde unité d'entrée d'image (41b) pour entrer une image endoscopique réelle ; une unité de corrélation (58) pour corréler l'image endoscopique virtuelle à l'image endoscopique réelle ; une première unité d'extraction de zone caractéristique (50) pour extraire une première zone caractéristique qui satisfait une première condition prescrite à partir de l'image endoscopique virtuelle ; une seconde unité d'extraction de zone caractéristique (54) pour extraire une seconde zone caractéristique qui satisfait une seconde condition correspondant à la première condition à partir de l'image endoscopique réelle ; et une unité de rapport (59) pour signaler que la première zone caractéristique n'est pas corrélée à la seconde zone caractéristique.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2019550323A JP6840263B2 (ja) | 2017-10-31 | 2018-10-26 | 内視鏡システム及びプログラム |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2017210248 | 2017-10-31 | ||
| JP2017-210248 | 2017-10-31 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2019087969A1 true WO2019087969A1 (fr) | 2019-05-09 |
Family
ID=66331877
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2018/039901 Ceased WO2019087969A1 (fr) | 2017-10-31 | 2018-10-26 | Système endoscope, procédé de rapport et programme |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP6840263B2 (fr) |
| WO (1) | WO2019087969A1 (fr) |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2021171464A1 (fr) * | 2020-02-27 | 2021-09-02 | オリンパス株式会社 | Dispositif de traitement, système d'endoscope et procédé de traitement d'image capturée |
| JP2021141969A (ja) * | 2020-03-10 | 2021-09-24 | 独立行政法人国立病院機構 | 内視鏡装置 |
| JP2023513646A (ja) * | 2021-01-14 | 2023-04-03 | コ,ジファン | 内視鏡を用いた大腸検査ガイド装置及び方法 |
| WO2025004206A1 (fr) * | 2023-06-28 | 2025-01-02 | 日本電気株式会社 | Dispositif d'aide à l'examen endoscopique, procédé d'aide à l'examen endoscopique, et support d'enregistrement |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2013009956A (ja) * | 2011-06-01 | 2013-01-17 | Toshiba Corp | 医用画像表示装置及び医用画像診断装置 |
| JP2013150650A (ja) * | 2012-01-24 | 2013-08-08 | Fujifilm Corp | 内視鏡画像診断支援装置および方法並びにプログラム |
| JP2014230612A (ja) * | 2013-05-28 | 2014-12-11 | 国立大学法人名古屋大学 | 内視鏡観察支援装置 |
| JP2016143194A (ja) * | 2015-01-30 | 2016-08-08 | ザイオソフト株式会社 | 医用画像処理装置、医用画像処理方法、及び医用画像処理プログラム |
-
2018
- 2018-10-26 WO PCT/JP2018/039901 patent/WO2019087969A1/fr not_active Ceased
- 2018-10-26 JP JP2019550323A patent/JP6840263B2/ja active Active
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2013009956A (ja) * | 2011-06-01 | 2013-01-17 | Toshiba Corp | 医用画像表示装置及び医用画像診断装置 |
| JP2013150650A (ja) * | 2012-01-24 | 2013-08-08 | Fujifilm Corp | 内視鏡画像診断支援装置および方法並びにプログラム |
| JP2014230612A (ja) * | 2013-05-28 | 2014-12-11 | 国立大学法人名古屋大学 | 内視鏡観察支援装置 |
| JP2016143194A (ja) * | 2015-01-30 | 2016-08-08 | ザイオソフト株式会社 | 医用画像処理装置、医用画像処理方法、及び医用画像処理プログラム |
Cited By (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2021171464A1 (fr) * | 2020-02-27 | 2021-09-02 | オリンパス株式会社 | Dispositif de traitement, système d'endoscope et procédé de traitement d'image capturée |
| CN115209783A (zh) * | 2020-02-27 | 2022-10-18 | 奥林巴斯株式会社 | 处理装置、内窥镜系统以及摄像图像的处理方法 |
| US12433478B2 (en) | 2020-02-27 | 2025-10-07 | Olympus Corporation | Processing device, endoscope system, and method for processing captured image |
| JP2021141969A (ja) * | 2020-03-10 | 2021-09-24 | 独立行政法人国立病院機構 | 内視鏡装置 |
| JP2023513646A (ja) * | 2021-01-14 | 2023-04-03 | コ,ジファン | 内視鏡を用いた大腸検査ガイド装置及び方法 |
| JP7374224B2 (ja) | 2021-01-14 | 2023-11-06 | コ,ジファン | 内視鏡を用いた大腸検査ガイド装置 |
| JP2023178415A (ja) * | 2021-01-14 | 2023-12-14 | コ,ジファン | 内視鏡を用いた大腸検査ガイド装置及び方法 |
| JP7550947B2 (ja) | 2021-01-14 | 2024-09-13 | コ,ジファン | 内視鏡を用いた大腸検査ガイド装置及び方法 |
| JP2024160013A (ja) * | 2021-01-14 | 2024-11-08 | コ,ジファン | 内視鏡を用いた大腸検査ガイド装置及び方法 |
| WO2025004206A1 (fr) * | 2023-06-28 | 2025-01-02 | 日本電気株式会社 | Dispositif d'aide à l'examen endoscopique, procédé d'aide à l'examen endoscopique, et support d'enregistrement |
Also Published As
| Publication number | Publication date |
|---|---|
| JPWO2019087969A1 (ja) | 2020-11-12 |
| JP6840263B2 (ja) | 2021-03-10 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP7583100B2 (ja) | 医用画像処理装置、内視鏡システム、医用画像処理システム、医用画像処理装置の作動方法、プログラム及び記憶媒体 | |
| JP7756621B2 (ja) | 内視鏡システム、医療画像処理装置の作動方法及びプログラム、記録媒体 | |
| JP6890184B2 (ja) | 医療画像処理装置及び医療画像処理プログラム | |
| JP7346693B2 (ja) | 医用画像処理装置、プロセッサ装置、内視鏡システム、医用画像処理装置の作動方法、及びプログラム | |
| JP5675227B2 (ja) | 内視鏡画像処理装置および作動方法、並びに、プログラム | |
| JP7630560B2 (ja) | 医療画像処理装置、医療画像処理システム、医療画像処理装置の作動方法及び医療画像処理プログラム | |
| US11607109B2 (en) | Endoscopic image processing device, endoscopic image processing method, endoscopic image processing program, and endoscope system | |
| JP7335157B2 (ja) | 学習データ作成装置、学習データ作成装置の作動方法及び学習データ作成プログラム並びに医療画像認識装置 | |
| JP5486432B2 (ja) | 画像処理装置、その作動方法およびプログラム | |
| JP6405138B2 (ja) | 画像処理装置、画像処理方法、及び画像処理プログラム | |
| JP7125479B2 (ja) | 医療画像処理装置、医療画像処理装置の作動方法及び内視鏡システム | |
| JP7050817B2 (ja) | 画像処理装置、プロセッサ装置、内視鏡システム、画像処理装置の動作方法及びプログラム | |
| JP6840263B2 (ja) | 内視鏡システム及びプログラム | |
| US12131513B2 (en) | Medical image processing apparatus and medical image processing method for utilizing a classification result relating to a medical image | |
| US11481944B2 (en) | Medical image processing apparatus, medical image processing method, program, and diagnosis support apparatus | |
| JP7122328B2 (ja) | 画像処理装置、プロセッサ装置、画像処理方法、及びプログラム | |
| CN116724334A (zh) | 计算机程序、学习模型的生成方法、以及手术辅助装置 | |
| JP7148534B2 (ja) | 画像処理装置、プログラム、及び内視鏡システム | |
| US20230206445A1 (en) | Learning apparatus, learning method, program, trained model, and endoscope system | |
| WO2024185468A1 (fr) | Dispositif d'assistance médicale, système endoscope, procédé d'assistance médicale et programme |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18873161 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2019550323 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 18873161 Country of ref document: EP Kind code of ref document: A1 |