WO2014156604A1 - Système d'endoscope, procédé opérationnel s'y rapportant et dispositif de traitement - Google Patents
Système d'endoscope, procédé opérationnel s'y rapportant et dispositif de traitement Download PDFInfo
- Publication number
- WO2014156604A1 WO2014156604A1 PCT/JP2014/056268 JP2014056268W WO2014156604A1 WO 2014156604 A1 WO2014156604 A1 WO 2014156604A1 JP 2014056268 W JP2014056268 W JP 2014056268W WO 2014156604 A1 WO2014156604 A1 WO 2014156604A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- light
- signal
- image signal
- image
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/063—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements for monochromatic or narrow-band illumination
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000094—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/043—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances for fluorescence imaging
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/145—Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value ; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid or cerebral tissue
- A61B5/1455—Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value ; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid or cerebral tissue using optical sensors, e.g. spectral photometrical oximeters
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/145—Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value ; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid or cerebral tissue
- A61B5/1455—Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value ; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid or cerebral tissue using optical sensors, e.g. spectral photometrical oximeters
- A61B5/1459—Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value ; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid or cerebral tissue using optical sensors, e.g. spectral photometrical oximeters invasive, e.g. introduced into the body by a catheter
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10141—Special mode during image acquisition
- G06T2207/10152—Varying illumination
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30101—Blood vessel; Artery; Vein; Vascular
Definitions
- the high-frequency component extraction unit for performing high-frequency filtering to extract high-frequency components to the first signal image signal and the second signal image signal
- the first alignment unit is an image signal for the first signal after high frequency filtering It is preferable that the calculation of the positional deviation amount and the alignment be performed on the basis of the second signal image signal.
- a low frequency component extraction unit for performing low frequency filtering to extract low frequency components to the first reference image signal and the second reference image signal which have been aligned is provided, and the second alignment unit is configured to perform low frequency filtering It is preferable to align the later first reference image signal and the second reference image signal.
- the processor unit 16 is electrically connected to the monitor 18 and the console 20.
- the monitor 18 outputs and displays image information and the like.
- the console 20 functions as a UI (user interface) that receives an input operation such as function setting.
- an external recording unit (not shown) that records image information and the like may be connected to the processor device 16.
- the light source device 14 includes a first blue laser light source (473 LD) 34 that emits a first blue laser light having a central wavelength of 473 nm and a second blue laser light source that emits a second blue laser light having a central wavelength of 445 nm. (445 LD) 34 as a light source.
- the light emission from the semiconductor light emitting elements of each of the light sources 34 and 36 is individually controlled by the light source control unit 40, and the light quantity ratio of the emitted light of the first blue laser light source 34 and the emitted light of the second blue laser light source 36 Is changeable.
- the light source control unit 40 drives the second blue laser light source 36 to emit the second blue laser light.
- the laser beams emitted from the light sources 34 and 36 are incident on the light guide (LG) 41 via optical members such as a condenser lens, an optical fiber, and a multiplexer (none of which are shown).
- the light guide 41 is incorporated in a universal cord (not shown) that connects the light source device 14 and the endoscope 12.
- the light guide 41 propagates the laser light from each of the light sources 34 and 36 to the tip 24 of the endoscope 12.
- a multimode fiber can be used as the light guide 41.
- the distal end portion 24 of the endoscope 12 has an illumination optical system 24 a and an imaging optical system 24 b.
- a fluorescent body 44 and an illumination lens 45 are provided in the illumination optical system 24 a.
- the laser light from the light guide 41 is incident on the phosphor 44.
- the fluorescent substance 44 emits fluorescence by being irradiated with the first or second blue laser light.
- part of the first or second blue laser light passes through the phosphor 44 as it is.
- the light emitted from the phosphor 44 is irradiated into the sample through the illumination lens 45.
- the second white light as shown in FIG. 3 is irradiated into the sample.
- the second white light is composed of the second blue laser light and the green to red second fluorescence excited and emitted from the phosphor 44 by the second blue laser light. Therefore, the second white light has a wavelength range extending to the entire visible light range.
- the first white light and the second white light are alternately emitted. Do.
- the alternately emitted first and second white light is irradiated into the sample.
- the first white light is composed of a first blue laser light and a first fluorescence of green to red which is excited to emit light from the phosphor 44 by the first blue laser light. Therefore, the signal light has a wavelength range extending to the entire visible light range.
- the second white light is similar to the second white light emitted in the normal observation mode. As shown in FIG. 5, the first fluorescence and the second fluorescence have the same waveform.
- the phosphor 44 absorbs a part of the first and second blue laser light, and emits plural colors of green to red (for example, YAG-based phosphor or BAM (BaMgAl 10 O 17 )). Etc.) are preferably contained.
- the semiconductor light emitting element is used as an excitation light source of the phosphor 44 as in this configuration example, high intensity first and second white light can be obtained with high luminous efficiency, and the intensity of the white light can be easily adjusted. In addition, changes in color temperature and chromaticity can be reduced.
- the imaging optical system 24 b of the endoscope 12 has an objective lens 46, a zooming lens 47, and an image sensor 48. Reflected light from the subject is incident on the image sensor 48 through the objective lens 46 and the zooming lens 47. Thereby, a reflection image of the subject is formed on the image sensor 48.
- the zooming lens 47 moves between the tele end and the wide end by operating the zoom operation unit 22 c. When the zooming lens 47 moves to the tele end side, the reflection image of the subject is enlarged, while by moving to the wide end side, the reflection image of the subject is reduced.
- An image sensor (imaging unit) 48 is a color image sensor, which captures a reflection image of a subject and outputs an image signal.
- the image sensor 48 is a charge coupled device (CCD) image sensor, a complementary metal-oxide semiconductor (CMOS) image sensor, or the like.
- CCD charge coupled device
- CMOS complementary metal-oxide semiconductor
- the image sensor used in the present invention is an RGB image sensor having RGB pixels in which RGB color filters are provided on the imaging surface, and photoelectric conversion is performed on each channel to obtain R, G, B image signals of three colors. Output.
- the first white light when the first white light is irradiated into the sample in the special observation mode, part of the green component of the first blue laser light and the first fluorescence is incident on the B pixel and the first fluorescence is on the G pixel A part of the green component of R.sub.1 is incident, and the red component of the first fluorescence is incident to the R pixel.
- the first blue laser light since the first blue laser light has extremely higher emission intensity than the first fluorescence, most of the B image signal is occupied by the reflected light component of the first blue laser light.
- the light incident component at the BGR pixel when the second white light is irradiated into the sample in the special observation mode is the same as that in the normal observation mode.
- the imaging control unit 49 performs imaging control of the image sensor 48 according to the observation mode.
- the observation mode As shown in FIG. 7A, in the normal observation mode, the inside of the sample illuminated by the second white light is imaged by the color image sensor 48 every period of one frame. Thus, the image sensor 48 outputs RGB image signals for each frame.
- the inside of the sample illuminated with the first white light is imaged by the color image sensor 48 in the first frame, and the second white light is imaged in the second frame.
- the illuminated sample is imaged by a color image sensor 48.
- the period for one frame of the image sensor 48 includes an accumulation period for photoelectrically converting and accumulating the reflected light from the sample and a readout period for reading out the charges accumulated thereafter and outputting an image signal.
- the image signal output from the image sensor 48 is transmitted to the CDS / AGC circuit 50.
- the CDS-AGC circuit 50 performs correlated double sampling (CDS) and automatic gain control (AGC) on an image signal which is an analog signal.
- a gamma conversion unit 51 performs gamma conversion on the image signal that has passed through the CDS-AGC circuit 50. Thereby, an image signal having a gradation suitable for an output device such as the monitor 18 can be obtained.
- the image signal after gamma conversion is converted into a digital image signal by an A / D converter (A / D converter) 52.
- the A / D converted digital image signal is input to the processor unit 16.
- the image processing switching unit 60 transmits the digital image signal to the normal light image processing unit 62 when the normal observation mode is set by the mode switching SW 22 b, and when the special observation mode is set, the image processing switching unit 60 The image signal is transmitted to the special light image processing unit 64.
- a digital image signal before image processing by the normal light image processing unit 62 and the special light image processing unit 64 is referred to as an image signal
- a digital image signal after image processing is referred to as image data.
- the color emphasizing unit 70 performs various color emphasizing processing on the color converted RGB image data.
- the structure emphasizing unit 72 performs structure emphasizing processing such as spatial frequency emphasizing on the color emphasizing processed RGB image data.
- the RGB image data subjected to the structure emphasizing process by the structure emphasizing unit 72 is input from the normal light image processing unit 62 to the image display signal generating unit 66 as a normal light image.
- the special light image processing unit 64 generates an oxygen saturation image based on the input two frames of B1 and R1 image signals and G2 and R2 image signals, and an oxygen saturation image generation unit 76.
- a structure emphasizing unit 78 that performs structure emphasizing processing such as spatial frequency emphasizing.
- the RGB image data subjected to the structure emphasis processing by the structure emphasis unit 78 is input from the special light image processing unit 64 to the image display signal generation unit 66 as a special light image.
- the search condition setting unit 80a sets various conditions (search conditions) of the search point used for alignment processing in the first and second alignment units 80b and 80c. After setting the search conditions, as shown in FIG. 9, the first alignment unit 80b calculates the amount of displacement of the sample between the B1 image signal and the G2 image signal, and the sample between the B1 image signal and the G2 image signal Align the image.
- the B1 image signal after alignment is set as a “B1a image signal” in order to deform the B1 image signal and align it with the G2 image signal.
- the second alignment unit 80c aligns the sample image between the R1 image signal and the R2 image signal based on the positional shift amount calculated by the first alignment unit 80b.
- the reference signal ratio calculation unit 80d calculates the reference signal ratio used to correct the G2 image signal and the R2 image signal from the aligned R1 image signal and the R2 image signal.
- the correction unit 80e corrects the G2 image signal and the R2 image signal to be equal to the image signal obtained under the reference inter-frame intensity ratio. Thereby, the G2a image signal and the R2a image signal are obtained.
- the B1a image signal, the G2a image signal, and the R2a image signal obtained by the above-described series of processes are used to calculate the oxygen saturation.
- the search condition setting unit 80a performs, as a search condition, position setting of a reference point, position setting of a search point, and setting of a search range.
- reference points P1 to P9 are provided at predetermined positions of nine areas A1 to A9 (3 ⁇ 3) in the B1 image signal.
- the search points D1 to D9 are provided at the same pixel positions as the reference points P1 to P9 in nine areas A1 to A9 (3 ⁇ 3) in the G2 image signal, as shown in FIG. 10B.
- These search points D1 to D9 are translated (searched) in the X direction or the Y direction within a predetermined search range within nine areas A1 to A9 in the G2 image signal.
- FIG. 12A when there is a dark area (“area A5” in FIG. 12A) in which the dark part BP where the pixel value is equal to or less than the fixed value is present in areas A1 to A9.
- a reference point P5 and a search point D5 are set in a part other than the dark part BP in the dark area. This is because when the reference point P5 and the search point D5 are set in the dark part BP, the target point T is detected immediately after the search, so that the alignment can not be accurately performed.
- FIG. 12B for example, as shown in FIG.
- a protrusion 86 appears in front of the distal end portion 24 of the endoscope 12 in the lumen as a case where the partial area becomes dark.
- the brightness of each area is calculated
- the first alignment unit 80 b includes a high frequency filtering unit (high frequency component extraction unit) HF that performs high frequency frequency filtering processing on the B1 image signal and the G2 image signal.
- a high frequency filtering unit high frequency component extraction unit
- Information of high frequency components such as landmarks (for example, blood vessel structure) serving as landmarks at the time of alignment is sharply extracted from the B1 image signal and G2 image signal in high frequency frequency filtering processing, so the target by the search points D1 to D9
- the point T can be detected with high accuracy.
- B1 image signal and G2 image signal have many blue wavelength components with high absorption coefficient of the absorber (hemoglobin) of mucous membrane, so there are many images of structures that can be land marks such as blood vessel structure. include. Therefore, for the B1 image signal and the G2 image signal, the detection of the target point T by the search point D is easy, so the movement amount of the search point D, that is, the positional shift amount can be accurately obtained. Therefore, for positional alignment between the R1 image signal and the R2 image signal, not the positional displacement amount of the R1 image signal and the R2 image signal but the positional displacement amount between the B1 image signal and the G2 image signal is used.
- the measurement signal ratio calculation unit 81 sets the measurement signal ratio B1 / G2 between the B1a image signal and the G2a image signal and the measurement signal ratio R2 / G2 between the G2a image signals G2 and R2a image signal for each pixel.
- “B1a image signal, G2a image signal, R2a image signal” used for calculation of the measurement signal ratios B1 / G2 and R2 / G2 are corrected by the signal correction processing in the signal correction unit 80, and therefore The oxygen saturation can be accurately calculated by the measurement signal ratios B1 / G2 and R2 / G2.
- the correlation storage unit 82 stores the correlation between the measurement signal ratios B1 / G2 and R2 / G2 and the oxygen saturation.
- This correlation is stored as a two-dimensional table in which isolines of oxygen saturation are defined in the two-dimensional space shown in FIG.
- the position and shape of this contour line are obtained by physical simulation of light scattering, and are defined to change according to blood volume. For example, when there is a change in blood volume, intervals between contour lines become wide or narrow.
- the measurement signal ratios B1 / G2 and R2 / G2 are stored in log scale.
- a graph 90 shows the absorption coefficient of oxygenated hemoglobin and a graph 91 shows the absorption coefficient of reduced hemoglobin.
- a wavelength with a large difference in absorption coefficient such as the central wavelength 473 nm of the first blue laser light
- a B1a image signal including a signal corresponding to 473 nm light is highly dependent not only on oxygen saturation but also on blood volume.
- the measurement signal ratio B1 / obtained from the R2a image signal corresponding to the light that changes mainly depending on the blood volume, and the G2a image signal serving as the reference signal of the B1a image signal and the R2a image signal By using G2 and R2 / G2, oxygen saturation can be accurately determined without depending on blood volume.
- the oxygen saturation calculation unit 83 refers to the correlation stored in the correlation storage unit 82, and the oxygen saturation corresponding to the measurement signal ratios B1 / G2 and R2 / G2 determined by the measurement signal ratio calculation unit 81. Find the degree. The oxygen saturation is calculated for each pixel.
- the oxygen saturation calculation unit 83 calculates the oxygen saturation as follows. For example, when the measurement signal ratio at a predetermined pixel is B1 * / G2 * and R2 * / G2 * , referring to the correlation as shown in FIG. 17, the measurement signal ratio B1 * / G2 * , The oxygen saturation corresponding to R2 * / G2 * is "60%". Therefore, the oxygen saturation is calculated as “60%”.
- the measurement signal ratios B1 / G2 and R2 / G2 are calculated based on the B1 image signal and the G2a image signal and the R2a image signal corrected by the signal correction unit 80.
- the ratios B1 / G2 and R2 / G2 hardly increase or decrease very little. That is, the measurement signal ratios B1 / G2 and R2 / G2 are higher than the lower limit line 93 of 0% of oxygen saturation or lower than the upper limit line 94 of 100% of oxygen saturation in correlation. There is almost nothing.
- the oxygen saturation is 0%
- the measurement signal ratios B1 / G2 and R2 / G2 are positioned above the lower limit line 93 in correlation
- the oxygen saturation is 0%
- the measurement signal ratios B1 / G2 and R2 / G2 are When positioned below the upper limit line 94, the oxygen saturation is set to 100%. If the corresponding point is out of the range between the lower limit line 93 and the upper limit line 94, the reliability of the oxygen saturation in the pixel may be lowered and not displayed.
- the image generation unit 84 generates an oxygen saturation image in which the oxygen saturation is imaged, using the oxygen saturation calculated by the oxygen saturation calculation unit 83, and the B2 image signal, the G2 image signal, and the R2 image signal. .
- the image generation unit 84 applies a gain according to the oxygen saturation to the B2 image signal, the G2 image signal, and the R2 image signal. For example, when the oxygen saturation is 60% or more, the same gain "1" is applied to all of the B2 image signal, the G2 image signal, and the R2 image signal. On the other hand, when the oxygen saturation is less than 60%, a gain less than "1" is applied to the B2 image signal, while "1" is applied to the G2 image signal and the R2 image signal. More gain is applied.
- the B2 image signal, the G2 image signal, and the R2 image signal after the gain processing are allocated to the BGR image data.
- the normal observation mode screening is performed from the distant view state.
- a normal light image is displayed on the monitor 18.
- the mode switching SW 22b is operated to switch to the special observation mode.
- this special observation mode it is diagnosed whether or not the lesion-prone site is in hypoxia.
- the first and second white lights are alternately emitted.
- the image sensor 48 picks up an image of the sample illuminated by the first white light, whereby the image sensor 48 outputs the B1 image signal, the G1 image signal, and the R1 image signal, and the sample illuminated by the second white light
- the image sensor 48 outputs a B2 image signal, a G2 image signal, and an R2 image signal. These two frames of image signals are used to create one frame of oxygen saturation image.
- the reference points P1 to P9 and the search points D1 to D9 are set, and the search range of the search points D1 to D9 is set. Then, in each of the areas A1 to A9, the search points D1 to D9 are searched within the set search range. Then, the amount of movement when the search points D1 to D9 detect the target point T is taken as the amount of positional deviation between frames.
- the alignment between the B1 image signal and the G2 image signal is performed based on the positional deviation amount between the frames.
- the generated oxygen saturation image is displayed on the monitor 18 as a special light image. Based on the oxygen saturation image displayed on the monitor 18, the doctor confirms whether or not the lesion site is hypoxic. The display of the oxygen saturation is continuously displayed until the normal observation mode is switched. Then, when the diagnosis is finished, the insertion portion 21 of the endoscope 12 is extracted from the inside of the sample.
- the phosphor 44 is provided at the distal end portion 24 of the endoscope 12.
- the phosphor 44 is provided between the first blue laser light source (473 LD) 34 and the second blue laser light source (445 LD) 36 and the light guide 41.
- the first blue laser light source 34 or the second blue laser light source 36 irradiates the phosphor 44 with the first blue laser light or the second blue laser light.
- the first white light or the second white light is emitted.
- the first or second white light is irradiated into the sample through the light guide 41.
- the endoscope system 100 is similar to the endoscope system 10.
- the first and second blue laser beams are incident on the same phosphor 44.
- the first blue laser beam and the second blue laser beam may be respectively separated into the first phosphor and the second phosphor.
- the light may be incident on the phosphor.
- the fluorescence emitted from the first phosphor the light of the first red component incident on the R pixel of the image sensor 48 and the fluorescence emitted from the second phosphor enter the R pixel of the image sensor 48 It is necessary to make the same waveform for the second red component light.
- an LED light source unit 202 and an LED light source controller A light source control unit 204 is provided in the light source device 14 of the endoscope system 200.
- the fluorescent body 44 is not provided in the illumination optical system 24 a of the endoscope 200. Other than that, it is the same as that of endoscope system 10 of a 1st embodiment.
- the LED light source unit 202 has four LEDs as light sources that emit light limited to a specific wavelength range.
- the LED light source unit 202 includes an LED (B) that emits blue band light B (hereinafter referred to simply as blue light) in the blue region of 400 to 500 nm, and a blue narrow band light limited to 473 nm ⁇ 10 nm.
- LED (R) which light-emits only red light.
- the LED light source unit 202 may be provided with a plurality of LEDs that emit narrow band light with slightly different wavelength ranges so that each LED emits light in a broad wavelength range.
- the “first signal light” corresponds to the “blue narrowband light Nb”, and the “first reference light” corresponds to the “red light R”. Therefore, the “first light source” of the present invention includes the LED (473) and the LED (R).
- the “second signal light” corresponds to the "green light G”
- the “second reference light” corresponds to the “red light R”. Therefore, the “second light source” of the present invention includes the LED (G) and the LED (R).
- the LED light source control unit 204 controls each of the LEDs of the LED light source unit 202 individually. Further, the LED light source control unit 204 drives the LEDs (B), (G) and (R) in the normal observation mode. On the other hand, in the case of the special observation mode, in a state in which the LEDs (G) and (R) are always lit, the LED (473) and the LED (B) are controlled to be alternately lit. .
- the imaging control unit 49 performs the following imaging control for each observation mode.
- the color image sensor 48 captures an image of the inside of the subject simultaneously illuminated with blue light B, green light G and red light R every period of one frame, ie, blue light
- a step of accumulating charges obtained by photoelectric conversion of B, green light G and red light R, and a step of reading out the accumulated charges as a B image signal, a G image signal, and an R image signal are performed.
- Such an operation is repeatedly performed while the normal observation mode is set. Then, based on the image signals for one frame, a normal light image is generated by the same method as the first embodiment.
- the reference signal ratio C indicating the ratio between the R1 image signal obtained at the first frame red light emission and the R2 image signal obtained at the second frame red light emission interlocks with the actual inter-frame intensity ratio. Increase or decrease. Therefore, also in the second embodiment, the reference signal ratio C accurately represents the actual inter-frame intensity ratio.
- a broadband light source 302 instead of the first and second blue laser light sources 34 and 36, and the light source control unit 40, a broadband light source 302, a rotation filter 304, and a filter switching unit 305 is provided. Other than that, it is the same as that of endoscope system 10 of a 1st embodiment.
- the broadband light source 302 is a xenon lamp, a white LED, or the like, and emits white light in a wavelength range ranging from blue to red.
- the rotation filter 304 rotates around the rotation axis 304a, and includes a normal observation mode filter 308 provided inside and a special observation mode filter 309 provided outside (see FIG. 23). .
- the filter switching unit 305 moves the rotary filter 304 in the radial direction, and inserts the normal observation mode filter 308 of the rotary filter 304 into the optical path of the white light when the mode switching switch 22 b is set to the normal observation mode.
- the special observation mode filter 309 of the rotary filter 304 is inserted into the light path of the white light.
- the rotating shaft 304a is supported by two support rods 304b.
- the normal observation mode filter 308 is provided with an opening 308 a that transmits white light as it is. Therefore, in the normal observation mode, white light is irradiated into the sample.
- a band pass filter (BPF) 309a for transmitting band-limited light (473, GR) of a predetermined band of white light along the circumferential direction, and an opening for transmitting the white light as it is 309b is provided. Therefore, in the special observation mode, the band-limited light (473, GR) and the white light are alternately irradiated into the sample by rotating the rotary filter 304.
- the rotation filter 304 and a drive unit (not shown) that controls the rotation speed of the rotation filter 304 constitute the “light source control unit” of the present invention.
- the band pass filter 309a has transparency at 473 nm ⁇ 10 nm and 500 to 700 nm (green region to red region), and blocks the other wavelengths. Therefore, the band-limited light (473, GR) has wavelengths of 473 nm ⁇ 10 nm and 500 to 700 nm.
- the “first signal light” corresponds to “the light which enters the B pixel of the image sensor 48 in the band-limited light”
- the “first reference light” corresponds to the “image sensor 48 in the band-limited light”.
- the “first light source” of the present invention includes the broadband light source 302 and the band pass filter 309a.
- the “second signal light” corresponds to “the light which is incident on the G pixel of the image sensor 48 in the white light”
- the “second reference light” corresponds to the “R” of the image sensor 48 in the white light.
- the “second light source” of the present invention will have a broadband light source 302.
- the imaging control unit 49 performs the following imaging control for each observation mode. As shown in FIG. 25A, in the normal observation mode, a step of accumulating charges obtained by photoelectric conversion of white light every period of one frame, and the accumulated charges in B image signal, G image signal, R image signal The step of reading out is performed. Such an operation is repeatedly performed while the normal observation mode is set. Then, based on the image signals for one frame, a normal light image is generated by the same method as the first embodiment.
- the red light component received by the R pixel of the image sensor in the band-limited light of the first frame and the red light component received by the R pixel of the white light of the second frame are emitted from the same broadband light source 302 Therefore, the respective waveforms are the same, and their intensity ratios are the same regardless of the wavelength. Therefore, the reference signal ratio C indicating the ratio between the R1 image signal obtained at the time of band-limited light emission of the first frame and the R2 image signal obtained at the time of white light emission of the second frame interlocks with the actual inter-frame intensity ratio. To increase or decrease. Therefore, also in the third embodiment, the reference signal ratio C represents the inter-frame intensity ratio.
- the endoscope system 300 provided with the broadband light source 302, the rotation filter 304, and the filter switching unit 305 has been described, but as shown in FIG. 26, the endoscope system of the fourth embodiment
- the rotary filter 404, the semiconductor light source LD (473) 406, the semiconductor light source control unit 408, and the light merging unit 410 may be provided in 400. Other than that, it is the same as that of endoscope system 300 of a 3rd embodiment.
- the rotation filter 404 rotates around the rotation axis 404a, and includes a normal observation mode filter 412 provided inside and a special observation mode filter 413 provided outside. ing.
- the normal observation mode filter 412 is provided with an opening 412 a that transmits white light as it is. Therefore, in the normal observation mode, white light is irradiated into the sample.
- the rotating shaft 404a is supported by two support rods 404b.
- the filter for special observation mode 413 includes a band pass filter (BPF) 413a for transmitting band-limited light (GR) of a predetermined band of white light and an opening 413b for transmitting white light as it is along the circumferential direction. It is provided. Therefore, in the special observation mode, the band-limited light (GR) and the white light are alternately irradiated into the sample by rotating the rotary filter 404. As shown in FIG. 28, the band pass filter 413a is transparent in the wavelength range of 500 to 700 nm, and blocks the other wavelengths. Thus, the band limited light (GR) has a wavelength of 500 to 700 nm.
- BPF band pass filter
- the semiconductor light source LD (473) 406 emits blue narrow band light Nb of 473 nm ⁇ 10 nm.
- the semiconductor light source control unit 408 acquires a detection signal from an image sensor (not shown) that detects the rotation of the rotation filter 404, and according to the acquired detection signal, the driving timing and synchronization timing of the semiconductor light source LD (473) 406, Control such as lighting and extinguishing.
- the semiconductor light source control unit 408 does not emit the blue narrow band light Nb within the irradiation period in which the white light is irradiated into the sample, and within the irradiation period in which the band limited light (GR) is irradiated into the sample. Emits blue narrow band light Nb.
- the rotation filter 404, a drive unit (not shown) for controlling the rotation speed of the rotation filter 404, and the semiconductor light source control unit 408 constitute the “light source control unit” of the present invention. .
- the light merging portion 410 is formed of a dichroic mirror, transmits the light from the rotation filter 404 to be incident on the LG 41, reflects the light from the blue semiconductor light source LD (473) 406, and causes the light to be incident on the LG 41.
- the “first signal light” corresponds to the “blue narrowband light Nb”, and the “first reference light” corresponds to the “light that enters the R pixel of the image sensor 48 in the band-limited light”.
- the “first light source” of the present invention includes the broadband light source 302, the semiconductor light source LD (473) 406, and the band pass filter (BPF) 413a.
- the “second signal light” corresponds to “the light which is incident on the G pixel of the image sensor 48 in the white light”
- the “second reference light” corresponds to the “R” of the image sensor 48 in the white light. Corresponds to the light incident on the pixel.
- the “second light source” of the present invention will have a broadband light source 302.
- a step of accumulating charges obtained by photoelectric conversion of white light every period of one frame, and the accumulated charges in B image signal, G image signal, R image signal The step of reading out is performed. Such an operation is repeatedly performed while the normal observation mode is set. Then, based on the image signals for one frame, a normal light image is generated by the same method as the first embodiment.
- the red light component received by the R pixel of the image sensor in the band-limited light of the first frame and the red light component received by the R pixel of the white light of the second frame are emitted from the same broadband light source 302 Therefore, the respective waveforms are the same, and their intensity ratios are the same regardless of the wavelength. Therefore, the reference signal ratio C indicating the ratio between the R1 image signal obtained at the time of band-limited light emission of the first frame and the R2 image signal obtained at the time of white light emission of the second frame interlocks with the actual inter-frame intensity ratio. To increase or decrease. Therefore, also in the fourth embodiment, the reference signal ratio C accurately represents the inter-frame intensity ratio.
- the oxygen saturation is generated from the two measurement signal ratios B1 / G2 and R2 / G2 in the above embodiment, the oxygen saturation may be calculated only from the measurement signal ratio B1 / G2.
- the correlation storage unit storing the correlation between the measurement signal ratio B1 / G2 and the oxygen saturation is used to calculate the oxygen saturation.
- the imaging of the oxygen saturation degree is performed.
- the imaging of the blood volume may be performed.
- the blood volume has a correlation with the measurement signal ratio R2 / G2 obtained by the measurement signal ratio calculation unit. Therefore, by assigning different colors according to the measurement signal ratio R2 / G2, it is possible to create a blood volume image in which the blood volume is imaged.
- the oxygen saturation which is the ratio of oxygenated hemoglobin in the blood volume (sum of oxygenated hemoglobin and reduced hemoglobin)
- blood volume ⁇ oxygen saturation Other biological function information such as an oxygenated hemoglobin index obtained from (%) or a reduced hemoglobin index obtained from “blood volume ⁇ (100 ⁇ oxygen saturation) (%)” may be calculated.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Surgery (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Pathology (AREA)
- Public Health (AREA)
- Optics & Photonics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Veterinary Medicine (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Biophysics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Signal Processing (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Endoscopes (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
L'invention concerne: un système d'endoscope qui calcule avec précision la saturation en oxygène de l'hémoglobine sanguine même lorsqu'une image comprend des régions avec une luminosité différente ; un procédé opérationnel s'y rapportant; et un dispositif de traitement. Une première lumière blanche qui inclut une première lumière de laser bleu et une première fluorescence et une seconde lumière blanche qui inclut une seconde lumière de laser bleu et une seconde fluorescence sont appliquées dans un sujet alternativement. Parmi les signaux d'image obtenus par imagerie du sujet avec un capteur d'image, le signal d'image B, le signal d'image G2, le signal d'image R1 et le signal d'image R2 sont utilisés pour calculer la saturation en oxygène. A partir du signal d'image B1 et du signal d'image G2, le déplacement du sujet est calculé. Sur la base de ce déplacement du sujet, le signal d'image R1 et le signal d'image R2 sont alignés. A partir du signal d'image R1 et du signal d'image R2 alignés, un rapport de signal de référence (R2/R1) est calculé. Sur la base de ce rapport de signal de référence, le signal d'image G2 ou le signal d'image R2 est corrigé. Sur la base du signal d'image B1 et du signal d'image G2 corrigé ou du signal d'image R2 corrigé, la saturation en oxygène est calculée.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2013064663A JP5997643B2 (ja) | 2013-03-26 | 2013-03-26 | 内視鏡システム及びプロセッサ装置並びに作動方法 |
| JP2013-064663 | 2013-03-26 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2014156604A1 true WO2014156604A1 (fr) | 2014-10-02 |
Family
ID=51623587
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2014/056268 Ceased WO2014156604A1 (fr) | 2013-03-26 | 2014-03-11 | Système d'endoscope, procédé opérationnel s'y rapportant et dispositif de traitement |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP5997643B2 (fr) |
| WO (1) | WO2014156604A1 (fr) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2016202380A (ja) * | 2015-04-17 | 2016-12-08 | Hoya株式会社 | 画像処理装置および内視鏡装置 |
| EP3446619A4 (fr) * | 2016-04-21 | 2019-05-29 | FUJIFILM Corporation | Système d'endoscope, dispositif de processeur, et procédé de fonctionnement de système d'endoscope |
Families Citing this family (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP6616071B2 (ja) * | 2014-12-22 | 2019-12-04 | 富士フイルム株式会社 | 内視鏡用のプロセッサ装置、内視鏡用のプロセッサ装置の作動方法、内視鏡用の制御プログラム |
| JP6744712B2 (ja) | 2015-12-17 | 2020-08-19 | 富士フイルム株式会社 | 内視鏡システム、プロセッサ装置、及び内視鏡システムの作動方法 |
| JP6744713B2 (ja) * | 2015-12-17 | 2020-08-19 | 富士フイルム株式会社 | 内視鏡システム、プロセッサ装置、及び内視鏡システムの作動方法 |
Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2008149674A1 (fr) * | 2007-06-05 | 2008-12-11 | Olympus Corporation | Dispositif de traitement d'image, programme de traitement d'image et procédé de traitement d'image |
| JP2009011563A (ja) * | 2007-07-04 | 2009-01-22 | Olympus Corp | 画像処理装置および画像処理プログラム |
| JP2011200517A (ja) * | 2010-03-26 | 2011-10-13 | Fujifilm Corp | 電子内視鏡システム |
| JP2011217886A (ja) * | 2010-04-07 | 2011-11-04 | Olympus Corp | 蛍光観察装置 |
| JP2012125402A (ja) * | 2010-12-15 | 2012-07-05 | Fujifilm Corp | 内視鏡システム、内視鏡システムのプロセッサ装置及び機能情報取得方法 |
| JP2012205619A (ja) * | 2011-03-29 | 2012-10-25 | Olympus Medical Systems Corp | 画像処理装置、制御装置、内視鏡装置、画像処理方法及び画像処理プログラム |
| JP2013013656A (ja) * | 2011-07-06 | 2013-01-24 | Fujifilm Corp | 内視鏡システム、内視鏡システムのプロセッサ装置、及び画像表示方法 |
-
2013
- 2013-03-26 JP JP2013064663A patent/JP5997643B2/ja active Active
-
2014
- 2014-03-11 WO PCT/JP2014/056268 patent/WO2014156604A1/fr not_active Ceased
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2008149674A1 (fr) * | 2007-06-05 | 2008-12-11 | Olympus Corporation | Dispositif de traitement d'image, programme de traitement d'image et procédé de traitement d'image |
| JP2009011563A (ja) * | 2007-07-04 | 2009-01-22 | Olympus Corp | 画像処理装置および画像処理プログラム |
| JP2011200517A (ja) * | 2010-03-26 | 2011-10-13 | Fujifilm Corp | 電子内視鏡システム |
| JP2011217886A (ja) * | 2010-04-07 | 2011-11-04 | Olympus Corp | 蛍光観察装置 |
| JP2012125402A (ja) * | 2010-12-15 | 2012-07-05 | Fujifilm Corp | 内視鏡システム、内視鏡システムのプロセッサ装置及び機能情報取得方法 |
| JP2012205619A (ja) * | 2011-03-29 | 2012-10-25 | Olympus Medical Systems Corp | 画像処理装置、制御装置、内視鏡装置、画像処理方法及び画像処理プログラム |
| JP2013013656A (ja) * | 2011-07-06 | 2013-01-24 | Fujifilm Corp | 内視鏡システム、内視鏡システムのプロセッサ装置、及び画像表示方法 |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2016202380A (ja) * | 2015-04-17 | 2016-12-08 | Hoya株式会社 | 画像処理装置および内視鏡装置 |
| EP3446619A4 (fr) * | 2016-04-21 | 2019-05-29 | FUJIFILM Corporation | Système d'endoscope, dispositif de processeur, et procédé de fonctionnement de système d'endoscope |
| US11044416B2 (en) | 2016-04-21 | 2021-06-22 | Fujifilm Corporation | Endoscope system, processor device, and endoscope system operation method |
Also Published As
| Publication number | Publication date |
|---|---|
| JP2014188083A (ja) | 2014-10-06 |
| JP5997643B2 (ja) | 2016-09-28 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP5303012B2 (ja) | 内視鏡システム、内視鏡システムのプロセッサ装置及び内視鏡システムの作動方法 | |
| JP5654511B2 (ja) | 内視鏡システム、内視鏡システムのプロセッサ装置、及び内視鏡システムの作動方法 | |
| JP5992936B2 (ja) | 内視鏡システム、内視鏡システム用プロセッサ装置、内視鏡システムの作動方法、内視鏡システム用プロセッサ装置の作動方法 | |
| EP2366327B1 (fr) | Système d'endoscope électronique et procédé d'acquisition d'informations relatives aux vaisseaux sanguins | |
| JP5977772B2 (ja) | 内視鏡システム、内視鏡システムのプロセッサ装置、内視鏡システムの作動方法、プロセッサ装置の作動方法 | |
| US10335014B2 (en) | Endoscope system, processor device, and method for operating endoscope system | |
| JP5887367B2 (ja) | プロセッサ装置、内視鏡システム、及び内視鏡システムの作動方法 | |
| JP6092792B2 (ja) | 内視鏡システム用プロセッサ装置、内視鏡システム、内視鏡システム用プロセッサ装置の作動方法、内視鏡システムの作動方法 | |
| JP5670264B2 (ja) | 内視鏡システム、及び内視鏡システムの作動方法 | |
| JP5808031B2 (ja) | 内視鏡システム | |
| JP5932748B2 (ja) | 内視鏡システム | |
| JP6008812B2 (ja) | 内視鏡システム及びその作動方法 | |
| JP5997643B2 (ja) | 内視鏡システム及びプロセッサ装置並びに作動方法 | |
| JP5930474B2 (ja) | 内視鏡システム及びその作動方法 | |
| JP5881658B2 (ja) | 内視鏡システム及び光源装置 | |
| JP6245710B2 (ja) | 内視鏡システム及びその作動方法 | |
| JP5829568B2 (ja) | 内視鏡システム、画像処理装置、画像処理装置の作動方法、及び画像処理プログラム | |
| WO2015025595A1 (fr) | Système endoscope et procédé de fonctionnement | |
| JP2018051364A (ja) | 内視鏡システム、内視鏡システムのプロセッサ装置、内視鏡システムの作動方法、プロセッサ装置の作動方法 | |
| JP5990141B2 (ja) | 内視鏡システム及びプロセッサ装置並びに作動方法 | |
| JP6272956B2 (ja) | 内視鏡システム、内視鏡システムのプロセッサ装置、内視鏡システムの作動方法、プロセッサ装置の作動方法 | |
| JP6175538B2 (ja) | 内視鏡システム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14773975 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 14773975 Country of ref document: EP Kind code of ref document: A1 |