WO2018211970A1 - 内視鏡 - Google Patents
内視鏡 Download PDFInfo
- Publication number
- WO2018211970A1 WO2018211970A1 PCT/JP2018/017380 JP2018017380W WO2018211970A1 WO 2018211970 A1 WO2018211970 A1 WO 2018211970A1 JP 2018017380 W JP2018017380 W JP 2018017380W WO 2018211970 A1 WO2018211970 A1 WO 2018211970A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- light
- sensor
- pixels
- endoscope
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00006—Operational features of endoscopes characterised by electronic signal processing of control signals
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/0005—Display arrangement combining images e.g. side-by-side, superimposed or tiled
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/045—Control thereof
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/05—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by the image sensor, e.g. camera, being in the distal end portion
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0638—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements providing two or more wavelengths
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0661—Endoscope light sources
- A61B1/0684—Endoscope light sources using light emitting diodes [LED]
Definitions
- the present technology relates to an endoscope, and more particularly to an endoscope that can acquire a high-resolution medical image with low power consumption, for example.
- R Red
- G Green
- B Blue
- an image sensor for R that outputs an image signal corresponding to an image by light
- a G image sensor and It has an image sensor for B
- the image sensor for R is configured with a larger number of pixels than the image sensor for G and the image sensor for B, and for R so that it is shifted by 1/2 pixel of the image sensor for R
- An endoscope that acquires an image with a high resolution with respect to R light by arranging an image pickup device, an image pickup device for G, and an image pickup device for B has been proposed (see, for example, Patent Document 1).
- the R image sensor is configured with a larger number of pixels than the G image sensor and the B image sensor, the size of the light receiving surface (image plane) between the R image sensor, the G image sensor, and the B image sensor Are the same, the pixel size of the R image sensor is smaller than the pixel size of the G image sensor and the B image sensor.
- the image signal output by the R image sensor becomes a noisy image signal
- the arrangement of the R image sensor, the G image sensor, and the B image sensor so as to be shifted by a half pixel of the R image sensor may not function effectively.
- the image taken by the endoscope is a medical image used for medical purposes, it is desirable that the image be a high-resolution image so that a doctor can check a fine part.
- an R sensor that receives R light that is light in the R wavelength band
- a G sensor that receives G light that is light in the G wavelength band
- a B sensor that receives B light, which is light in the B wavelength band
- a high-resolution image sensor multi-pixel image sensor
- the endoscope is required to be controlled by a user such as a doctor or a scopist in order to reduce the amount of heat generated and thus power consumption.
- the present technology has been made in view of such a situation, and makes it possible to acquire a high-resolution medical image with low power consumption.
- a first endoscope of the present technology has a first number of pixels, a first sensor that receives G light that is light in a G (Green) wavelength band, and the first number of pixels.
- the endoscope includes a camera head having a second number of pixels and a second sensor that receives light other than the G light.
- the first sensor in the camera head, includes pixels having the first number of pixels and receives G light that is light in a G (Green) wavelength band.
- the second sensor has pixels having a second number of pixels smaller than the first number of pixels, and receives light other than the G light.
- the second endoscope of the present technology has a predetermined number of pixels, a first sensor that receives G light that is light in a G (Green) wavelength band, and the predetermined number of pixels. And a second sensor that receives light other than the G light, and the second sensor uses a sum of pixel values of a plurality of pixels as a pixel value of one pixel.
- the first sensor has a predetermined number of pixels, receives G light that is light in a wavelength band of G (Green), and receives the second sensor. Has the predetermined number of pixels and receives light other than the G light.
- the second sensor has a binning function of outputting an addition value of pixel values of a plurality of pixels as a pixel value of one pixel, and the first sensor outputs the binning function.
- a second image composed of pixel values corresponding to light other than the G light and having a smaller number of pixels than the first image composed of pixel values corresponding to G light is output.
- a high-resolution medical image can be acquired.
- a high-resolution medical image can be acquired with low power consumption.
- FIG. 2 is a block diagram illustrating a first configuration example of a camera head and a first configuration example of a CCU 201.
- FIG. 2 is a plan view illustrating an outline of a configuration example of image sensors 303 to 305.
- FIG. 10 is a flowchart for explaining an example of processing of the CCU 201.
- 3 is a block diagram illustrating a second configuration example of the CCU 201.
- FIG. 3 is a diagram illustrating a second configuration example of a camera head.
- FIG. 10 is a flowchart illustrating an example of processing for setting an operation mode performed by a binning control unit 413.
- FIG. 6 is a block diagram illustrating a third configuration example of the camera head 102 and a third configuration example of the CCU 201.
- FIG. 6 is a flowchart illustrating an example of processing for setting an operation mode performed by a binning control unit 421.
- FIG. 18 is a block diagram illustrating a configuration example of an embodiment of a computer to which the present technology is applied.
- FIG. 1 is a diagram illustrating a configuration example of an embodiment of an endoscopic surgery system to which the present technology is applied.
- FIG. 1 illustrates a state in which an operator (doctor) 131 is performing an operation on a patient 132 on a patient bed 133 using the endoscopic operation system 10.
- the endoscopic surgery system 10 includes an endoscope 100, other surgical instruments 110 such as an insufflation tube 111 and an energy treatment instrument 112, and a support arm device 120 that supports the endoscope 100. And a cart 200 on which various devices for endoscopic surgery are mounted.
- the endoscope 100 includes a lens barrel 101 in which a region having a predetermined length from the distal end is inserted into a body cavity of a patient 132, and a camera head 102 connected to the proximal end of the lens barrel 101.
- a lens barrel 101 in which a region having a predetermined length from the distal end is inserted into a body cavity of a patient 132, and a camera head 102 connected to the proximal end of the lens barrel 101.
- an endoscope 100 configured as a so-called rigid mirror having a rigid lens barrel 101 is illustrated, but the endoscope 100 may be configured as a so-called flexible mirror having a flexible lens barrel. Good.
- an opening into which an objective lens is fitted is provided.
- a light source device 203 is connected to the endoscope 100, and light generated by the light source device 203 is guided to the tip of the lens barrel 101 by a light guide extending inside the lens barrel 101, and the objective lens Is irradiated toward the observation target in the body cavity of the patient 132.
- the endoscope 100 may be a direct endoscope, a perspective mirror, or a side endoscope.
- an optical system and an image sensor are provided, and reflected light (observation light) from the observation target is condensed on the image sensor by the optical system.
- Observation light is photoelectrically converted by the image sensor, and an electrical signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated.
- the image signal is transmitted to a camera control unit (CCU: “Camera Control Unit”) 201 as RAW data.
- CCU Camera Control Unit
- the CCU 201 is configured by a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like, and comprehensively controls the operations of the endoscope 100 and the display device 202. Furthermore, the CCU 201 receives an image signal (image data) from the camera head 102, and displays various medical images corresponding to the image signal, such as development processing (demosaic processing), for example, for the image signal. Apply image processing.
- the display device 202 displays a medical image corresponding to an image signal subjected to image processing by the CCU 201 under the control of the CCU 201.
- the light source device 203 is composed of a light source such as an LED (Light-Emitting-Diode), for example, and supplies irradiation light to the endoscope 100 when photographing an observation target such as a surgical site.
- a light source such as an LED (Light-Emitting-Diode), for example, and supplies irradiation light to the endoscope 100 when photographing an observation target such as a surgical site.
- the input device 204 is an input interface for the endoscopic surgery system 10.
- the user can input various information and instructions to the endoscopic surgery system 10 via the input device 204.
- the user inputs an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 100.
- the treatment instrument control device 205 controls driving of the energy treatment instrument 112 for tissue ablation, incision, blood vessel sealing, or the like.
- the insufflation apparatus 206 supplies gas into the body cavity through the insufflation tube 111. Send it in.
- the recorder 207 is a device that can record various types of information related to surgery.
- the printer 208 is a device that can print various types of information related to surgery in various formats such as text, images, or graphs.
- the light source device 203 that supplies irradiation light for imaging the surgical site to the endoscope 100 can be configured from a white light source configured by, for example, an LED, a laser light source, or a combination thereof.
- a white light source is configured by a combination of RGB (Red, Green, Blue) laser light sources
- the output intensity and output timing of each color (each wavelength) can be controlled with high accuracy.
- the white balance can be adjusted.
- laser light from each of the RGB laser light sources is radiated to the surgical site in a time-sharing manner, and the drive of the image sensor of the camera head 102 is controlled in synchronization with the irradiation timing, thereby corresponding to each RGB. It is also possible to take the images that have been taken in time division.
- a color image can be obtained without providing a color filter in the image sensor.
- the driving of the light source device 203 may be controlled so as to change the intensity of the output light every predetermined time.
- the image sensor is acquired in a time-sharing manner by controlling the drive of the image sensor of the camera head 102, and the images are synthesized, so that high dynamics without so-called blackout and overexposure are obtained. A range image can be generated.
- the light source device 203 may be configured to be able to supply light of a predetermined wavelength band corresponding to special light observation.
- special light observation for example, by utilizing the wavelength dependence of light absorption in body tissue, the surface of the mucous membrane is irradiated by irradiating light in a narrow band compared to irradiation light (ie, white light) during normal observation.
- a so-called narrow-band light observation (Narrow Band Imaging) is performed in which a predetermined tissue such as a blood vessel is imaged with high contrast.
- fluorescence observation may be performed in which an image is obtained by fluorescence generated by irradiating excitation light.
- the body tissue is irradiated with excitation light and the fluorescence from the body tissue is observed (autofluorescence observation), or a reagent such as indocyanine green (ICG) is locally administered to the body tissue, It is possible to obtain a fluorescence image by irradiating excitation light corresponding to the fluorescence wavelength of the reagent.
- the light source device 203 can be configured to be able to supply narrowband light and / or excitation light corresponding to such special light observation.
- FIG. 2 is a block diagram illustrating a first configuration example of the camera head 102 and a first configuration example of the CCU 201.
- FIG. 2 (the same applies to the drawings described later), the illustration of the configuration for irradiating the surgical site with irradiation light is omitted for the sake of simplicity.
- the camera head 102 is a monocular camera head, and includes a lens (group) 301, a prism 302, image sensors 303, 304, and 305, and a data transmission unit 306.
- the camera head 102 can be configured with multiple eyes instead of a single eye.
- the camera head 102 can be configured in a small size, and the burden on the user who has the camera head 102 can be reduced.
- a part of the camera head 102 that is, for example, the lens 301, the prism 302, and the image sensors 303 to 305 can be provided at the tip of the lens barrel 101.
- the lens 301 constitutes the optical system of the camera head 102, and the light incident from the lens barrel 101, that is, the reflected light that is returned from the irradiation light after being reflected by the surgical site is returned to the image sensor 303 through the prism 302. The light is condensed on 305.
- the prism 302 constitutes the optical system of the camera head 102. Light condensed by the lens 301 is incident on the prism 302. The prism 302 separates the light from the lens 301 into R light, G light, and B light, the G light to the image sensor 303, the B light to the image sensor 304, and the R light to the image sensor 305, respectively. Make it incident.
- the image sensor 303 receives the G light incident from the prism 302, performs photoelectric conversion, and outputs a G signal that is an image signal corresponding to the G light from the prism 302.
- the image sensor 304 receives the B light incident from the prism 302, performs photoelectric conversion, and outputs a B signal that is an image signal corresponding to the B light from the prism 302.
- the image sensor 305 receives R light incident from the prism 302, performs photoelectric conversion, and outputs an R signal that is an image signal corresponding to the R light from the prism 302.
- the G signal output from the image sensor 303, the B signal output from the image sensor 304, and the R signal output from the image sensor 305 are supplied to the data transmission unit 306.
- the image sensors 303 to 305 are image sensors having the same size of the light receiving surface capable of receiving light and performing photoelectric conversion, and the same subject image is the same on the light receiving surface. It is arranged to be formed in size.
- the number of pixels (second pixel number) of the image sensors 304 and 305 (second sensor) that receive light other than G light is the number of pixels of the image sensor 303 (first sensor) that receives G light. It is less than (first pixel number).
- the image sensor 303 is a high-resolution image sensor capable of capturing a so-called 4K image, for example, 3840 ⁇ 2160 pixels or 4096 ⁇ 2160 pixels in the horizontal and vertical directions
- the image sensors 304 and 305 are, for example, This is an image sensor capable of capturing a so-called HD (High Definition) image, such as 1920 ⁇ 1080 pixels, whose resolution is lower than that of the image sensor 303.
- HD High Definition
- the size of the light receiving surfaces of the image sensors 303 to 305 is the same, but the number of pixels of the image sensors 304 and 305 is smaller than the number of pixels of the image sensor 303.
- the size (pitch) of the pixels of the image sensors 304 and 305 is larger (can be configured larger) than the size of the pixels of the image sensor 303.
- the image sensor 303 that receives the G light is an image sensor that has a larger number of pixels than the image sensors 304 and 305 (although it has a high resolution) but a small pixel size.
- the image sensors 304 and 305 that receive light other than G light that is, here, B light and R light, respectively, have a larger pixel size but a smaller number of pixels than the image sensor 303 (resolution). Is an image sensor.
- the data transmission unit 306 transmits (transmits) the G signal, the B signal, and the R signal respectively supplied from the image sensors 303 to 305 to the CCU 201.
- the prism 302 In the camera head 102 configured as described above, light incident from the lens barrel 101 enters the prism 302 via the lens 301. In the prism 302, the light from the lens 301 is separated into R light, G light, and B light, the G light to the image sensor 303, the B light to the image sensor 304, and the R light to the image sensor 305, respectively. Incident.
- the image sensors 303 to 305 receive the G light, B light, and R light from the prism 302, and output the corresponding G signal, B signal, and R signal to the data transmission unit 306, respectively.
- the G signal, the B signal, and the R signal from each of the image sensors 303 to 305 are transmitted to the CCU 201.
- a high-resolution image sensor that is, an image sensor that can shoot a 4K image (hereinafter also referred to as a 4K sensor), for example, is used.
- an image sensor having a smaller number of pixels than that of the 4K sensor that is, an image sensor that can shoot an HD image (hereinafter also referred to as an HD sensor) is employed. It is possible to acquire a high-resolution medical image (a medical image that looks like a high resolution) with low power consumption.
- human vision is more sensitive to G light (G signal) than G light, for example, R light and B light, and has a characteristic of acquiring resolution information from G light.
- G light contributes to the resolution that human vision acquires (feels) more than R light and B light.
- the image sensor 303 that receives the G light that contributes to the resolution as described above has a higher resolution than the image sensor 304 that receives the B light and the image sensor 305 that receives the R light. Since it is an image sensor (for example, a 4K sensor), a high-resolution medical image can be acquired using the G signal output from such an image sensor 303.
- the medical image (image signal) that is finally displayed on the display device 202 is based on the G signal output from the image sensor 303, the B signal output from the image sensor 304, and the R signal output from the image sensor 305. Generated.
- the image sensors 304 and 305 have, for example, fewer HD pixels than the image sensor 303 that is a 4K sensor, for example, an HD sensor, but the B signal and the R signal output from the image sensors 304 and 305, respectively, Compared with the G signal output from the, the degree of contribution to the resolution (which humans feel) (hereinafter also referred to as resolution contribution) is low.
- the image sensor 303 that outputs a G signal having a high resolution contribution is a high-resolution 4K sensor. Therefore, the G signal output from the image sensor 303, the B signal output from the image sensor 304, and the image sensor 305 include The medical image generated from the output R signal is an image that can be felt at almost the same resolution as when a high-resolution 4K sensor is used as the image sensors 303 to 305.
- the image sensors 304 and 305 that receive B light and R light that do not contribute to the resolution as much as the G light have fewer pixels than the image sensor 303 that receives G light that contributes to the resolution, for example, an HD sensor.
- the power consumption required to transmit the B signal and the R signal output from the image sensors 304 and 305, respectively, is the same as that of the high-resolution image sensor 303, for example, 4K as the image sensor that receives the B light and the R light. Compared with the case where a sensor is used, the power consumption is reduced and the amount of generated heat can be suppressed.
- the number of pixels 1/4 of the number N of pixels of the image sensor 303 (for example, the number of pixels in the horizontal and vertical directions is 1/2 the number of pixels of the image sensor 303) N / 4.
- 305 (3N N + N + N).
- the power consumption required for transmitting the pixel signals (G signal, B signal, R signal) output from the image sensors 303 to 305 is proportional to the number of pixels of the image sensors 303 to 305. Then, when an image sensor having a pixel number that is 1/4 of the number of pixels of the image sensor 303 is adopted as the image sensors 304 and 305, the power consumption, and thus the heat generation amount, is the same as the number of pixels of the image sensor 303. It is possible to suppress the image sensor to 1/2 that when the image sensor is used as the image sensors 304 and 305.
- the image sensors 304 and 305 that receive B light and R light other than G light respectively have a smaller number of pixels (decrease in resolution) than the image sensor 303, but the pixel size is smaller. large. That is, the image sensors 304 and 305 are image sensors having higher sensitivity to light intensity than the image sensor 303, and have a low noise (S / N (Signal to Noise ratio)) pixel signal (B signal). , R signal).
- S / N Signal to Noise ratio
- the medical image finally displayed on the display device 202 is generated using the low-noise B signal and R signal, a medical image with good S / N can be obtained.
- An image sensor with a small number of pixels is less expensive than an image sensor with a large number of pixels. Therefore, by adopting an image sensor with a smaller number of pixels than the image sensor 303 as the image sensors 304 and 305, the image sensor 303 is provided. Compared to the case where the image sensor having the same number of pixels as the image sensors 304 and 305 is employed, the camera head 102 and, therefore, the endoscopic surgery system 10 can be configured at low cost.
- the image sensor 303 is not limited to a 4K sensor, and for example, an 8K sensor that is an image sensor capable of capturing a higher-resolution 8K image can be employed.
- the image sensors 304 and 305 are not limited to HD sensors.
- the combination of the image sensor 303 and the image sensors 304 and 305 is not limited to the combination of the 4K sensor and the HD sensor.
- a combination of the image sensor 303 and the image sensors 304 and 305 a combination of an 8K sensor and a 4K sensor can be employed.
- the size of the light receiving surface of the image sensors 303 to 305 may not be the same.
- the pixel size of the image sensor 303 is not limited to a size smaller than the pixels of the image sensors 304 and 305.
- the pixel size of the image sensor 303 and the pixel sizes of the image sensors 304 and 305 can be the same size.
- the size of the light receiving surface of the image sensor 303 with a large number of pixels is larger than the size of the light receiving surfaces of the image sensors 304 and 305 with a small number of pixels.
- the R image sensor, the G image sensor, and the B image sensor are arranged so as to be shifted by 1/2 pixel.
- the number of horizontal and vertical pixels needs to be an integral multiple of the number of horizontal and vertical pixels of the G image sensor and the B image sensor.
- the number of pixels of the image sensor 303 may be an integer multiple of the number of pixels of the image sensors 304 and 305, or may not be an integer multiple. That is, an integer multiple is not imposed on the relationship between the number of pixels of the image sensor 303 and the number of pixels of the image sensors 304 and 305.
- the CCU 201 includes a data reception unit 311, conversion units 312 and 313, and a camera signal processing unit 314.
- the data reception unit 311 receives the G signal, the B signal, and the R signal transmitted from the data transmission unit 306, and supplies them to the necessary blocks.
- the G signal is supplied to the camera signal processing unit 314, the R signal is supplied to the conversion unit 312, and the B signal is supplied to the conversion unit 313.
- the conversion unit 312 has the same number of pixels of the R image (image signal thereof) having the R signal from the data reception unit 311 as the pixel value, and the number of pixels of the G image having the G signal output from the image sensor 303 as the pixel value. Image processing to convert (up-convert) to the number of pixels, and supply the converted R image to the camera signal processing unit 314.
- the conversion unit 313 performs image processing for converting the number of pixels of the B image having the B signal from the data reception unit 311 as a pixel value into the same number of pixels as that of the G image, and converts the converted B image to the number of pixels. To the camera signal processing unit 314.
- the conversion (up-conversion) of the number of pixels in the conversion units 312 and 313 can be performed using, for example, an up-conversion filter such as a bicubic filter.
- the conversion of the number of pixels in the conversion units 312 and 313 can be performed regardless of whether the number of pixels of the image sensor 303 is an integral multiple of the number of pixels of the image sensors 304 and 305. Therefore, as described above, the number of pixels of the image sensor 303 may be an integer multiple of the number of pixels of the image sensors 304 and 305 or may not be an integer multiple.
- the camera signal processing unit 314 includes a G image using the G signal from the data reception unit 311 as a pixel value, an R image converted from the conversion unit 312 to the same number of pixels as the G image, and a G image from the conversion unit 313.
- the B image converted to the same number of pixels as the image is subjected to predetermined camera signal processing to generate a medical image having pixel values of the R signal, the G signal, and the B signal, and supplies the medical image to the display device 202 To do.
- the camera signal processing for example, various signal processing such as development, gamma correction, and color adjustment can be performed.
- FIG. 3 is a plan view for explaining an outline of a configuration example of the image sensors 303 to 305.
- FIG. 3 schematically shows a configuration example of the image sensors 303 to 305.
- the light receiving surfaces of the image sensors 303 to 305 have the same size.
- the number of pixels of the image sensors 304 and 305 is smaller than the number of pixels of the image sensor 303.
- the image sensor 303 that receives G light has a smaller pixel size than the image sensors 304 and 305, but has a larger number of pixels and takes a high-resolution image. Can do.
- the image sensors 304 and 305 that receive the B light and the R light, respectively, have a smaller number of pixels than the image sensor 303, but the pixel size is large and the sensitivity to the light intensity is high.
- FIG. 4 is a flowchart for explaining an example of processing of the CCU 201 of FIG.
- the data receiving unit 311 includes a G image having the G signal transmitted from the data transmitting unit 306 as a pixel value, a B image having the B signal as a pixel value, and an R image having the R signal as a pixel value. Receive.
- the data reception unit 311 supplies the G image to the camera signal processing unit 314, the R image to the conversion unit 312, and the B image to the conversion unit 313, and the process proceeds to step S12.
- step S12 the conversion unit 312 converts the number of pixels of the R image from the data reception unit 311 into the same number of pixels as the number of pixels of the G image, and the conversion unit 313 performs the B image from the data reception unit 311. Is converted to the same number of pixels as that of the G image. Further, in step S12, the conversion unit 312 supplies the R image after the pixel number conversion to the camera signal processing unit 314, and the conversion unit 313 converts the B image after the pixel number conversion into the camera signal processing unit. The process proceeds to step S13.
- step S13 the camera signal processing unit 314 performs predetermined camera signal processing on the G image having the G signal from the data reception unit 311 as a pixel value, the R image from the conversion unit 312 and the B image from the conversion unit 313. As a result, a medical image having pixel values of the R signal, the G signal, and the B signal is generated, supplied to the display device 202, and the process ends.
- FIG. 5 is a block diagram showing a second configuration example of the CCU 201.
- the CCU 201 includes a data reception unit 311, a camera signal processing unit 314, an NR (Noise Reduction) unit 321, and conversion units 322 and 323.
- a data reception unit 311 a camera signal processing unit 314, an NR (Noise Reduction) unit 321, and conversion units 322 and 323.
- NR Noise Reduction
- the CCU 201 in FIG. 5 is common to the case in FIG. 2 in that the data receiving unit 311 and the camera signal processing unit 314 are provided.
- the CCU 201 of FIG. 5 is different from the case of FIG. 2 in that an NR unit 321 is newly provided and that conversion units 322 and 323 are provided instead of the conversion units 312 and 313. To do.
- the G image, the R image, and the B image are supplied from the data receiving unit 311 to the NR unit 321.
- the NR unit 321 performs, for example, NR as image processing of a G image (first image) using an R image or a B image (second image).
- NR can be performed as filtering by a Bilateral filter, for example.
- the NR unit 321 performs NR of a portion of the G image in which the correlation with the R image or the B image is equal to or greater than a threshold (a region with high correlation) using the R image or the B image.
- a threshold a region with high correlation
- the NR of the portion where the correlation with the R image or B image is not greater than or equal to the threshold can be performed without using the R image or B image.
- the G image and the R image are low noise images obtained by the image sensors 304 and 305 having high sensitivity to light intensity
- the G image has the same phase (same position) as the R image and the B image.
- the noise of the G image is obtained by performing NR such as filtering of the Joint Bilateral filter using the R image and B image that have a high correlation with the G image in addition to the G image. Can be further reduced using low-noise R and B images.
- the waveform pattern correlation of pixel values in phase with the R image and B image is low, like the filtering of the Bilateral filter using the G image without using the R image and B image.
- NR can be performed.
- the G image and the R image are supplied from the data reception unit 311 to the conversion unit 322.
- the conversion unit 322 performs, for example, up-conversion for converting the number of pixels of the R image into the same number of pixels as that of the G image, as image processing of the R image (second image) from the data reception unit 311. This is performed using a G image (first image).
- the conversion unit 322 performs up-conversion of a portion of the R image in which the correlation with the G image is equal to or greater than the threshold using the G image, and the correlation with the G image in the R image is equal to or greater than the threshold.
- the up-conversion of the part which is not can be performed without using the G image.
- the G image is a high-resolution image obtained by the image sensor 303 having a large number of pixels, a portion of the R image in which the correlation between the waveform patterns of the pixel values having the same phase with the G image is high.
- a G image that has a high correlation with the image for example, by performing up-conversion by filtering with a filter such as Joint Bilateral Upsampling, a higher-resolution R image is obtained than when performing up-conversion without using the G image. Obtainable.
- the G image and the B image are supplied from the data reception unit 311 to the conversion unit 323.
- the conversion unit 323 performs the same processing as the conversion unit 322 except that the B image is used instead of the R image, the description thereof is omitted.
- the positional relationship between the G image and each of the R image and the B image (G image, R image, and B image) It is sufficient to recognize the position where the same subject is reflected in each, and the number of pixels of the G image must be an integer multiple of the number of pixels of the R image or B image. There is no particular limitation on the relationship between the number of pixels and the number of pixels of the image sensors 304 and 305.
- FIG. 6 is a diagram illustrating a second configuration example of the camera head 102.
- the first configuration example of the CCU 201 in FIG. 2 is adopted as the configuration of the CCU 201, but the second configuration example of the CCU 201 in FIG. 5 can be adopted as the configuration of the CCU 201. It is.
- the camera head 102 includes a lens 301, a prism 302, an image sensor 303, a data transmission unit 306, image sensors 401 and 402, a temperature sensor 411, a motion sensor 412, and a binning control unit 413.
- the camera head 102 of FIG. 6 is common to the case of FIG. 2 in that the lens 301, the prism 302, the image sensor 303, and the data transmission unit 306 are included.
- the camera head 102 of FIG. 6 has a point that image sensors 401 and 402 are provided instead of the image sensors 304 and 305, and a temperature sensor 411, a motion sensor 412, and a binning control unit 413 are newly added. It is different from the case of FIG. 2 in that it is provided.
- the image sensors 401 and 402 receive B light and R light incident from the prism 302, perform photoelectric conversion, and perform B conversion and R light from the prism 302, respectively.
- B image and R image (image signals thereof) corresponding to are output to the data transmission unit 306.
- the image sensors 401 and 402 are image sensors having the same light receiving surface size as the image sensor 303 and are arranged in the same manner as the image sensors 304 and 305. Has been.
- the image sensors 401 and 402 (second sensor) have the same number of pixels (predetermined number of pixels) as the image sensor 303 (first sensor). Therefore, the image sensors 401 and 402 are, for example, 4K sensors, like the image sensor 303.
- the image sensors 401 and 402 have a function of outputting an added value of pixel values of one pixel as a pixel value of one pixel, and a plurality of pixels, for example, pixel values of 2 ⁇ 2 pixels in horizontal ⁇ vertical A binning function for outputting the added value as a pixel value of one pixel.
- the operation mode for enabling the function of outputting the added value of the pixel value of one pixel as the pixel value of the one pixel is referred to as a normal mode, and the operation for enabling the binning function.
- the mode is referred to as a binning mode.
- the image sensors 401 and 402 respectively output a B image and an R image (second image) having the same number of pixels as the G image (first image) output from the image sensor 303.
- the image sensors 401 and 402 have fewer pixels than the G image (first image) configured by the pixel value corresponding to the G light output from the image sensor 303 by the binning function.
- a B image and an R image (second image) composed of pixel values corresponding to the B light and the R light are respectively output.
- the image sensors 401 and 402 output, for example, a 4K image having the same number of pixels as the G image as a B image and an R image, respectively.
- the image sensors 401 and 402 output, for example, an HD image having a smaller number of pixels than the G image as a B image and an R image, respectively.
- the image sensors 401 and 402 output a high-resolution 4K image similar to the G image as a B image and an R image, respectively, so that transmission of the B image and the R image is necessary.
- the B image and the R image are high-resolution images similarly to the G image, a medical image generated using such a B image and R image is a high-resolution image.
- the image sensors 401 and 402 each output an HD image having a smaller number of pixels than the G image as a B image and an R image. Therefore, the medical images generated using the B image and the R image are used.
- the resolution of the image is lower than that in the normal mode.
- the B image and R image in the binning mode are images (HD images) having a smaller number of pixels than in the normal mode (4K image)
- the power consumption required for transmitting the B image and the R image As the amount of heat generation decreases, the sensitivity of the image sensors 401 and 402 to the light intensity increases, and the S / N of the B image and the R image is improved.
- the temperature sensor 411 senses the temperature of the camera head 102 and supplies temperature information representing the temperature to the binning control unit 413.
- the motion sensor 412 is composed of, for example, a gyro and the like, senses the motion of the camera head 102, and supplies motion information representing the motion to the binning control unit 413.
- the binning control unit 413 recognizes the state of the camera head 102 based on the temperature information from the temperature sensor 411, the motion information from the motion sensor 412, and the like, and according to the state of the camera head 102, the image sensor 401 and The operation mode 402 is controlled.
- the binning control unit 413 sets the operation mode of the image sensors 401 and 402 to the binning mode. be able to. In this case, it is possible to reduce the power consumption of the camera head 102 and thus the heat generation amount.
- the binning control unit 413 can set the operation mode of the image sensors 401 and 402 to the normal mode. In this case, a high-resolution medical image can be acquired.
- the binning control unit 413 performs image sensor 401 and 402. Can be set to the normal mode, whereby a high-resolution medical image can be acquired.
- the camera head 102 is stationary, it is presumed that the user is gazing at the portion shown in the medical image. Therefore, by acquiring the high-resolution medical image, the user can obtain the high-resolution medical image. The details of the desired location can be confirmed accurately.
- the binning control unit 413 can set the operation mode of the image sensors 401 and 402 to the binning mode. Thereby, the power consumption and heat generation amount of the camera head 102 can be reduced.
- the camera head 102 is moving at a certain speed, it is presumed that the camera head 102 is moved to search for a place to be photographed. Therefore, the medical image has a higher resolution as the camera head 102 is stationary. The reduction in power consumption and heat generation can be prioritized.
- the binning control unit 413 sets the operation modes of the image sensors 401 and 402. Accordingly, by controlling the data transmission unit 306 and the data reception unit 311, the transmission rate between the data transmission unit 306 and the data reception unit 311, the period during which images are transmitted, and the like are controlled.
- the temperature sensor 411 and the motion sensor 412 are provided as sensors for detecting the state of the camera head 102, but the sensors for detecting the state of the camera head 102 include the temperature sensor 411 and the motion sensor 412. Sensors for detecting only one of them or other states can be provided.
- the binning control unit 413 can set the operation mode of the image sensors 401 and 402 according to a state other than the temperature and movement of the camera head 102.
- the binning control unit 413 can set the operation mode of the image sensors 401 and 402 according to the user's operation. For example, the user can select the normal mode when he wants to check a fine portion shown in the medical image, and can select the binning mode when he wants to see a medical image with a good S / N.
- the conversion unit 312 of the CCU 201 performs up-conversion of the R image supplied from the data reception unit 311 when the operation mode of the image sensors 401 and 402 is the binning mode, and the operation mode is the normal mode.
- the R image supplied from the data receiving unit 311 is supplied as it is to the camera signal processing unit 314 without being up-converted. This is because, in the normal mode, the R image is a 4K image having the same number of pixels as the G image, and there is no need for up-conversion.
- FIG. 7 is a flowchart illustrating an example of operation mode setting processing performed by the binning control unit 413 in FIG.
- step S21 the binning control unit 413 determines whether the temperature of the camera head 102 is greater than the temperature threshold THt based on the temperature information from the temperature sensor 411 and the motion information from the motion sensor 412 or the camera. It is determined whether the speed of movement of the head 102 is greater than the speed threshold THv.
- step S21 If it is determined in step S21 that the temperature of the camera head 102 is greater than the threshold value THt, or if it is determined that the speed of movement of the camera head 102 is greater than the threshold value THv, the process proceeds to step S22. Proceed to
- step S22 the binning control unit 413 sets the operation mode of the image sensors 401 and 402 to the binning mode, and the process returns to step S21.
- the binning control unit 413 sets the operation mode of the image sensors 401 and 402 to the binning mode, and the process returns to step S21.
- a medical image with a good S / N can be acquired.
- step S21 If it is determined in step S21 that the temperature of the camera head 102 is not greater than the threshold value THt and the speed of movement of the camera head 102 is not greater than the threshold value THv, the process proceeds to step S23.
- step S23 the binning control unit 413 sets the operation mode of the image sensors 401 and 402 to the normal mode, and the process returns to step S21.
- it is possible to increase the resolution of the medical image (acquire a high-resolution medical image). That is, it is possible to obtain a medical image having a higher blue and red resolution than in the binning mode.
- FIG. 8 is a block diagram illustrating a third configuration example of the camera head 102 and a third configuration example of the CCU 201.
- the camera head 102 includes a lens 301, a prism 302, an image sensor 303, a data transmission unit 306, and image sensors 401 and 402.
- the camera head 102 of FIG. 8 is common to the case of FIG. 6 in that the lens 301, the prism 302, the image sensor 303, the data transmission unit 306, and the image sensors 401 and 402 are included.
- the camera head 102 of FIG. 8 is different from the case of FIG. 6 in that the temperature sensor 411, the motion sensor 412, and the binning control unit 413 are not provided.
- the CCU 201 includes a data reception unit 311, conversion units 312 and 313, a camera signal processing unit 314, and a binning control unit 421.
- the CCU 201 in FIG. 8 is common to the case in FIG. 2 in that the data receiving unit 311, the conversion units 312 and 313, and the camera signal processing unit 314 are included. However, the CCU 201 of FIG. 8 is different from the case of FIG. 2 in that a binning control unit 421 is newly provided.
- the binning control unit 421 controls the operation mode of the image sensors 401 and 402 according to the medical image obtained by the camera signal processing unit 314 (including the G image, the R image, and the B image used for generating the medical image). To do.
- the binning control unit 421 can control the operation mode according to one or more of the noise amount, brightness, color, and degree of focus of the medical image.
- the binning control unit 421 sets the operation mode to the binning mode when the noise amount of the medical image is large (when the S / N is equal to or less than the threshold), and the noise amount of the medical image is small.
- the operation mode can be set to the normal mode.
- the operation mode is set to the binning mode, so that the camera signal processing unit 314 has the R and B images with low noise captured by the image sensors 401 and 402 in the binning mode, respectively. It is possible to generate a medical image with less noise using the HD image.
- the binning control unit 421 sets the operation mode to the normal mode when the medical image is bright (when the medical image is greater than or equal to the threshold), and sets the operation mode to the binning mode when the medical image is dark. Can be set to When the medical image is dark, by setting the operation mode to the binning mode, the camera signal processing unit 314 has a high sensitivity R to the intensity of light imaged by the image sensors 401 and 402 in the binning mode. A brighter medical image can be generated using the image and the HD image as the B image.
- the binning control unit 421 sets the operation mode to the normal mode when, for example, the number of pixels in which red and / or blue are dominant in the medical image is large (when the number is greater than or equal to the threshold), the red When the number of pixels dominant in blue and blue is small, the operation mode can be set to the binning mode.
- the operation mode can be set to the normal mode when the number of pixels dominant in red and blue is large, the high resolution captured by the image sensor 401 and 402 in the normal mode on the camera signal processing unit 314, respectively.
- the 4K image as the R image and the B image, it is possible to generate a medical image in which minute portions are clearly reflected in red and blue.
- the binning control unit 42 for example, when the degree of focus of the medical image is high (for example, when the contrast (sharpness) of the medical image is greater than or equal to a threshold value and is estimated to be in focus).
- the operation mode can be set to the normal mode.
- the degree of focus of the medical image is low, that is, when there is no focus, by setting the operation mode to the normal mode, the camera signal processing unit 314 is captured by the image sensors 401 and 402 in the normal mode, respectively.
- the high-resolution R image and the 4K image as the B image are used to generate a high-resolution medical image, and the focus can be adjusted more accurately using the medical image.
- the binning control unit 421 controls the data transmission unit 306 and the data reception unit 311 according to the operation mode of the image sensors 401 and 402, similarly to the binning control unit 413 of FIG.
- the binning control unit 421 is provided in the CCU 201, but the binning control unit 421 can be provided in the camera head 102.
- the binning control unit 421 is provided in the camera head 102, for example, a G image used for generating a medical image (or a medical image) from the camera signal processing unit 314 of the CCU 201 to the binning control unit 421 of the camera head 102. , R image, and B image), and from the data transmission unit 306 of the camera head 102, the G image, the R image, and the B image are supplied to the binning control unit 421 of the camera head 102. it can.
- the up-conversion is performed when the operation mode is the binning mode, and the up-conversion is not performed when the operation mode is the normal mode.
- FIG. 9 is a flowchart for explaining an example of operation mode setting processing performed by the binning control unit 421 in FIG.
- step S31 the binning control unit 413 repeatedly sets the operation mode of the image sensors 401 and 402 to the binning mode or the normal mode according to the medical image from the camera signal processing unit 314.
- the camera head 102 includes an image sensor 303 that receives G light, an image sensor 304 or 401 that receives B light, and an image sensor 305 or 402 that receives R light.
- an image sensor 303 that receives G light
- an image sensor 304 or 401 that receives B light
- an image sensor 305 or 402 that receives R light.
- the configuration of the camera head 102 may be a two-plate type or a four-plate type other than three plates.
- the camera head 102 receives B light and R light as light other than G light, but as light other than G light, visible light other than B light and R light, and light other than visible light can be obtained. For example, infrared rays can be received.
- the present technology can be applied not only when displaying a 2D (Dimensional) image but also when displaying a 3D image.
- the present technology is not limited to an endoscopic surgery system, but various systems that capture medical images, for example, It can be applied to a microscopic surgery system or the like.
- the series of processes of the CCU 201 described above can be performed by hardware or software.
- a program constituting the software is installed in a microcomputer computer or the like.
- FIG. 10 is a block diagram illustrating a configuration example of an embodiment of a computer in which a program for executing the above-described series of processes is installed.
- the program can be recorded in advance in a hard disk 505 or ROM 503 as a recording medium built in the computer.
- the program can be stored (recorded) in a removable recording medium 511.
- a removable recording medium 511 can be provided as so-called package software.
- examples of the removable recording medium 511 include a flexible disk, a CD-ROM (Compact Disc Read Only Memory), an MO (Magneto Optical) disc, a DVD (Digital Versatile Disc), a magnetic disc, and a semiconductor memory.
- the program can be installed in the computer from the removable recording medium 511 as described above, or can be downloaded to the computer via the communication network or the broadcast network and installed in the built-in hard disk 505. That is, the program is transferred from a download site to a computer wirelessly via a digital satellite broadcasting artificial satellite, or wired to a computer via a network such as a LAN (Local Area Network) or the Internet. be able to.
- a network such as a LAN (Local Area Network) or the Internet.
- the computer incorporates a CPU (Central Processing Unit) 502, and an input / output interface 510 is connected to the CPU 502 via a bus 501.
- a CPU Central Processing Unit
- an input / output interface 510 is connected to the CPU 502 via a bus 501.
- the CPU 502 executes a program stored in a ROM (Read Only Memory) 503 accordingly. .
- the CPU 502 loads a program stored in the hard disk 505 to a RAM (Random Access Memory) 504 and executes it.
- the CPU 502 performs processing according to the flowchart described above or processing performed by the configuration of the block diagram described above. Then, the CPU 502 outputs the processing result as necessary, for example, via the input / output interface 510, from the output unit 506, transmitted from the communication unit 508, and further recorded in the hard disk 505.
- the input unit 507 includes a keyboard, a mouse, a microphone, and the like.
- the output unit 506 includes an LCD (Liquid Crystal Display), a speaker, and the like.
- the processing performed by the computer according to the program does not necessarily have to be performed in chronological order in the order described as the flowchart. That is, the processing performed by the computer according to the program includes processing executed in parallel or individually (for example, parallel processing or object processing).
- the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Accordingly, a plurality of devices housed in separate housings and connected via a network and a single device housing a plurality of modules in one housing are all systems. .
- each step described in the above-described flowchart can be executed by one device or can be shared by a plurality of devices.
- the plurality of processes included in the one step can be executed by being shared by a plurality of apparatuses in addition to being executed by one apparatus.
- this technique can take the following structures.
- a first sensor having a first number of pixels and receiving G light which is light in a wavelength band of G (Green);
- An endoscope comprising: a camera head having a second number of pixels smaller than the first number of pixels and having a second sensor that receives light other than the G light.
- a second sensor comprising: a camera head having a second number of pixels smaller than the first number of pixels and having a second sensor that receives light other than the G light.
- sizes of light receiving surfaces of the first sensor and the second sensor are equal.
- ⁇ 3> The endoscope according to ⁇ 1> or ⁇ 2>, wherein a pixel size of the second sensor is larger than a pixel size of the first sensor.
- the second sensor receives R light that is light in a wavelength band of R (Red) or B light that is light in a wavelength band of B (Blue) ⁇ 1> to ⁇ 3>
- the endoscope described.
- ⁇ 5> Using a first image composed of pixel values corresponding to the G light output from the first sensor, pixel values corresponding to light other than the G light output from the second sensor.
- the endoscope according to any one of ⁇ 1> to ⁇ 4>, in which image processing of the second image configured is performed.
- ⁇ 6> The endoscope according to ⁇ 5>, wherein image processing of a portion of the second image having a correlation with the first image that is equal to or greater than a threshold is performed using the first image.
- ⁇ 7> A pixel value corresponding to light other than the G light output from the second sensor in the image processing of the first image configured by the pixel value corresponding to the G light output from the first sensor.
- image processing of a portion of the first image that has a correlation with the second image that is greater than or equal to a threshold is performed using the second image.
- a first sensor having a predetermined number of pixels and receiving G light which is light in a wavelength band of G (Green);
- a camera head having a predetermined number of pixels and a second sensor that receives light other than the G light,
- the second sensor has a binning function for outputting an added value of pixel values of a plurality of pixels as a pixel value of one pixel, and the G light output by the first sensor by the binning function.
- the endoscope which outputs the 2nd image comprised by the pixel value corresponding to light other than the said G light with a pixel number smaller than the 1st image comprised by the pixel value corresponding to 1 to.
- the second sensor has, as an operation mode, a binning mode for outputting the second image having a smaller number of pixels than the first image by the binning function, and the same number of pixels as the first image.
- a normal mode for outputting the second image The endoscope according to ⁇ 9>, further including a control unit that controls an operation mode of the second sensor.
- the control unit controls the operation mode according to a state of the camera head.
- the control unit controls the operation mode according to one or more of temperature and movement of the camera head.
- ⁇ 17> The endoscope according to ⁇ 16>, wherein image processing of a portion of the second image having a correlation with the first image that is greater than or equal to a threshold is performed using the first image.
- image processing of the first image is performed using the second image.
- image processing of a portion of the first image that has a correlation with the second image equal to or greater than a threshold value is performed using the second image.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Optics & Photonics (AREA)
- Physics & Mathematics (AREA)
- Biophysics (AREA)
- General Health & Medical Sciences (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Veterinary Medicine (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Public Health (AREA)
- Signal Processing (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Endoscopes (AREA)
- Instruments For Viewing The Inside Of Hollow Bodies (AREA)
Abstract
Description
第1の画素数の画素を有し、G(Green)の波長帯の光であるG光を受光する第1のセンサと、
前記第1の画素数より少ない第2の画素数の画素を有し、前記G光以外の光を受光する第2のセンサと
を有するカメラヘッドを備える
内視鏡。
<2>
前記第1のセンサと前記第2のセンサとの受光面のサイズが等しい
<1>に記載の内視鏡。
<3>
前記第2のセンサの画素サイズは、前記第1のセンサの画素サイズより大きい
<1>又は<2>に記載の内視鏡。
<4>
前記第2のセンサは、R(Red)の波長帯の光であるR光、又は、B(Blue)の波長帯の光であるB光を受光する
<1>ないし<3>のいずれかに記載の内視鏡。
<5>
前記第1のセンサが出力する、前記G光に対応する画素値で構成される第1の画像を用いて、前記第2のセンサが出力する、前記G光以外の光に対応する画素値で構成される第2の画像の画像処理が行われる
<1>ないし<4>のいずれかに記載の内視鏡。
<6>
前記第2の画像のうちの、前記第1の画像との相関が閾値以上の部分の画像処理が、前記第1の画像を用いて行われる
<5>に記載の内視鏡。
<7>
前記第1のセンサが出力する、前記G光に対応する画素値で構成される第1の画像の画像処理が、前記第2のセンサが出力する、前記G光以外の光に対応する画素値で構成される第2の画像を用いて行われる
<1>ないし<6>のいずれかに記載の内視鏡。
<8>
前記第1の画像のうちの、前記第2の画像との相関が閾値以上の部分の画像処理が、前記第2の画像を用いて行われる
<7>に記載の内視鏡。
<9>
所定の画素数の画素を有し、G(Green)の波長帯の光であるG光を受光する第1のセンサと、
前記所定の画素数の画素を有し、前記G光以外の光を受光する第2のセンサと
を有するカメラヘッドを備え、
前記第2のセンサは、複数の画素の画素値の加算値を、1の画素の画素値として出力するビニング機能を有し、前記ビニング機能により、前記第1のセンサが出力する、前記G光に対応する画素値で構成される第1の画像よりも画素数が少ない、前記G光以外の光に対応する画素値で構成される第2の画像を出力する
内視鏡。
<10>
前記第2のセンサは、動作モードとして、前記ビニング機能により、前記第1の画像よりも画素数が少ない前記第2の画像を出力するビニングモードと、前記第1の画像と同一の画素数の前記第2の画像を出力する通常モードとを有し、
前記第2のセンサの動作モードを制御する制御部をさらに備える
<9>に記載の内視鏡。
<11>
前記制御部は、前記カメラヘッドの状態に応じて、前記動作モードを制御する
<10>に記載の内視鏡。
<12>
前記制御部は、前記カメラヘッドの温度及び動きのうちの1以上に応じて、前記動作モードを制御する
<11>に記載の内視鏡。
<13>
前記制御部は、前記第1の画像及び前記第2の画像に応じて、前記動作モードを制御する
<10>に記載の内視鏡。
<14>
前記制御部は、前記第1の画像及び前記第2の画像のノイズ量、明るさ、色、及び、合焦の程度のうちの1以上に応じて、前記動作モードを制御する
<13>に記載の内視鏡。
<15>
前記第2のセンサは、R(Red)の波長帯の光であるR光、又は、B(Blue)の波長帯の光であるB光を受光する
<9>ないし<14>のいずれかに記載の内視鏡。
<16>
前記第1の画像を用いて、前記第2の画像の画像処理が行われる
<9>ないし<15>のいずれかに記載の内視鏡。
<17>
前記第2の画像のうちの、前記第1の画像との相関が閾値以上の部分の画像処理が、前記第1の画像を用いて行われる
<16>に記載の内視鏡。
<18>
前記第1の画像の画像処理が、前記第2の画像を用いて行われる
<9>ないし<17>のいずれかに記載の内視鏡。
<19>
前記第1の画像のうちの、前記第2の画像との相関が閾値以上の部分の画像処理が、前記第2の画像を用いて行われる
<18>に記載の内視鏡。
Claims (19)
- 第1の画素数の画素を有し、G(Green)の波長帯の光であるG光を受光する第1のセンサと、
前記第1の画素数より少ない第2の画素数の画素を有し、前記G光以外の光を受光する第2のセンサと
を有するカメラヘッドを備える
内視鏡。 - 前記第1のセンサと前記第2のセンサとの受光面のサイズが等しい
請求項1に記載の内視鏡。 - 前記第2のセンサの画素サイズは、前記第1のセンサの画素サイズより大きい
請求項1に記載の内視鏡。 - 前記第2のセンサは、R(Red)の波長帯の光であるR光、又は、B(Blue)の波長帯の光であるB光を受光する
請求項1に記載の内視鏡。 - 前記第1のセンサが出力する、前記G光に対応する画素値で構成される第1の画像を用いて、前記第2のセンサが出力する、前記G光以外の光に対応する画素値で構成される第2の画像の画像処理が行われる
請求項1に記載の内視鏡。 - 前記第2の画像のうちの、前記第1の画像との相関が閾値以上の部分の画像処理が、前記第1の画像を用いて行われる
請求項5に記載の内視鏡。 - 前記第1のセンサが出力する、前記G光に対応する画素値で構成される第1の画像の画像処理が、前記第2のセンサが出力する、前記G光以外の光に対応する画素値で構成される第2の画像を用いて行われる
請求項1に記載の内視鏡。 - 前記第1の画像のうちの、前記第2の画像との相関が閾値以上の部分の画像処理が、前記第2の画像を用いて行われる
請求項7に記載の内視鏡。 - 所定の画素数の画素を有し、G(Green)の波長帯の光であるG光を受光する第1のセンサと、
前記所定の画素数の画素を有し、前記G光以外の光を受光する第2のセンサと
を有するカメラヘッドを備え、
前記第2のセンサは、複数の画素の画素値の加算値を、1の画素の画素値として出力するビニング機能を有し、前記ビニング機能により、前記第1のセンサが出力する、前記G光に対応する画素値で構成される第1の画像よりも画素数が少ない、前記G光以外の光に対応する画素値で構成される第2の画像を出力する
内視鏡。 - 前記第2のセンサは、動作モードとして、前記ビニング機能により、前記第1の画像よりも画素数が少ない前記第2の画像を出力するビニングモードと、前記第1の画像と同一の画素数の前記第2の画像を出力する通常モードとを有し、
前記第2のセンサの動作モードを制御する制御部をさらに備える
請求項9に記載の内視鏡。 - 前記制御部は、前記カメラヘッドの状態に応じて、前記動作モードを制御する
請求項10に記載の内視鏡。 - 前記制御部は、前記カメラヘッドの温度及び動きのうちの1以上に応じて、前記動作モードを制御する
請求項11に記載の内視鏡。 - 前記制御部は、前記第1の画像及び前記第2の画像に応じて、前記動作モードを制御する
請求項10に記載の内視鏡。 - 前記制御部は、前記第1の画像及び前記第2の画像のノイズ量、明るさ、色、及び、合焦の程度のうちの1以上に応じて、前記動作モードを制御する
請求項13に記載の内視鏡。 - 前記第2のセンサは、R(Red)の波長帯の光であるR光、又は、B(Blue)の波長帯の光であるB光を受光する
請求項9に記載の内視鏡。 - 前記第1の画像を用いて、前記第2の画像の画像処理が行われる
請求項9に記載の内視鏡。 - 前記第2の画像のうちの、前記第1の画像との相関が閾値以上の部分の画像処理が、前記第1の画像を用いて行われる
請求項16に記載の内視鏡。 - 前記第1の画像の画像処理が、前記第2の画像を用いて行われる
請求項9に記載の内視鏡。 - 前記第1の画像のうちの、前記第2の画像との相関が閾値以上の部分の画像処理が、前記第2の画像を用いて行われる
請求項18に記載の内視鏡。
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2019519165A JP7140113B2 (ja) | 2017-05-15 | 2018-05-01 | 内視鏡 |
| US16/611,851 US11399699B2 (en) | 2017-05-15 | 2018-05-01 | Endoscope including green light sensor with larger pixel number than pixel number of red and blue light sensors |
| EP18802646.2A EP3626156A4 (en) | 2017-05-15 | 2018-05-01 | Endoscope |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2017-096200 | 2017-05-15 | ||
| JP2017096200 | 2017-05-15 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2018211970A1 true WO2018211970A1 (ja) | 2018-11-22 |
Family
ID=64273568
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2018/017380 Ceased WO2018211970A1 (ja) | 2017-05-15 | 2018-05-01 | 内視鏡 |
Country Status (4)
| Country | Link |
|---|---|
| US (1) | US11399699B2 (ja) |
| EP (1) | EP3626156A4 (ja) |
| JP (1) | JP7140113B2 (ja) |
| WO (1) | WO2018211970A1 (ja) |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11363188B2 (en) * | 2020-06-17 | 2022-06-14 | Microsoft Technology Licensing, Llc | Motion-based operation of imaging devices |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2008136732A (ja) | 2006-12-04 | 2008-06-19 | Pentax Corp | 三板式電子内視鏡および電子内視鏡システム |
| JP2012170639A (ja) * | 2011-02-22 | 2012-09-10 | Fujifilm Corp | 内視鏡システム、および粘膜表層の毛細血管の強調画像表示方法 |
| JP2014233533A (ja) * | 2013-06-04 | 2014-12-15 | 富士フイルム株式会社 | 内視鏡システム |
Family Cites Families (15)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7697975B2 (en) * | 2003-06-03 | 2010-04-13 | British Colombia Cancer Agency | Methods and apparatus for fluorescence imaging using multiple excitation-emission pairs and simultaneous multi-channel image detection |
| WO2007066628A1 (ja) * | 2005-12-06 | 2007-06-14 | Shibaura Mechatronics Corporation | 外観検査装置 |
| CN103327880B (zh) * | 2011-01-31 | 2015-08-26 | 奥林巴斯株式会社 | 荧光观察装置 |
| JPWO2012153568A1 (ja) * | 2011-05-10 | 2014-07-31 | オリンパスメディカルシステムズ株式会社 | 医用画像処理装置 |
| CN103491847B (zh) * | 2011-06-07 | 2016-01-20 | 奥林巴斯株式会社 | 内窥镜装置和荧光观察的光量控制方法 |
| US8784301B2 (en) * | 2011-08-12 | 2014-07-22 | Intuitive Surgical Operations, Inc. | Image capture unit and method with an extended depth of field |
| JP6034668B2 (ja) * | 2012-11-08 | 2016-11-30 | 富士フイルム株式会社 | 内視鏡システム |
| DE102013103333A1 (de) * | 2013-04-03 | 2014-10-09 | Karl Storz Gmbh & Co. Kg | Kamera zur Erfassung von optischen Eigenschaften und von Raumstruktureigenschaften |
| WO2015041496A1 (ko) * | 2013-09-23 | 2015-03-26 | 엘지이노텍 주식회사 | 카메라 모듈 및 그 제작 방법 |
| JP5925169B2 (ja) * | 2013-09-27 | 2016-05-25 | 富士フイルム株式会社 | 内視鏡システム及びその作動方法並びに内視鏡用光源装置 |
| JP6196900B2 (ja) * | 2013-12-18 | 2017-09-13 | オリンパス株式会社 | 内視鏡装置 |
| JP6230409B2 (ja) * | 2013-12-20 | 2017-11-15 | オリンパス株式会社 | 内視鏡装置 |
| CN105828693B (zh) * | 2013-12-20 | 2018-11-06 | 奥林巴斯株式会社 | 内窥镜装置 |
| JP5968944B2 (ja) | 2014-03-31 | 2016-08-10 | 富士フイルム株式会社 | 内視鏡システム、プロセッサ装置、光源装置、内視鏡システムの作動方法、プロセッサ装置の作動方法、光源装置の作動方法 |
| JP6707533B2 (ja) * | 2015-05-21 | 2020-06-10 | オリンパス株式会社 | 画像処理装置、画像処理方法、及び画像処理プログラム |
-
2018
- 2018-05-01 JP JP2019519165A patent/JP7140113B2/ja active Active
- 2018-05-01 WO PCT/JP2018/017380 patent/WO2018211970A1/ja not_active Ceased
- 2018-05-01 EP EP18802646.2A patent/EP3626156A4/en not_active Withdrawn
- 2018-05-01 US US16/611,851 patent/US11399699B2/en active Active
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2008136732A (ja) | 2006-12-04 | 2008-06-19 | Pentax Corp | 三板式電子内視鏡および電子内視鏡システム |
| JP2012170639A (ja) * | 2011-02-22 | 2012-09-10 | Fujifilm Corp | 内視鏡システム、および粘膜表層の毛細血管の強調画像表示方法 |
| JP2014233533A (ja) * | 2013-06-04 | 2014-12-15 | 富士フイルム株式会社 | 内視鏡システム |
Non-Patent Citations (1)
| Title |
|---|
| See also references of EP3626156A4 |
Also Published As
| Publication number | Publication date |
|---|---|
| EP3626156A1 (en) | 2020-03-25 |
| JPWO2018211970A1 (ja) | 2020-05-14 |
| US11399699B2 (en) | 2022-08-02 |
| EP3626156A4 (en) | 2020-09-09 |
| JP7140113B2 (ja) | 2022-09-21 |
| US20200214539A1 (en) | 2020-07-09 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP7074065B2 (ja) | 医療用画像処理装置、医療用画像処理方法、プログラム | |
| JP7095693B2 (ja) | 医療用観察システム | |
| US12355935B2 (en) | Medical imaging system, medical imaging device, and operation method | |
| WO2018079259A1 (ja) | 信号処理装置および方法、並びにプログラム | |
| JP7013677B2 (ja) | 医用画像処理装置、医用画像処理装置の作動方法、及び、内視鏡システム | |
| US20230248231A1 (en) | Medical system, information processing apparatus, and information processing method | |
| US12364386B2 (en) | Medical image generation apparatus, medical image generation method, and medical image generation program | |
| US11394866B2 (en) | Signal processing device, imaging device, signal processing meihod and program | |
| US11676242B2 (en) | Image processing apparatus and image processing method | |
| WO2018173605A1 (ja) | 手術用制御装置、制御方法、手術システム、およびプログラム | |
| US12102296B2 (en) | Medical imaging system, medical imaging device, and operation method | |
| US20210235968A1 (en) | Medical system, information processing apparatus, and information processing method | |
| JP7140113B2 (ja) | 内視鏡 | |
| US20220022728A1 (en) | Medical system, information processing device, and information processing method | |
| US11244478B2 (en) | Medical image processing device, system, method, and program | |
| WO2024122323A1 (en) | Imaging device, operating method of imaging device, and program | |
| US20240382096A1 (en) | Medical information processing apparatus, medical observation system, and medical information processing method | |
| WO2022249572A1 (ja) | 画像処理装置、画像処理方法及び記録媒体 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18802646 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2019519165 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| ENP | Entry into the national phase |
Ref document number: 2018802646 Country of ref document: EP Effective date: 20191216 |
|
| WWW | Wipo information: withdrawn in national office |
Ref document number: 2018802646 Country of ref document: EP |