WO2021152704A1 - Dispositif d'endoscope, procédé de fonctionnement pour dispositif d'endoscope, et programme de traitement d'image - Google Patents
Dispositif d'endoscope, procédé de fonctionnement pour dispositif d'endoscope, et programme de traitement d'image Download PDFInfo
- Publication number
- WO2021152704A1 WO2021152704A1 PCT/JP2020/003021 JP2020003021W WO2021152704A1 WO 2021152704 A1 WO2021152704 A1 WO 2021152704A1 JP 2020003021 W JP2020003021 W JP 2020003021W WO 2021152704 A1 WO2021152704 A1 WO 2021152704A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- light
- blue
- color filter
- green
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0655—Control therefor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/045—Control thereof
Definitions
- the present invention relates to an endoscope device, an operation method of the endoscope device, an image processing program, and the like.
- a method of capturing an image using an image sensor having a complementary color filter is known.
- the complementary color filter is a filter that is sensitive to a plurality of wavelength bands of blue, green, and red.
- a cyan color filter has a property of transmitting blue light and green light.
- Patent Document 1 discloses a method of generating a narrow band optical image or a white optical image by using an image sensor having a complementary color filter.
- the information obtained from light of a certain wavelength depends on the spectral sensitivity of the color filter of the image sensor.
- color mixing occurs.
- green light and blue narrow band light are mixed in a cyan pixel corresponding to a cyan color filter.
- a green image or a blue narrow band image is generated by performing subtraction by image processing, but it is difficult to completely correct the color mixing by image processing.
- One aspect of the present disclosure comprises a first illumination light group including green light having a green wavelength band and a second illumination light group including blue narrow band light belonging to the blue wavelength band and not including the green light.
- An imaging element that has a light source device that emits light at different timings, a color filter array that includes color filters of a plurality of colors arranged in two dimensions, and an imaging element that images a subject, and the imaging when the first illumination light group is irradiated.
- the color filter array includes a processing circuit that generates a display image based on an image captured by the element and an image captured by the image pickup element when the second illumination light group is irradiated.
- the cyan color filter and the blue color filter, the cyan color filter has an optical property of transmitting the green and blue light, and the processing circuit is the cyan color filter.
- an endoscopy device that produces the display image based on the blue narrow band image obtained by the blue narrow band light that has passed through the blue color filter.
- Another aspect of the present disclosure is a method of operating an endoscope device including a color filter array including color filters of a plurality of colors arranged in two dimensions and an image pickup element for photographing a subject, wherein the color filter array is provided.
- a color filter array including color filters of a plurality of colors arranged in two dimensions and an image pickup element for photographing a subject, wherein the color filter array is provided.
- the first illumination light group and the second illumination light group containing the blue narrow band light belonging to the blue wavelength band and not containing the green light were emitted at different timings, and the first illumination light group was irradiated.
- a display image is generated based on an image captured by the image pickup element and an image captured by the image pickup element when the second illumination light group is irradiated, and the display image is generated.
- it relates to a method of operating an endoscope device that generates the display image based on the blue narrow band image obtained by the blue narrow band light transmitted through the cyan color filter and the blue color filter. ..
- Still another aspect of the present disclosure is an image processing program for processing an image captured by an imaging element that images a subject, having a color filter array including color filters of a plurality of colors arranged in two dimensions.
- the color filter array includes a cyan color filter and a blue color filter, the cyan color filter has optical characteristics for transmitting green and blue light, and the first illumination light group has a first illumination light group.
- the second illumination light group includes the blue narrow band light belonging to the blue wavelength band and does not include the green light
- the first illumination light group includes the green light having the green wavelength band.
- a display image is generated based on the image captured by the image pickup element when irradiated and the image captured by the image pickup element when the second illumination light group is irradiated, and the display image is displayed.
- a computer is made to perform a step of generating the display image based on the blue narrow band image obtained by the blue narrow band light transmitted through the cyan color filter and the blue color filter.
- Configuration example of the endoscope device Configuration example of the color filter of the image sensor. An example of the spectral characteristics of a color filter.
- Detailed configuration example of the processing circuit The figure explaining the surface sequential imaging performed by the endoscope apparatus of this embodiment. The spectrum of the first illumination light group. The spectrum of the second illumination light group.
- a flowchart illustrating processing in a processing circuit The figure explaining the 1st interpolation processing. The figure explaining the 2nd interpolation processing.
- a flowchart illustrating an enhancement process for a white light image The explanatory view of the process which generates the white light image and the narrow band light image as a display image.
- FIG. 1 is a configuration example of an endoscope device.
- the endoscope device includes an insertion unit 200, a control device 300, a display unit 400, an external I / F unit 500, and a light source device 100.
- As the endoscope device for example, a flexible mirror used for a gastrointestinal tract or the like and a rigid mirror used for a laparoscope or the like can be assumed, but the endoscope device is not limited thereto.
- the insertion portion 200 is also referred to as a scope.
- the control device 300 is also referred to as a main body or a processing device.
- the display unit 400 is also called a display device.
- the external I / F unit 500 is also referred to as an operation unit or an operation device.
- the light source device 100 is also called a lighting unit or a lighting device.
- the light source device 100 is a device that generates illumination light.
- the light source device 100 includes light sources LDV, LDB, LDG, LDR, and a combiner 120.
- Each of the light sources LDV, LDB, LDG, and LDR is an LED (Light Emitting Diode) or a laser light source.
- the light generated by the light sources LDV, LDB, LDG, and LDR will be referred to as V light, B light, G light, and R light, respectively.
- V light is narrow band light having a peak wavelength of 410 nm.
- the half width of V light is several nm to several tens of nm.
- the band of V light belongs to the blue wavelength band of white light and is narrower than the blue wavelength band.
- B light is light having a blue wavelength band in white light.
- G light is light having a green wavelength band in white light.
- R light is light having a red wavelength band in white light.
- the wavelength band of B light is 430 to 500 nm
- the wavelength band of G light is 500 to 600 nm
- the wavelength band of R light is 600 to 700 nm.
- the image sensor 240 includes a plurality of color filters as described later.
- V light and B light pass through a blue color filter
- G light passes through a green color filter
- R light passes through a red color filter.
- V light, B light and G light pass through a cyan color filter.
- the above wavelength is an example.
- the peak wavelength of each light and the upper and lower limits of the wavelength band may be deviated by about 10%.
- the B light, the G light and the R light may be narrow band light having a half width of several nm to several tens of nm.
- the light source device 100 may include a light source (not shown) that irradiates light in another wavelength band.
- the light source device 100 may include a light source LDA that irradiates A light.
- the A light is a narrow band light having a peak wavelength of 600 nm, and its half width is several nm to several tens of nm.
- the band of A light belongs to the red wavelength band of white light and is narrower than the red wavelength band.
- the combiner 120 combines the light emitted by the LDV, LDB, LDG, and LDR, and causes the combined light to enter the light guide 210.
- the combiner 120 is composed of, for example, a dichroic mirror and a lens.
- the light source device 100 emits one wavelength or a plurality of wavelengths of V light, B light, G light, and R light at one light emission timing.
- the light having one wavelength or a plurality of wavelengths emitted at this one light emission timing is referred to as an illumination light group. The details of lighting will be described later.
- the insertion portion 200 is a portion to be inserted into the body.
- the insertion unit 200 includes a light guide 210, an illumination lens 220, an objective lens 230, an image sensor 240, and an A / D conversion circuit 250. Further, the insertion unit 200 can include a memory 260.
- the image sensor 240 is also called an image sensor.
- the insertion portion 200 has a connector (not shown), and the insertion portion 200 is attached to and detached from the control device 300 by the connector.
- the light guide 210 guides the illumination light from the light source device 100 to the tip of the insertion portion 200.
- the illumination lens 220 irradiates the subject with the illumination light guided by the light guide 210.
- the subject is a living body.
- the reflected light from the subject is incident on the objective lens 230.
- the subject image is imaged by the objective lens 230, and the image sensor 240 images the subject image.
- the image sensor 240 includes a plurality of pixels for photoelectric conversion of a subject image, and acquires pixel signals from the plurality of pixels.
- the image pickup device 240 is a color image sensor that can obtain pixel signals of a plurality of colors by one imaging.
- FIG. 2 is a diagram illustrating a color filter array 241 of the image sensor 240.
- the color filter array 241 is a 4 ⁇ 4 filter unit 242 composed of 16 filters arranged in a two-dimensional grid pattern arranged side by side.
- the filter unit 242 is composed of four green color filters, two blue color filters, two red color filters, and eight cyan color filters.
- the green color filter corresponds to G in FIG.
- the green color filter is also referred to as a G filter.
- the blue color filter, the red color filter, and the cyan color filter correspond to B, R, and Cy in FIG. 2, respectively, and are hereinafter referred to as B filter, R filter, and Cy filter.
- the pixel in which the G filter is arranged is referred to as a G pixel.
- the pixels in which the B filter, the R filter, and the Cy filter are arranged are referred to as B pixel, R pixel, and Cy pixel, respectively.
- FIG. 3 is a diagram showing the spectral characteristics of each color filter included in the color filter array 241.
- the transmittance curves are simulated and standardized so that the maximum values of the transmittances of each filter are equal.
- the curve L B represents the transmittance curve of the B filter
- curve L G represents a transmittance curve of G filters
- the curve L R represents the transmittance curve of the R filter
- a curve L Cy is Cy filter The transmittance curve is shown.
- the horizontal axis represents the wavelength and the vertical axis represents the transmittance.
- B filter transmits light in the wavelength band H B.
- G filter transmits light in the wavelength band H G.
- Cy filter transmits the wavelength band H B and the wavelength band H G respective light absorbs light of a wavelength band H R. That is, the Cy filter transmits light in the cyan wavelength band, which is a complementary color.
- R filter transmits light in the wavelength band H R.
- complementary colors refers to the color composed by light including at least two wavelength bands of the wavelength band H B, H G, H R .
- the A / D conversion circuit 250 may be built in the image sensor 240.
- the control device 300 performs signal processing including image processing. Further, the control device 300 controls each part of the endoscope device.
- the control device 300 includes a processing circuit 310 and a control circuit 320.
- the control circuit 320 controls each part of the endoscope device. For example, the user operates the external I / F unit 500 to set the presence / absence of emphasis processing. For example, when an instruction to perform emphasis processing is input, the control circuit 320 outputs an instruction to perform emphasis processing to the processing circuit 310.
- the processing circuit 310 emphasizes the blood vessel structure in the white light image based on, for example, an image obtained by V light.
- the control circuit 320 outputs the instruction not to perform the emphasis processing to the processing circuit 310.
- the processing circuit 310 outputs the white light image to the display unit 400 without emphasizing it.
- the memory 260 of the insertion unit 200 stores information about the insertion unit 200.
- the control circuit 320 controls each part of the endoscope device based on the information read from the memory 260.
- the memory 260 stores information about the image sensor 240.
- the information about the image sensor 240 is, for example, information about the type of the image sensor 240 and the like.
- the control circuit 320 causes the processing circuit 310 to perform image processing corresponding to the information about the image sensor 240 read from the memory 260.
- the processing circuit 310 generates a display image by performing image processing based on the pixel signal from the A / D conversion circuit 250, and outputs the display image to the display unit 400.
- the display unit 400 is, for example, a liquid crystal display device or the like, and displays a display image from the processing circuit 310.
- FIG. 4 is a detailed configuration example of the processing circuit 310.
- the processing circuit 310 includes an interpolation processing unit 311 and a display image generation unit 313.
- a pixel signal is input to the interpolation processing unit 311 from the A / D conversion circuit 250.
- the interpolation processing unit 311 acquires an image corresponding to each of V light, B light, G light, and R light based on the pixel signal.
- the images corresponding to V light, B light, G light, and R light are referred to as V image, B image, G image, and R image, respectively.
- the image sensor 240 includes the color filter shown in FIG.
- the V light is imaged by blue pixels and cyan pixels.
- the interpolation processing unit 311 generates a V image by performing interpolation processing based on the pixel signals of blue pixels and cyan pixels.
- the interpolation processing unit 311 generates a B image, a G image, and an R image by performing interpolation processing based on the pixel signals corresponding to each light. The details of the interpolation process will be described later with reference to FIGS. 9 and 10.
- the display image generation unit 313 generates a display image based on the B image, the G image, the R image, and the V image.
- the display image generation unit 313 generates a white light image using FIGS. 11 and 12, and generates a display image by performing enhancement processing based on the V image on the white light image. ..
- the display image generation unit 313 may generate a white light image and a narrow band light image as display images, as will be described later with reference to FIG.
- the display image generation unit 313 outputs the generated display image to the display unit 400.
- the external I / F unit 500 is an interface for inputting from the user to the endoscope device. That is, it is an interface for operating the endoscope device, an interface for setting the operation of the endoscope device, and the like. For example, it includes buttons or dials, levers, etc. for operating the endoscope device.
- FIG. 5 is a diagram illustrating surface sequential imaging performed by the endoscope device of the present embodiment.
- the light source device 100 emits the first illumination light group GR1 in the first frame F1, emits the second illumination light group GR2 in the second frame F2, and emits the first illumination in the third frame F3.
- the light group GR1 is emitted, and the second illumination light group GR2 is emitted in the fourth frame F4.
- the frame is a period during which the image pickup device 240 performs imaging, and corresponds to a frame in moving image shooting.
- the image sensor 240 images the subject irradiated with the first illumination light group GR1 in the first frame F1 and the third frame F3, and is irradiated by the second illumination light group GR2 in the second frame F2 and the fourth frame F4. Image the subject.
- the processing circuit 310 generates the first display image IM1 based on the images captured in the first frame F1 and the second frame F2. Further, the processing circuit 310 generates a second display image IM2 based on the images captured in the second frame F2 and the third frame F3, and is based on the images captured in the third frame F3 and the fourth frame F4. , Generates the third display image IM3.
- the display images IM1 to IM3 are frame images in the moving image. After that, by performing the same operation, a moving image is taken and the moving image is displayed on the display unit 400.
- FIG. 6 and 7 are spectra of illumination light.
- FIG. 6 shows the spectrum of the light included in the first illumination light group
- FIG. 7 shows the spectrum of the light included in the second illumination light group.
- the light source device 100 emits B light, G light, and R light as the first illumination light group GR1 in FIG. 5, and emits V light as the second illumination light group GR2 in FIG.
- the interpolation processing unit 311 In the frames F1 and F3 from which the first illumination light group GR1 is emitted, the interpolation processing unit 311 generates an RGB image by the interpolation processing, outputs the B channel of the RGB image as the B image, and sets the G channel of the RGB image to G. It is output as an image, and the R channel of the RGB image is output as an R image.
- the interpolation processing unit 311 In the frames F2 and F4 from which the second illumination light group GR2 is emitted, the interpolation processing unit 311 generates a B channel of an RGB image by the interpolation processing, and outputs the B channel as a V image. The interpolation processing unit 311 does not generate the G channel and the R channel, or does not output even if it is generated.
- FIG. 8 is a flowchart illustrating the processing performed in the processing circuit 310.
- the processing circuit 310 acquires image data based on the emission of the illumination light group by the light source device 100.
- Image data here is to correspond to the image F 20 in the image F 10 or 10 in FIG. 9 to be described later, R filters of the imaging device 240, G filter, B filter, the output from the corresponding pixel in each filter of the Cy filter It is a set of signals to be generated.
- the processing circuit 310 acquires the image data based on the emission of the first illumination light group GR1 in the odd frame, and acquires the image data based on the emission of the second illumination light group GR2 in the even frame. ..
- the interpolation processing unit 311 performs interpolation processing on the image data. Specifically, when the first illumination light group GR1 is emitted, the first interpolation processing is performed, and when the second illumination light group GR2 is emitted, the second interpolation processing is performed.
- Each interpolation process will be described later with reference to FIGS. 9 and 10.
- an R image, a G image, a B image, and a V image are acquired.
- the display image generation unit 313 generates a display image based on the R image, the G image, the B image, and the V image. The display image generation process will be described later with reference to FIGS. 11 to 13.
- the endoscope device of this embodiment includes a light source device 100, an image pickup device 240, and a processing circuit 310.
- the light source device 100 causes the first illumination light group GR1 and the second illumination light group GR2 to emit light at different timings.
- the first illumination light group GR1 includes green light having a green wavelength band.
- the second illumination light group GR2 includes blue narrow band light belonging to the blue wavelength band and does not include green light.
- the image sensor 240 has a color filter array 241 including color filters of a plurality of colors arranged two-dimensionally, and images a subject.
- the color filter array 241 includes cyan and blue color filters, and the cyan color filter has an optical property of transmitting green and blue light as shown in FIG. As described with reference to FIG.
- the processing circuit 310 includes an image captured by the image pickup device 240 when the first illumination light group GR1 is irradiated and an image sensor 240 when the second illumination light group GR2 is irradiated. A display image is generated based on the captured image. In this case, the processing circuit 310 generates a display image based on the blue narrow band image obtained by the cyan color filter and the blue narrow band light transmitted through the blue color filter.
- green light corresponds to G light
- blue narrow band light corresponds to V light
- the blue narrow band image is a V image.
- the green image captured by the green light corresponds to the G image.
- the process of generating a display image based on the V image may be a process of generating a B channel image of a white light image using the V image, or B of a narrow band light image using the V image. It may be a process of generating a channel image, or a process of emphasizing a B channel image of a white light image using a V image.
- the narrow band optical image may be rephrased as an NBI (Narrow Band Imaging) image.
- the color filter array 241 of the image sensor 240 includes a Cy filter that transmits G light and B light. Since the wavelength bands of G light and B light include the peak wavelength of the absorption characteristic of hemoglobin, light in the wavelength band belonging to G light and B light is of high importance in in vivo observation.
- the image sensor 240 including the Cy filter the light belonging to the G light and the B light can be received by a large number of pixels, so that it is possible to generate an image with high resolution.
- the Cy filter transmits both G light and B light, color mixing occurs when light belonging to the green wavelength band and light belonging to the blue wavelength band are simultaneously irradiated.
- V light which is narrow-band light, can image a structure that is not clearly imaged by G light or broad B light, but the contrast of the structure is lowered by mixing with G light.
- the light source device 100 emits light from the first illumination light group GR1 including G light and the second illumination light group GR2 including V light and not including G light at different timings. Let me. Since the G light is not irradiated at the irradiation timing of the second illumination light group GR2, the pixel value of the Cy pixel can be treated as the B pixel, and a high-resolution V image can be acquired. This makes it possible to improve the visibility of a specific structure in the displayed image.
- the specific structure is specifically the capillaries on the surface of the mucosa.
- the image sensor 240 has color filters of a plurality of colors, and the light source device 100 causes the first illumination light group GR1 and the second illumination light group GR2 to emit light at different timings.
- the image sensor 240 having a plurality of color filters the number of surface sequences can be reduced, so that color shift can be reduced.
- the first illumination light group GR1 and the second illumination light group GR2 are alternately irradiated, and the number of surface sequences is two. However, it is not prevented that the number of surface sequences is 3 or more.
- the first illumination light group GR1 includes red light having a red wavelength band and blue light having a blue wavelength band.
- Red light corresponds to R light
- blue light corresponds to B light.
- the red image captured by the red light corresponds to the R image
- the blue image captured by the blue light corresponds to the B image.
- the B image, the G image, and the R image can be taken. Therefore, it is possible to generate a white light image having a natural color.
- a V image can be taken.
- a display image can be generated based on the V image, the B image, the G image, and the R image. At this time, for example, the B channel of the display image is generated based on the V image and the B image.
- the processing circuit 310 generates a green channel image in the display image based on the G image and the V image obtained by the G light.
- the green channel is one of a plurality of channels constituting the display image.
- the plurality of channels are a red channel, a green channel, and a blue channel, and correspond to the R channel, the G channel, and the B channel in FIGS. 11 and 13, respectively.
- the G channel image of the displayed image has a higher contribution to the luminance component than the R channel and B channel images, and greatly affects the visibility of the displayed image.
- V image By using the V image to generate the G channel image, it is possible to improve the visibility of the structure such as capillaries.
- the processing circuit 310 generates a white light image as a display image based on the R image obtained by the R light, the G image, and the B image obtained by the B light.
- the processing circuit 310 performs enhancement processing on the G channel image based on the V image.
- the enhancement process using the V image is performed on the G channel image in the white light image. Therefore, it is possible to present an image with improved visibility of structures such as capillaries as a display image while using a white light image having a natural color as a base.
- the processing circuit 310 extracts a feature amount related to the structure of the subject based on the G image and the V image.
- the feature quantity is an quantity whose value increases at the position where the structure of interest exists.
- the feature amount is an edge component or the like.
- the processing circuit 310 extracts an edge component by applying a high-pass filter to the V image.
- the processing circuit 310 extracts the structure that is not reflected in the G image but is reflected in the V image by obtaining the difference between the G image and the V image. That is, the processing circuit 310 extracts a structure that is not reflected in the G image but is reflected in the V image based on the correlation between the G image and the V image.
- the processing circuit 310 emphasizes the displayed image based on this correlation.
- the processing circuit 310 emphasizes the structure of the subject in the displayed image by synthesizing the above feature amount with the G image.
- the feature amount extracted from the V image is the feature amount related to the blood vessels on the surface layer of the mucosa.
- the enhancement process using the V image may be performed on the B channel image in addition to the G channel image of the display image.
- the processing circuit 310 generates a white light image as a display image based on the R image, the G image, and the B image, and has a narrow band as the display image based on the G image and the V image.
- An optical image may be generated.
- the display unit 400 displays a white light image and a narrow band light image side by side on one screen.
- the processing circuit 310 when the first illumination light group GR1 is irradiated, the processing circuit 310 performs the first interpolation processing on the pixel signal from the image sensor 240, and the second illumination light group GR2 is irradiated.
- a second interpolation process different from the first interpolation process is performed on the pixel signal from the image sensor 240.
- the interpolation process here is the above-mentioned given color with respect to the pixel signal generated by the image pickup device 240 based on the light received by the pixel in which the given color filter is arranged among the color filters of a plurality of colors. This is a process of interpolating the signals of pixels in which a color filter different from the filter is arranged.
- the image based on the Cy pixel is used as a guide image for joint bilateral interpolation processing and guided filter interpolation processing. Etc. are interpolated.
- the second illumination light group GR2 whose pixel value of the Cy pixel contains the V component and does not mix with the G component is irradiated, the Cy pixel is treated as the B pixel.
- the second interpolation that interpolates the signals of pixels other than Cy and B, that is, R pixel and G pixel, based on the pixel signal generated by the image sensor 240 based on the light received by the Cy pixel and the B pixel. Processing is done.
- processing such as bilinear interpolation, cubic interpolation, and direction discrimination interpolation is performed.
- the peak wavelength of blue narrow band light that is, V light is within the range of 415 ⁇ 20 nm.
- V light having a peak wavelength in the range of 415 ⁇ 20 nm is light that is easily absorbed by hemoglobin.
- V-light it is possible to photograph a region having a high hemoglobin concentration, such as a blood vessel in a mucous membrane. Since V light is scattered in a relatively shallow place in the mucosa, blood vessels on the surface layer of the mucosa can be photographed by V light.
- the light source device 100 may emit light in another wavelength band such as A light as described above. Since A light has a longer wavelength than V light, it reaches deeper in the mucous membrane than V light. Therefore, deep blood vessels can be photographed by using A light.
- a light is included in the second illumination light group GR2.
- a V image and an A image are acquired by the emission of the second illumination light group GR2, and the blood vessels in the deep mucosa can be emphasized by enhancing the display image based on the A image.
- the method of operating the endoscope device is a method of operating the endoscope device having a color filter array 241 including color filters of a plurality of colors arranged two-dimensionally and including an image pickup element 240 for photographing a subject.
- the method of operation is green light having a green wavelength band when the color filter array 241 includes cyan and blue color filters and the cyan color filters have optical properties that allow green and blue light to pass through.
- the first illumination light group GR1 including the A display image is generated based on the image captured by the image pickup element 240 when the light is generated and the image captured by the image pickup element 240 when the second illumination light group GR2 is irradiated. Further, when the display image is generated, the display image is generated based on the blue narrow band image obtained by the cyan color filter and the blue narrow band light transmitted through the blue color filter.
- the control device 300 of the present embodiment may be configured as follows. That is, each of the processing circuit 310 and the control circuit 320 is composed of the following hardware. Further, the processing circuit 310 and the control circuit 320 may be integrally configured by the following hardware.
- the hardware can include at least one of a circuit that processes a digital signal and a circuit that processes an analog signal.
- hardware can consist of one or more circuit devices mounted on a circuit board or one or more circuit elements.
- One or more circuit devices are, for example, ICs and the like.
- One or more circuit elements are, for example, resistors, capacitors, and the like.
- each of the processing circuit 310 and the control circuit 320 may be realized by the following processors. Further, the processing circuit 310 and the control circuit 320 may be realized by one processor. That is, the control device 300 of the present embodiment includes a memory for storing information and a processor that operates based on the information stored in the memory. The information is, for example, a program and various data.
- the processor includes hardware. By controlling the light source device 100, the processor causes the first illumination light group GR1 and the second illumination light group GR2 to emit light at different timings. The processor is based on an image captured by the image sensor 240 when the first illumination light group GR1 is irradiated and an image captured by the image sensor 240 when the second illumination light group GR2 is irradiated. Generate a display image. At this time, the processor generates a display image based on the V image captured by the V light transmitted through the Cy filter and the B filter of the image sensor 240.
- the processor may be, for example, a CPU (Central Processing Unit). However, the processor is not limited to the CPU, and various processors such as GPU (Graphics Processing Unit) or DSP (Digital Signal Processor) can be used.
- the memory may be a semiconductor memory such as SRAM or DRAM, a register, a magnetic storage device such as a hard disk device, or an optical storage device such as an optical disk device. You may.
- the memory stores instructions that can be read by a computer, and when the instructions are executed by the processor, the functions of each part of the control device 300 are realized as processing.
- the instruction here may be an instruction of an instruction set constituting a program, or an instruction instructing an operation to a hardware circuit of a processor.
- the processor realizes the function of the processing circuit 310 in FIG.
- the processor realizes the functions of the processing circuit 310 and the control circuit 320 in FIG.
- each part of the endoscope device of the present embodiment may be realized as a module of a program that operates on a processor.
- the program includes a light source control module that causes the first illumination light group GR1 and the second illumination light group GR2 to emit light at different timings, and an image captured by the image sensor 240 when the first illumination light group GR1 is irradiated.
- a processing module that generates a display image based on the image captured by the image pickup device 240 when the second illumination light group GR2 is irradiated.
- the processing module generates a display image based on the V image captured by the V light transmitted through the Cy filter and the B filter of the image sensor 240.
- the program that realizes the processing performed by each part of the control device 300 of the present embodiment can be stored in, for example, an information storage medium that is a medium that can be read by a computer.
- the information storage medium can be realized by, for example, an optical disk, a memory card, an HDD, a semiconductor memory, or the like.
- the semiconductor memory is, for example, a ROM.
- the processing circuit 310 and the control circuit 320 perform various processes of the present embodiment based on the programs and data stored in the information storage medium. That is, the information storage medium stores a program for operating the computer as each part of the endoscope device of the present embodiment.
- a computer is a device including an input device, a processing unit, a storage unit, and an output unit.
- the program is a program for causing a computer to execute the processing of each part.
- the program is recorded on an information storage medium.
- various recording media that can be read by an optical detection system, such as an optical disk such as a DVD or a CD, a magneto-optical disk, a hard disk, and a memory such as a non-volatile memory or a RAM, can be assumed.
- the interpolation processing unit 311 of the present embodiment performs different interpolation processing depending on whether the first illumination light group GR1 is emitted and the second illumination light group GR2 is emitted. Hereinafter, each interpolation process will be described.
- FIG. 9 is a schematic diagram illustrating an outline of the first interpolation process.
- the interpolation processing unit 311 acquires the image F 10 corresponding to the image data.
- the first illumination light group GR1 is an illumination light group including G light, and specifically includes B light, G light, and R light.
- the color filter array of the image sensor 240 includes a B filter, a G filter, an R filter, and a Cy filter. Therefore, when the first illumination light group GR1 is emitted, the B pixel outputs a signal corresponding to the B light.
- the G pixel outputs a signal corresponding to the G light.
- the R pixel outputs a signal corresponding to the R light.
- the Cy pixel outputs signals corresponding to G light and B light.
- the image F 10 corresponding to the image data is an image in which each pixel has a pixel value corresponding to light according to the type of the color filter.
- the interpolation processing unit 311 generates an interpolation image of Cy pixels arranged with the highest density in the image sensor 240 as a guide image. Specifically, as shown in FIG. 9, the interpolation processing unit 311 has B pixels and G pixels based on the pixel values of each Cy pixel in the separated image F Cy 11 in which the pixel values of the Cy pixels are separated from the image F 10. And the pixel value of the Cy pixel at the pixel position where each of the R pixels is arranged is calculated by the interpolation process. As a result, the interpolation processing unit 311 generates an interpolation image F Cy12 having pixel values of Cy pixels at all pixel positions. The interpolated image F Cy12 is referred to as a guide image. As shown in FIG.
- B pixels in the separation image F Cy11, pixel positions, each arranged G pixels and R pixels are adjacent four directions is surrounded by Cy pixel.
- the four directions here are up, down, left, and right directions. Therefore, the interpolation processing unit 311 can generate a highly accurate guide image FCy12 by using well-known bilinear interpolation, cubic interpolation, direction discrimination interpolation, and the like.
- the interpolation processing unit 311 generates an interpolation image of pixels of other colors based on the generated guide image F Cy12.
- FB 11 is a separated image in which the pixel values of the B pixels are separated from the image F 10.
- An interpolated image FB12 having pixel values of pixels is generated.
- the interpolated image FB12 is a B image corresponding to B light.
- the interpolation processing unit 311 based on the guide image F Cy12, Cy pixel in the separation image F G11, by calculating the pixel values of the G pixels of the pixel position R and B pixels are arranged, all the pixels An interpolated image FG12 having a pixel value of G pixel at a position is generated.
- the FG 11 is a separated image in which the pixel values of the G pixels are separated from the image F 10.
- the interpolated image F G12 is a G image corresponding to G light.
- interpolation processing unit 31 based on the guide image F Cy12, Cy pixel in the separation image F R11, by calculating the pixel values of the R pixels in the pixel position G and B pixels are arranged, all pixel positions Generates an interpolated image FR12 having a pixel value of R pixel.
- FR 11 is a separated image in which the pixel values of the R pixels are separated from the image F 10.
- the interpolated image FR12 is an R image corresponding to R light.
- the interpolation method based on the guide image F Cy12 is, for example, a known joint bilateral interpolation process or a guided filter interpolation process.
- the interpolation processing unit 311 can generate high-precision interpolated images FB12 , FG12, and FR12 of the B pixels, G pixels, and R pixels arranged at low densities in the image sensor 240, respectively.
- the guide image F Cy12 is composed of Cy pixels when the first illumination light group GR1 is emitted, it contains a G component and a B component. Therefore, the correlation between the Cy pixel and the B pixel and the correlation between the Cy pixel and the G pixel are very high. Therefore, the interpolation processing unit 311 can accurately perform processing of generating an interpolated image F B12 and F G12 based on the guide image F Cy12. Further, in general, in white light, the high frequency component has a high correlation with each of the R pixel, the G pixel, and the B pixel. Therefore, the interpolation processing unit 311 can perform the interpolation processing with high accuracy even when the R pixel is interpolated by using the Cy image as the guide image F Cy12.
- FIG. 10 is a schematic diagram illustrating an outline of the second interpolation process.
- the interpolation processing unit 311 acquires the image F 20 corresponding to the image data.
- the second illumination light group GR2 is an illumination light group containing V light and not containing G light. Specifically, the second illumination light group GR2 includes only V light. Therefore, when the second illumination light group GR2 is emitted, the B pixel outputs a signal corresponding to the V light.
- the G pixel and the R pixel do not output a signal due to the emission of the second illumination light group GR2.
- the Cy pixel outputs a signal corresponding to V light. That is, when the second illumination light group GR2 is emitted, the Cy pixel can be treated as the B pixel.
- F Cy 21 in FIG. 10 is a separated image in which the pixel values of the Cy pixels are separated from the image F 20.
- the separated image F Cy21 becomes an image having a signal corresponding to the B channel, that is, a pixel value corresponding to V light at the position of the Cy pixel.
- the separation images F B21 separating the pixel values of the B pixels from the image F 20 will image having pixel values corresponding to V light to the position of the B pixel.
- Intermediate image F B22, in Cy pixel has a pixel value of the separation images F Cy21, an image having pixel values of the separation image F B21 in the B pixel.
- the interpolation processing unit 311 calculates the pixel value of the B pixel at the pixel position where each of the G pixel and the R pixel is arranged by the interpolation processing based on the pixel values of the B pixel and the Cy pixel in the intermediate image FB22.
- the pixel positions where the R pixels are arranged are surrounded by Cy pixels in four adjacent directions. The four directions here are up, down, left, and right directions.
- the pixel positions where the G pixels are arranged are surrounded by Cy pixels or B pixels in six adjacent directions. The six directions here are up, down, left, and right, and two diagonal directions.
- the interpolation processing unit 311 can generate a highly accurate interpolation image FB23 by using well-known bilinear interpolation, cubic interpolation, direction discrimination interpolation, and the like.
- the interpolated image FB23 is a V image corresponding to V light.
- the second illumination light group GR2 when the second illumination light group GR2 is emitted, color mixing of G light and V light does not occur in the Cy pixel.
- V image is an image that reflects the structure of the capillaries and the like on the surface layer of the mucous membrane, it is possible to display the structure of the capillaries and the like with high contrast in the display image by using the high-precision V image. .. In other words, by using a high-precision V-image, it is possible to improve the visibility of a specific structure.
- FIG. 11 is a schematic diagram illustrating an outline of the display image generation processing.
- the display image generation unit 313 generates an enhanced white light image as a display image. Specifically, the display image generation unit 313 generates a white light image based on the R image, the G image, the B image, and the V image.
- the display image generation unit 313 inputs the R image to the R channel of the color image, inputs the G image to the G channel of the color image, and inputs the B image and the V image to the B channel of the color image.
- a white light image is generated as a color image.
- the B channel of the color image is, for example, an image obtained by adding and averaging the pixel values of the B image and the V image.
- the display image generation unit 313 may input only the B image to the B channel of the color image.
- the display image generation unit 313 extracts the feature amount from the V image and enhances the white light image based on the feature amount.
- the display image generation unit 313 extracts a feature amount indicating the structure of the surface blood vessel by extracting a high frequency component from the V image.
- the display image generation unit 313 enhances the white light image by adding the extracted high frequency component to the G channel of the white light image. Further, the display image generation unit 313 may enhance the white light image by adding the extracted high frequency components to the G channel and the B channel of the white light image.
- FIG. 12 is a flowchart illustrating a process of calculating a feature amount used in the emphasis process.
- the display image generation unit 313 obtains the average intensity of the V image, and corrects the average intensity of the G image with reference to the average intensity of the V image. By performing this correction, the display image generation unit 313 adjusts the average brightness of the G image to the average brightness of the V image.
- step S32 the display image generation unit 313 applies a high-pass filter to the G image whose average intensity has been corrected.
- the display image generation unit 313 outputs the output of the high-pass filter as a high frequency component of the G image.
- step S33 the display image generation unit 313 applies a high-pass filter to the V image.
- the display image generation unit 313 outputs the output of the high-pass filter as a high frequency component of the V image.
- step S34 the display image generation unit 313 calculates the difference between the high frequency component of the V image calculated in step S33 and the high frequency component of the G image calculated in step S32.
- step S35 the display image generation unit 313 calculates the difference between the V image and the G image whose average intensity has been corrected in step S31. This difference is called the intensity difference. Intensity is the pixel value in each image. The intensity difference is used to enhance the white light image, but if the intensity difference is used as it is for the enhancement process, the vascular structure may be overemphasized. Therefore, the intensity difference is suppressed in steps S36 to S38.
- step S36 the display image generation unit 313 corrects the average intensity of the R image with reference to the average intensity of the V image.
- the display image generation unit 313 adjusts the average brightness of the R image to the average brightness of the V image by performing this correction.
- step S37 the display image generation unit 313 calculates the ratio between the V image and the R image whose average intensity has been corrected.
- the ratio is the ratio of intensity, for example, the ratio is calculated for each pixel.
- step S38 the display image generation unit 313 suppresses the intensity difference calculated in step S35 by using the ratio calculated in step S37.
- step S37 when the ratio of the intensity of the V image to the intensity of the R image is calculated, the display image generation unit 313 divides the intensity difference by the ratio. This calculation is performed on each pixel, for example.
- step S39 the display image generation unit 313 synthesizes the difference of the high frequency component calculated in step S34 and the intensity difference suppressed in step S38. Synthesis is, for example, addition. This calculation is performed on each pixel, for example.
- the display image generation unit 313 adds the composite value obtained in step S39 to the G channel of the white light image. This calculation is performed on each pixel, for example. In this way, the blood vessel structure in the white light image is emphasized based on the V image. Further, the display image generation unit 313 may add the composite value obtained in step S39 to the B channel of the white light image.
- the display image generation unit 313 outputs the white light image after the enhancement process to the display unit 400.
- the processing performed by the display image generation unit 313 is not limited to the enhancement processing, and various image processing such as gradation modulation processing, enlargement processing, and noise reduction processing can be added.
- the display image generation unit 313 generates two images, a white light image and a narrow band light image, as display images. Specifically, as shown in FIG. 13, the display image generation unit 313 inputs the R image to the R channel of the color image, inputs the G image to the G channel of the color image, and inputs the B image and the V image to the color image.
- a white light image is generated as a color image by inputting to the B channel of.
- the B channel of the color image is, for example, an image obtained by adding and averaging the pixel values of the B image and the V image.
- the white light image here is an image that has not been subjected to structure enhancement processing based on the V image.
- the display image generation unit 313 generates a narrow band optical image as a color image by inputting the G image to the R channel of the color image and inputting the V image to the G channel and the B channel of the color image.
- the display image generation unit 313 outputs a white light image and a narrow band light image to the display unit 400.
- the display unit 400 displays, for example, a white light image and a narrow band light image side by side.
- a white light image it is possible to display the subject in a natural color.
- a narrow band optical image it is possible to improve the visibility of a specific subject such as a lesion as compared with a white light image.
- the display image generation unit 313 may output either a white light image or a narrow band light image to the display unit 400.
- the display image generation unit 313 performs a process of switching the image to be output to the display unit 400 based on the user input.
- the present invention is not limited to the respective embodiments and the modified examples as they are, and the present invention is within a range that does not deviate from the gist of the invention at the embodiment.
- the components can be transformed and embodied with.
- various inventions can be formed by appropriately combining a plurality of components disclosed in the above-described embodiments and modifications. For example, some components may be deleted from all the components described in each embodiment or modification. Further, the components described in different embodiments and modifications may be combined as appropriate. In this way, various modifications and applications are possible within a range that does not deviate from the gist of the invention.
- a term described at least once in the specification or drawing together with a different term having a broader meaning or a synonym may be replaced with the different term at any part of the specification or drawing.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biomedical Technology (AREA)
- Optics & Photonics (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Biophysics (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Endoscopes (AREA)
Abstract
La présente invention concerne un dispositif d'endoscope qui comprend : un dispositif de source de lumière (100) qui émet, à différents moments, un premier groupe de lumière d'éclairage (GR1) qui comprend une lumière verte ayant une bande de longueur d'onde verte, et un second groupe de lumière d'éclairage (GR2) qui comprend une lumière à bande étroite bleue appartenant à une bande de longueur d'onde bleue et qui ne comprend pas de lumière verte ; un élément d'imagerie (240) qui a un réseau de filtres colorés (241) comprenant des filtres colorés d'une pluralité de couleurs ; et un circuit de traitement (310) qui génère une image d'affichage sur la base d'une image capturée par l'élément d'imagerie (240) lorsque le premier groupe de lumière d'éclairage (GR1) a été émis et une image capturée par l'élément d'imagerie (240) lorsque le second groupe de lumière d'éclairage (GR2) a été émis. Le réseau de filtres colorés (241) comprend des filtres de couleur cyan et bleu. Le circuit de traitement (310) génère l'image d'affichage sur la base d'une image à bande étroite bleue obtenue à partir d'une lumière à bande étroite bleue transmise à travers les filtres de couleur cyan et bleu.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2020/003021 WO2021152704A1 (fr) | 2020-01-28 | 2020-01-28 | Dispositif d'endoscope, procédé de fonctionnement pour dispositif d'endoscope, et programme de traitement d'image |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2020/003021 WO2021152704A1 (fr) | 2020-01-28 | 2020-01-28 | Dispositif d'endoscope, procédé de fonctionnement pour dispositif d'endoscope, et programme de traitement d'image |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2021152704A1 true WO2021152704A1 (fr) | 2021-08-05 |
Family
ID=77078063
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2020/003021 Ceased WO2021152704A1 (fr) | 2020-01-28 | 2020-01-28 | Dispositif d'endoscope, procédé de fonctionnement pour dispositif d'endoscope, et programme de traitement d'image |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2021152704A1 (fr) |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2014221168A (ja) * | 2013-05-14 | 2014-11-27 | 富士フイルム株式会社 | プロセッサ装置、内視鏡システム、及び内視鏡システムの作動方法 |
| JP2015066050A (ja) * | 2013-09-27 | 2015-04-13 | 富士フイルム株式会社 | 内視鏡システム及びプロセッサ装置並びに作動方法 |
| JP2015171443A (ja) * | 2014-03-11 | 2015-10-01 | 富士フイルム株式会社 | 内視鏡用光源装置及び内視鏡システム |
| WO2017158692A1 (fr) * | 2016-03-14 | 2017-09-21 | オリンパス株式会社 | Dispositif d'endoscope, dispositif de traitement d'image, procédé de traitement d'image et programme |
| WO2019069414A1 (fr) * | 2017-10-04 | 2019-04-11 | オリンパス株式会社 | Dispositif d'endoscope, procédé de traitement d'image et programme |
-
2020
- 2020-01-28 WO PCT/JP2020/003021 patent/WO2021152704A1/fr not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2014221168A (ja) * | 2013-05-14 | 2014-11-27 | 富士フイルム株式会社 | プロセッサ装置、内視鏡システム、及び内視鏡システムの作動方法 |
| JP2015066050A (ja) * | 2013-09-27 | 2015-04-13 | 富士フイルム株式会社 | 内視鏡システム及びプロセッサ装置並びに作動方法 |
| JP2015171443A (ja) * | 2014-03-11 | 2015-10-01 | 富士フイルム株式会社 | 内視鏡用光源装置及び内視鏡システム |
| WO2017158692A1 (fr) * | 2016-03-14 | 2017-09-21 | オリンパス株式会社 | Dispositif d'endoscope, dispositif de traitement d'image, procédé de traitement d'image et programme |
| WO2019069414A1 (fr) * | 2017-10-04 | 2019-04-11 | オリンパス株式会社 | Dispositif d'endoscope, procédé de traitement d'image et programme |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN105705075B (zh) | 图像处理装置及其工作方法 | |
| JP5654511B2 (ja) | 内視鏡システム、内視鏡システムのプロセッサ装置、及び内視鏡システムの作動方法 | |
| JP5623348B2 (ja) | 内視鏡システム、内視鏡システムのプロセッサ装置、及び内視鏡システムの作動方法 | |
| JP4951256B2 (ja) | 生体観測装置 | |
| JP6439816B2 (ja) | 医療用皮膚検査装置、画像処理方法、プログラム | |
| EP2468187B1 (fr) | Système endoscope et appareil de traitement correspondant et procédé pour générer des images | |
| CN111050629B (zh) | 内窥镜系统 | |
| CN107105987B (zh) | 图像处理装置及其工作方法、记录介质和内窥镜装置 | |
| US10003774B2 (en) | Image processing device and method for operating endoscope system | |
| US12023007B2 (en) | Endoscope apparatus and operation method of endoscope apparatus | |
| JP6054806B2 (ja) | 画像処理装置及び内視鏡システムの作動方法 | |
| JP5715602B2 (ja) | 内視鏡システム及び内視鏡システムの作動方法 | |
| JP7163386B2 (ja) | 内視鏡装置、内視鏡装置の作動方法及び内視鏡装置の作動プログラム | |
| JP5670400B2 (ja) | 内視鏡システム及びそのプロセッサ装置並びに内視鏡システムの作動方法 | |
| WO2019234815A1 (fr) | Dispositif endoscope, procédé de fonctionnement pour dispositif endoscope, et programme | |
| JP7123135B2 (ja) | 内視鏡装置、内視鏡装置の作動方法及びプログラム | |
| WO2021152704A1 (fr) | Dispositif d'endoscope, procédé de fonctionnement pour dispositif d'endoscope, et programme de traitement d'image | |
| JPWO2019175991A1 (ja) | 画像処理装置、内視鏡システム、画像処理方法およびプログラム | |
| WO2021019663A1 (fr) | Dispositif de source de lumière d'endoscope et dispositif d'endoscope | |
| WO2021149140A1 (fr) | Dispositif endoscopique, dispositif de commande, procédé de fonctionnement de dispositif endoscopique, et programme | |
| JP7411515B2 (ja) | 内視鏡システム及びその作動方法 | |
| JP7090699B2 (ja) | 内視鏡装置及び内視鏡装置の作動方法 | |
| WO2020026323A1 (fr) | Dispositif de type endoscope ainsi que programme et procédé d'actionnement de dispositif de type endoscope | |
| JP2019076656A (ja) | 電子内視鏡システム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20916535 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 20916535 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: JP |