WO2025027024A1 - Data processing device and computer implemented invention for a medical observation device, for visualization of an autofluorescence signal and a fluorescence emission signal - Google Patents
Data processing device and computer implemented invention for a medical observation device, for visualization of an autofluorescence signal and a fluorescence emission signal Download PDFInfo
- Publication number
- WO2025027024A1 WO2025027024A1 PCT/EP2024/071556 EP2024071556W WO2025027024A1 WO 2025027024 A1 WO2025027024 A1 WO 2025027024A1 EP 2024071556 W EP2024071556 W EP 2024071556W WO 2025027024 A1 WO2025027024 A1 WO 2025027024A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- color
- signal
- digital
- fluorescence
- input image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/043—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances for fluorescence imaging
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000094—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000095—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope for image enhancement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00163—Optical arrangements
- A61B1/00186—Optical arrangements with imaging filters
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0638—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements providing two or more wavelengths
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0071—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by measuring fluorescence emission
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/62—Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
- G01N21/63—Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
- G01N21/64—Fluorescence; Phosphorescence
- G01N21/645—Specially adapted constructive features of fluorimeters
- G01N21/6456—Spatial resolved fluorescence measurements; Imaging
- G01N21/6458—Fluorescence microscopy
Definitions
- the invention relates to a data processing device and a computer-implemented method for a medical observation device, such as a microscope or an endoscope, for observing an object.
- the invention also relates to a medical observation device comprising such a data processing device, and to a method for using a medical observation device, comprising the computer- implemented method.
- Medical observation devices such as microscopes or endoscopes are often used in connection with fluorophores.
- One or more fluorophores may be added artificially to the object to mark specific parts of the object.
- one or more fluorophores may be injected into a patient’s body.
- the fluorophore 5-ALA is used to mark tumors.
- the fluorophore may be used to mark pathways along which the fluorophore is transported.
- ICG may be used to highlight blood flow.
- the object observed by the medical observation device may further contain naturally one or more fluorophores.
- the fluorescence of these naturally occurring fluorophores is called autofluorescence.
- Autofluorescence may be used as an additional diagnostic tool but at the same time may obscure the fluorescence emission of the artificially added fluorophores, which makes a simultaneous usage of both autofluorescence and fluorescence emission difficult.
- a data processing device for a medical observation device, such as a microscope or an endoscope, for observing an object
- the data device is configured: to access input image data, the input image data containing an autofluorescence signal, the autofluorescence signal being representative of fluorescence emitted by a fluorophore naturally contained in the object, and a fluorescence emission signal, the fluorescence emission signal being representative of fluorescence emitted by a fluorophore artificially added to the object; and to generate a digital fluorescence color output image from a combination of the fluorescence emission signal colored in a first color and the autofluorescence signal colored in a second color, the second color being different from the first color.
- a computer-implemented method for processing input image data in a medical observation device comprising the steps of: accessing input image data containing an autofluorescence signal of an object, the autofluorescence signal being representative of fluorescence emitted by a fluor- ophore naturally contained in the object, and a fluorescence emission signal of the object, the fluorescence emission signal being representative of fluorescence emitted by a fluorophore artificially added to the object; generating a digital fluorescence color output image from a combination of the fluorescence emission signal colored in a first color and the autofluorescence signal colored in a second color, the second color being different from the first color.
- Both the above data processing device and the above computer-implemented method allow to more easily differentiate between the autofluorescence signal and the fluorescence emission signal, as both are colored differently.
- the data processing device and the computer-implemented method may be further improved by one or more of the following features, which can be combined independently of one another. Any of the features described in the following can further be independently used for improving the data processing device and/or for improving the computer-implemented method, even if the respective feature is only described in the context of the data processing device or in the context of the computer-implemented method.
- the first and/or the second color may be a pseudo color, a false color or a natural color. If the first and/or the second color is a pseudo color, different intensities of the autofluorescence signal and/or the fluorescence emission signal will be assigned a different color. If the first and/or second color is a false color or a natural color, the intensity or brightness of the color will depend on the intensity of the autofluorescence signal and the fluorescence emission signal, respectively.
- the hue of the first and/or second color is, in such a case, preferably independent of the intensity of the autofluorescence signal and/or the fluorescence emission signal, respectively.
- first and/or second color is a false color, it will be different from the color of the fluorescence signal and/or the autofluorescence signal as it is perceived by human color vision. If the first and/or second color is a natural color, it will at least closely correspond to the color of the autofluorescence signal and/or the fluorescence emission signal as it is perceived by human color vision.
- the distance between the first and the second color in the color space of the fluorescence color output image may be larger than the distance between the natural color of the autofluorescence signal and the natural color of the fluorescence emission signal. This facilitates visual differentiation between these two signals.
- the first color may have the same hue as the natural color of the fluorescence emission signal and/or the second color may have the same hue as the natural color of the autofluorescence signal.
- the input image data may be contained in or comprise one or more digital input images.
- Each of the digital input images may contain a plurality of input pixels. If more than one digital input image is contained in the input image data, these digital input images may have the same or a different number of pixels, be represented in the same or a different color space, and/or have the same or a different image aspect ratio.
- a digital input image is a color image, it is termed a “digital color input image”.
- a multispectral or hyperspectral image is considered as a color image.
- a digital input image is a monochrome image, it is termed “digital monochrome input image”. If, in a specific context, it does not matter whether the digital input image is a color or a monochrome image, the generic term “digital input image” is used.
- a “digital input image” may thus be a “digital color input image” or a “digital monochrome input image”.
- each pixel or more specifically input pixel may be a color pixel.
- each pixel or more specifically output pixel may be a color pixel.
- a color pixel contains a set of color space coordinates which, at least in combination, represent at least some of the color appearance parameters such as hue, lightness, brightness, chroma, colorfulness, etc.
- a monochrome image may represent only light intensity values, e.g., on a grey-scale.
- the input image data may comprise at least one of: at least one digital color input image; at least one digital color input image and at least one digital monochrome input image; and at least two digital monochrome input images of the object.
- a digital color input image may contain at least one of the autofluorescence signal and the fluorescence emission signal, or at least part of the autofluorescence signal and/or at least part of the fluorescence emission signal. If a digital color input image comprises more than one signal, these signals may be separated from one another by spectral unmixing.
- a digital monochrome input image may contain only one of the autofluorescence signal and the fluorescence emission signal.
- a monochrome image only contains light intensity information, it is difficult to separate two signals contained in a single monochrome image.
- one of the two digital monochrome input images may contain or preferably consist of the autofluorescence signal while the other one of the two digital monochrome input images may contain or preferably consist of the fluorescence emission signal.
- the autofluorescence signal and the fluorescence emission signal can be separated without much effort from one another as they are already contained in two different digital input images.
- one digital monochrome input image contains the sum of the autofluorescence signal and the fluorescence emission signal and the other digital monochrome input image contains only one of the autofluorescence signal and the fluorescence emission signal.
- the subtraction of the two digital monochrome input images yields the signal that is not contained in the digital monochrome input image containing only a single signal.
- the input image data may also contain a reflectance signal, which is representative of light reflected off the object.
- the reflectance signal, or part thereof, may in particular be contained in a digital color input image that contains at least one of the fluorescence emission signal and the autofluorescence signal.
- the reflectance signal may not be needed. Therefore it is an advantage if the data processing device is configured to separate any one, or both, of the autofluorescence signal and the fluorescence emission signal from the reflectance signal.
- a digital input image contains more than one signal, such as any combination or subset of the group containing the autofluorescence signal, the fluorescence emission signal and the reflectance signal, contributions of these signals may be contained at any one input pixel.
- both signals need to be separated or extracted from one another if they, or parts thereof, are contained in a single digital input image.
- one method to do this is to use spectral unmixing.
- Another way to do this is to apply a linear transformation to the input pixel or input pixels containing the signals that need to be separated from one another.
- the linear transformation may comprise applying a transformation matrix to the color space coordinates of the at least one input pixel, which may be a color or a monochrome input pixel.
- extracting is considered synonymous to isolating and separating.
- the data processing device may be configured for spectral unmixing of the input image data, and/or any digital input image contained in the input image data.
- the data processing device may also be configured to apply a linear transformation to the input image data, or any digital input image contained therein.
- the data processing device comprises an extraction or separation routine which is configured to separated or extract the autofluorescence signal and/or to extract or separate the fluorescence emission signal.
- the extraction or separation routine may comprise a linear transformation, in particular a transformation matrix.
- the extraction or separation routine may comprise independently thereof a spectral unmixing routine.
- the two or more digital input images are preferably registered with respect to one another. They preferably have an identical field of view.
- the cameras which are used for recording the different digital input images have preferably identical field of view and/or coaxial optical axes. In this context it is even more beneficial if the two or more digital input images have the same number of pixels and the same aspect ratio, so that there is a high correlation between their location in their respective digital input image and the location on the object represented by the input pixels.
- Having registered digital input images facilitates their joint processing in particular on a pixel- by-pixel bases, as the registered digital input images have corresponding input pixels, i.e. input pixels that represent the same location of the object although they are located in different digital input images.
- corresponding pixels may be computed using a feature analysis.
- a feature analysis One way to identify corresponding pixels is, e.g., described in US 6,711 ,293 B.
- a scale-invariant feature transform or the SURF algorithm from ETH Zurich may be used.
- Both the fluorescence emission signal and the autofluorescence signal represent a color or monochrome image of the object in the respective fluorescence emission spectrum.
- the fluorescence emission signal represents an image of the object in the fluorescence emission spectrum of the at least one fluorophore that has been artificially added to the object.
- the autofluorescence signal corresponds to an image of the object in at least part of the fluorescence emission spectrum of the fluorophore that is naturally contained in the object.
- the autofluorescence signal and/or the fluorescence emission signal may, according to another aspect, contain only part of the respective fluorescence emission spectrum.
- the spectrum that is recorded in the autofluorescence signal and/or the fluorescence emission signal may be determined by optical filters before the cameras that are used to record them.
- the input pixels represent an intensity of the autofluorescence signal at a location of the object which corresponds to the input pixel.
- the input pixels represent an intensity of the fluorescence emission signal at location of the object which corresponds to the location of the input pixel.
- an input pixel represents at least part of the color appearance parameters of the fluorescence emission signal as emitted from a location of the object which corresponds to the input pixel.
- an input pixel represents at least some color appearance parameters of the autofluorescence signal as emitted from a location of the object which corresponds to the input pixel.
- an input pixel represents a mixture of at least some color appearance parameters of the fluorescence emission signal and the autofluorescence signal as emitted by a single location of the object which corresponds to the location of the input pixel.
- the intensity of the autofluorescence signal and/or of the fluorescence emission signal may be zero at an input pixel of which represents a location in the object, where no autofluorescence signal and/or no autofluorescence signal is emitted or where the emitted fluorescence signal and/or the emitted autofluorescence signal is too weak to be recorded.
- each output pixel may be formed from a combination of the fluorescence emission signal at a corresponding input pixel of the digital input image containing the fluorescence emission signal colored in the first color and the autofluorescence signal at a corresponding pixel of the digital input image containing the autofluorescence signal colored in the second color.
- the input pixels and the output pixel that is generated from the input pixels are corresponding pixels.
- an intensity of the first color at an output pixel of the digital fluorescence color output image may depend on the preferably normalized intensity of the fluorescence emission signal at a corresponding input pixel of the digital input image containing the fluorescence emission signal.
- an intensity of the second color at an output pixel of the digital fluo- rescence color output image may depend on the preferably normalized intensity of the autofluorescence signal at a corresponding input pixel of the digital input image containing the autofluorescence signal.
- Coloring the fluorescence emission signal in the first color may comprise assigning to an output pixel an intensity to the first color which depends on the intensity of the fluorescence emission signal at the corresponding input pixel in the input image data.
- Coloring the autofluorescence signal in the second color may comprise assigning at an output pixel an intensity to the second color which depends on the intensity of the fluorescence emission signal at a corresponding input pixel.
- the input image data may comprise a digital color input image that contains the fluorescence emission signal and the autofluorescence signal, where the data processing device is configured to extract the fluorescence emission signal from the digital color input image prior to coloring the fluorescence emission signal in the first color; and extract the autofluorescence signal from the digital color input image prior to coloring the autofluorescence signal in the second color.
- the extraction of the fluorescence emission signal and the extraction of the autofluorescence signal allows to process these two signals separately.
- the data processing device may be configured to extract at least one of the autofluorescence signal and the fluorescence emission signal from the input image data by spectral unmixing.
- various embodiments are described for coloring the autofluorescence signal and the fluorescence emission signal.
- the data processing device may be configured to generate the digital fluorescence color output image in a color space having a set of color space coordinates.
- the color space may be a RGB color space.
- the first color may be represented by a first subset of the set of color space coordinates of the color space whereas the second color may be represented by a second subset of the set of color space coordinates of the color space.
- the intersection of the first subset and the second subset may be empty.
- the data processing device may then form the combination of the fluorescence emission signal colored in the first color and the autofluorescence signal colored in the second color by forming the union of the first subset and the second subset.
- the intersection of the first subset and the second subset being empty means that a color space coordinate is either contained in only one of the first and the second subset or in none of the first and second subsets.
- the first color in particular at a predetermined intensity of the fluorescence emission signal may correspond to predetermined first values of the color space coordinates of the first subset. If, for example, the intensity of the fluorescence emission signal is normalized, then an intensity of the fluorescence emission signal of one may be assigned a predetermined value of the first color of ⁇ 255, 0, 0 ⁇ .
- the value of the first color space coordinate at an output pixel may then be adjusted to the intensity of the fluorescence emission signal at an input pixel. For example, if an input pixel contains the fluorescence emission signal at an intensity of 0.5, the fluorescence emission signal at the output image will be colored as ⁇ 127, 0, 0 ⁇ .
- the predetermined second color ⁇ 0, 255, 0 ⁇ may be assigned in, e.g., RGB color space.
- the autofluorescence signal in the output pixel may be colored as ⁇ 0, 127, 0 ⁇ .
- the combination of the first color and the second color for an input pixel which has a fluorescence emission signal of intensity 0.5 and an autofluorescence signal of intensity 0.25, the final color will then result from a combination of the two subsets, e.g., ⁇ 127, 68, 0 ⁇ .
- the first subset of color space coordinates contains other color space coordinates than the second subset.
- the first subset and the second subset should of course not be empty.
- An output pixel of the digital fluorescence color output image may be formed pixel by pixel by the pixel-wise union of the color space coordinate of the first color, of which the intensity is preferably adjusted according to the intensity of the fluorescence emission signal at the input pixel corresponding to the output pixel, and the color space coordinates of the second color at the input pixel of which the intensity is preferably adjusted according to the intensity of the autofluorescence signal at the input pixel.
- the data processing device may be configured to form the combination of the fluorescence emission signal colored in the first color and the autofluorescence signal colored in the second color by performing a vector addition of the color space coordinates of the fluorescence emission signal colored in the first color and the color space coordinates of the autofluorescence signal colored in the second color at each output pixel of the digital fluorescence color output image.
- the color space coordinates at an output pixel of the digital fluorescence color output image may be computed by a vector addition of the color space coordinates of the corresponding fluorescence emission signal in a corresponding input pixel and of the color space coordinates of the autofluorescence signal at a corresponding input pixel, wherein both input pixels may be in the same or in a different digital input image.
- the data processing device may be configured to form the combination of the fluorescence emission signal colored in the first color and the autofluorescence signal colored in the second color by applying a linear transformation simultaneously to the fluorescence emission signal and to the autofluorescence signal.
- the linear transformation may be performed on a pixel-by-pixel basis, wherein the linear transformation is applied both to an input pixel which contains only the autofluorescence signal and to an input pixel which contains only the fluorescence emission signal.
- a union set of the set of color space coordinates of an input pixel containing only the fluorescence emission signal and the set of color space coordinates of a corresponding input pixel containing only the autofluorescence signal may be formed.
- the resulting union set has a number of color space coordinates which correspond to the sum of the color space coordinates of the input pixels.
- the linear transformation is then applied to this union set to obtain the values of the color space coordinates at the corresponding output pixel.
- the linear transformation may, in one example, comprise multiplication of the union set, which may be processed as a vector, with a transformation matrix.
- the linear transformation in this case is reduced to a matrix multiplication which can be implemented and carried out quickly on a computer.
- the transformation matrix has a first dimension which corresponds to the sum of color space coordinates of the two input pixels and a second dimension which corresponds to the number of color space coordinates in the digital fluorescence color output image.
- the data processing device may be configured to change the ratio of the intensity of the fluorescence emission signal and the intensity of the autofluorescence signal in the digital fluorescence color output image depending on a user selection signal.
- the data processing device may be configured to change the ratio of the intensity of the first color and the second color in the digital fluorescence color output image depending on the user selection signal.
- the user selection signal may dim at least one of the autofluorescence signal and the fluorescence emission signal in the digital fluorescence color output relative to the respective other.
- the data processing device may be configured to receive the user selection signal.
- the data processing device may comprise a data interface which may comprise connectors and/or wireless connections for uni- or bi-directional data exchange.
- the medical observation device may comprise a user input device, such as a widget or a manually operated member, such as a dial, a knob, a switch, or a button.
- a user input device such as a widget or a manually operated member, such as a dial, a knob, a switch, or a button.
- the initially stated object may, in another embodiment, also be achieved by a data processing device which is configured to access a digital color input image which contains a fluorescence emission signal representative of the fluorescence emitted by a fluorophore artificially added to the object, and an autofluorescence signal representative of the autofluorescence emitted naturally by the object; to extract the fluorescence emission signal from the digital color input image and color the extracted fluorescence signal in a first color set; to extract the autofluorescence signal and color the extracted autofluorescence signal in a second color different from the first color; and to generate a digital fluorescence color output image from a combination of the extracted fluorescence signal colored in the first color and the extracted autofluorescence signal colored in the second color.
- a medical observation device such as a microscope or an endoscope, for observing an object which contains at least two fluorophores may comprise a data processing device in any of the configurations described above and at least one digital camera which is adapted to record the input image data.
- the medical observation device may comprise a digital color camera for recording at least one of the fluorescence emission signal and the autofluorescence signal.
- This digital color camera may also configure for recording a reflectance signal instead of or in addition to the fluorescence emission signal and the autofluorescence signal.
- the medical observation device may, in another embodiment, comprise a digital color or monochrome camera for recording the fluorescence emission signal and another digital color or monochrome camera for recording the autofluorescence signal.
- the medical observation device may comprise a first digital monochrome camera for recording the fluorescence emission signal and a second digital monochrome camera for recording the autofluorescence signal.
- the digital color input image that contains the fluorescence emission signal and the autofluorescence signal may be a digital fluorescence color input image that is recorded by a fluorescence camera of the medical observation device.
- a fluorescence camera may be com- bined with a fluorescence filter set having band-pass filters of which the passbands are contained in or correspond to the respective fluorescence emission spectra of the fluorescence emission signal and the autofluorescence signal.
- the fluorescence camera may be configured to specifically record fluorescence only and not record reflectance.
- the fluorescence camera may be configured to also record a reflectance signal, or part of the reflectance signal, in particular a part of the reflectance signal that is contained in the NIR (near infrared range) and/or in the fluorescence excitation spectrum of the at least one artificially added fluorophore.
- a reflectance signal or part of the reflectance signal, in particular a part of the reflectance signal that is contained in the NIR (near infrared range) and/or in the fluorescence excitation spectrum of the at least one artificially added fluorophore.
- any of the above-mentioned digital cameras does not need to be configured for recording the entirety of the fluorescence emission signal and/or the autofluorescence signal.
- a part of the fluorescence emission signal may be recorded by a first digital color or monochrome camera and the other part by a second digital color or monochrome camera.
- the initially mentioned object is also achieved by a method for using a medical observation device, such as an endoscope or microscope, wherein the method comprises the steps of recording the input image data and carrying out the computer implemented method in any of the above-described configurations.
- a medical observation device such as an endoscope or microscope
- the invention is also concerned with a computer program product and/or a computer- readable medium comprising instructions, which when executed by a computer causes the computer to carry out the computer implemented method in any of the above configurations.
- aspects have been described in the context of an apparatus, it is clear that these aspects also represent a description of the corresponding method, where a block or device corresponds to a method step or a feature of a method step. Analogously, aspects described in the context of a method step also represent a description of a corresponding block or item or feature of a corresponding apparatus.
- Fig. 1 shows a schematic representation of a medical observation device for generating a digital fluorescence color output image from at least one digital color input image
- Fig. 2 shows a schematic representation of the generation of a digital fluorescence color output image from an autofluorescence signal and a fluorescence emission signal contained in two digital input images of the input image data;
- Fig. 3 shows a schematic representation of the generation of a digital fluorescence color output image from an autofluorescence signal and a fluorescence emission signal contained in a single digital input image of the input image data;
- Fig. 4 shows a schematic representation of an example of combining the extracted fluorescence emission signal and the extracted autofluorescence signal
- Fig. 5 shows a schematic representation of another example of combining the extracted fluorescence emission signal and the extracted autofluorescence signal
- Fig. 6 shows a schematic representation of another example of combining the extracted fluorescence emission signal and the extracted autofluorescence signal
- Fig. 7 presents a schematic overview of the steps for generating a digital fluorescence color output image
- Fig. 8 shows a schematic representation of a generic medical observation device.
- Fig. 1 shows schematically a medical observation device 100.
- the medical observation device 100 may be a microscope or an endoscope, the difference between a microscope and an endoscope being primarily that, in an endoscope (not shown), an object 106 is viewed through optical fibers that are brought into vicinity of the object 106 to be investigated, e.g. by insertion into a body containing the object, whereas, in a microscope, an objective 174 is directed onto the object.
- the medical observation device of Fig. 1 is a microscope, the following description also applies to an endoscope.
- the medical observation device 100 may be a medical observation device used in surgery.
- the medical observation device 100 may also be a medical observation device used in a laboratory, such as a laboratory microscope.
- the object 106 to be investigated may consist of or comprise biological tissue 107.
- the object 106 may be a part of a patient’s body that is located within the field of view of the medical observation device 100.
- the object 106 may contain one or more fluorophores 116, 118, 120.
- At least one fluorophore 116 may be a fluorophore that is naturally contained in the object.
- At least one fluorophore 118, 120 may be artificially added to the object 106, e.g. by injecting it into the biological tissue 107.
- fluorophores 118, 120 that may be artificially added to the object 106 are ICG, fluorescein and/or 5- ALA.
- the medical observation device 100 as shown is a fluorescence imaging device.
- the medical observation device is configured to view and preferably also excite the fluorescence of the one or more fluorophores 116, 118, 120.
- the medical observation device 100 may be a stereoscopic device as is exemplarily shown in Fig. 1. It may thus comprise two identical subassemblies 101 L and 101 R for each of the two stereoscopic channels. As the two subassemblies 101 L, 101 R are identical with respect to function and structure, the following description focuses on the right subassembly 101 R, but applies identically to the left stereoscopic channel 101 L.
- the medical observation device 100 may alternatively be a monoscopic device. In this case, only one of the two subassemblies 101 L, 101 R may be present. For a monoscopic medical observation device 100, the following description therefore applies as well.
- the medical observation device 100 in operation provides input image data 122.
- the input image data 122 are representative of an imaged scene, i.e. the part of the object that is within the field of view 184.
- the input image data 122 may comprise one or more different digital input images 130. If the input image data 122 contain a plurality of digital input images 130, the different digital input images 130 should contain different spectral information. In such a case, each digital input image 130 of the input image data may be recorded at different wavelengths, preferably with no or, synonymously, minimum spectral overlap. Preferably, the spectra, in which the different digital input images 130 of the input image data 122 are recorded, are complementary.
- the digital imaging system 102 may comprise one or more digital cameras 108.
- the number of digital input images 130 contained in the input image data 122 may depend on the number of cameras 108 used for generating the input image data 122.
- a digital input image 130 may be a color image or a monochrome image.
- the medical observation device 100 is configured to record in the input image data 122both the fluorescence of the fluorophore 116 that occurs naturally in the object, and the fluorescence of at least one fluorophore 118, 120 that has been added artificially.
- the one or more fluorophores 118, 120 may have been injected into a patient’s body to mark specific areas of interest, such as tumors.
- the fluorescence of the naturally occurring fluorophore is represented by an autofluorescence signal in the input image data 122.
- the autofluorescence signal is a component of the input image data 122 and represents an image of the object in the fluorescence spectrum of the fluorophore 116.
- the fluorescence of the artificially added fluorophore is represented by a fluorescence emission signal in the input image data 122.
- the fluorescence emission signal is also a component of the input image data 122 and represents an image of the object in the fluorescence emission spectrum of the at least one fluorophore 118, 120.
- Each of these two fluorescence signals has a different spectral signature due to the different spectral characteristics of the respective fluorescence emission.
- the input image data 122 may comprise further signals as additional components.
- the medical observation device 100 may also be configured to record the light reflected off the object in the input image data 122.
- the light reflected off the object is represented in the input signal as a reflectance signal.
- the reflectance signal represents a reflectance image of the object.
- a single digital input image 130 may contain one or more signals either in their respective entirety or only parts thereof. If a digital input image 130 contains more than one signal, it is preferably a digital color input image so that the different signals may be distinguished from one another e.g. by their spectral signatures. Alternatively a single digital input image 130 may be a digital monochrome image. In this case it is preferred that the digital input image 130 is a digital monochrome input image.
- the digital imaging system 102 may comprise as digital cameras 108 a digital reflectance camera 110 and one or more digital fluorescence-light cameras 111 , 111a.
- a second (or third) digital fluorescence-light camera 111a is optional.
- the second digital fluorescence-light camera 111a is only shown in the left stereoscopic channel 101 L, but of course may also be present in the right stereoscopic channel 101 R.
- the digital fluorescence-light camera of one stereoscopic channel may be used as the (first) digital fluorescence-light color camera 111 and the digital fluorescence-light camera of the other stereoscopic channel may be used as the second fluorescence-light camera 111a.
- the cameras 110, 111 , 111a may each be a color camera or a monochrome camera.
- a multispectral camera or a hyperspectral camera is considered as a color camera.
- the digital reflectance camera 110 is configured to record a digital reflectance input image 114, i.e. a digital input image 130, which is representative of the reflectance of the object 106 and thus may comprise all or at least a major part of the reflectance signal.
- the digital reflectance camera 110 is preferably configured to record a digital input image 130 in a wide spectral range within the visible light spectrum.
- the digital input image 130 recorded by the digital reflectance camera represents closely the natural colors of the object 106. This is important if the digital reflectance camera 110 is used to provide the user with an image of the object which comes as close as possible to the human perception of the object.
- the digital reflectance camera 110 may be a CCD, CMOS or multispectral or hyperspectral camera.
- the digital input image 130 recorded by the digital reflectance camera 110 may contain also at least part of the signal and/or the fluorescence emission signal.
- Each of the at least one digital fluorescence-light camera 111 , 111a is configured to record a different digital fluorescence-light image 112, i.e. a digital input image 130, which is representative of the fluorescence of the object 106 and thus may comprise all or at least a major part of the autofluorescence signal and the fluorescence emission signal.
- the fluorescencelight camera 111 may be configured to record the digital fluorescence-light input image 114 only in the fluorescence spectrum or the fluorescence spectra of the at least one fluorophore 116, 118, 120.
- Each fluorescence-light camera 111 , 111a may be configured to record the fluorescence of a different fluorophore. If the digital fluorescence-light image 112 is a color image, it may contain both the autofluorescence and the fluorescence-emission signal.
- the fluorescence-light camera 111 may be configured to record the digital fluorescence-light image only in one or more narrow bands of light. These narrow bands should overlap the fluorescence spectrum or spectra of the one or more fluorophores 116, 118, 120 of which fluorescence is to be recorded.
- the fluorescence spectra of the different fluorophores 116, 118, 120 are at least partly separate, preferably completely separate, i.e. nonoverlapping, so that the fluorescence-light camera 111 may record a digital color input image 130 representing two separate fluorescence bands that are spaced from one another.
- each fluorescence- light camera 111 , 111a preferably captures the fluorescence emission of a different fluoro- phore, thus providing two digital fluorescence-light input images 112 containing different fluorescence-emission signals.
- the fluorescence-light camera 111 captures the fluorescence emission signal and the fluorescence-light camera 111a captures the autofluorescence signal.
- the fluorescence-light cameras 111 , 111a may be monochrome or color.
- the at least one fluorescence-light camera 111 , 111a may also capture part of the reflectance signal in a digital color input image 130. For example, a part of the low-frequency end of the excitation spectrum used for triggering the fluorescence of a fluorophore may overlap the fluorescence spectrum of this fluorophore and be recorded.
- the at least one fluorescence-light camera 111 , 111a may be a monochrome camera, a CCD, CMOS or multispectral or hyperspectral camera.
- the white-light color camera 110 and the at least one fluorescence-light color camera 111 are of the same type, although this is not necessary.
- Any combination of the cameras 110, 111 and 111a may be combined into a single multispectral or hyperspectral camera
- the respective fields of view 184 of the cameras 110, 111 , and if present 111a, are preferably aligned or even coinciding and coaxial. It is preferred that the cameras 110, 111 provide the identical field of view 184 with the identical perspective and focal length. This results in identical representations of the object 106 in the images 112, 114 generated by the different cameras 110, 111. Both cameras 110, 111 may use the same objective 174.
- a match of the perspectives and field of view cannot be generated optically, it may be generated by image processing by applying a matching or registering routine to the digital input images 130, as is explained further below.
- the cameras 110, 111 , and, if present, 111a are operated synchronously. Specifically, the exposure times may be synchronized.
- the medical observation device 100 may be configured to generate the digital input images 130 at the same time.
- the gain of the at least two cameras 110, 111 , 111a is synchronized, i.e. adjusted in the at least two cameras 110, 111 , 111 a at the same time.
- the ratio of the gain applied in camera 110 to the gain applied in camera 111 and, if present, in camera 111a may be constant, even if the gain is changed.
- the gamma correction and color adjustment or white balance may be switched off or kept constant.
- an optical color-separation assembly 176 may be provided for separating the light recorded in the digital reflectance input image 114 from the spectrum recorded in the at least one digital fluorescence-light input image 112, i.e. for separating the reflectance spectrum from the fluorescence spectrum.
- the color-separation assembly 176 may comprise optical elements such as a beam splitter 192, which may be dichroic.
- the color separation assembly 176 may further or alternatively comprise an optical observation filter set 188 and/or an optical fluorescence filter set 190.
- the optical observation filter set 188 and the fluorescence-filter set 190 may be part of an optical filter assembly 187.
- the fluorescence filter set 190 is preferably configured to transmit light in the fluorescence spectrum or spectra of the one or more fluorophores 116, 118, 120 and to block light outside the fluorescence spectrum or spectra.
- the fluorescence filter set 190 may comprise one or more optical band-pass filters comprising one or more passbands. Each passband should overlap the fluorescence emission spectrum of a respective fluorophore 116, 118, 120 of which the fluorescence is to be recorded. As the fluorescence-light filter set 190 is in the light path between the beam splitter 192 and the fluorescence-light color camera 111 , only the wavelengths in the passbands of the fluorescencelight filter set 190 are transmitted to the fluorescence-light camera color 111.
- the fluorescence filter set 190 may comprise a different optical band-pass filter in front of each of the fluorescence-light color cameras 111 , 111a.
- the pass-band of one bandpass filter may be contained in the fluorescence-emission spectrum of one fluorophore 116, whereas the pass-band of the other band-pass filter may be contained in the fluorescenceemission spectrum of another fluorophore 116, 118 in the object 106.
- the observation filter set 188 is preferably configured to block light in the fluorescence spectrum or spectra of the one or more fluorophores 116, 118.
- the observation filter set 188 may also be configured to block light in the fluorescence-excitation spectrum.
- the observation filter set 188 is preferably configured as a band-stop filter, of which the stop bands correspond to or at least contain the passbands of the fluorescence-light filter set 190.
- the observation filter set 188 is located in the light path between the beam splitter 192 and the white-light camera 110.
- white-light camera 110 records only wavelengths that are outside the stop-bands of the observation filter set 188 and therefore also outside of the passbands of the fluorescence-light filter set 190 of the fluorescence-light filter set 190.
- Any one of the observation filter set 188 and the fluorescence filter set 190 may be a tunable filter.
- the beam splitter 192 is a dichroic beam splitter
- at least one of the filter sets 188, 190 may be omitted as the optical spectral filtering in this case is already integrated in the dichroic beam splitter.
- the above description of the passbands and stopbands then should apply mutatis mutandis to the dichroic beam splitter 192.
- the medical observation device 100 may further comprise an illumination assembly 178, which is configured to illuminate the object 106 preferably through the objective 174 through which the imaging system 102 records the at least one digital image 112, 114.
- the illumination assembly 178 may be configured to selectively generate white-light, i.e. light that is evenly distributed across the entire visible spectrum, and fluorescence-excitation light, which contains light only in wavelengths that stimulate fluorescence of the at least one fluoro- phore 116, 118.
- the illumination light generated by the illumination assembly 178 may be fed into the objective 174 using an illumination beam splitter 180.
- the illumination assembly 178 may be configured to generate illumination light simultaneously in a plurality of discrete, in particular narrow-band, wavelength bands. These wavelength bands may comprise any of or any combination of the following wavelength bands.
- One such discrete wavelength band may be entirely located in the fluorescence-excitation spectrum of a fluorophore 116.
- Another such wavelength band may be entirely located in the fluorescence-emission spectrum of another fluorophore 118.
- Another such wavelength band may be limited to wavelengths larger than 700 nm and be entirely located in the NIR range.
- the simultaneous illumination of the object with any of the discrete wavelength bands as described above may be accomplished by a light source 199, e.g. a tunable light source such as a light source comprising a plurality of LEDs in different colors, in particular in different primary colors, which is configured to generate light in these wavelength bands simultaneously.
- the wavelength bands may be generated by using an illumination filter 179 having multiple pass-bands, wherein the pass-bands preferably correspond to the above wavelength bands. If such an illumination filter 179 is used, the light source 199 may generate white-light which is then filtered by the illumination filter 179 so that only the light in the passbands illuminates the object 106.
- the illumination filter 179 may be provided depending on the at least one fluorophore, of which fluorescence is to be triggered, and its specific excitation spectrum. For example, if 5-ALA is used as a fluorophore, the illumination filter may have a transmission of 90 % to 98 % up to wavelengths of 425 nm, a transmission between 0.5 % and 0.7 % in wavelengths between 450 nm and 460 nm, a transmission of not more than 0.1 % between 460 nm and 535 nm and of practically zero for wavelengths above 535 nm.
- the illumination filter 179 may be configured for pass-through of NIR light.
- the illumination filter 179 may comprise a passband in the NIR.
- the illumination filter 178 may further comprise a passband, which is preferably entirely located in the fluorescence-excitation spectrum of another fluorophore.
- the medical observation device 100 may be adjusted to a different fluorophore or set of fluor- ophores by re-configuring the color-separation assembly 176, e.g. by exchanging its optical elements, such as the filters set 190 and/or 192, or the dichroic beam splitter 180.
- the digital reflectance color camera 110 may be used to record at least part of the fluorescence emission signal and/or at least part of the autofluorescence signal if the respective wavelengths pass through the pass-bands of the observation filter system 188.
- the illumination of the object 106 should preferably not contain the fluorescence emission wavelengths, as the intensity of the fluorescence is often low, which may make the fluorescence signals harder to detect. The same holds mutatis mutandum for the reflection signal and the at least one digital fluorescence-light camera 111 , 111a.
- the input image data 122 are processed by a data processing device 170.
- the data processing device 170 may be an integral part of the medical observation device 100. In one example, it is a processor which is embedded in the medical observation device and also used as a controller for controlling the hardware of the medical observation device 100, such as the brightness and/or spectral emission of the light source 199 and/or any objective of the medical observation device 100 and/or any actuators of the medical observation device 100.
- the data processing device 170 is part of a general computer, which is connected to the medical observation device for unidirectional, or bidirectional data transfer by wire or wirelessly.
- the data processing device 170 may be a hardware module, such as a microprocessor, or a software module.
- the data processing device 170 may also be a combination of both a hardware module and a software module, for example by using software modules that are configured to be run on a specific processor, such as a vector processor, a floating point graphics processor, a parallel processor and/or on multiple processors.
- the data processing device 170 may be part of a general-purpose computer 186, such as a PC.
- the data processing device 170 is an embedded processor of the medical observation device 100.
- the data processing device 170 is configured to access the input image data 122, e.g. in the form of one or more digital input images 130, such as digital white-light color input image 114 and the digital fluorescence-light image 112.
- the data processing device 170 may be configured to retrieve the digital input images 130 from a memory 194 and/or directly from the cameras 110, 111 and if present 111a.
- the memory 194 may be part of the data processing device 170 or reside elsewhere in the medical observation device 100.
- the data processing device 170 is further configured to compute a digital fluorescence color output image 160 from the input image data 122, specifically from the autofluorescence signal and the fluorescence-emission signal contained in the input image data 122.
- the digital fluorescence color output image 160 is a color image, which is represented in a color space.
- the color space of the digital fluorescence color output image may be different from the color space of any digital color input image that is contained in the input image data 122.
- the color space of the digital fluorescence color output image 160 is the same color space as that of any of the digital color input images 130.
- the respective signal needs to be separated or extracted from the input image data 122.
- the digital processing device 170 is configured to separate or extract any signal of the group of signals containing the autofluorescence signal, the fluorescence emission signal and the reflectance signal from the remaining signals of the group.
- the digital processing device may comprise a separation or extraction routine 140 that may be stored in a memory 194 of the digital processing device 170.
- the separation or extraction routine 140 may comprise an unmixing routine 142 e.g. for spectrally unmixing the signals.
- the extracted signals may be combined using a signal/image combination routine 144.
- the signal/image combination routine 144 may treat the extracted signals as images.
- a linear transformation routine 146 which may comprise a transformation matrix 148, may be used to transform the colors of the extracted signals.
- the linear transformation may comprised by the signal/image combination routine 144.
- routines 140 to 146 may be a software routine, a routine implemented in hardware, or a routine in which software and hardware components are combined.
- the medical observation device 100 may comprise a user input device 162, which, upon being operated by a user, may generate a user selection signal 164, which may be communicated to the digital processing device 170.
- the user input device 162 may e.g. be a physical button, dial, slide or lever or a widget that represents a physical button, dial, slide, lever, or widget.
- the user may determine which signal or combination of signals extracted from the digital input images 130 and/or in which intensity each signal either in absolute or relative to one or more other signals is displayed. These different modes of display are indicated by I, II, III etc.
- the digital fluorescence color output image 160 may be displayed on a display 132, which is integral with the medical observation device 100.
- the display 132 may be integrated in an ocular or eyepiece 104 of the medical observation device 100.
- the digital fluorescence color output image 160 is preferably generated in real-time, i.e. a digital fluorescence color output image 160 is generated from a set of digital color input images 130 before the next set is generated by the at least two cameras 110, 111 , 111a.
- the medical observation device 100 may comprise a direct optical path 134 from the object 106 through the objective 174 to the eyepiece 104.
- the display may be a translucent display 132 located in the direct optical path 134 or the display may be projected into the direct optical path 134.
- a beam splitter 136 may be provided to split the light between the optical eyepiece 104 and the digital imaging system 102. In one embodiment, up to 80 % of the light may be directed to the eyepiece 104.
- the medical observation device 100 may not have a direct optical path 134 but only display images from the integral display 132.
- the medical observation device may not have any display at all.
- the medical observation device 100 may comprise an output interface 172 to which one or more (external) displays 182 may be connected.
- the output interface 172 may comprise standardized connectors and data transmission protocols, such as USB, HDMI, DVI, Dis- playPort, Bluetooth and/or others.
- An external display may be a monitor, 3D goggles, oculars and the like. Any combination of external displays may be connected to output interface 172.
- the computer 186 and/or the data processing device 170 is connected to the digital imaging system 102 using one or more data transmission lines 196.
- a data transmission line may be wired or wireless, or partly wired and partly wireless.
- the computer 186 and/or the data processing device 170 may not be bodily integrated in the medical observation device 100 but be physically located remote from the digital imaging system 102.
- the digital imaging system 102 and the computer 186 and/or the data processing device 170 may be connected to a network, such as a LAN, a WLAN or a WAN, to which also at least one display 182 is connected.
- the medical observation device 100 may be stereoscopic but comprise only two cameras, one for each stereoscopic channel.
- the fluorescence-light color camera 111 is used and configured to selectively record white-light reflectance, whereas in the other stereoscopic channel, the white-light color camera 110 is used.
- Such an arrangement provides a stereoscopic white-light color input image if no fluorescence is used and a monoscopic white-light color input image and a monoscopic fluorescencelight color input image if fluorescence is used.
- the description above and below applies equally to this configuration.
- the input image data may, in one embodiment, contain at least one digital color input image 130 which contains both the autofluorescence signal and the fluorescence emission signal. This is the case if all fluorescence emitted by the object is recorded in the at least one digital color input image.
- the input image data 122 may contain at least one digital monochrome input image and at least one digital color input image.
- the digital monochrome input image contains either at least a part of the fluorescence emission signal or at least a part of the autofluorescence signal.
- the digital color input image may contain the fluorescence emission signal if the digital monochrome color input image contains the autofluorescence signal, and it may contain a part of the autofluorescence signal if the digital mono
- the input image data comprise a plurality of pixels.
- the pixels may be color pixels or monochrome pixels.
- a monochrome pixel only represents intensity, e.g. as a greyscale image.
- a color pixel comprises information about at least some of the color appearance parameters such as hue, lightness, brightness, chroma, colorfulness and saturation.
- Color pixels are recorded using color bands or, equivalently, color channels or primary colors of a color space using a digital color camera.
- a color space comprises at least three of such color channels. In the color space, each color channel is represented by a different color space coordinate. For conversion between different color spaces, color space transformations may be used. In a different color space, the same color is represented in different color space coordinates.
- Each pixel of the digital color input images 130 comprises a set of color space coordinates that together represent the color of the respective pixel.
- Each color band thus may be regarded as representing a color space axis and each color may be regarded as a point in color space, which is defined by the vector — i.e. the color space coordinates — pointing to this color.
- adding two colors corresponds to a vector addition. If one color has color space coordinates ⁇ xi, yi, zi ⁇ and a second color has color space coordinate ⁇ X2, y2, Z2 ⁇ then the sum of these two colors corresponds to the color ⁇ xi+x 2 , yi+y2, Z1+Z2 ⁇ .
- the digital color input images 130 or, more generally the input image data 120 may be recorded in RGB color space using the three primary colors or color bands — or color space coordinates — R, G, B.
- the digital color input image 130 may be recorded in different color spaces, respectively, and/or represent multi-spectral or hyperspectral color input images.
- the digital input images 130 of a set of digital input images, such as the digital white-light color input image 114 and the digital fluorescence color input image 112 need not be recorded in the same color space, although this is preferred.
- RGB color space each color is represented by triple of three color space coordinates in the form of integer numbers, wherein each integer number indicates the intensity of one of the primary colors R, G, B.
- each integer number indicates the intensity of one of the primary colors R, G, B.
- the most intense red is indicated by the triple [255 ,0, 0]
- the most intense green color is indicated by [0, 255, 0]
- the most intense blue by [0, 0, 255]
- RGB color space is a three-dimensional space
- CMYK color space would be a four-dimensional space.
- a color can be considered as a point in color space to which a vector such as [0, 0, 255] points.
- a multispectral or hyperspectral color space having n color bands would correspondingly result in an n-dimensional color space, in which each color is represented by an n-tuple of color space coordinates.
- Fig. 2 gives an example how a digital fluorescence color output image 160 may be generated from the input image data 122.
- the input image data 122 may comprise two digital input images 130, any of which may be a color image or a monochrome image.
- Each of the digital input images 130 comprises or consists of input pixels 230. If the respective digital input image 130 is a color image, its input pixels 230 will be color pixels. If the respective digital input image 130 is a monochrome image, the respective input pixels 230 will be monochrome pixels.
- a first digital color input image 212 of the input image data 122 may correspond to the digital reflectance input image 114 mentioned above.
- the second digital input image 214 of the input image data 122 may correspond to the fluorescence-light input image 114.
- the first digital input image 212 comprises the autofluorescence signal 224.
- each input pixel 230 contains a contribution of the autofluorescence signal 224. This contribution may be zero at some pixels, where no autofluorescence signal could be received.
- the spectrum 240 indicates the intensity I at a wavelength of light received in the first digital color input image 212. It can be seen that the spectrum 250 of the autofluorescence 224, which corresponds to the autofluorescence emission spectrum, is at least partly contained in a passband 220, e.g., of the optical filter assembly 187.
- the passband 220 may be wider or narrower than the spectrum 250.
- the first digital input image 212 is a monochrome image, only a single signal should be contained therein. In the case of a monochrome image, the first digital input image 212 may consist of the autofluorescence signal 224. If the first digital input image 212 is a color image, additional signals may be contained. In the case of a color image, the the first digital input image 212 may e.g. comprise also at least part of a reflectance signal 202 in addition to the autofluorescence signal 224. The spectrum 252 of the reflectance signal 202 is indicated quantitatively in the spectrum 240.
- the reflectance signal 202 and the autofluorescence signal 224 may overlap.
- Such an input pixel 230 contains both (a local part of the) reflectance signal 202 and (a local part of) the autofluorescence signal 224.
- the two signals 202, 224 may be separated from one another if the input pixel 230 is a color pixel.
- the reflectance signal 202 may, e.g., be generated by illuminating the object 106 with light having wavelengths in a fluorescence excitation spectrum of another fluorophore 118, 120 of which fluorescence is to be triggered. A part of such an illumination spectrum may overlap the passband 220.
- the spectrum 240 of the first digital input image may of course comprise additional components at other wavelengths preferably outside the band-pass 220.
- the first digital input image 130 includes only the autofluorescence signal 224, an extraction of the autofluorescence signal 224 is not necessary. In this case, the first digital input image 130 represents already the autofluorescence signal 224.
- the autofluorescence signal 224 must be separated or extracted if it is to be processed separately. Such an extraction or separation may be done using spectral unmixing in particular by the data processing device 170.
- the autofluorescence signal 224 may, e.g., be colored by a color 218.
- the intensity of the color 218 may depend on the intensity of the extracted autofluorescence signal 224 at an input pixel 230. If the first digital input image 130 is a color image, the color of the extracted autofluorescence signal 290 may be kept but the intensity and/or brightness may be normalized.
- the color 218 may be a pseudocolor, a false color or a natural color.
- a pseudocolor a different color is assigned to a pixel depending on the intensity of the autofluorescence signal 224 at this pixel.
- the brightness or intensity of this color may depend on the intensity of the autofluorescence signal 224 at this pixel.
- a false color is a color which is different from the color of the fluorescence emitted by the autofluorescing fluorophore as perceived by the human eye.
- a natural color corresponds to the color of the fluorescence emitted by the autofluorescing fluorophore as perceived by the human eye.
- the fluorescence emission signal 204 is contained in the example of Fig. 2. Again, additional signals such as part of the autofluorescence signal 224 and/or part of the reflectance signal 202 may be contained in the second digital input image 214. As with the first digital input image 212 it is preferred that the second digital input image 214 is a color image to be able to separate the fluorescence emission signal 204 from any other signals contained in the second digital input image 214. If the second digital input image 214 is a monochrome image, it should contain only the fluorescence emission signal 204.
- a sample spectrum of the light recorded by the second digital input image 214 is indicated at reference number 242.
- the second digital input image 214 records the light that passes through the passbands 220 of the optical filter assembly 187. As shown just by way of example, some of the autofluorescence signal 224 may leak into the second digital input image 214 if the autofluorescence emission spectrum 250 overlaps the passband 220 which contains the fluorescence emission spectrum 254 of at least one fluorophore 116, 118, such as 5-ALA and/or ICG.
- the second digital input image 214 may correspond to the fluorescence-light input image 114.
- the object is illuminated in an additional narrow band to provide additional information.
- the object 106 may be illuminated in the NIR range 272 outside the visible light range 270. If the second digital input image 214 only contains the fluorescence emission signal 204, no extraction or separation of this signal is necessary. If, however, additional signals such as the autofluorescence signal 224 and/or the reflectance signal 202, or parts thereof, are contained in the second digital image 214, an extraction or separation of the fluorescence emission signal 204 is necessary, to be able to process it separately.
- a separation of the fluorescence emission signal 204 from the second digital input image 130 may, e.g., be done using spectral unmixing. The spectral unmixing can be performed by the data processing device 170 described above.
- the extracted autofluorescence signal 290 preferably is colored using a color 216.
- the intensity and/or brightness or any other color parameter of the color 216 may depend on the intensity of the extracted fluorescence signal 290 at an input pixel 230. If extracted fluorescence emission signal 280 is already a color image, its color may also be maintained. As described above in the context of the color 218, the color 216 may be a pseudocolor, a false color or a natural color.
- the digital fluorescence color output image 160 is obtained by combining the (if necessary) extracted, colored and preferably also normalized autofluorescence signal 290 with the (if necessary) extracted, colored and preferably normalized fluorescence emission signal 280.
- Each output pixel 232 of the digital fluorescence color output image 160 contains a combination of the autofluorescence signal 224 as contained in the corresponding input pixel 230 of the first digital input image 212 and the fluorescence emission signal 204 at the corresponding input pixel 230 of the second digital image 214.
- An overlap 222 may exist in the digital fluorescence color output image 160, where both the autofluorescence signal and the fluorescence emission signal overlap.
- Fig. 3 shows an embodiment, where the input image data 122 comprise only a single digital color input image 130, which, for example, may be a digital fluorescence-light color input image 112.
- the digital color input image 130 contains both the autofluorescence signal 224 and the fluorescence emission signal 204.
- the digital fluorescence input image 130 may contain a reflectance signal 202.
- a sample spectrum of such a digital color input image 130 is shown at reference numeral 300.
- the autofluorescence signal 224 and the fluorescence emission signal 204 need to be extracted at every input pixel 230, resulting in an extracted autofluorescence signal 290 and an extracted fluorescence emission signal 280.
- the further procedure may be as described with reference to Figure 2.
- the extracted autofluorescence signal 290 and the extracted fluorescence emission signal 280 can be combined using different colors 216, 218 to make them more distinguishable to a user.
- the digital fluorescence color output image 160 is represented in RGB color space having three color bands R, G, B where the color of a pixel is represented accordingly in three color space coordinates [rgb].
- the fluorescence emission signal 280 and the autofluorescence signal 290 may be assigned to different color channels of a color space respectively.
- the extracted autofluorescence signal 290 may be assigned the green color channel G
- the extracted fluorescence emission signal 280 is assigned the red channel R.
- the blue color channel B may not be assigned any signal and thus be set, for example, to zero.
- the color bands correspond at least roughly to the hues of the autofluorescence signal and the fluorescence emission signal, respectively. This may be the case if the fluorescence emission of the respective fluorophores 116, 118, 120 has a hue which corresponds to the color of a color band according to human color vision.
- the fluorescence emission of 5-ALA has a red hue.
- the extracted fluorescence emission signal 280 if representing the fluorescence of 5-ALA may be assigned the red channel R.
- the autofluorescence signal 290 may be assigned the green channel, especially if the autofluorescence emission is perceived as greenish by human perception.
- the color space coordinates of two corresponding pixels 230 of the extracted autofluorescence signal 290 and the extracted fluorescence emission signal 280 may simply be added to obtain the color space coordinates of the corresponding output pixel 232 of the digital fluorescence color output image 160.
- a pixel 230 of the extracted autofluorescence signal 290 has color space coordinates ⁇ n, gi, bi ⁇
- the corresponding pixel 230 of the extracted fluorescence emission signal 280 has color space coordinates ⁇ r 2 , g2, b2 ⁇
- vector-adding these color space coordinates results in the color space coordinates ⁇ n + r 2 , gi + g 2 , bi + b 2 ⁇ .
- y may represent any color space and even multi or hyperspectral color coordinates.
- the ex- tracted autofluorescence signal 290 may be monochrome or color.
- a linear transformation 146 may applied to the color space coordinates Xj and yj.
- the linear transformation 146 may comprise a transformation matrix , which is indicated in Fig. 1 at the reference numeral 148 as part of the data processing device 170.
- the linear transformation matrix is multiplied by a vector ⁇ xi ⁇ yj ⁇ consisting of the coordinates x and yj.
- the input vector is of the form ⁇ xi, ... , XM, yi,... , yN ⁇ and therefore has dimension (M + N).
- the linear transformation matrix has dimensions of (M+N) in one direction and K in the other direction.
- Assigning different colors 216, 218 and using the combination schemes shown in Figs. 5 to 6 allows to spread spectra of the autofluorescence signal and the fluorescence emission signal over a wider range of colors, thus offering a better visualization of output pixels 232 that contain both the autofluorescence signal 224 and the fluorescence emission signal 204.
- Fig. 7 presents an overview of the steps that may be used for generating the digital fluorescence color output image 160.
- a first step 700 the input image data 122 are recorded, retrieved or accessed. This may comprise recording one or more digital input images 130 with a corresponding number of cameras. For example, at step 702 a first digital input image 130 may be recorded, at step 704 a second digital input image 130 may be recorded and at step 706 a third digital input image 130 may be recorded. The steps 702, 704, 706 may occur simultaneously or shortly after one another.
- the autofluorescence signal is separated from the input image data or the respective digital input image 130.
- This step may, e.g., be carried out by the extraction routine 144.
- the extracted fluorescence emission signal 280 and the extracted autofluorescence signal 290 are obtained.
- Step 708 is not necessary, if a digital input image 130 consists of a signal 204, 224 that otherwise would have been needed to be extracted. In such a case, the digital input image 130 may be used as the signal to be extracted.
- intermediate processing may be carried out on the extracted fluorescence emission signal 280.
- Intermediate processing may comprise normalizing the extracted fluorescence emission signal.
- Intermediate processing 710 may alternatively or cumulatively comprise assigning a color to the extracted fluorescence emission signal.
- Step 712 intermediate processing may be carried out on the extracted autofluorescence signal 290.
- Step 712 may correspond to step 710 described above.
- the extracted autofluorescence signal 290 may be combined with the extracted fluorescence emission signal 280 to generate the digital fluorescence color output image 160.
- the step 714 may, e.g., be carried out by the signal/image combination routine 144.
- the step 714 may comprise the step of coloring the extracted autofluorescence signal 280 in a different color than the extracted fluorescence emission signal 290.
- the digital fluorescence color output image 160 is displayed e.g. in at least one of the displays 132, 182.
- a microscope comprising a system as described in connection with one or more of the Figs. 1 to 7.
- a microscope may be part of or connected to a system as described in connection with one or more of the Figs. 1 to 7.
- Fig. 8 shows a schematic illustration of a system 800 configured to perform a method described herein.
- the system 800 comprises a microscope 810 and a computer system 820.
- the microscope 810 is configured to take images and is connected to the computer system 820.
- the computer system 820 is configured to execute at least a part of a method described herein.
- the computer system 820 may be configured to execute a machine learning algorithm.
- the computer system 820 and microscope 810 may be separate entities but can also be integrated together in one common housing.
- the computer system 820 may be part of a central processing system of the microscope 810 and/or the computer system 820 may be part of a subcomponent of the microscope 810, such as a sensor, an actor, a camera or an illumination unit, etc. of the microscope 810.
- the computer system 820 may be a local computer device (e.g. personal computer, laptop, tablet computer or mobile phone) with one or more processors and one or more storage devices or may be a distributed computer system (e.g. a cloud computing system with one or more processors and one or more storage devices distributed at various locations, for example, at a local client and/or one or more remote server farms and/or data centers).
- the computer system 820 may comprise any circuit or combination of circuits.
- the computer system 820 may include one or more processors which can be of any type.
- processor may mean any type of computational circuit, such as but not limited to a microprocessor, a microcontroller, a complex instruction set computing (CISC) microprocessor, a reduced instruction set computing (RISC) microprocessor, a very long instruction word (VLIW) microprocessor, a graphics processor, a digital signal processor (DSP), multiple core processor, a field programmable gate array (FPGA), for example, of a microscope or a microscope component (e.g. camera) or any other type of processor or processing circuit.
- CISC complex instruction set computing
- RISC reduced instruction set computing
- VLIW very long instruction word
- DSP digital signal processor
- FPGA field programmable gate array
- the computer system 820 may include one or more storage devices, which may include one or more memory elements suitable to the particular application, such as a main memory in the form of random access memory (RAM), one or more hard drives, and/or one or more drives that handle removable media such as compact disks (CD), flash memory cards, digital video disk (DVD), and the like.
- RAM random access memory
- CD compact disks
- DVD digital video disk
- the computer system 820 may also include a display device, one or more speakers, and a keyboard and/or controller, which can include a mouse, trackball, touch screen, voice-recognition device, or any other device that permits a system user to input information into and receive information from the computer system 820.
- a display device one or more speakers
- a keyboard and/or controller which can include a mouse, trackball, touch screen, voice-recognition device, or any other device that permits a system user to input information into and receive information from the computer system 820.
- Some or all of the method steps may be executed by (or using) a hardware apparatus, like for example, a processor, a microprocessor, a programmable computer or an electronic circuit. In some embodiments, some one or more of the most important method steps may be executed by such an apparatus.
- embodiments of the invention can be implemented in hardware or in software.
- the implementation can be performed using a non- transitory storage medium such as a digital storage medium, for example a floppy disc, a DVD, a Blu-Ray, a CD, a ROM, a PROM, and EPROM, an EEPROM or a FLASH memory, having electronically readable control signals stored thereon, which cooperate (or are capable of cooperating) with a programmable computer system such that the respective method is performed. Therefore, the digital storage medium may be computer readable.
- Some embodiments according to the invention comprise a data carrier having electronically readable control signals, which are capable of cooperating with a programmable computer system, such that one of the methods described herein is performed.
- embodiments of the present invention can be implemented as a computer program product with a program code, the program code being operative for performing one of the methods when the computer program product runs on a computer.
- the program code may, for example, be stored on a machine readable carrier.
- inventions comprise the computer program for performing one of the methods described herein, stored on a machine readable carrier.
- an embodiment of the present invention is, therefore, a computer program having a program code for performing one of the methods described herein, when the computer program runs on a computer.
- a further embodiment of the present invention is, therefore, a storage medium (or a data carrier, or a computer-readable medium) comprising, stored thereon, the computer program for performing one of the methods described herein when it is performed by a processor.
- the data carrier, the digital storage medium or the recorded medium are typically tangible and/or non- transitionary.
- a further embodiment of the present invention is an apparatus as described herein comprising a processor and the storage medium.
- a further embodiment of the invention is, therefore, a data stream or a sequence of signals representing the computer program for performing one of the methods described herein.
- the data stream or the sequence of signals may, for example, be configured to be transferred via a data communication connection, for example, via the internet.
- a further embodiment comprises a processing means, for example, a computer or a programmable logic device, configured to, or adapted to, perform one of the methods described herein.
- a processing means for example, a computer or a programmable logic device, configured to, or adapted to, perform one of the methods described herein.
- a further embodiment comprises a computer having installed thereon the computer program for performing one of the methods described herein.
- a further embodiment according to the invention comprises an apparatus or a system configured to transfer (for example, electronically or optically) a computer program for performing one of the methods described herein to a receiver.
- the receiver may, for example, be a computer, a mobile device, a memory device or the like.
- the apparatus or system may, for example, comprise a file server for transferring the computer program to the receiver.
- a programmable logic device for example, a field programmable gate array
- a field programmable gate array may cooperate with a microprocessor in order to perform one of the methods described herein.
- the methods are preferably performed by any hardware apparatus.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Pathology (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Optics & Photonics (AREA)
- Radiology & Medical Imaging (AREA)
- Signal Processing (AREA)
- Analytical Chemistry (AREA)
- Chemical & Material Sciences (AREA)
- Biochemistry (AREA)
- Immunology (AREA)
- General Physics & Mathematics (AREA)
- Endoscopes (AREA)
Abstract
The invention relates to a data processing device (170) for a medical observation device (100), such as a microscope or an endoscope, for observing an object (106) The data processing device (170) is configured to access input image data (120). The input image data contain an autofluorescence signal (224), the autofluorescence signal being representative of fluorescence emitted by a fluorophore (116) naturally contained in the object, and a fluorescence emission signal (204), the fluorescence emission signal being representative of fluorescence emitted by a fluorophore (118) artificially added to the object. Finally, the data processing device (170) is configured to generate a digital fluorescence color output image (160) from a combination of the fluorescence emission signal colored in a first color (308) and the autofluorescence signal colored in a second color (310), the second color being different from the first color.
Description
Data Processing Device and Computer Implemented Invention for a Medical Observation Device, for Visualization of an Autofluorescence Signal and a Fluorescence Emission Signal
The invention relates to a data processing device and a computer-implemented method for a medical observation device, such as a microscope or an endoscope, for observing an object. The invention also relates to a medical observation device comprising such a data processing device, and to a method for using a medical observation device, comprising the computer- implemented method.
Medical observation devices, such as microscopes or endoscopes are often used in connection with fluorophores.
One or more fluorophores may be added artificially to the object to mark specific parts of the object. For example, in the case of medical observation devices, one or more fluorophores may be injected into a patient’s body.
Some of the artificially added fluorophores attach to some specific part of the object, e.g. to certain antibodies or (bio)chemical substances. For example, the fluorophore 5-ALA is used to mark tumors. In another example, the fluorophore may be used to mark pathways along which the fluorophore is transported. For example, ICG may be used to highlight blood flow.
The object observed by the medical observation device may further contain naturally one or more fluorophores. The fluorescence of these naturally occurring fluorophores is called autofluorescence. Autofluorescence may be used as an additional diagnostic tool but at the same time may obscure the fluorescence emission of the artificially added fluorophores, which makes a simultaneous usage of both autofluorescence and fluorescence emission difficult.
It is therefore an object of the invention to provide a device and a method which allows to use both autofluorescence and the fluorescence emission from artificially added fluorophores more efficiently.
This object is solved by a data processing device for a medical observation device, such as a microscope or an endoscope, for observing an object, wherein the data device is configured: to access input image data, the input image data containing an autofluorescence signal, the autofluorescence signal being representative of fluorescence emitted by a fluorophore naturally contained in the object, and a fluorescence emission signal, the fluorescence emission signal being representative of fluorescence emitted by a fluorophore artificially added to the object; and to generate a digital fluorescence color output image from a combination of the
fluorescence emission signal colored in a first color and the autofluorescence signal colored in a second color, the second color being different from the first color.
The above object is also achieved by a computer-implemented method for processing input image data in a medical observation device, such as a microscope or endoscope, the method comprising the steps of: accessing input image data containing an autofluorescence signal of an object, the autofluorescence signal being representative of fluorescence emitted by a fluor- ophore naturally contained in the object, and a fluorescence emission signal of the object, the fluorescence emission signal being representative of fluorescence emitted by a fluorophore artificially added to the object; generating a digital fluorescence color output image from a combination of the fluorescence emission signal colored in a first color and the autofluorescence signal colored in a second color, the second color being different from the first color.
Both the above data processing device and the above computer-implemented method allow to more easily differentiate between the autofluorescence signal and the fluorescence emission signal, as both are colored differently.
The data processing device and the computer-implemented method may be further improved by one or more of the following features, which can be combined independently of one another. Any of the features described in the following can further be independently used for improving the data processing device and/or for improving the computer-implemented method, even if the respective feature is only described in the context of the data processing device or in the context of the computer-implemented method.
For example, the first and/or the second color may be a pseudo color, a false color or a natural color. If the first and/or the second color is a pseudo color, different intensities of the autofluorescence signal and/or the fluorescence emission signal will be assigned a different color. If the first and/or second color is a false color or a natural color, the intensity or brightness of the color will depend on the intensity of the autofluorescence signal and the fluorescence emission signal, respectively. The hue of the first and/or second color is, in such a case, preferably independent of the intensity of the autofluorescence signal and/or the fluorescence emission signal, respectively. If the first and/or second color is a false color, it will be different from the color of the fluorescence signal and/or the autofluorescence signal as it is perceived by human color vision. If the first and/or second color is a natural color, it will at least closely correspond to the color of the autofluorescence signal and/or the fluorescence emission signal as it is perceived by human color vision.
In another embodiment, the distance between the first and the second color in the color space of the fluorescence color output image may be larger than the distance between the natural
color of the autofluorescence signal and the natural color of the fluorescence emission signal. This facilitates visual differentiation between these two signals. According to another aspect, the first color may have the same hue as the natural color of the fluorescence emission signal and/or the second color may have the same hue as the natural color of the autofluorescence signal.
The input image data may be contained in or comprise one or more digital input images. Each of the digital input images may contain a plurality of input pixels. If more than one digital input image is contained in the input image data, these digital input images may have the same or a different number of pixels, be represented in the same or a different color space, and/or have the same or a different image aspect ratio.
Throughout this text, if a digital input image is a color image, it is termed a “digital color input image”. A multispectral or hyperspectral image is considered as a color image. If a digital input image is a monochrome image, it is termed “digital monochrome input image”. If, in a specific context, it does not matter whether the digital input image is a color or a monochrome image, the generic term “digital input image” is used. A “digital input image” may thus be a “digital color input image” or a “digital monochrome input image”.
In a digital color input image, each pixel or more specifically input pixel may be a color pixel. In the digital fluorescence color output image, each pixel or more specifically output pixel may be a color pixel. A color pixel contains a set of color space coordinates which, at least in combination, represent at least some of the color appearance parameters such as hue, lightness, brightness, chroma, colorfulness, etc. A monochrome image may represent only light intensity values, e.g., on a grey-scale.
In one specific example, the input image data may comprise at least one of: at least one digital color input image; at least one digital color input image and at least one digital monochrome input image; and at least two digital monochrome input images of the object. These are the respective minimum sets of digital input images that may contain both an autofluorescence signal and a fluorescence emission signal that can be separated from one another reliably.
A digital color input image may contain at least one of the autofluorescence signal and the fluorescence emission signal, or at least part of the autofluorescence signal and/or at least part of the fluorescence emission signal. If a digital color input image comprises more than one signal, these signals may be separated from one another by spectral unmixing.
A digital monochrome input image may contain only one of the autofluorescence signal and
the fluorescence emission signal. As a monochrome image only contains light intensity information, it is difficult to separate two signals contained in a single monochrome image. Thus, if two digital monochrome input images are comprised by the input image data, one of the two digital monochrome input images may contain or preferably consist of the autofluorescence signal while the other one of the two digital monochrome input images may contain or preferably consist of the fluorescence emission signal. In such a configuration, the autofluorescence signal and the fluorescence emission signal can be separated without much effort from one another as they are already contained in two different digital input images. Of course, it is also possible that one digital monochrome input image contains the sum of the autofluorescence signal and the fluorescence emission signal and the other digital monochrome input image contains only one of the autofluorescence signal and the fluorescence emission signal. In such a configuration, the subtraction of the two digital monochrome input images yields the signal that is not contained in the digital monochrome input image containing only a single signal.
The input image data may also contain a reflectance signal, which is representative of light reflected off the object. The reflectance signal, or part thereof, may in particular be contained in a digital color input image that contains at least one of the fluorescence emission signal and the autofluorescence signal. For distinguishing between the autofluorescence signal and the fluorescence emission signal, the reflectance signal may not be needed. Therefore it is an advantage if the data processing device is configured to separate any one, or both, of the autofluorescence signal and the fluorescence emission signal from the reflectance signal.
If a digital input image contains more than one signal, such as any combination or subset of the group containing the autofluorescence signal, the fluorescence emission signal and the reflectance signal, contributions of these signals may be contained at any one input pixel.
If the autofluorescence signal is to be processed separately from the fluorescence emission signal, both signals need to be separated or extracted from one another if they, or parts thereof, are contained in a single digital input image. This means that at each input pixel, the respective contributions of the signals contained in the respective digital input image need to be calculated. As stated above, one method to do this is to use spectral unmixing. Another way to do this is to apply a linear transformation to the input pixel or input pixels containing the signals that need to be separated from one another. The linear transformation may comprise applying a transformation matrix to the color space coordinates of the at least one input pixel, which may be a color or a monochrome input pixel.
The term extracting is considered synonymous to isolating and separating.
The data processing device may be configured for spectral unmixing of the input image data,
and/or any digital input image contained in the input image data. The data processing device may also be configured to apply a linear transformation to the input image data, or any digital input image contained therein. This may be realized in that the data processing device comprises an extraction or separation routine which is configured to separated or extract the autofluorescence signal and/or to extract or separate the fluorescence emission signal. The extraction or separation routine may comprise a linear transformation, in particular a transformation matrix. The extraction or separation routine may comprise independently thereof a spectral unmixing routine.
If the input image data comprise more than one digital input image, the two or more digital input images are preferably registered with respect to one another. They preferably have an identical field of view. The cameras which are used for recording the different digital input images have preferably identical field of view and/or coaxial optical axes. In this context it is even more beneficial if the two or more digital input images have the same number of pixels and the same aspect ratio, so that there is a high correlation between their location in their respective digital input image and the location on the object represented by the input pixels.
Having registered digital input images facilitates their joint processing in particular on a pixel- by-pixel bases, as the registered digital input images have corresponding input pixels, i.e. input pixels that represent the same location of the object although they are located in different digital input images.
If the digital input images are unregistered and/or have a different number of pixels and/or aspect ratio, corresponding pixels may be computed using a feature analysis. One way to identify corresponding pixels is, e.g., described in US 6,711 ,293 B. Alternatively, a scale-invariant feature transform or the SURF algorithm from ETH Zurich may be used.
Both the fluorescence emission signal and the autofluorescence signal represent a color or monochrome image of the object in the respective fluorescence emission spectrum. For example, the fluorescence emission signal represents an image of the object in the fluorescence emission spectrum of the at least one fluorophore that has been artificially added to the object. The autofluorescence signal corresponds to an image of the object in at least part of the fluorescence emission spectrum of the fluorophore that is naturally contained in the object. The autofluorescence signal and/or the fluorescence emission signal may, according to another aspect, contain only part of the respective fluorescence emission spectrum. The spectrum that is recorded in the autofluorescence signal and/or the fluorescence emission signal may be determined by optical filters before the cameras that are used to record them.
In one example of a digital monochrome input image consisting of or comprising only the autofluorescence signal, the input pixels represent an intensity of the autofluorescence signal at a location of the object which corresponds to the input pixel.
In another independent example of a digital monochrome input image consisting of or comprising only the fluorescence emission signal, the input pixels represent an intensity of the fluorescence emission signal at location of the object which corresponds to the location of the input pixel.
In an example of a digital color input image consisting of or comprising only the autofluorescence signal, an input pixel represents at least part of the color appearance parameters of the fluorescence emission signal as emitted from a location of the object which corresponds to the input pixel. In a digital color input image consisting of or comprising only the fluorescence emission signal, an input pixel represents at least some color appearance parameters of the autofluorescence signal as emitted from a location of the object which corresponds to the input pixel.
In an example of a digital color input image comprising both the fluorescence emission signal and the autofluorescence signal, an input pixel represents a mixture of at least some color appearance parameters of the fluorescence emission signal and the autofluorescence signal as emitted by a single location of the object which corresponds to the location of the input pixel.
The intensity of the autofluorescence signal and/or of the fluorescence emission signal may be zero at an input pixel of which represents a location in the object, where no autofluorescence signal and/or no autofluorescence signal is emitted or where the emitted fluorescence signal and/or the emitted autofluorescence signal is too weak to be recorded.
In the digital fluorescence color output image, each output pixel may be formed from a combination of the fluorescence emission signal at a corresponding input pixel of the digital input image containing the fluorescence emission signal colored in the first color and the autofluorescence signal at a corresponding pixel of the digital input image containing the autofluorescence signal colored in the second color. As indicated, the input pixels and the output pixel that is generated from the input pixels are corresponding pixels.
For example, an intensity of the first color at an output pixel of the digital fluorescence color output image may depend on the preferably normalized intensity of the fluorescence emission signal at a corresponding input pixel of the digital input image containing the fluorescence emission signal. Likewise, an intensity of the second color at an output pixel of the digital fluo-
rescence color output image may depend on the preferably normalized intensity of the autofluorescence signal at a corresponding input pixel of the digital input image containing the autofluorescence signal.
Coloring the fluorescence emission signal in the first color may comprise assigning to an output pixel an intensity to the first color which depends on the intensity of the fluorescence emission signal at the corresponding input pixel in the input image data.
Coloring the autofluorescence signal in the second color may comprise assigning at an output pixel an intensity to the second color which depends on the intensity of the fluorescence emission signal at a corresponding input pixel.
According to one embodiment, the input image data may comprise a digital color input image that contains the fluorescence emission signal and the autofluorescence signal, where the data processing device is configured to extract the fluorescence emission signal from the digital color input image prior to coloring the fluorescence emission signal in the first color; and extract the autofluorescence signal from the digital color input image prior to coloring the autofluorescence signal in the second color. As mentioned above, the extraction of the fluorescence emission signal and the extraction of the autofluorescence signal allows to process these two signals separately.
According to another embodiment, the data processing device may be configured to extract at least one of the autofluorescence signal and the fluorescence emission signal from the input image data by spectral unmixing. In the following, various embodiments are described for coloring the autofluorescence signal and the fluorescence emission signal.
According to one embodiment, the data processing device may be configured to generate the digital fluorescence color output image in a color space having a set of color space coordinates. According to one aspect, the color space may be a RGB color space. The first color may be represented by a first subset of the set of color space coordinates of the color space whereas the second color may be represented by a second subset of the set of color space coordinates of the color space. The intersection of the first subset and the second subset may be empty. The data processing device may then form the combination of the fluorescence emission signal colored in the first color and the autofluorescence signal colored in the second color by forming the union of the first subset and the second subset.
The intersection of the first subset and the second subset being empty means that a color space coordinate is either contained in only one of the first and the second subset or in none of the first and second subsets.
The first color in particular at a predetermined intensity of the fluorescence emission signal may correspond to predetermined first values of the color space coordinates of the first subset. If, for example, the intensity of the fluorescence emission signal is normalized, then an intensity of the fluorescence emission signal of one may be assigned a predetermined value of the first color of {255, 0, 0}. The value of the first color space coordinate at an output pixel may then be adjusted to the intensity of the fluorescence emission signal at an input pixel. For example, if an input pixel contains the fluorescence emission signal at an intensity of 0.5, the fluorescence emission signal at the output image will be colored as {127, 0, 0}.
For a preferably normalized autofluorescence signal having an intensity of 1 , the predetermined second color {0, 255, 0} may be assigned in, e.g., RGB color space. For an intensity of 0.5 at an input pixel, the autofluorescence signal in the output pixel may be colored as {0, 127, 0}. The combination of the first color and the second color for an input pixel which has a fluorescence emission signal of intensity 0.5 and an autofluorescence signal of intensity 0.25, the final color will then result from a combination of the two subsets, e.g., {127, 68, 0}.
Thus, the first subset of color space coordinates contains other color space coordinates than the second subset. The first subset and the second subset should of course not be empty.
An output pixel of the digital fluorescence color output image may be formed pixel by pixel by the pixel-wise union of the color space coordinate of the first color, of which the intensity is preferably adjusted according to the intensity of the fluorescence emission signal at the input pixel corresponding to the output pixel, and the color space coordinates of the second color at the input pixel of which the intensity is preferably adjusted according to the intensity of the autofluorescence signal at the input pixel.
In another embodiment, the data processing device may be configured to form the combination of the fluorescence emission signal colored in the first color and the autofluorescence signal colored in the second color by performing a vector addition of the color space coordinates of the fluorescence emission signal colored in the first color and the color space coordinates of the autofluorescence signal colored in the second color at each output pixel of the digital fluorescence color output image.
In particular, the color space coordinates at an output pixel of the digital fluorescence color output image may be computed by a vector addition of the color space coordinates of the corresponding fluorescence emission signal in a corresponding input pixel and of the color space coordinates of the autofluorescence signal at a corresponding input pixel, wherein both input pixels may be in the same or in a different digital input image.
In another embodiment, the data processing device may be configured to form the combination of the fluorescence emission signal colored in the first color and the autofluorescence signal colored in the second color by applying a linear transformation simultaneously to the fluorescence emission signal and to the autofluorescence signal.
The linear transformation may be performed on a pixel-by-pixel basis, wherein the linear transformation is applied both to an input pixel which contains only the autofluorescence signal and to an input pixel which contains only the fluorescence emission signal.
For example, a union set of the set of color space coordinates of an input pixel containing only the fluorescence emission signal and the set of color space coordinates of a corresponding input pixel containing only the autofluorescence signal may be formed. The resulting union set has a number of color space coordinates which correspond to the sum of the color space coordinates of the input pixels. The linear transformation is then applied to this union set to obtain the values of the color space coordinates at the corresponding output pixel. The linear transformation may, in one example, comprise multiplication of the union set, which may be processed as a vector, with a transformation matrix. The linear transformation in this case is reduced to a matrix multiplication which can be implemented and carried out quickly on a computer. The transformation matrix has a first dimension which corresponds to the sum of color space coordinates of the two input pixels and a second dimension which corresponds to the number of color space coordinates in the digital fluorescence color output image.
In some cases, it may be desirable to be able to change the relative intensities of the autofluorescence signal in the second color and the fluorescence emission signal in the first color. This facilitates distinguishing between these two fluorescence signals. In order to achieve this, the data processing device may be configured to change the ratio of the intensity of the fluorescence emission signal and the intensity of the autofluorescence signal in the digital fluorescence color output image depending on a user selection signal. In particular, the data processing device may be configured to change the ratio of the intensity of the first color and the second color in the digital fluorescence color output image depending on the user selection signal.
Thus, the user selection signal may dim at least one of the autofluorescence signal and the fluorescence emission signal in the digital fluorescence color output relative to the respective other.
The data processing device may be configured to receive the user selection signal.
For receiving the input image data and/or the user selection signal, the data processing device
may comprise a data interface which may comprise connectors and/or wireless connections for uni- or bi-directional data exchange.
For generating the user selection signal, the medical observation device may comprise a user input device, such as a widget or a manually operated member, such as a dial, a knob, a switch, or a button.
The initially stated object may, in another embodiment, also be achieved by a data processing device which is configured to access a digital color input image which contains a fluorescence emission signal representative of the fluorescence emitted by a fluorophore artificially added to the object, and an autofluorescence signal representative of the autofluorescence emitted naturally by the object; to extract the fluorescence emission signal from the digital color input image and color the extracted fluorescence signal in a first color set; to extract the autofluorescence signal and color the extracted autofluorescence signal in a second color different from the first color; and to generate a digital fluorescence color output image from a combination of the extracted fluorescence signal colored in the first color and the extracted autofluorescence signal colored in the second color.
According to another embodiment, a medical observation device, such as a microscope or an endoscope, for observing an object which contains at least two fluorophores may comprise a data processing device in any of the configurations described above and at least one digital camera which is adapted to record the input image data.
According to one aspect, the medical observation device may comprise a digital color camera for recording at least one of the fluorescence emission signal and the autofluorescence signal. This digital color camera may also configure for recording a reflectance signal instead of or in addition to the fluorescence emission signal and the autofluorescence signal.
The medical observation device may, in another embodiment, comprise a digital color or monochrome camera for recording the fluorescence emission signal and another digital color or monochrome camera for recording the autofluorescence signal. In yet another embodiment, the medical observation device may comprise a first digital monochrome camera for recording the fluorescence emission signal and a second digital monochrome camera for recording the autofluorescence signal.
The digital color input image that contains the fluorescence emission signal and the autofluorescence signal may be a digital fluorescence color input image that is recorded by a fluorescence camera of the medical observation device. Such a fluorescence camera may be com-
bined with a fluorescence filter set having band-pass filters of which the passbands are contained in or correspond to the respective fluorescence emission spectra of the fluorescence emission signal and the autofluorescence signal. Thus, the fluorescence camera may be configured to specifically record fluorescence only and not record reflectance. In another variant, the fluorescence camera may be configured to also record a reflectance signal, or part of the reflectance signal, in particular a part of the reflectance signal that is contained in the NIR (near infrared range) and/or in the fluorescence excitation spectrum of the at least one artificially added fluorophore.
Any of the above-mentioned digital cameras does not need to be configured for recording the entirety of the fluorescence emission signal and/or the autofluorescence signal. For example, a part of the fluorescence emission signal may be recorded by a first digital color or monochrome camera and the other part by a second digital color or monochrome camera.
The initially mentioned object is also achieved by a method for using a medical observation device, such as an endoscope or microscope, wherein the method comprises the steps of recording the input image data and carrying out the computer implemented method in any of the above-described configurations.
Finally, the invention is also concerned with a computer program product and/or a computer- readable medium comprising instructions, which when executed by a computer causes the computer to carry out the computer implemented method in any of the above configurations.
As used herein the term “and/or” includes any and all combinations of one or more of the associated listed items and may be abbreviated as “/”.
Although some aspects have been described in the context of an apparatus, it is clear that these aspects also represent a description of the corresponding method, where a block or device corresponds to a method step or a feature of a method step. Analogously, aspects described in the context of a method step also represent a description of a corresponding block or item or feature of a corresponding apparatus.
In the following, the invention is described exemplarily with reference to various examples and with reference to the drawings. The combination of features that are described and/or shown in the drawings and/or examples is not to be considered as limiting. For example, features may be omitted from an embodiment if it has a technical effect e.g. as explained above, that is not needed in a specific application. Conversely, a feature described above that are not part of an embodiment described below may be added if the technical effect associated with this particular feature is beneficial in a specific application.
Throughout the description and the drawings, the same reference numerals are used for elements that correspond to each other with respect to function and/or structure.
In the drawings,
Fig. 1 shows a schematic representation of a medical observation device for generating a digital fluorescence color output image from at least one digital color input image;
Fig. 2 shows a schematic representation of the generation of a digital fluorescence color output image from an autofluorescence signal and a fluorescence emission signal contained in two digital input images of the input image data;
Fig. 3 shows a schematic representation of the generation of a digital fluorescence color output image from an autofluorescence signal and a fluorescence emission signal contained in a single digital input image of the input image data;
Fig. 4 shows a schematic representation of an example of combining the extracted fluorescence emission signal and the extracted autofluorescence signal;
Fig. 5 shows a schematic representation of another example of combining the extracted fluorescence emission signal and the extracted autofluorescence signal;
Fig. 6 shows a schematic representation of another example of combining the extracted fluorescence emission signal and the extracted autofluorescence signal;
Fig. 7 presents a schematic overview of the steps for generating a digital fluorescence color output image;
Fig. 8 shows a schematic representation of a generic medical observation device.
Fig. 1 shows schematically a medical observation device 100. The medical observation device 100 may be a microscope or an endoscope, the difference between a microscope and an endoscope being primarily that, in an endoscope (not shown), an object 106 is viewed through optical fibers that are brought into vicinity of the object 106 to be investigated, e.g. by insertion into a body containing the object, whereas, in a microscope, an objective 174 is directed onto the object. Although the medical observation device of Fig. 1 is a microscope, the following description also applies to an endoscope.
The medical observation device 100 may be a medical observation device used in surgery. The medical observation device 100 may also be a medical observation device used in a laboratory, such as a laboratory microscope. The object 106 to be investigated may consist of
or comprise biological tissue 107. The object 106 may be a part of a patient’s body that is located within the field of view of the medical observation device 100.
The object 106 may contain one or more fluorophores 116, 118, 120. At least one fluorophore 116 may be a fluorophore that is naturally contained in the object. For example, bone and blood naturally contain fluorophores. At least one fluorophore 118, 120 may be artificially added to the object 106, e.g. by injecting it into the biological tissue 107. Examples of fluorophores 118, 120 that may be artificially added to the object 106 are ICG, fluorescein and/or 5- ALA.
The medical observation device 100 as shown is a fluorescence imaging device. Thus, the medical observation device is configured to view and preferably also excite the fluorescence of the one or more fluorophores 116, 118, 120.
The medical observation device 100 may be a stereoscopic device as is exemplarily shown in Fig. 1. It may thus comprise two identical subassemblies 101 L and 101 R for each of the two stereoscopic channels. As the two subassemblies 101 L, 101 R are identical with respect to function and structure, the following description focuses on the right subassembly 101 R, but applies identically to the left stereoscopic channel 101 L.
The medical observation device 100 may alternatively be a monoscopic device. In this case, only one of the two subassemblies 101 L, 101 R may be present. For a monoscopic medical observation device 100, the following description therefore applies as well.
The medical observation device 100 in operation provides input image data 122. The input image data 122 are representative of an imaged scene, i.e. the part of the object that is within the field of view 184.
The input image data 122 may comprise one or more different digital input images 130. If the input image data 122 contain a plurality of digital input images 130, the different digital input images 130 should contain different spectral information. In such a case, each digital input image 130 of the input image data may be recorded at different wavelengths, preferably with no or, synonymously, minimum spectral overlap. Preferably, the spectra, in which the different digital input images 130 of the input image data 122 are recorded, are complementary.
For generating the input image data 122, the digital imaging system 102 may comprise one or more digital cameras 108. The number of digital input images 130 contained in the input image data 122 may depend on the number of cameras 108 used for generating the input image data 122. Depending on the type and setting of digital camera 108, a digital input image 130 may
be a color image or a monochrome image.
The medical observation device 100 is configured to record in the input image data 122both the fluorescence of the fluorophore 116 that occurs naturally in the object, and the fluorescence of at least one fluorophore 118, 120 that has been added artificially. For example, the one or more fluorophores 118, 120 may have been injected into a patient’s body to mark specific areas of interest, such as tumors.
The fluorescence of the naturally occurring fluorophore, also termed autofluorescence, is represented by an autofluorescence signal in the input image data 122. The autofluorescence signal is a component of the input image data 122 and represents an image of the object in the fluorescence spectrum of the fluorophore 116.
The fluorescence of the artificially added fluorophore is represented by a fluorescence emission signal in the input image data 122. The fluorescence emission signal is also a component of the input image data 122 and represents an image of the object in the fluorescence emission spectrum of the at least one fluorophore 118, 120.
Each of these two fluorescence signals has a different spectral signature due to the different spectral characteristics of the respective fluorescence emission.
The input image data 122 may comprise further signals as additional components. For example, the medical observation device 100 may also be configured to record the light reflected off the object in the input image data 122. The light reflected off the object is represented in the input signal as a reflectance signal. The reflectance signal represents a reflectance image of the object.
A single digital input image 130 may contain one or more signals either in their respective entirety or only parts thereof. If a digital input image 130 contains more than one signal, it is preferably a digital color input image so that the different signals may be distinguished from one another e.g. by their spectral signatures. Alternatively a single digital input image 130 may be a digital monochrome image. In this case it is preferred that the digital input image 130 is a digital monochrome input image.
Just by way of example, the digital imaging system 102 may comprise as digital cameras 108 a digital reflectance camera 110 and one or more digital fluorescence-light cameras 111 , 111a. A second (or third) digital fluorescence-light camera 111a is optional. In Fig. 1 , the second digital fluorescence-light camera 111a is only shown in the left stereoscopic channel 101 L, but of course may also be present in the right stereoscopic channel 101 R. Alternatively, the digital
fluorescence-light camera of one stereoscopic channel may be used as the (first) digital fluorescence-light color camera 111 and the digital fluorescence-light camera of the other stereoscopic channel may be used as the second fluorescence-light camera 111a. The cameras 110, 111 , 111a may each be a color camera or a monochrome camera. A multispectral camera or a hyperspectral camera is considered as a color camera.
The digital reflectance camera 110 is configured to record a digital reflectance input image 114, i.e. a digital input image 130, which is representative of the reflectance of the object 106 and thus may comprise all or at least a major part of the reflectance signal. The digital reflectance camera 110 is preferably configured to record a digital input image 130 in a wide spectral range within the visible light spectrum. Thus, the digital input image 130 recorded by the digital reflectance camera represents closely the natural colors of the object 106. This is important if the digital reflectance camera 110 is used to provide the user with an image of the object which comes as close as possible to the human perception of the object. The digital reflectance camera 110 may be a CCD, CMOS or multispectral or hyperspectral camera.
The digital input image 130 recorded by the digital reflectance camera 110 may contain also at least part of the signal and/or the fluorescence emission signal.
Each of the at least one digital fluorescence-light camera 111 , 111a is configured to record a different digital fluorescence-light image 112, i.e. a digital input image 130, which is representative of the fluorescence of the object 106 and thus may comprise all or at least a major part of the autofluorescence signal and the fluorescence emission signal. The fluorescencelight camera 111 may be configured to record the digital fluorescence-light input image 114 only in the fluorescence spectrum or the fluorescence spectra of the at least one fluorophore 116, 118, 120. Each fluorescence-light camera 111 , 111a may be configured to record the fluorescence of a different fluorophore. If the digital fluorescence-light image 112 is a color image, it may contain both the autofluorescence and the fluorescence-emission signal.
The fluorescence-light camera 111 may be configured to record the digital fluorescence-light image only in one or more narrow bands of light. These narrow bands should overlap the fluorescence spectrum or spectra of the one or more fluorophores 116, 118, 120 of which fluorescence is to be recorded. Preferably, the fluorescence spectra of the different fluorophores 116, 118, 120 are at least partly separate, preferably completely separate, i.e. nonoverlapping, so that the fluorescence-light camera 111 may record a digital color input image 130 representing two separate fluorescence bands that are spaced from one another.
Alternatively, if two fluorescence-light cameras 111 , 111a are provided, each fluorescence-
light camera 111 , 111a preferably captures the fluorescence emission of a different fluoro- phore, thus providing two digital fluorescence-light input images 112 containing different fluorescence-emission signals. For example, the fluorescence-light camera 111 captures the fluorescence emission signal and the fluorescence-light camera 111a captures the autofluorescence signal. In this case, the fluorescence-light cameras 111 , 111a may be monochrome or color.
The at least one fluorescence-light camera 111 , 111a may also capture part of the reflectance signal in a digital color input image 130. For example, a part of the low-frequency end of the excitation spectrum used for triggering the fluorescence of a fluorophore may overlap the fluorescence spectrum of this fluorophore and be recorded.
The at least one fluorescence-light camera 111 , 111a may be a monochrome camera, a CCD, CMOS or multispectral or hyperspectral camera. Preferably, the white-light color camera 110 and the at least one fluorescence-light color camera 111 are of the same type, although this is not necessary.
Any combination of the cameras 110, 111 and 111a may be combined into a single multispectral or hyperspectral camera
The respective fields of view 184 of the cameras 110, 111 , and if present 111a, are preferably aligned or even coinciding and coaxial. It is preferred that the cameras 110, 111 provide the identical field of view 184 with the identical perspective and focal length. This results in identical representations of the object 106 in the images 112, 114 generated by the different cameras 110, 111. Both cameras 110, 111 may use the same objective 174.
If a match of the perspectives and field of view cannot be generated optically, it may be generated by image processing by applying a matching or registering routine to the digital input images 130, as is explained further below.
It is preferred that the cameras 110, 111 , and, if present, 111a are operated synchronously. Specifically, the exposure times may be synchronized. Thus, the medical observation device 100 may be configured to generate the digital input images 130 at the same time.
Preferably, the gain of the at least two cameras 110, 111 , 111a is synchronized, i.e. adjusted in the at least two cameras 110, 111 , 111 a at the same time. Moreover, the ratio of the gain applied in camera 110 to the gain applied in camera 111 and, if present, in camera 111a may be constant, even if the gain is changed. The gamma correction and color adjustment or white balance may be switched off or kept constant.
For separating the light recorded in the digital reflectance input image 114 from the spectrum recorded in the at least one digital fluorescence-light input image 112, i.e. for separating the reflectance spectrum from the fluorescence spectrum, an optical color-separation assembly 176 may be provided. The color-separation assembly 176 may comprise optical elements such as a beam splitter 192, which may be dichroic. The color separation assembly 176 may further or alternatively comprise an optical observation filter set 188 and/or an optical fluorescence filter set 190. The optical observation filter set 188 and the fluorescence-filter set 190 may be part of an optical filter assembly 187.
The fluorescence filter set 190 is preferably configured to transmit light in the fluorescence spectrum or spectra of the one or more fluorophores 116, 118, 120 and to block light outside the fluorescence spectrum or spectra.
The fluorescence filter set 190 may comprise one or more optical band-pass filters comprising one or more passbands. Each passband should overlap the fluorescence emission spectrum of a respective fluorophore 116, 118, 120 of which the fluorescence is to be recorded. As the fluorescence-light filter set 190 is in the light path between the beam splitter 192 and the fluorescence-light color camera 111 , only the wavelengths in the passbands of the fluorescencelight filter set 190 are transmitted to the fluorescence-light camera color 111.
If two fluorescence-light cameras 111 , 111a are used to capture different fluorescence emission spectra, the fluorescence filter set 190 may comprise a different optical band-pass filter in front of each of the fluorescence-light color cameras 111 , 111a. The pass-band of one bandpass filter may be contained in the fluorescence-emission spectrum of one fluorophore 116, whereas the pass-band of the other band-pass filter may be contained in the fluorescenceemission spectrum of another fluorophore 116, 118 in the object 106.
The observation filter set 188 is preferably configured to block light in the fluorescence spectrum or spectra of the one or more fluorophores 116, 118. The observation filter set 188 may also be configured to block light in the fluorescence-excitation spectrum.
The observation filter set 188 is preferably configured as a band-stop filter, of which the stop bands correspond to or at least contain the passbands of the fluorescence-light filter set 190. The observation filter set 188 is located in the light path between the beam splitter 192 and the white-light camera 110. Thus, white-light camera 110 records only wavelengths that are outside the stop-bands of the observation filter set 188 and therefore also outside of the passbands of the fluorescence-light filter set 190 of the fluorescence-light filter set 190.
Any one of the observation filter set 188 and the fluorescence filter set 190 may be a tunable
filter.
If the beam splitter 192 is a dichroic beam splitter, at least one of the filter sets 188, 190 may be omitted as the optical spectral filtering in this case is already integrated in the dichroic beam splitter. The above description of the passbands and stopbands then should apply mutatis mutandis to the dichroic beam splitter 192.
The medical observation device 100 may further comprise an illumination assembly 178, which is configured to illuminate the object 106 preferably through the objective 174 through which the imaging system 102 records the at least one digital image 112, 114.
The illumination assembly 178 may be configured to selectively generate white-light, i.e. light that is evenly distributed across the entire visible spectrum, and fluorescence-excitation light, which contains light only in wavelengths that stimulate fluorescence of the at least one fluoro- phore 116, 118. The illumination light generated by the illumination assembly 178 may be fed into the objective 174 using an illumination beam splitter 180.
The illumination assembly 178 may be configured to generate illumination light simultaneously in a plurality of discrete, in particular narrow-band, wavelength bands. These wavelength bands may comprise any of or any combination of the following wavelength bands.
One such discrete wavelength band may be entirely located in the fluorescence-excitation spectrum of a fluorophore 116. Another such wavelength band may be entirely located in the fluorescence-emission spectrum of another fluorophore 118. Another such wavelength band may be limited to wavelengths larger than 700 nm and be entirely located in the NIR range.
The simultaneous illumination of the object with any of the discrete wavelength bands as described above may be accomplished by a light source 199, e.g. a tunable light source such as a light source comprising a plurality of LEDs in different colors, in particular in different primary colors, which is configured to generate light in these wavelength bands simultaneously. Alternatively or additionally, the wavelength bands may be generated by using an illumination filter 179 having multiple pass-bands, wherein the pass-bands preferably correspond to the above wavelength bands. If such an illumination filter 179 is used, the light source 199 may generate white-light which is then filtered by the illumination filter 179 so that only the light in the passbands illuminates the object 106.
The illumination filter 179 may be provided depending on the at least one fluorophore, of which fluorescence is to be triggered, and its specific excitation spectrum. For example, if 5-ALA is used as a fluorophore, the illumination filter may have a transmission of 90 % to 98 % up to
wavelengths of 425 nm, a transmission between 0.5 % and 0.7 % in wavelengths between 450 nm and 460 nm, a transmission of not more than 0.1 % between 460 nm and 535 nm and of practically zero for wavelengths above 535 nm. The illumination filter 179 may be configured for pass-through of NIR light. For example, the illumination filter 179 may comprise a passband in the NIR. The illumination filter 178 may further comprise a passband, which is preferably entirely located in the fluorescence-excitation spectrum of another fluorophore.
The medical observation device 100 may be adjusted to a different fluorophore or set of fluor- ophores by re-configuring the color-separation assembly 176, e.g. by exchanging its optical elements, such as the filters set 190 and/or 192, or the dichroic beam splitter 180.
Using the observation filter system 188 as described above, the digital reflectance color camera 110 may be used to record at least part of the fluorescence emission signal and/or at least part of the autofluorescence signal if the respective wavelengths pass through the pass-bands of the observation filter system 188. In such a case, the illumination of the object 106 should preferably not contain the fluorescence emission wavelengths, as the intensity of the fluorescence is often low, which may make the fluorescence signals harder to detect. The same holds mutatis mutandum for the reflection signal and the at least one digital fluorescence-light camera 111 , 111a.
The input image data 122 are processed by a data processing device 170. The data processing device 170 may be an integral part of the medical observation device 100. In one example, it is a processor which is embedded in the medical observation device and also used as a controller for controlling the hardware of the medical observation device 100, such as the brightness and/or spectral emission of the light source 199 and/or any objective of the medical observation device 100 and/or any actuators of the medical observation device 100. In another example, the data processing device 170 is part of a general computer, which is connected to the medical observation device for unidirectional, or bidirectional data transfer by wire or wirelessly.
The data processing device 170 may be a hardware module, such as a microprocessor, or a software module. The data processing device 170 may also be a combination of both a hardware module and a software module, for example by using software modules that are configured to be run on a specific processor, such as a vector processor, a floating point graphics processor, a parallel processor and/or on multiple processors. The data processing device 170 may be part of a general-purpose computer 186, such as a PC. In another embodiment, the data processing device 170 is an embedded processor of the medical observation device 100.
The data processing device 170 is configured to access the input image data 122, e.g. in the
form of one or more digital input images 130, such as digital white-light color input image 114 and the digital fluorescence-light image 112. The data processing device 170 may be configured to retrieve the digital input images 130 from a memory 194 and/or directly from the cameras 110, 111 and if present 111a. The memory 194 may be part of the data processing device 170 or reside elsewhere in the medical observation device 100.
The data processing device 170 is further configured to compute a digital fluorescence color output image 160 from the input image data 122, specifically from the autofluorescence signal and the fluorescence-emission signal contained in the input image data 122.
The digital fluorescence color output image 160 is a color image, which is represented in a color space. The color space of the digital fluorescence color output image may be different from the color space of any digital color input image that is contained in the input image data 122. Preferably, however, the color space of the digital fluorescence color output image 160 is the same color space as that of any of the digital color input images 130.
If the autofluorescence signal and/or the fluorescence emission signal need to be processed separately, the respective signal needs to be separated or extracted from the input image data 122.
Extraction is straightforward, if a digital input image consists of the entire fluorescence emission signal or the entire autofluorescence signal. Processing such a digital input image 130 corresponds to processing its constituent signal.
If, however, any two signals (or parts thereof) of the group of signals containing the autofluorescence signal, the fluorescence emission signal and the reflectance signal are contained in a single digital input image - which in this case should be a digital color input image, the respective signal may need to be separated from the other signals to be processed separately. To achieve this, the digital processing device 170 is configured to separate or extract any signal of the group of signals containing the autofluorescence signal, the fluorescence emission signal and the reflectance signal from the remaining signals of the group.
For such a separation or extraction of a signal, the digital processing device may comprise a separation or extraction routine 140 that may be stored in a memory 194 of the digital processing device 170. The separation or extraction routine 140 may comprise an unmixing routine 142 e.g. for spectrally unmixing the signals.
The extracted signals may be combined using a signal/image combination routine 144. The signal/image combination routine 144 may treat the extracted signals as images.
A linear transformation routine 146, which may comprise a transformation matrix 148, may be used to transform the colors of the extracted signals. The linear transformation may comprised by the signal/image combination routine 144.
Any of the routines 140 to 146 may be a software routine, a routine implemented in hardware, or a routine in which software and hardware components are combined.
The medical observation device 100 may comprise a user input device 162, which, upon being operated by a user, may generate a user selection signal 164, which may be communicated to the digital processing device 170. The user input device 162 may e.g. be a physical button, dial, slide or lever or a widget that represents a physical button, dial, slide, lever, or widget.
By operating the user input device 162, the user may determine which signal or combination of signals extracted from the digital input images 130 and/or in which intensity each signal either in absolute or relative to one or more other signals is displayed. These different modes of display are indicated by I, II, III etc.
The digital fluorescence color output image 160 may be displayed on a display 132, which is integral with the medical observation device 100. For example, the display 132 may be integrated in an ocular or eyepiece 104 of the medical observation device 100.
The digital fluorescence color output image 160 is preferably generated in real-time, i.e. a digital fluorescence color output image 160 is generated from a set of digital color input images 130 before the next set is generated by the at least two cameras 110, 111 , 111a.
The medical observation device 100 may comprise a direct optical path 134 from the object 106 through the objective 174 to the eyepiece 104. In such a case, the display may be a translucent display 132 located in the direct optical path 134 or the display may be projected into the direct optical path 134. A beam splitter 136 may be provided to split the light between the optical eyepiece 104 and the digital imaging system 102. In one embodiment, up to 80 % of the light may be directed to the eyepiece 104.
Alternatively, the medical observation device 100 may not have a direct optical path 134 but only display images from the integral display 132. As a further alternative, the medical observation device may not have any display at all.
The medical observation device 100 may comprise an output interface 172 to which one or more (external) displays 182 may be connected. For this, the output interface 172 may comprise standardized connectors and data transmission protocols, such as USB, HDMI, DVI, Dis- playPort, Bluetooth and/or others. An external display may be a monitor, 3D goggles, oculars
and the like. Any combination of external displays may be connected to output interface 172.
The computer 186 and/or the data processing device 170 is connected to the digital imaging system 102 using one or more data transmission lines 196. A data transmission line may be wired or wireless, or partly wired and partly wireless. The computer 186 and/or the data processing device 170 may not be bodily integrated in the medical observation device 100 but be physically located remote from the digital imaging system 102. For this, the digital imaging system 102 and the computer 186 and/or the data processing device 170 may be connected to a network, such as a LAN, a WLAN or a WAN, to which also at least one display 182 is connected.
According to a modification, the medical observation device 100 may be stereoscopic but comprise only two cameras, one for each stereoscopic channel. In one stereoscopic channel, the fluorescence-light color camera 111 is used and configured to selectively record white-light reflectance, whereas in the other stereoscopic channel, the white-light color camera 110 is used. Such an arrangement provides a stereoscopic white-light color input image if no fluorescence is used and a monoscopic white-light color input image and a monoscopic fluorescencelight color input image if fluorescence is used. The description above and below applies equally to this configuration.
The input image data may, in one embodiment, contain at least one digital color input image 130 which contains both the autofluorescence signal and the fluorescence emission signal. This is the case if all fluorescence emitted by the object is recorded in the at least one digital color input image.
In another embodiment, the input image data 122 may contain at least one digital monochrome input image and at least one digital color input image. The digital monochrome input image contains either at least a part of the fluorescence emission signal or at least a part of the autofluorescence signal. The digital color input image may contain the fluorescence emission signal if the digital monochrome color input image contains the autofluorescence signal, and it may contain a part of the autofluorescence signal if the digital mono
The input image data comprise a plurality of pixels. The pixels may be color pixels or monochrome pixels. A monochrome pixel only represents intensity, e.g. as a greyscale image. A color pixel comprises information about at least some of the color appearance parameters such as hue, lightness, brightness, chroma, colorfulness and saturation. Color pixels are recorded using color bands or, equivalently, color channels or primary colors of a color space using a digital color camera.
A color space comprises at least three of such color channels. In the color space, each color channel is represented by a different color space coordinate. For conversion between different color spaces, color space transformations may be used. In a different color space, the same color is represented in different color space coordinates. Each pixel of the digital color input images 130 comprises a set of color space coordinates that together represent the color of the respective pixel. Each color band thus may be regarded as representing a color space axis and each color may be regarded as a point in color space, which is defined by the vector — i.e. the color space coordinates — pointing to this color. Thus, adding two colors corresponds to a vector addition. If one color has color space coordinates {xi, yi, zi} and a second color has color space coordinate {X2, y2, Z2} then the sum of these two colors corresponds to the color {xi+x2, yi+y2, Z1+Z2}.
In one example, the digital color input images 130 or, more generally the input image data 120, may be recorded in RGB color space using the three primary colors or color bands — or color space coordinates — R, G, B. Alternatively, the digital color input image 130 may be recorded in different color spaces, respectively, and/or represent multi-spectral or hyperspectral color input images. The digital input images 130 of a set of digital input images, such as the digital white-light color input image 114 and the digital fluorescence color input image 112, need not be recorded in the same color space, although this is preferred.
In RGB color space, each color is represented by triple of three color space coordinates in the form of integer numbers, wherein each integer number indicates the intensity of one of the primary colors R, G, B. For example, the most intense red is indicated by the triple [255 ,0, 0], The most intense green color is indicated by [0, 255, 0], and the most intense blue by [0, 0, 255], Thus, RGB color space is a three-dimensional space, CMYK color space would be a four-dimensional space. A color can be considered as a point in color space to which a vector such as [0, 0, 255] points. A multispectral or hyperspectral color space having n color bands would correspondingly result in an n-dimensional color space, in which each color is represented by an n-tuple of color space coordinates.
Fig. 2 gives an example how a digital fluorescence color output image 160 may be generated from the input image data 122. As shown, the input image data 122 may comprise two digital input images 130, any of which may be a color image or a monochrome image. Each of the digital input images 130 comprises or consists of input pixels 230. If the respective digital input image 130 is a color image, its input pixels 230 will be color pixels. If the respective digital input image 130 is a monochrome image, the respective input pixels 230 will be monochrome pixels.
A first digital color input image 212 of the input image data 122 may correspond to the digital
reflectance input image 114 mentioned above. The second digital input image 214 of the input image data 122 may correspond to the fluorescence-light input image 114.
The first digital input image 212 comprises the autofluorescence signal 224. Thus, each input pixel 230 contains a contribution of the autofluorescence signal 224. This contribution may be zero at some pixels, where no autofluorescence signal could be received.
At reference numeral 240, an exemplary spectrum of the first digital input image 212 is shown. The spectrum 240 indicates the intensity I at a wavelength of light received in the first digital color input image 212. It can be seen that the spectrum 250 of the autofluorescence 224, which corresponds to the autofluorescence emission spectrum, is at least partly contained in a passband 220, e.g., of the optical filter assembly 187. The passband 220 may be wider or narrower than the spectrum 250.
If the first digital input image 212 is a monochrome image, only a single signal should be contained therein. In the case of a monochrome image, the first digital input image 212 may consist of the autofluorescence signal 224. If the first digital input image 212 is a color image, additional signals may be contained. In the case of a color image, the the first digital input image 212 may e.g. comprise also at least part of a reflectance signal 202 in addition to the autofluorescence signal 224. The spectrum 252 of the reflectance signal 202 is indicated quantitatively in the spectrum 240.
At some input pixels 230, the reflectance signal 202 and the autofluorescence signal 224 may overlap. Such an input pixel 230 contains both (a local part of the) reflectance signal 202 and (a local part of) the autofluorescence signal 224. The two signals 202, 224 may be separated from one another if the input pixel 230 is a color pixel.
The reflectance signal 202 may, e.g., be generated by illuminating the object 106 with light having wavelengths in a fluorescence excitation spectrum of another fluorophore 118, 120 of which fluorescence is to be triggered. A part of such an illumination spectrum may overlap the passband 220. The spectrum 240 of the first digital input image may of course comprise additional components at other wavelengths preferably outside the band-pass 220.
If the first digital input image 130 includes only the autofluorescence signal 224, an extraction of the autofluorescence signal 224 is not necessary. In this case, the first digital input image 130 represents already the autofluorescence signal 224.
If, however, the first digital input image 212 contains another signal, such as the reflectance
signal 202, the autofluorescence signal 224 must be separated or extracted if it is to be processed separately. Such an extraction or separation may be done using spectral unmixing in particular by the data processing device 170.
Once the autofluorescence signal 224 has been extracted or separated as shown at reference number 290, it may, e.g., be colored by a color 218. The intensity of the color 218 may depend on the intensity of the extracted autofluorescence signal 224 at an input pixel 230. If the first digital input image 130 is a color image, the color of the extracted autofluorescence signal 290 may be kept but the intensity and/or brightness may be normalized.
The color 218 may be a pseudocolor, a false color or a natural color. In case of a pseudocolor, a different color is assigned to a pixel depending on the intensity of the autofluorescence signal 224 at this pixel. In case of a false color or a natural color, the brightness or intensity of this color may depend on the intensity of the autofluorescence signal 224 at this pixel. A false color is a color which is different from the color of the fluorescence emitted by the autofluorescing fluorophore as perceived by the human eye. A natural color corresponds to the color of the fluorescence emitted by the autofluorescing fluorophore as perceived by the human eye.
In the second digital input image 214, the fluorescence emission signal 204 is contained in the example of Fig. 2. Again, additional signals such as part of the autofluorescence signal 224 and/or part of the reflectance signal 202 may be contained in the second digital input image 214. As with the first digital input image 212 it is preferred that the second digital input image 214 is a color image to be able to separate the fluorescence emission signal 204 from any other signals contained in the second digital input image 214. If the second digital input image 214 is a monochrome image, it should contain only the fluorescence emission signal 204.
A sample spectrum of the light recorded by the second digital input image 214 is indicated at reference number 242. The second digital input image 214 records the light that passes through the passbands 220 of the optical filter assembly 187. As shown just by way of example, some of the autofluorescence signal 224 may leak into the second digital input image 214 if the autofluorescence emission spectrum 250 overlaps the passband 220 which contains the fluorescence emission spectrum 254 of at least one fluorophore 116, 118, such as 5-ALA and/or ICG. The second digital input image 214 may correspond to the fluorescence-light input image 114.
In some cases, it may be preferred that the object is illuminated in an additional narrow band to provide additional information. For example, the object 106 may be illuminated in the NIR range 272 outside the visible light range 270.
If the second digital input image 214 only contains the fluorescence emission signal 204, no extraction or separation of this signal is necessary. If, however, additional signals such as the autofluorescence signal 224 and/or the reflectance signal 202, or parts thereof, are contained in the second digital image 214, an extraction or separation of the fluorescence emission signal 204 is necessary, to be able to process it separately. A separation of the fluorescence emission signal 204 from the second digital input image 130 may, e.g., be done using spectral unmixing. The spectral unmixing can be performed by the data processing device 170 described above.
The extracted autofluorescence signal 290 preferably is colored using a color 216. The intensity and/or brightness or any other color parameter of the color 216 may depend on the intensity of the extracted fluorescence signal 290 at an input pixel 230. If extracted fluorescence emission signal 280 is already a color image, its color may also be maintained. As described above in the context of the color 218, the color 216 may be a pseudocolor, a false color or a natural color.
The digital fluorescence color output image 160 is obtained by combining the (if necessary) extracted, colored and preferably also normalized autofluorescence signal 290 with the (if necessary) extracted, colored and preferably normalized fluorescence emission signal 280. Each output pixel 232 of the digital fluorescence color output image 160 contains a combination of the autofluorescence signal 224 as contained in the corresponding input pixel 230 of the first digital input image 212 and the fluorescence emission signal 204 at the corresponding input pixel 230 of the second digital image 214. An overlap 222 may exist in the digital fluorescence color output image 160, where both the autofluorescence signal and the fluorescence emission signal overlap.
Fig. 3 shows an embodiment, where the input image data 122 comprise only a single digital color input image 130, which, for example, may be a digital fluorescence-light color input image 112. The digital color input image 130 contains both the autofluorescence signal 224 and the fluorescence emission signal 204. In addition, the digital fluorescence input image 130 may contain a reflectance signal 202. A sample spectrum of such a digital color input image 130 is shown at reference numeral 300. As before, the autofluorescence signal 224 and the fluorescence emission signal 204 need to be extracted at every input pixel 230, resulting in an extracted autofluorescence signal 290 and an extracted fluorescence emission signal 280. Once the extracted signals 280, 290 are obtained, the further procedure may be as described with reference to Figure 2.
With reference to Figs. 4 to 6, it is now explained how the extracted autofluorescence signal 290 and the extracted fluorescence emission signal 280 can be combined using different colors
216, 218 to make them more distinguishable to a user. Just by way of example, the digital fluorescence color output image 160 is represented in RGB color space having three color bands R, G, B where the color of a pixel is represented accordingly in three color space coordinates [rgb].
According to Fig. 4, the fluorescence emission signal 280 and the autofluorescence signal 290 may be assigned to different color channels of a color space respectively. For example, the extracted autofluorescence signal 290 may be assigned the green color channel G, whereas the extracted fluorescence emission signal 280 is assigned the red channel R. The blue color channel B may not be assigned any signal and thus be set, for example, to zero.
This combination is simple and useful in particular if the color bands correspond at least roughly to the hues of the autofluorescence signal and the fluorescence emission signal, respectively. This may be the case if the fluorescence emission of the respective fluorophores 116, 118, 120 has a hue which corresponds to the color of a color band according to human color vision.
For example, the fluorescence emission of 5-ALA has a red hue. Thus, the extracted fluorescence emission signal 280 if representing the fluorescence of 5-ALA may be assigned the red channel R. The autofluorescence signal 290 may be assigned the green channel, especially if the autofluorescence emission is perceived as greenish by human perception.
According to Fig. 5, the color space coordinates of two corresponding pixels 230 of the extracted autofluorescence signal 290 and the extracted fluorescence emission signal 280 may simply be added to obtain the color space coordinates of the corresponding output pixel 232 of the digital fluorescence color output image 160. Thus, if a pixel 230 of the extracted autofluorescence signal 290 has color space coordinates {n, gi, bi} and if the corresponding pixel 230 of the extracted fluorescence emission signal 280 has color space coordinates {r2, g2, b2} then vector-adding these color space coordinates results in the color space coordinates {n + r2, gi + g2, bi + b2}.
According to Fig. 6, the extracted autofluorescence signal 290 may be represented at each pixel 230 by color space coordinates Xj, where i=1 , ... , M. Thus, for a monochrome image, i=1 and xi simply represents the intensity or brightness of the pixel 230. If the extracted fluorescence emission signal 280 is represented in RGB color space, then yi=r, y2=g and ya=b. Of course, y; may represent any color space and even multi or hyperspectral color coordinates.
Independent of the representation of the extracted fluorescence emission signal 280, the ex-
tracted autofluorescence signal 290 may be monochrome or color. Here, the color space coordinates at each pixel may be represented as yj, where j=1 , N, where N does not necessarily need to be equal to M, so that both signals may be represented in different color spaces (considering a monochrome representation as a very simple color space).
For converting the color space coordinates and yj of corresponding pixels 230 of the extracted autofluorescence signal 290 and the extracted fluorescence emission signal 280, a linear transformation 146 may applied to the color space coordinates Xj and yj. The linear transformation 146 may comprise a transformation matrix , which is indicated in Fig. 1 at the reference numeral 148 as part of the data processing device 170.
The linear transformation matrix is multiplied by a vector {{xi}{yj}} consisting of the coordinates x and yj. Thus, the input vector is of the form {xi, ... , XM, yi,... , yN} and therefore has dimension (M + N). The results of the linear transformation are the color space coordinates or the color vector {rk}, k=1 , ... , K at the corresponding output pixel 232 of the digital fluorescence color output image 160. If the digital fluorescence color output image 160 is represented in RGB color space, corresponds to the three RGB color space coordinates. Thus, the linear transformation matrix has dimensions of (M+N) in one direction and K in the other direction.
Assigning different colors 216, 218 and using the combination schemes shown in Figs. 5 to 6 allows to spread spectra of the autofluorescence signal and the fluorescence emission signal over a wider range of colors, thus offering a better visualization of output pixels 232 that contain both the autofluorescence signal 224 and the fluorescence emission signal 204.
Fig. 7 presents an overview of the steps that may be used for generating the digital fluorescence color output image 160.
In a first step 700, the input image data 122 are recorded, retrieved or accessed. This may comprise recording one or more digital input images 130 with a corresponding number of cameras. For example, at step 702 a first digital input image 130 may be recorded, at step 704 a second digital input image 130 may be recorded and at step 706 a third digital input image 130 may be recorded. The steps 702, 704, 706 may occur simultaneously or shortly after one another.
At step 708, the autofluorescence signal is separated from the input image data or the respective digital input image 130. This step may, e.g., be carried out by the extraction routine 144. As a result, the extracted fluorescence emission signal 280 and the extracted autofluorescence signal 290 are obtained. Step 708 is not necessary, if a digital input image 130 consists of a signal 204, 224 that otherwise would have been needed to be extracted. In such a case, the
digital input image 130 may be used as the signal to be extracted.
At step 710, intermediate processing may be carried out on the extracted fluorescence emission signal 280. Intermediate processing may comprise normalizing the extracted fluorescence emission signal. Intermediate processing 710 may alternatively or cumulatively comprise assigning a color to the extracted fluorescence emission signal.
At step 712, intermediate processing may be carried out on the extracted autofluorescence signal 290. Step 712 may correspond to step 710 described above.
At step 714, the extracted autofluorescence signal 290 may be combined with the extracted fluorescence emission signal 280 to generate the digital fluorescence color output image 160. The step 714 may, e.g., be carried out by the signal/image combination routine 144. The step 714 may comprise the step of coloring the extracted autofluorescence signal 280 in a different color than the extracted fluorescence emission signal 290.
At step 716, the digital fluorescence color output image 160 is displayed e.g. in at least one of the displays 132, 182.
Some embodiments relate to a microscope comprising a system as described in connection with one or more of the Figs. 1 to 7. Alternatively, a microscope may be part of or connected to a system as described in connection with one or more of the Figs. 1 to 7. Fig. 8 shows a schematic illustration of a system 800 configured to perform a method described herein. The system 800 comprises a microscope 810 and a computer system 820. The microscope 810 is configured to take images and is connected to the computer system 820. The computer system 820 is configured to execute at least a part of a method described herein. The computer system 820 may be configured to execute a machine learning algorithm. The computer system 820 and microscope 810 may be separate entities but can also be integrated together in one common housing. The computer system 820 may be part of a central processing system of the microscope 810 and/or the computer system 820 may be part of a subcomponent of the microscope 810, such as a sensor, an actor, a camera or an illumination unit, etc. of the microscope 810.
The computer system 820 may be a local computer device (e.g. personal computer, laptop, tablet computer or mobile phone) with one or more processors and one or more storage devices or may be a distributed computer system (e.g. a cloud computing system with one or more processors and one or more storage devices distributed at various locations, for example, at a local client and/or one or more remote server farms and/or data centers). The computer system 820 may comprise any circuit or combination of circuits. In one embodiment, the
computer system 820 may include one or more processors which can be of any type. As used herein, processor may mean any type of computational circuit, such as but not limited to a microprocessor, a microcontroller, a complex instruction set computing (CISC) microprocessor, a reduced instruction set computing (RISC) microprocessor, a very long instruction word (VLIW) microprocessor, a graphics processor, a digital signal processor (DSP), multiple core processor, a field programmable gate array (FPGA), for example, of a microscope or a microscope component (e.g. camera) or any other type of processor or processing circuit. Other types of circuits that may be included in the computer system 820 may be a custom circuit, an application-specific integrated circuit (ASIC), or the like, such as, for example, one or more circuits (such as a communication circuit) for use in wireless devices like mobile telephones, tablet computers, laptop computers, two-way radios, and similar electronic systems. The computer system 820 may include one or more storage devices, which may include one or more memory elements suitable to the particular application, such as a main memory in the form of random access memory (RAM), one or more hard drives, and/or one or more drives that handle removable media such as compact disks (CD), flash memory cards, digital video disk (DVD), and the like. The computer system 820 may also include a display device, one or more speakers, and a keyboard and/or controller, which can include a mouse, trackball, touch screen, voice-recognition device, or any other device that permits a system user to input information into and receive information from the computer system 820.
Some or all of the method steps may be executed by (or using) a hardware apparatus, like for example, a processor, a microprocessor, a programmable computer or an electronic circuit. In some embodiments, some one or more of the most important method steps may be executed by such an apparatus.
Depending on certain implementation requirements, embodiments of the invention can be implemented in hardware or in software. The implementation can be performed using a non- transitory storage medium such as a digital storage medium, for example a floppy disc, a DVD, a Blu-Ray, a CD, a ROM, a PROM, and EPROM, an EEPROM or a FLASH memory, having electronically readable control signals stored thereon, which cooperate (or are capable of cooperating) with a programmable computer system such that the respective method is performed. Therefore, the digital storage medium may be computer readable.
Some embodiments according to the invention comprise a data carrier having electronically readable control signals, which are capable of cooperating with a programmable computer system, such that one of the methods described herein is performed.
Generally, embodiments of the present invention can be implemented as a computer program
product with a program code, the program code being operative for performing one of the methods when the computer program product runs on a computer. The program code may, for example, be stored on a machine readable carrier.
Other embodiments comprise the computer program for performing one of the methods described herein, stored on a machine readable carrier.
In other words, an embodiment of the present invention is, therefore, a computer program having a program code for performing one of the methods described herein, when the computer program runs on a computer.
A further embodiment of the present invention is, therefore, a storage medium (or a data carrier, or a computer-readable medium) comprising, stored thereon, the computer program for performing one of the methods described herein when it is performed by a processor. The data carrier, the digital storage medium or the recorded medium are typically tangible and/or non- transitionary. A further embodiment of the present invention is an apparatus as described herein comprising a processor and the storage medium.
A further embodiment of the invention is, therefore, a data stream or a sequence of signals representing the computer program for performing one of the methods described herein. The data stream or the sequence of signals may, for example, be configured to be transferred via a data communication connection, for example, via the internet.
A further embodiment comprises a processing means, for example, a computer or a programmable logic device, configured to, or adapted to, perform one of the methods described herein.
A further embodiment comprises a computer having installed thereon the computer program for performing one of the methods described herein.
A further embodiment according to the invention comprises an apparatus or a system configured to transfer (for example, electronically or optically) a computer program for performing one of the methods described herein to a receiver. The receiver may, for example, be a computer, a mobile device, a memory device or the like. The apparatus or system may, for example, comprise a file server for transferring the computer program to the receiver.
In some embodiments, a programmable logic device (for example, a field programmable gate array) may be used to perform some or all of the functionalities of the methods described herein. In some embodiments, a field programmable gate array may cooperate with a microprocessor in order to perform one of the methods described herein. Generally, the methods are preferably performed by any hardware apparatus.
Reference Numerals (tentative)
100 medical observation device
101 L stereoscopic subassembly for the left channel
101 R stereoscopic subassembly for the right channel
102 digital imaging system
104 eyepiece
106 object
107 biological tissue
108 digital camera
110 digital reflectance camera
111 digital fluorescence-light camera
111a second digital fluorescence-light camera
112 digital fluorescence-light input image
114 digital reflectance input image
116 fluorophore naturally contained in the object
118 fluorophore artificially added to the object
120 another fluorophore artificially added to the object
122 input image data
130 digital input image
132 integral display
134 direct optical path
136 beam splitter
140 extraction/separation routine
142 unmixing routine
144 signal/image combination routine
146 linear transformation
148 transformation matrix
160 digital fluorescence color output image
162 user input device
164 user selection signal
170 data processing device
172 output interface
174 objective
176 color-separation assembly
178 illumination assembly
179 illumination filter
180 illumination beam splitter
182 display
184 field of view
186 computer
187 optical filter assembly
188 observation filter set
190 fluorescence filter set
192 dichroic beam splitter
194 memory
196 data transmission line
198 illumination light
199 light source
200 input image set
202 reflectance signal
204 fluorescence emission signal
212 first digital input image
214 second digital input image
216 first color
218 second color
220 passband
222 overlap
224 autofluorescence signal
230 input pixel
232 output pixel
240 spectrum of first digital input image
242 spectrum of second digital input image
250 spectrum of autofluorescence signal
252 spectrum of reflectance signal
254 spectrum of fluorescence emission signal
270 visible light
272 NIR range
280 extracted fluorescence emission signal
290 extracted autofluorescence signal
300 spectrum of digital color input image
402 sensitivity curve of a first color space coordinate, e.g. B
404 sensitivity curve of a second color space coordinate, e.g. G
406 sensitivity curve of a third color space coordinate, e.g. R
700 recording digital input image data 700 recording a first digital input image
702 recording a second digital input image
704 recording a third color input image
706 extracting/separating fluorescence emission signal and reflectance signal 710 intermediate processing
712 intermediate processing
714 combining and creating digital fluorescence color output image
900 system
910 microscope 920 computer x, B, B1, color space coordinate y, G, G1 color space coordinate
I intensity
I, II, III... imaging modes
I, j, k coordinate indices
Mik linear transformation matrix
R, R1 color space coordinate rk color space coordinates
Xj color space coordinates yj color space coordinates
A wavelength
Claims
1. Data processing device (170) for a medical observation device (100), such as a microscope or an endoscope, for observing an object (106), wherein the data processing device (170) is configured:
- to access input image data (120), the input image data containing
= an autofluorescence signal (224), the autofluorescence signal being representative of fluorescence emitted by a fluorophore (116) naturally contained in the object, and
= a fluorescence emission signal (204), the fluorescence emission signal being representative of fluorescence emitted by a fluorophore (118) artificially added to the object;
- to generate a digital fluorescence color output image (160) from a combination of the fluorescence emission signal colored in a first color (308) and the autofluorescence signal colored in a second color (310), the second color being different from the first color.
2. Data processing device (170) according to claim 1 , wherein the input image data (120) comprise a digital color input image (114), wherein the digital color input image contains the fluorescence emission signal (204) and the autofluorescence signal (224); and wherein the data processing device is configured to:
- extract the fluorescence emission signal from the digital color input image prior to coloring the fluorescence emission signal in the first color; and
- extract the autofluorescence signal from the digital color input image prior to coloring the autofluorescence signal in the second color (218).
3. Data processing device (170) according to claim 1 or 2, wherein the data processing device is configured:
- to extract at least one of the autofluorescence signal (224) and the fluorescence emission signal (204) from the input image data (120) by spectral unmixing.
4. Data processing device (170) according to any one of claims 1 to 3, wherein the input image data (120) comprises input pixels (230); wherein the digital fluorescence color output image (160) comprises output pixels (232); wherein the intensity (I) of the first color (216) at an output pixel (232) depends on the intensity (I) of the fluorescence emission signal (204) at at least one corresponding input pixel (230); and wherein the intensity of the second color (218) at an output pixel depends on the intensity of the autofluorescence emission signal (224) at at least one corresponding input pixel.
5. Data processing device (170) according to any one of claims 1 to 4, wherein the data processing device is configured:
- to generate the digital fluorescence color output image (160) in a color space (410) having a set ({R, G, B}) of color space coordinates (402, 404, 406); wherein the first color (216) is represented by a first subset of the set of color space coordinates of the color space; wherein the second color (218) is represented by a second subset of the set of color space coordinates of the color space, the intersection of the first subset and the second subset being empty; and wherein the data processing device is configured to form the combination of the fluorescence emission signal (204) colored in the first color (308) and the autofluorescence signal (224) colored in the second color (310) by forming the union of the first subset and the second subset.
6. Data processing device (170) according to any one of claims 1 to 4, wherein the data processing device is configured to form the combination of the fluorescence emission signal (204) colored in the first color (216) and the autofluorescence signal (224) colored in the second color (218) by performing a vector addition (140) of
the color space coordinates of the fluorescence emission signal colored in the first color and the color space coordinates of the autofluorescence signal colored in the second color at each pixel of the digital fluorescence color output image.
7. Data processing device (170) according any one of claims 1 to 4, wherein the data processing device is configured to
- form the combination of the fluorescence emission signal (204) colored in the first color (308) and the autofluorescence signal (224) colored in the second color (310) by applying a linear transformation (144) simultaneously to the fluorescence emission signal and to the autofluorescence signal.
8. Data processing device (170) according to claim 7, wherein the linear transformation comprises multiplying the intermediate image with a transformation matrix having a first dimension corresponding to the total number of color space coordinates in the fluorescence emission signal and the autofluorescence signal and a second dimension corresponding to the total number of color space coordinates of the color space of the digital fluorescence color output image.
9. Data processing device (170) according to any one of claims 1 to 8, wherein the data processing device is configured:
- to change the ratio of the intensity of the fluorescence emission signal and the intensity of the autofluorescence signal in the digital fluorescence color output image (160) depending on a user selection signal (164).
10. Data processing device (170) according to any one of claims 1 to 9, wherein the input image data (120) comprise one of
- at least one digital color input image (212),
- at least one digital color input image, and at least one digital monochrome input image (214),
- at least two digital input images (214), which are monochrome images, of the object,
11. Medical observation device (100), such as a microscope or an endoscope, for observing an object (106) which contains at least two fluorophores (116, 118,120), wherein the medical observation device (100) comprises:
- a data processing device (170) according to any one of claims 1 to 9;
- at least one digital camera (110, 111 , 111a), the at least one digital camera being adapted to record the input image data (120).
12. Computer-implemented method for processing input image data (122) in a medical observation device (100), such as a microscope or endoscope, the method comprising the steps of:
= accessing input image data (120) containing
E an autofluorescence signal (224) of an object (106), the autofluorescence signal being representative of fluorescence emitted by a fluoro- phore (116) naturally contained in the object, and
= a fluorescence emission signal (204) of the object, the fluorescence emission signal being representative of fluorescence emitted by a fluor- ophore (118) artificially added to the object;
- generating a digital fluorescence color output image (160) from a combination of the fluorescence emission signal colored in a first color (308) and the autofluorescence signal colored in a second color (310), the second color being different from the first color.
13. Method for using a medical observation device, the method comprising the steps of
- recording the input image data (120); and
- carrying out the computer implemented method of claim 11 .
14. A computer program product computer-readable medium comprising instructions, which, when the program is executed by a computer, cause the computer to carry out the method of claim 12.
15. A computer-readable medium comprising instructions, which, when the program is executed by a computer, cause the computer to carry out the method of claim 12.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| DE102023120559.5 | 2023-08-02 | ||
| DE102023120559 | 2023-08-02 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025027024A1 true WO2025027024A1 (en) | 2025-02-06 |
Family
ID=92212882
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/EP2024/071556 Pending WO2025027024A1 (en) | 2023-08-02 | 2024-07-30 | Data processing device and computer implemented invention for a medical observation device, for visualization of an autofluorescence signal and a fluorescence emission signal |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025027024A1 (en) |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6711293B1 (en) | 1999-03-08 | 2004-03-23 | The University Of British Columbia | Method and apparatus for identifying scale invariant features in an image and use of same for locating an object in an image |
| EP1705477A2 (en) * | 2005-03-23 | 2006-09-27 | Fuji Photo Film Co., Ltd. | Fluorescence detecting system |
| US20190282099A1 (en) * | 2018-03-16 | 2019-09-19 | Leica Instruments (Singapore) Pte. Ltd. | Augmented reality surgical microscope and microscopy method |
| US20200367818A1 (en) * | 2018-02-02 | 2020-11-26 | University Health Network | Devices, systems, and methods for tumor visualization and removal |
| WO2022038527A1 (en) * | 2020-08-18 | 2022-02-24 | Agilent Technologies, Inc. | Tissue staining and sequential imaging of biological samples for deep learning image analysis and virtual staining |
-
2024
- 2024-07-30 WO PCT/EP2024/071556 patent/WO2025027024A1/en active Pending
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6711293B1 (en) | 1999-03-08 | 2004-03-23 | The University Of British Columbia | Method and apparatus for identifying scale invariant features in an image and use of same for locating an object in an image |
| EP1705477A2 (en) * | 2005-03-23 | 2006-09-27 | Fuji Photo Film Co., Ltd. | Fluorescence detecting system |
| US20200367818A1 (en) * | 2018-02-02 | 2020-11-26 | University Health Network | Devices, systems, and methods for tumor visualization and removal |
| US20190282099A1 (en) * | 2018-03-16 | 2019-09-19 | Leica Instruments (Singapore) Pte. Ltd. | Augmented reality surgical microscope and microscopy method |
| WO2022038527A1 (en) * | 2020-08-18 | 2022-02-24 | Agilent Technologies, Inc. | Tissue staining and sequential imaging of biological samples for deep learning image analysis and virtual staining |
| US20240029409A1 (en) * | 2020-08-18 | 2024-01-25 | Newtron Group Sa | Tissue staining and sequential imaging of biological samples for deep learning image analysis and virtual staining |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| EP4523404A1 (en) | Image processor and computer-implemented method for a medical observation device, using a location-dependent color conversion function | |
| US20250143652A1 (en) | Method, processor, and medical observation device using two color images and color cameras for fluorescence and white-light | |
| EP4275580A1 (en) | Method, processor, and medical fluorescence observation device using two color images to record fluorescence | |
| WO2025027024A1 (en) | Data processing device and computer implemented invention for a medical observation device, for visualization of an autofluorescence signal and a fluorescence emission signal | |
| EP4501203A1 (en) | Data processing device and computer-implemented method for combining a fluorescence emission signal with an edge detection signal in a medical observation device | |
| EP4502578A1 (en) | Data processing device and computer-implemented method for combining a fluorescence emission signal with a specular reflection signal in a medical observation device | |
| EP4574006A1 (en) | Data processing device, computer-implemented method and medical observation device | |
| EP4480385A1 (en) | Data processing device for a medical observation device such as a microscope or an endoscope and computer-implemented method for generating a digital reflectance color output image | |
| US12349863B2 (en) | Data processing device and computer-implemented method combining two images and an overlay color using a uniform color space | |
| WO2025056578A1 (en) | Data processing device and computer-implemented method for displaying blood oxygenation and concentration values in a medical observation device and medical observation device and method for its use | |
| US20250331722A1 (en) | Method, processor, and medical fluorescence observation device using two color images and color cameras for fluorescence and white-light | |
| US20250331746A1 (en) | Method, processor, and medical fluorescence observation device using a color-dependent color conversion function | |
| US20250302276A1 (en) | Method, processor, and medical fluorescence observation device using two color images to record fluorescence | |
| US20250331709A1 (en) | Method, processor, and medical fluorescence observation device for toggling images |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24751441 Country of ref document: EP Kind code of ref document: A1 |