WO2025056578A1 - Dispositif de traitement de données et procédé mis en œuvre par ordinateur pour afficher des valeurs d'oxygénation et de concentration de sang dans un dispositif d'observation médicale et dispositif d'observation médicale et son procédé d'utilisation - Google Patents
Dispositif de traitement de données et procédé mis en œuvre par ordinateur pour afficher des valeurs d'oxygénation et de concentration de sang dans un dispositif d'observation médicale et dispositif d'observation médicale et son procédé d'utilisation Download PDFInfo
- Publication number
- WO2025056578A1 WO2025056578A1 PCT/EP2024/075313 EP2024075313W WO2025056578A1 WO 2025056578 A1 WO2025056578 A1 WO 2025056578A1 EP 2024075313 W EP2024075313 W EP 2024075313W WO 2025056578 A1 WO2025056578 A1 WO 2025056578A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- color
- digital
- input
- image
- value
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/145—Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value ; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid or cerebral tissue
- A61B5/1455—Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value ; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid or cerebral tissue using optical sensors, e.g. spectral photometrical oximeters
- A61B5/14551—Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value ; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid or cerebral tissue using optical sensors, e.g. spectral photometrical oximeters for measuring blood gases
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000094—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0638—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements providing two or more wavelengths
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0075—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by spectroscopy, i.e. measuring spectra, e.g. Raman spectroscopy, infrared absorption spectroscopy
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0077—Devices for viewing the surface of the body, e.g. camera, magnifying lens
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0082—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
- A61B5/0084—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes for introduction into the body, e.g. by catheters
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/145—Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value ; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid or cerebral tissue
- A61B5/14535—Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value ; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid or cerebral tissue for measuring haematocrit
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/145—Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value ; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid or cerebral tissue
- A61B5/1455—Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value ; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid or cerebral tissue using optical sensors, e.g. spectral photometrical oximeters
- A61B5/1459—Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value ; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid or cerebral tissue using optical sensors, e.g. spectral photometrical oximeters invasive, e.g. introduced into the body by a catheter
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4887—Locating particular structures in or on the body
- A61B5/489—Blood vessels
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient; User input means
- A61B5/742—Details of notification to user or communication with user or patient; User input means using visual displays
- A61B5/7425—Displaying combinations of multiple images regardless of image source, e.g. displaying a reference anatomical image with a live image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2576/00—Medical imaging apparatus involving image processing or analysis
Definitions
- the invention relates to a medical observation device, such as a microscope or endoscope.
- this object is addressed by a data processing device for a medical observation device, such as a microscope or endoscope, for imaging a biological object
- the data processing device is configured: to access at least one digital input image, the at least one digital input image being representative of an image of the object formed by light reflected off the object and comprising a plurality of input pixels; to determine a blood concentration value at an input pixel of the plurality of input pixels, the blood concentration value being representative of the amount of blood at a location of the object which location is imaged in the input pixel; to determine a blood oxygenation value at the input pixel, the blood oxygenation value being representative of the amount of deoxyhemoglobin and/or oxyhemoglobin at the location of the object and to generate a digital output color image having a plurality of output pixels; wherein an output pixel of the plurality is generated by assigning a color to the output pixel, the color depending on the blood oxygenation value and the blood concentration value.
- a computer-implemented method for processing images of a medical observation device comprising the following steps: accessing at least one digital input image, the at least one digital input image being representative of an image of the object formed by light reflected off the object and comprising a plurality of input pixels; determining a blood concentration value at an input pixel of the plurality of input pixels, the blood concentration value being representative of the amount of blood at a location of the object, which location is imaged in the input pixel; determining a blood oxygenation value at the input pixel, the blood oxygenation value being representative of the amount of deoxyhemoglobin and/or oxyhemoglobin at the location of the object generating a digital output color image having a plurality of output pixels; and generating an output pixel of the plurality by assigning a color to the output pixel, the color depending on the blood oxygenation value and the blood concentration value.
- the above device and method facilitate the identification of blood vessels as the blood vessels are now not only made visible by the blood oxygenation value, i.e. the amount of oxygen contained in the blood, or, synonymously, the oxygen saturation of the blood, at a location of the object, but also by the blood concentration value.
- the blood oxygenation value i.e. the amount of oxygen contained in the blood, or, synonymously, the oxygen saturation of the blood, at a location of the object, but also by the blood concentration value.
- the above-identified device and method may be further improved by any one of the following features which may be arbitrarily combined with one another, each feature having a technical effect of its own.
- Each of the following features may be used both in connection with the data processing device and the computer-implemented method, independent of whether the particular feature has been described in the context of the data processing device or the computer-implemented method. If, for example, a feature has been described in the context of the computer- implemented method, the data processing device may be likewise configured to execute the feature. If a feature has been described in the context of the data processing device, it may be executed as a step in the computer-implemented method.
- the input image data may contain a plurality of input pixels. Each input pixel represents the light received from a specific location on the object. Each input pixel corresponds to a specific location on the object. If the input image data contain more than one digital input image, a specific location on the object may be represented by more than one input pixel. Input pixels that represent the same location on the object are termed “corresponding pixels” as is usual in the art.
- the input pixels may be color pixels or monochrome pixels.
- a color input pixel comprises color space coordinates that define the color of the input pixel using, e.g., a color space.
- the color space coordinates represent the color appearance parameters, such as hue, lightness, brightness, chroma, colorfulness and saturation. If a tristimulus color space such as RGB is used, each input pixel comprises three color space coordinates R, G, B.
- the color space coordinate R defines the intensity of the red color band
- the color space coordinate G defines the intensity of the green color band
- the color space coordinate B defines the intensity of the blue color band.
- Other color spaces such as HSV, CIELAB or CIELLIV, use other color space coordinates.
- a monochrome input pixel just indicates light intensity in the spectral band where the light is recorded. It thus has only a single color space coordinate.
- a digital input image is a color image, it is termed a “digital color input image”.
- a multispectral or hyperspectral image is considered a color image.
- a digital input image is a monochrome image, it is termed “digital monochrome input image”. If, in a specific context, it does not matter whether the digital input image is a color or a monochrome image, the generic term “digital input image” is used.
- a “digital input image” may thus be a “digital color input image” or a “digital monochrome input image”. The same terminology is used for the digital output image.
- the color that is assigned to the output pixel may be a pseudocolor or a false color.
- a pseudocolor a different hue may be assigned to the output pixel depending on the blood oxygenation value and/or the blood concentration value.
- a false color the same hue, but at a different intensity may be assigned to the output pixel depending on the blood oxygenation value and/or the blood concentration value.
- the color assigned to the output pixel may be black or have a very low intensity if the blood concentration value is smaller than a predetermined threshold. Using the latter may help to identify blood vessels, as areas in which the blood concentration value is below the threshold are less visible or even blackened out.
- any color appearance parameter may be assigned to the output pixel depending on the blood oxygenation and the blood concentration value.
- the color appearance parameters comprise any one of hue, lightness, brightness, chroma, colorfulness and saturation.
- the output pixel and the input pixel are preferably corresponding pixels.
- the digital output color image and the at least one digital input image have the same number of pixels and/or the same aspect ratio and/or are represented in the same color space.
- the blood concentration value and/or the blood oxygenation value are determined solely based on the at least one digital input image.
- the digital output image may comprise a plurality of output pixels.
- the digital output image may be a monochrome image or a color image.
- the output pixels may be monochrome or color. For each output pixel, there is at least one corresponding input pixel in the input image data.
- the at least one digital input image is a set of two or more monochrome images, a set comprising at least one digital color input image and at least one digital monochrome input image or a set comprising at least one digital color input image.
- the blood concentration value may be considered indicative of the presence of a blood vessel, where blood is transported and thus accumulated. Therefore, according to another aspect, the data processing device may be configured to assign a hue to the output pixel depending on the blood oxygenation value of the input pixel and an intensity, lightness and/or saturation to the output pixel depending on the blood concentration value of the input pixel when assigning the color to the output pixel.
- the at least one digital input image is recorded in as many color bands as possible.
- each color band may be represented as a color space coordinate in a color space, this is equivalent to the at least one digital input image being recorded in a color space having as many color space coordinates as possible.
- the at least one digital input image may be a digital multispectral input image.
- the term multispectral input image also comprises hyperspectral input images.
- the multispectral input image may be represented in a color space having more than three or more than four spectral bands or color space coordinates, respectively.
- the multispectral input image comprises six color bands, or color space coordinates, respectively.
- the digital multispectral input image may be generated from a plurality of digital color input images, which are recorded preferably in different image spectra, respectively.
- Each of the plurality of digital color input images may represent a reflectance image of the object.
- each of the plurality of digital color input images may have been recorded in a color space having three color space coordinates, such as an RGB or HSV color space.
- the digital color input images are registered with respect to each other.
- pixels with corresponding locations in the images represent corresponding locations of the imaged object.
- the registered images may be obtained using an identical field of use, identical optical axes, identical focal lengths when the plurality of digital input images is recorded by a corresponding number of digital cameras.
- the digital input images may be registered using software.
- the plurality of digital input images that are used to generate the multispectral input image are preferably all represented in the same color space, such as an RGB or any other color space.
- the digital multispectral input image may comprise a plurality of input pixels, each input pixel of the multispectral input image comprising a set of color space coordinates.
- the set of color space coordinates of a multispectral image may be obtained in one case by forming a union set of the set of color space coordinates of the corresponding color input pixels of the plurality of digital color images.
- each of the plurality of digital input images represents a different spectrum.
- the spectra of different digital input images of the plurality of digital input images do not overlap or at least have only a minimum overlap.
- the spectra of the different digital input images of the plurality of digital input images may be complementary to each other.
- the nonoverlapping image spectra of the plurality of digital input images complement each other to form a continuous spectrum.
- Each imaged spectrum may comprise at least one passband and at least one stopband.
- the other digital input images of the plurality all may have a stopband.
- At least one of the imaged spectra may comprise a passband which comprises, is limited to or contained within the fluorescence emission spectrum of at least one fluorophore contained in the biological object.
- the at least one fluorophore may be a fluorophore that is naturally contained in the biological object. Additionally or cumulatively, the at least one fluorophore may have been artificially added to the biological object. Examples of fluorophores that may have been artificially added to the biological object are ICG, fluorescein or 5-ALA/PplX. For example, artificially added fluorophores may have been injected into the patient.
- an imaged spectrum of the at least two digital color input images that are used to generate the digital multispectral input image may comprise NIR (near infrared) wavelengths.
- the NIR wavelengths may be used to add additional reflectance information that may allow more accurately determining the blood concentration and oxygenation value.
- the data processing device may be configured to determine a deoxyhemoglobin value at the input pixel, where the deoxyhemoglobin value is representative of the concentration of deoxyhemoglobin at the location of the object represented by the input pixel.
- the deoxyhemoglobin value may be obtained for example by spectral unmixing. In spectral unmixing, a reference reflectance spectrum of deoxyhemoglobin may be used as an endmember.
- the data processing device may be configured to determine an oxyhemoglobin value at the input pixel, where the oxyhemoglobin value is representative of the concentration of oxyhemoglobin at the location of the object represented by the input pixel.
- the oxyhemoglobin value may be determined for example by spectral unmixing, using the reflectance spectrum of oxyhemoglobin as an endmember. If spectral unmixing is used to determine the deoxyhemoglobin or oxyhemoglobin value, the values of the signal descriptors obtained from the spectral unmixing may be used to determine the blood concentration value. The larger the contribution of the endmember to the spectrum at the input pixel, the larger the blood concentration value may be.
- the oxyhemoglobin value and/or the deoxyhemoglobin value at the location of the object corresponding to the input pixel may be obtained by a different modality.
- the data processing device may be configured to determine the blood oxygenation value at the input pixel by computing a discrepancy between the oxyhemoglobin value at the input pixel and the deoxyhemoglobin value at the input pixel.
- the discrepancy may be a ratio, a difference, or a combination of ratio and difference of the oxyhemoglobin value and the deoxyhemoglobin value.
- the blood oxygenation value is thus representative of the oxygen saturation at the location of the object which is mapped or imaged at the input pixel.
- the data processing device may be configured to compute the blood concentration value at the input pixel by computing an aggregate value of the oxyhemoglobin value at the input pixel and the deoxyhemoglobin value at the input pixel.
- the aggregate value may be a sum, a product, or a combination of sum and product of the oxyhemoglobin value and the deoxyhemoglobin value. It is thus representative of the amount of hemoglobin at the location. The amount of hemoglobin at a location in turn is representative for the amount of blood at this location.
- the digital output color image may be combined with the at least one digital input image, which represents the reflectance of the object.
- the digital output image may be overlaid on one of the digital input images.
- the digital input image which is combined with the digital color output image may itself have been generated by a combination of a plurality of digital input images each being representative of the light reflected off the object preferably in a spectrum.
- the multispectral input image generated from the plurality of input images may be transferred to a color input image having three color space coordinates, such as an RGB color image, before combining it with the digital color output image.
- the data processing device may be configured to generate the output pixel by mixing the color assigned to the output pixel with the color of the corresponding input pixel.
- the mixing may comprise alpha blending, such as assigning a transparency to the color of the output pixel, where the transparency may depend on the intensity of the color, and overlaying the color of the output pixel onto the color of the input pixel.
- the mixing may comprise a vector addition of the color space coordinates of the color assigned to the output pixel and the color space coordinates of the color of the input pixel.
- mixing may comprise a linear transformation, which maps the color space coordinates of the pseudocolor and the color spaced coordinates of the input pixel to a set of color space coordinates of the output pixel, such as RGB color space coordinates.
- the linear transformation may comprise a color conversion matrix, which is multiplied with the color space coordinates of the color assigned to the output pixel with the color space coordinates of the corresponding input pixel.
- the invention also relates to a medical observation device, such as an endoscope or a microscope for imaging a biological object, wherein the medical observation device comprises a data processing device according to any of the embodiments and/or comprising any of the features described above, and at least one color camera for recording the digital input image.
- a medical observation device such as an endoscope or a microscope for imaging a biological object
- the medical observation device comprises a data processing device according to any of the embodiments and/or comprising any of the features described above, and at least one color camera for recording the digital input image.
- the medical observation device may comprise in a further embodiment two color cameras, which are configured two record to digital color input images in two different imaged spectra, which are preferably non-overlapping and/or complementary.
- the medical observation device may further comprise a third camera for recording a third digital input image in an imaged spectrum, which is complementary to the imaged spectra of the two digital cameras.
- At least one camera may be configured to record the fluorescence emission of a fluorophore contained in the biological object.
- the imaged spectrum that is recorded by this camera preferably comprises a passband which comprises or is limited to the fluorescence emission spectrum of the at least one fluorophore.
- At least one camera of the medical observation device may be configured to record light in the NIR range. Such a camera may be configured to also record the fluorescence emission of a fluorophore.
- the medical observation device may comprise an illumination system that is configured to illuminate the object with a standard illuminant. Further, the illumination system may be configured to illuminate the object in the NIR range. In one example, the illumination system is configured to illuminate the object using a bandpass which is configured to pass light having a wavelength larger than about 600nm, in particular larger than about 750nm.
- One camera of the medical observation device may be a digital fluorescence-light color camera which is configured to record a first color image which is representative of a reflectance image of the object in a first imaged spectrum, the first imaged spectrum comprising wavelengths that are comprised in the fluorescence spectrum of at least one fluorophore in the biological object.
- the first imaged spectrum may comprise the NIR range.
- the digital fluorescence-light color camera may be a camera which is usually employed to record a fluorescence image of a fluorophore.
- the medical observation device may comprise a digital white-light color camera adapted to record a second color image in a second imaged spectrum complementary to the first spectrum.
- the medical observation device may comprise a first optical observation filter set configured to transmit the first imaged spectrum and a second optical observation set configured to transmit the second imaged spectrum.
- Passbands in the second imaged spectrum correspond preferably to stopbands in the first imaged spectrum.
- the stopbands in the second imaged spectrum thus comprise the fluorescence excitation spectrum of the at least one fluorophore.
- the invention further relates to a method for operating a medical observation device, such as a microscope or an endoscope, for observing a biological object, wherein the method comprises the steps of recording at least one digital input image, which is preferably representative of light reflected off the object and carrying out the computer-implemented method in any of the embodiments described above.
- a medical observation device such as a microscope or an endoscope
- the invention relates to a computer program product and/or a computer-readable medium comprising instructions, which, when the program is executed by a computer, such as the data processing device described above, causes the computer to carry out the computer-implemented method in any of its embodiments described above.
- Fig. 1 shows a schematic representation of a medical observation device for generating a digital color output image from at least one digital color input image
- Fig. 2 shows a schematic representation of the generation of a digital color output
- Fig. 3 shows a another schematic representation of the generation of a digital color output image
- Fig. 4 shows a schematic representation of the generation of a digital multispectral input image from two digital color input images
- Fig. 5 shows a schematic representation of the method of generating a digital color output image
- Fig. 6 shows a schematic representation of a microscope system.
- Fig. 1 shows schematically a medical observation device 100.
- the medical observation device 100 may be a microscope or an endoscope, the difference between a microscope and an endoscope being primarily that, in an endoscope (not shown), an object 106 is viewed through optical fibers that are brought into vicinity of the object 106 to be investigated, e.g. by insertion into a body containing the object, whereas, in a microscope, an objective 174 is directed onto the object.
- the medical observation device of Fig. 1 is a microscope, the following description also applies to an endoscope.
- the medical observation device 100 may be a medical observation device used in surgery.
- the medical observation device 100 may also be a medical observation device used in a laboratory, such as a laboratory microscope.
- the object 106 to be investigated may consist of or comprise biological tissue 107.
- the object 106 may be a part of a patient’s body that is located within the field of view of the medical observation device 100.
- the object 106 may contain one or more fluorophores 116, 118, 120.
- At least one fluorophore 116 may be a fluorophore that is naturally contained in the object.
- At least one fluorophore 118, 120 may be artificially added to the object 106, e.g. by injecting it into the biological tissue 107.
- fluorophores 118, 120 that may be artificially added to the object 106 are ICG, fluorescein and/or 5-ALA.
- the medical observation device 100 as shown may be a fluorescence imaging device.
- the medical observation device may be configured to view and preferably also excite the fluorescence of the one or more fluorophores 116, 118, 120.
- the medical observation device 100 may be a stereoscopic device as is exemplarily shown in Fig. 1. It may thus comprise two identical subassemblies 101 L and 101 R for each of the two stereoscopic channels. As the two subassemblies 101 L, 101 R are identical with respect to function and structure, the following description focuses on the right subassembly 101 R, but applies identically to the left stereoscopic channel 101 L.
- the medical observation device 100 may alternatively be a monoscopic device. In this case, only one of the two subassemblies 101 L, 101 R may be present. Fora monoscopic medical observation device 100, the following description therefore applies as well.
- the medical observation device 100 in operation provides input image data 122.
- the input image data 122 are representative of an imaged scene, i.e. the part of the object that is within the field of view 184 of the medical observation device 100.
- the input image data 122 may comprise one or more different digital input images 130, which in particular may be different digital color input images 130, different digital monochrome input images 130 or a combination of at least one digital color input image 130 and at least one digital monochrome input image 130. If the input image data 122 contain a plurality of digital input images 130, the different digital input images 130 should contain different spectral information. In such a case, each digital input image 130 of the input image data may be recorded at different wavelengths, preferably with no or, synonymously, minimum spectral overlap.
- the imaged spectra, in which the different digital input images 130 of the input image data 122 are recorded are non-overlapping. According to another aspect, the imaged spectra are complementary. In this case the stopbands and passbands of the imaged spectra complete each other to form seamlessly or at least almost seamlessly a continuous input spectrum which is split into the imaged spectra.
- the digital imaging system 102 may comprise one or more digital cameras 108, which are preferably digital color cameras.
- the number of digital input images 130 contained in the input image data 122 may depend on, in particular be equal to the number of cameras 108 used for generating the input image data 122.
- a digital input image 130 may be a color image or a monochrome image.
- the medical observation device 100 may be configured to record in the input image data 122 both the fluorescence of the fluorophore 116 that occurs naturally in the object, and the fluorescence of at least one fluorophore 118, 120 that has been added artificially.
- the one or more fluorophores 118, 120 may have been injected into a patient’s body to mark specific areas of interest, such as tumors.
- a digital fluorescence-light camera 111 For recording a fluorescing fluorophore at least one of the digital cameras 108, a digital fluorescence-light camera 111 , has an imaged spectrum that comprises or is contained in the (known) fluorescence emission spectrum of the fluorophore of which fluorescence is to be recorded.
- the medical observation device 100 may also be configured to record the light reflected off the object in the input image data 122. This may be done simultaneously or sequentially to recording the fluorescence emission of the fluorophore.
- another digital camera may be used than the one for recording the at least one fluorescing fluorophore 116, 118.
- This digital camera 108, a digital reflectance camera 110 may be used to provide a preferably white-light reflectance image of the object 106 in particular in the wavelengths excluding the wavelengths recorded by the other digital camera 108, i.e. the fluorescence emission spectrum or part thereof.
- the medical observation device 100 is configured to use both the digital camera that is used for recording the fluorescence emission of the at least one fluorophore, and the digital camera that is used for recording the reflectance image, to record a reflectance image of the object.
- the reflectance image recorded by the two or more cameras 108 has additional spectral information over a reflectance image recorded by just one camera 108.
- the medical observation device 100 may be configured to generate a digital multispectral reflectance input image from the digital input images 130 generated from the two or more digital cameras 108.
- the digital imaging system 102 may comprise in one embodiment as digital cameras 108 a digital reflectance camera 110 and one or more digital fluorescence-light cameras 111 , 111a.
- a second (or third) digital fluorescence-light camera 111a is optional.
- Each fluorescence-light camera should record light in a different, preferably non-overlapping imaged spectrum that is preferably complementary to all other imaged spectra.
- the second digital fluorescence-light camera 111a is only shown in the left stereoscopic channel 101 L, but of course may also be present in the right stereoscopic channel 101 R.
- the digital fluorescence-light camera of one stereoscopic channel may be used as the (first) digital fluorescence-light color camera 111 and the digital fluorescence-light camera of the other stereoscopic channel may be used as the second fluorescence-light camera 111a.
- the cameras 110, 111 , 111a may each be a color camera or a monochrome camera. A multispectral camera or a hyperspectral camera is considered a color camera.
- the digital reflectance camera 110 is configured to record a digital reflectance input image 114, i.e. a digital input image 130, which is representative of the reflectance of the object 106.
- the digital reflectance camera 110 is preferably configured to record a digital input image 130 in a wide spectral range within the visible light spectrum.
- the digital input image 130 recorded by the digital reflectance camera represents closely the natural colors of the object 106. This is important if the digital reflectance camera 110 is used to provide the user with an image of the object which comes as close as possible to the human perception of the object.
- the digital reflectance camera 110 may be a CCD, CMOS or multispectral or hyperspectral camera.
- Each of the at least one digital fluorescence-light camera 111 , 111a is configured to record a different digital fluorescence-light image 112, i.e. a digital input image 130, which is recorded in the fluorescence spectrum or the fluorescence spectra of the at least one fluorophore 116, 118, 120.
- Each fluorescence-light camera 111 , 111a may be configured to record the fluorescence of a different fluorophore.
- the fluorescence-light camera 111 may be configured to record the digital fluorescence-light image only in one or more narrow bands of light. These narrow bands should overlap the fluorescence spectrum or spectra of the one or more fluorophores 116, 118, 120 of which fluorescence is to be recorded.
- the fluorescence spectra of the different fluorophores 116, 118, 120 are at least partly separate, preferably completely separate, i.e. non-overlapping, so that the fluorescence-light camera 111 may record a digital color input image 130 representing two separate fluorescence bands that are spaced from one another.
- each fluorescence-light camera 111 , 111a preferably captures the fluorescence emission of a different fluorophore.
- the at least one fluorescence-light camera 111 , 111a may be a monochrome camera, a CCD, CMOS or multispectral or hyperspectral camera.
- the white-light color camera 110 and the at least one fluorescence-light color camera 111 are of the same type, although this is not necessary.
- the digital reflectance camera 110 and the at least one fluorescence-light camera 111 , 111a may be used to record a reflectance input image of the object 106.
- any combination of the cameras 110, 111 and 111a may be combined into a single multispectral or hyperspectral camera either virtually, in that the separate images recorded by the cameras
- 110, 111 , 111a are processed as a single multispectral image, or as a single real multispectral camera which performs the functions of the different cameras 110, 111 , 111a.
- the respective fields of view 184 of the cameras 110, 111 , and if present 111a, are preferably aligned or even coinciding and coaxial. It is preferred that the cameras 110, 111 provide the identical field of view 184 with the identical perspective and focal length. This results in identical representations of the object 106 in the images 112, 114 generated by the different cameras 110,
- Both cameras 110, 111 may use the same objective 174.
- a match of the perspectives and field of view cannot be generated optically, it may be generated by image processing by applying a matching or registering routine to the digital input images 130, as is explained further below.
- the cameras 110, 111 , and, if present, 111 a are operated synchronously. Specifically, the exposure times may be synchronized.
- the medical observation device 100 may be configured to generate the digital input images 130 at the same time.
- the gain of the at least two cameras 110, 111 , 111a is synchronized, i.e. adjusted in the at least two cameras 110, 111 , 111a at the same time.
- the ratio of the gain applied in camera 110 to the gain applied in camera 111 and, if present, in camera 111a may be constant, even if the gain is changed.
- the gamma correction and color adjustment or white balance may be switched off or kept constant.
- an optical color-separation assembly 176 For separating the light recorded in the digital reflectance input image 114 from the spectrum recorded in the at least one digital fluorescence-light input image 112, i.e. for separating the reflectance spectrum from the fluorescence spectrum, an optical color-separation assembly 176 may be provided.
- the color-separation assembly 176 may comprise optical elements such as a beam splitter 192, which may be dichroic.
- the color separation assembly 176 may further or alternatively comprise an optical observation filter set 188 and/or an optical fluorescence filter set 190.
- the fluorescence filter set 190 is preferably configured to transmit light in the fluorescence spectrum or spectra of the one or more fluorophores 116, 118, 120 and to block light outside the fluorescence spectrum or spectra.
- the fluorescence filter set 190 may comprise one or more optical band-pass filters comprising one or more passbands. Each passband should overlap the fluorescence emission spectrum of a respective fluorophore 116, 118, 120 of which the fluorescence is to be recorded. As the fluorescence-light filter set 190 is in the light path between the beam splitter 192 and the fluorescencelight color camera 111 , only the wavelengths in the passbands of the fluorescence-light filter set 190 are transmitted to the fluorescence-light color camera 111.
- the fluorescence filter set 190 may comprise a different optical band-pass filter in front of each of the fluorescence-light color cameras 111 , 111a.
- the passband of one band-pass filter may be contained in the fluorescence-emission spectrum of one fluorophore 116, whereas the passband of the other band-pass filter may be contained in the fluorescence-emission spectrum of another fluorophore 116, 118 in the object 106.
- the observation filter set 188 is preferably configured to block light in the fluorescence spectrum or spectra of the one or more fluorophores 116, 118.
- the observation filter set 188 may also be configured to block light in the fluorescence-excitation spectrum.
- the observation filter set 188 is preferably configured as a band-stop filter, of which the stopbands correspond to or at least contain the passbands of the fluorescence-light filter set 190.
- the observation filter set 188 is located in the light path between the beam splitter 192 and the whitelight camera 110.
- white-light camera 110 records only wavelengths that are outside the stopbands of the observation filter set 188 and therefore also outside of the passbands of the fluorescence-light filter set 190 of the fluorescence-light filter set 190.
- Any one of the observation filter set 188 and the fluorescence filter set 190 may be a tunable filter.
- the beam splitter 192 is a dichroic beam splitter
- at least one of the filter sets 188, 190 may be omitted as the optical spectral filtering in this case is already integrated in the dichroic beam splitter.
- the above description of the passbands and stopbands then should apply mutatis mutandis to the dichroic beam splitter 192.
- the medical observation device 100 may further comprise an illumination assembly 178, which is configured to illuminate the object 106 preferably through the objective 174 through which the imaging system 102 records the at least one digital image 112, 114.
- the illumination assembly 178 may be configured to selectively generate white-light, i.e. light that is evenly distributed across the entire visible spectrum, and fluorescence-excitation light, which contains light only in wavelengths that stimulate fluorescence of the at least one fluorophore 116, 118.
- the illumination light generated by the illumination assembly 178 may be fed into the objective 174 using an illumination beam splitter 180.
- the illumination assembly 178 may be configured to generate illumination light simultaneously in a plurality of discrete, in particular narrow-band, wavelength bands. These wavelength bands may comprise any or any combination of the following wavelength bands.
- One such discrete wavelength band may be entirely located in the fluorescence-excitation spectrum of a fluorophore 116.
- Another such wavelength band may be entirely located in the fluorescence-emission spectrum of another fluorophore 118.
- Another such wavelength band may be limited to wavelengths larger than 700 nm and be entirely located in the NIR range.
- the simultaneous illumination of the object with any of the discrete wavelength bands as described above may be accomplished by a light source 199, e.g. a tunable light source such as a light source comprising a plurality of LEDs in different colors, in particular in different primary colors, which is configured to generate light in these wavelength bands simultaneously.
- the wavelength bands may be generated by using an illumination filter 179 having multiple passbands, wherein the passbands preferably correspond to the above wavelength bands. If such an illumination filter 179 is used, the light source 199 may generate whitelight, which is then filtered by the illumination filter 179 so that only the light in the passbands illuminates the object 106.
- the illumination filter 179 may be provided depending on the at least one fluorophore, of which fluorescence is to be triggered, and its specific excitation spectrum. For example, if 5-ALA is used as a fluorophore, the illumination filter may have a transmission of 90 % to 98 % up to wavelengths of 425 nm, a transmission between 0.5 % and 0.7 % in wavelengths between 450 nm and 460 nm, a transmission of not more than 0.1 % between 460 nm and 535 nm and of practically zero for wavelengths above 535 nm.
- the illumination filter 179 may be configured for pass-through of NIR light.
- the illumination filter 179 may comprise a passband in the NIR.
- the illumination filter 178 may further comprise a passband, which is preferably entirely located in the fluorescence-excitation spectrum of another fluorophore.
- the medical observation device 100 may be adjusted to a different fluorophore or set of fluoro- phores by re-configuring the color-separation assembly 176, e.g. by exchanging its optical elements, such as the filters set 190 and/or 192, or the dichroic beam splitter 180.
- the input image data 122 are processed by a data processing device 170.
- the data processing device 170 may be an integral part of the medical observation device 100.
- the data processing device may be a processor, which is embedded in the medical observation device and also used as a controller for controlling the hardware of the medical observation device 100, such as the brightness and/or spectral emission of the light source 199 and/or any objective of the medical observation device 100 and/or any actuators of the medical observation device 100.
- the data processing device 170 is part of a general computer, which is connected to the medical observation device for unidirectional or bidirectional data transfer by wire or wirelessly.
- the data processing device 170 may be a hardware module, such as a microprocessor, or a software module.
- the data processing device 170 may also be a combination of both a hardware module and a software module, for example by using software modules that are configured to be run on a specific processor, such as a vector processor, a floating point graphics processor, a parallel processor and/or on multiple processors.
- the data processing device 170 may be part of a general-purpose computer 186, such as a PC.
- the data processing device 170 is an embedded system or embedded processor of the medical observation device 100.
- the data processing device 170 is configured to access the input image data 122, e.g. in the form of the one or more digital input images 130, such as the digital white-light color input image 114 and the digital fluorescence-light image 112.
- the data processing device 170 may be configured to retrieve the digital input images 130 from a memory 194 and/or directly from the cameras 110, 111 and if present 111a.
- the memory 194 may be part of the data processing device 170 or reside elsewhere in the medical observation device 100.
- the data processing device 170 is further configured to compute a digital color output image 160 from the input image data 122.
- the digital color output image 160 is a color image, which is represented in a color space.
- the color space of the digital color output image may be different from the color space of any digital color input image that is contained in the input image data 122.
- the color space of the digital color output image 160 is the same color space as that of any of the digital color input images 130.
- a color space comprises at least three of such color channels. In the color space, each color channel is represented by a different color space coordinate. For conversion between different color spaces, color space transformations may be used. In a different color space, the same color is represented in different color space coordinates.
- Each pixel of the digital color input images 130 comprises a set of color space coordinates that together represent the color of the respective pixel.
- Each color band thus may be regarded as representing a color space axis and each color may be regarded as a point in color space, which is defined by the vector — i.e. the color space coordinates — pointing to this color.
- adding two colors corresponds to a vector addition. If one color has color space coordinates ⁇ xi, yi, zi ⁇ and a second color has color space coordinate ⁇ X2, y2, Z2 ⁇ then the sum of these two colors corresponds to the color ⁇ X1+X2, yi+y2, Z1+Z2 ⁇ .
- the digital color input images 130 or, more generally the input image data 122 may be recorded in RGB color space using the three primary colors or color bands — or color space coordinates — R, G, B.
- the digital color input image 130 may be recorded in different color spaces, respectively, and/or represent multispectral or hyperspectral color input images.
- the digital input images 130 of a set of digital input images, such as the digital whitelight color input image 114 and the digital fluorescence color input image 112 need not be recorded in the same color space, although this is preferred.
- RGB color space each color is represented by a triple of three color space coordinates in the form of integer numbers, wherein each integer number indicates the intensity of one of the primary colors R, G, B.
- each integer number indicates the intensity of one of the primary colors R, G, B.
- the most intense red is indicated by the triple [255 ,0, 0]
- the most intense green color is indicated by [0, 255, 0]
- the most intense blue by [0, 0, 255]
- RGB color space is a three-dimensional space
- CMYK color space would be a four-dimensional space.
- a color can be considered as a point in color space to which a vector such as [0, 0, 255] points.
- a multispectral or hyperspectral color space having n color bands would correspondingly result in an n-dimensional color space, in which each color is represented by an n-tuple of color space coordinates.
- the data processing device 170 may comprise a routine 140 for determining oxyhemoglobin values and/or deoxyhemoglobin values at a pixel in the at least one digital input image 130 as e.g. described in Hashimoto M. et al. (1987): “Color analysis method for estimating the oxygen saturation of hemoglobin using an image-input and processing system”, Analytical Biochemistry, 162(1), p. 178-184.
- the accuracy of determining oxyhemoglobin and/or deoxyhemoglobin values is improved if more than one digital (color) image with different imaged spectra or a multi- or hyperspectral image is used are there are more color bands available for spectral analysis.
- the oxyhemoglobin value is representative of the concentration or amount of oxyhemoglobin at a location of the object, which is mapped onto or imaged in the input pixel.
- the deoxyhemoglobin value is representative of the concentration or amount of deoxyhemoglobin at the location of the object, which is mapped onto or imaged in the input pixel
- the data processing device 170 may comprise a routine 142 for determining a blood oxygenation value at an input pixel.
- the blood oxygenation value is representative of how saturated the blood is with oxygen. It may thus be representative of the concentration or amount of deoxyhemoglobin and oxyhemoglobin at the location of the object which is mapped onto the input pixel.
- the routine 142 may be configured to obtain the blood oxygenation value from the oxyhemoglobin value and the deoxyhemoglobin value at an input pixel.
- the blood oxygenation value at a pixel may be representative of a discrepancy between the deoxyhemoglobin value and the oxyhemoglobin value at this input pixel, such as a ratio or the difference of the oxyhemoglobin value and the deoxyhemoglobin value, or any combination of the ratio and the difference.
- the data processing device 170 may comprise a routine 144 for determining a blood concentration value at a pixel, where the blood concentration value is representative of the absolute or relative amount of blood contained at a location of the object 106 mapped onto the respective input pixel, preferably the location at which the blood oxygenation value is obtained.
- the relative amount may be determined as e.g. density, mass, area or volume of blood relative to the density, mass, area or volume of non-blood matter.
- the absolute amount may be determined in mass per unit volume or area.
- the routine 144 may be configured to obtain the blood concentration value at an input pixel by forming an aggregate value from the oxyhemoglobine value and the deoxyhemoglobine value at the input pixel.
- the aggregate value may be a sum or a product.
- the data processing device 170 may comprise a routine 148 for assigning a color such as a pseudocolor or a false color to the values obtained by any of the routines 142 and 144.
- a pseudocolor a different color is assigned to each value.
- a false color a different intensity of the same hue is assigned to each value.
- the expression color is assumed to refer to any (combination of) color appearance values.
- a routine 150 may be used to generate the digital color output image 160 from the blood concentration value and the blood oxygenation value at each pixel.
- the intensity at the output pixel may correspond to a combination of blood oxygenation value and the blood concentration value at corresponding pixels.
- the routine 150 may comprise a filtering step.
- the blood concentration value may be used as a filter mask for filtering the blood oxygenation value or the color of the blood oxygenation value.
- the intensity of the blood oxygenation value at a pixel is multiplied by the blood concentration value at this pixel.
- the data processing device may comprise a color image combination routine 152.
- a routine 154 may be provided that is configured to combine two digital color images. Such a routine may be used to combine the digital output image 160 with at least one digital input image. The digital image resulting from this combination thus represents the location of blood vessels and their oxygenation level and the background anatomy.
- the routine 154 may comprise alpha blending, vector addition of color space coordinates and/or linear transformation.
- routines 140 to 154 may be a software routine, a routine implemented in hardware, or a routine in which software and hardware components are combined. Any of the routines 140 to 148 may be stored in a memory 194 of the data processing device 170 or the medical observation device 100.
- the medical observation device 100 may comprise a user input device 162, which, upon being operated by a user, may generate a user selection signal 164, which may be communicated to the digital processing device 170.
- the user input device 162 may e.g. be a physical button, dial, slide or lever or a widget that represents a physical button, dial, slide, lever, or widget.
- the user may determine which false colors or pseudocolors are assigned to the blood oxygenation values and/or blood concentration values and at which intensities, and/or which values and images are combined in a single view.
- the digital color output image 160 may be displayed on a display 132, which is integral with the medical observation device 100.
- the display 132 may be integrated in an ocular or eyepiece 104 of the medical observation device 100.
- the digital color output image 160 is preferably generated in real-time, i.e. a digital color output image 160 is generated from a set of digital color input images 130 before the next set is generated by the at least two cameras 110, 111 , 111a.
- the medical observation device 100 may comprise a direct optical path 134 from the object 106 through the objective 174 to the eyepiece 104.
- the display may be a translucent display 132 located in the direct optical path 134 or the display may be projected into the direct optical path 134.
- a beam splitter 136 may be provided to split the light between the optical eyepiece 104 and the digital imaging system 102. In one embodiment, up to 80 % of the light may be directed to the eyepiece 104.
- the medical observation device 100 may not have a direct optical path 134 but only display images from the integral display 132.
- the medical observation device may not have any display at all.
- the medical observation device 100 may comprise an output interface 172 to which one or more (external) displays 182 may be connected.
- the output interface 172 may comprise standardized connectors and data transmission protocols, such as USB, HDMI, DVI, DisplayPort, Bluetooth and/or others.
- An external display may be a monitor, 3D goggles, oculars and the like. Any combination of external displays may be connected to output interface 172.
- the computer 186 and/or the data processing device 170 is connected to the digital imaging system 102 using one or more data transmission lines 196.
- a data transmission line may be wired or wireless, or partly wired and partly wireless.
- the computer 186 and/or the data processing device 170 may not be bodily integrated in the medical observation device 100 but be physically located remote from the digital imaging system 102.
- the digital imaging system 102 and the computer 186 and/or the data processing device 170 may be connected to a network, such as a LAN, a WLAN or a WAN, to which also at least one display 182 is connected.
- the medical observation device 100 may be stereoscopic but comprise only two cameras, one for each stereoscopic channel.
- the fluorescence-light color camera 111 is used and configured to selectively record white-light reflectance, whereas in the other stereoscopic channel, the white-light color camera 110 is used.
- Such an arrangement provides a stereoscopic white-light color input image if no fluorescence is used and a monoscopic white-light color input image and a monoscopic fluorescence-light color input image if fluorescence is used.
- the description above and below applies equally to this configuration.
- the input image data 122 comprise a plurality of pixels.
- the pixels may be color pixels or monochrome pixels.
- a monochrome pixel only represents intensity, e.g. as a greyscale image.
- a color pixel comprises information about at least some of the color appearance parameters such as hue, lightness, brightness, chroma, colorfulness and saturation.
- Color pixels are recorded using color bands or, equivalently, color channels or primary colors of a color space using a digital color camera. Each color band is represented by a different color space coordinate.
- Fig. 4 shows how a multispectral image may be generated from two digital input color images, which may be represented in RGB color space.
- the data processing device 170 may employ routine 152.
- the first digital color input image 130 may be a digital reflectance input image 114 and the second digital color input image 130 may be a digital fluorescence-light input image 112 as described above.
- both digital input images 112, 114 represent a reflectance image of the object 106 (Fig. 1) taken under identical illumination, preferably recorded simultaneously.
- the first input image 130, 114 is recorded in a first imaged spectrum 420, which comprises one or more passbands 430 and one or more stopbands 432.
- a first imaged spectrum 420 which comprises one or more passbands 430 and one or more stopbands 432.
- Fig. 4 indicates that the first imaged spectrum 420 may comprise two passbands 430 that are separated from one another by a stopband 432.
- the passbands 430 are preferably located in the visible light range 440, i.e. comprise or being limited to wavelengths A from about 380 to about 750 nm.
- Sensitivity curve 402 represents the wavelength-dependent sensitivity of the blue (B) sensor
- sensitivity curve 404 represents the wavelength-dependent sensitivity of the green (G) sensor
- sensitivity curve 406 represents the wavelength-dependent sensitivity of the red (R) sensor.
- the color space coordinates at each input pixel are generated by a set of such sensors. For a given range of wavelengths Ao that is recorded at an input pixel, each of the sensors will record a different intensity I in its color band. Thus, the range of wavelengths Ao will be represented by a single set 460 of space coordinates ⁇ R 1 , G1 , B1 ⁇ .
- the second digital input image 130, 112 is recorded in a second imaged spectrum 422.
- the second imaged spectrum 422 has at least one passband 430 and at least one stopband 432.
- the first and the second imaged spectra 420, 422 are preferably complementary to each other.
- a stopband 432 in one imaged spectrum 420, 422 will correspond to a passband in the other spectrum 422, 420.
- the range Ao of wavelengths that is recorded at an input pixel of the second digital image 130, 112 will generate a set 460 of color space coordinates ⁇ R2, G2, B2 ⁇ .
- the color space coordinates ⁇ R2, G2, B2 ⁇ in the second digital input image 130, 112 will differ from the color space coordinator ⁇ R1 , G1 , B1 ⁇ in the first digital input image 130, 114, even though the same range Ao is recorded.
- the second imaged spectrum 422 may comprise a passband 430 in the NIR (near infrared) range
- a multispectral image may be generated by forming a union set 462 from the union of the two sets 460 of the color space coordinates for each pair of corresponding input pixels of the first and second digital input image 112, 114.
- the union set 462 may comprise the color space coordinates ⁇ R 1 , R2, G1 , G2, B1 , B2 ⁇ of both the set 460 of an input pixel ⁇ R 1 , G1 , B 1 ⁇ in the first digital input image 130, 114 and the set 460 of the color space coordinates ⁇ R2, G2; B2 ⁇ of the corresponding input pixel in the second digital input image 130, 112.
- the spectrum 450 of the multispectral image comprises of the separate spectral bands 452 of the two digital input images 112, 114, allowing for improved color resolution, as the range Ao of wavelengths is now resolved by six color space coordinates instead of being represented by just three color space coordinates as would have been the case if only a single digital input image is used.
- Each of the spectral bands 452 corresponds to a passband 430 in either the first or the second imaged spectrum 420, 422. .
- a digital output image 160 is generated in which an output pixel 234 represents the blood oxygenation value and the blood concentration value as described above.
- one or more digital input images 130 are obtained, each of which represents an image of the object 106 formed by light reflected off the object.
- the at least one digital input image 130 may be a simple RGB image, but is preferably a multispectral image, such as a multispectral image generated from two or more color images as described with reference to Fig. 4.
- the one or more digital input images may be retrieved directly from a camera or from temporary or permanent storage, such as a computer memory or a disc drive.
- the at least one digital input image 130 is composed of a plurality of different color channels 204, which together form a digital reflectance color input image 212.
- a color channel 204 may correspond to a spectral band 452 in Fig. 4.
- the digital reflectance color input image 212 comprises a plurality of input pixels 232.
- the routine 140 for determining the oxyhemoglobin value and/or the deoxyhemoglobin value is applied. This results in an intermediate image 218 where, at each pixel 230, the oxyhemoglobin value is represented. Further, application of the routine 140 to the digital reflectance color input image 212 results in an intermediate image 220 representing deoxyhemoglobin values.
- the images 218, 220 may be gray-scale images, in which the intensity at each pixel 230 corresponds to the concentration of oxyhemoglobin and deoxyhemoglobin, respectively.
- a blood concentration value may be computed, resulting in an image 216, where each pixel 230 represents a blood concentration value.
- the blood concentration value may be computed by forming an aggregate value such as a sum for each pair of corresponding pixels in the images 218, 220 of the oxyhemoglobin value and the deoxyhemoglobin value.
- a location at the object 106 where the aggregate value of the oxyhemoglobin value and the deoxyhemoglobin value is large indicates a location where there is a higher concentration of blood.
- the discrepancy between the oxyhemoglobin value and the deoxyhemoglobin value indicates how much of the hemoglobin at this location is oxygenized.
- pixels 230 in the image 216 which represent a high blood concentration value, very likely represent blood vessels.
- the image 216 is used e.g. in routine 150 for mixing the two images 214, 216.
- the routine 150 may use, for example, the image 216 as a mask for masking the image 214, e.g. by using the image 216 as an intensity mask for the image 214:
- the blood oxygenation value, or the color assigned thereto, is multiplied by the preferably normalized blood concentration value. If the result falls below a predetermined threshold at an output pixel 234, the pixel may be set to black or any other color.
- routine 148 was not carried out before obtaining the image 214, a color 222 may be assigned after the image 216 has been used to mask image 214. In this case, the routine 148 may be applied after routine 150 has been applied.
- the resulting digital output color image 160 then represents the blood vessels and their oxygenation level.
- the digital output color image 160 shown in Fig. 3 may be combined with the digital input image 212 or any color channel 204 thereof. This is shown in Fig. 4.
- routine 154 is applied to the digital color output image 160 and the digital input image 212. Both images 160, 212 are combined to result in a digital color output image 160a.
- the combination of images 160, 212 may comprise a mixing by alpha blending, e.g. by assigning a transparency to the color of the digital input image 212 depending on the intensity of the color.
- the color at a pixel 230 of image 160 is overlaid onto the color of the corresponding input pixel 232 of image 212.
- the images 160, 212 may be mixed by vector addition of the color space coordinates of the pair of corresponding pixels 230, 232 of images 160, 212, respectively.
- a linear transformation may be applied simultaneously to the color space coordinates of pixel 230 of the image 160 and to the color space coordinates of the corresponding pixel 232 of image 212 to result in the color space coordinates of the output pixel 234 in image 160a.
- the linear transformation may comprise a color conversion matrix with which the color space coordinates of the corresponding pixels 230, 232 are multiplied to result in the color space coordinates of the output pixel 234 and image 160a.
- Fig. 5 a schematic overview of the method for obtaining an image representing blood oxygenation and saturation values is shown.
- a digital input image 130 is obtained.
- the digital input image 130 may be a digital reflectance input image 114.
- the digital input image 130 may be a multi- spectral or hyperspectral image 212.
- further digital input images 130 such as a digital fluorescence-light input image 112 may be obtained at step 502 or at a further optional step 504.
- two or more digital input images 130 may be combined into a single multispectral input image 202. This may be done, e.g. using the routine 152.
- the digital input images 130 which are input into step 506 may be color images or monochrome images.
- the representing oxyhemoglobin values and the deoxyhemoglobin values are computed at various locations of the digital input image 202, if optional step 506 was executed or, one or more digital input images 130 if optional step 506 was not executed.
- the oxyhemoglobin values and deoxyhemoglobin values are determined at each pixel of the at least one digital input image 130, 212 that is input at step 508.
- routine 140 may be executed at step 508.
- a set of oxyhemoglobin values and a set of deoxyhemoglobin values is obtained for each pixel or each set of corresponding input pixels. This set of values can be considered as constituting an image 218 representing oxyhemoglobin values and an image 220 representing deoxyhemoglobin values.
- blood oxygenation values are determined from the oxyhemoglobin values and the deoxyhemoglobin values for the various input pixels, for example by using routine 142.
- the resulting set of blood oxygenation values may be considered as constituting an image 214.
- blood concentration values are computed using the images 218, 220. This may be done using the routine 144. As a result of step 512, a set of blood concentration values is obtained that again may be interpreted as an image 216.
- a false color or pseudocolor may be assigned to the blood oxygenation values.
- the color of a pixel in image 214 thus depends on its blood oxygenation value.
- Step 514 may make use of routine 148.
- the selection of pseudocolors and/or false colors and/or their intensity levels may depend on a user selection signal 164.
- step 514 a colorized image 214 representing blood oxygenation values is obtained.
- step 516 the image 214, representing blood oxygenation values and the image 216, representing blood concentration values are combined. This may be done using routine 150. As a result, digital output image 160 is obtained. If step 514 has not been carried out before, it may be carried out now, assigning each value in the digital output image 160 a color value.
- the digital output image 160 may be combined with the reflectance image 130, 212, at optional step 518. This may be done using routine 154.
- the digital output image 160 or, if step 518 is executed, 160a may be displayed using, for example, the display 182 at step 522.
- a microscope comprising a system as described in connection with one or more of the Figs. 1 to 5.
- a microscope may be part of or connected to a system as described in connection with one or more of the Figs. 1 to 5.
- Fig. 6 shows a schematic illustration of a system 600 configured to perform a method described herein.
- the system 600 comprises a microscope 610 and a computer system 620.
- the microscope 610 is configured to take images and is connected to the computer system 620.
- the computer system 620 is configured to execute at least a part of a method described herein.
- the computer system 620 may be configured to execute a machine learning algorithm.
- the computer system 620 and microscope 610 may be separate entities but can also be integrated together in one common housing.
- the computer system 620 may be part of a central processing system of the microscope 610 and/or the computer system 620 may be part of a subcomponent of the microscope 610, such as a sensor, an actor, a camera or an illumination unit, etc. of the microscope 610.
- the computer system 620 may be a local computer device (e.g. personal computer, laptop, tablet computer or mobile phone) with one or more processors and one or more storage devices or may be a distributed computer system (e.g. a cloud computing system with one or more processors and one or more storage devices distributed at various locations, for example, at a local client and/or one or more remote server farms and/or data centers).
- the computer system 620 may comprise any circuit or combination of circuits.
- the computer system 620 may include one or more processors which can be of any type.
- processor may mean any type of computational circuit, such as but not limited to a microprocessor, a microcontroller, a complex instruction set computing (CISC) microprocessor, a reduced instruction set computing (RISC) microprocessor, a very long instruction word (VLIW) microprocessor, a graphics processor, a digital signal processor (DSP), multiple core processor, a field programmable gate array (FPGA), for example, of a microscope or a microscope component (e.g. camera) or any other type of processor or processing circuit.
- CISC complex instruction set computing
- RISC reduced instruction set computing
- VLIW very long instruction word
- DSP digital signal processor
- FPGA field programmable gate array
- circuits may be a custom circuit, an application-specific integrated circuit (ASIC), or the like, such as, for example, one or more circuits (such as a communication circuit) for use in wireless devices like mobile telephones, tablet computers, laptop computers, two-way radios, and similar electronic systems.
- the computer system 620 may include one or more storage devices, which may include one or more memory elements suitable to the particular application, such as a main memory in the form of random access memory (RAM), one or more hard drives, and/or one or more drives that handle removable media such as compact disks (CD), flash memory cards, digital video disk (DVD), and the like.
- RAM random access memory
- CD compact disks
- DVD digital video disk
- the computer system 620 may also include a display device, one or more speakers, and a keyboard and/or controller, which can include a mouse, trackball, touch screen, voice-recognition device, or any other device that permits a system user to input information into and receive information from the computer system 620.
- a display device one or more speakers
- a keyboard and/or controller which can include a mouse, trackball, touch screen, voice-recognition device, or any other device that permits a system user to input information into and receive information from the computer system 620.
- Some or all of the method steps may be executed by (or using) a hardware apparatus, like for example, a processor, a microprocessor, a programmable computer or an electronic circuit. In some embodiments, some one or more of the most important method steps may be executed by such an apparatus.
- embodiments of the invention can be implemented in hardware or in software.
- the implementation can be performed using a non-transitory storage medium such as a digital storage medium, for example a floppy disc, a DVD, a Blu-Ray, a CD, a ROM, a PROM, and EPROM, an EEPROM or a FLASH memory, having electronically readable control signals stored thereon, which cooperate (or are capable of cooperating) with a programmable computer system such that the respective method is performed. Therefore, the digital storage medium may be computer readable.
- Some embodiments according to the invention comprise a data carrier having electronically readable control signals, which are capable of cooperating with a programmable computer system, such that one of the methods described herein is performed.
- embodiments of the present invention can be implemented as a computer program product with a program code, the program code being operative for performing one of the methods when the computer program product runs on a computer.
- the program code may, for example, be stored on a machine readable carrier.
- inventions comprise the computer program for performing one of the methods described herein, stored on a machine readable carrier.
- an embodiment of the present invention is, therefore, a computer program having a program code for performing one of the methods described herein, when the computer program runs on a computer.
- a further embodiment of the present invention is, therefore, a storage medium (or a data carrier, or a computer-readable medium) comprising, stored thereon, the computer program for performing one of the methods described herein when it is performed by a processor.
- the data carrier, the digital storage medium or the recorded medium are typically tangible and/or non-transitionary.
- a further embodiment of the present invention is an apparatus as described herein comprising a processor and the storage medium.
- a further embodiment of the invention is, therefore, a data stream or a sequence of signals representing the computer program for performing one of the methods described herein.
- the data stream or the sequence of signals may, for example, be configured to be transferred via a data communication connection, for example, via the internet.
- a further embodiment comprises a processing means, for example, a computer or a programmable logic device, configured to, or adapted to, perform one of the methods described herein.
- a processing means for example, a computer or a programmable logic device, configured to, or adapted to, perform one of the methods described herein.
- a further embodiment comprises a computer having installed thereon the computer program for performing one of the methods described herein.
- a further embodiment according to the invention comprises an apparatus or a system configured to transfer (for example, electronically or optically) a computer program for performing one of the methods described herein to a receiver.
- the receiver may, for example, be a computer, a mobile device, a memory device or the like.
- the apparatus or system may, for example, comprise a file server for transferring the computer program to the receiver.
- a programmable logic device for example, a field programmable gate array
- a field programmable gate array may cooperate with a microprocessor in order to perform one of the methods described herein.
- the methods are preferably performed by any hardware apparatus.
- step 518 step of combining result of step 516 with reflectance image
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Pathology (AREA)
- Optics & Photonics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Signal Processing (AREA)
- Vascular Medicine (AREA)
- Endoscopes (AREA)
Abstract
Un dispositif de traitement de données (170) est configuré pour accéder à au moins une image d'entrée numérique (130), représentative d'une image de lumière réfléchie d'un objet biologique (106) et comprenant une pluralité de pixels d'entrée (230, 232). Une valeur de concentration sanguine est déterminée au niveau d'un pixel d'entrée (230, 232) de la pluralité de pixels d'entrée (230, 232), la valeur de concentration sanguine représentant la quantité de sang à un emplacement de l'objet (106), lequel emplacement est imagé dans le pixel d'entrée (230, 232). En outre, une valeur d'oxygénation du sang est déterminée au niveau du pixel d'entrée (230, 232), la valeur d'oxygénation du sang représentant la quantité de désoxyhémoglobine et/ou d'oxyhémoglobine à l'emplacement de l'objet (106). Des pixels de sortie (230, 234) d'une image couleur de sortie numérique (160) sont générés par attribution d'une couleur (222) à chaque pixel de sortie (230, 234), la couleur (222) dépendant de la valeur d'oxygénation sanguine et de la valeur de concentration sanguine.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| DE102023124622 | 2023-09-12 | ||
| DE102023124622.4 | 2023-09-12 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025056578A1 true WO2025056578A1 (fr) | 2025-03-20 |
Family
ID=92762311
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/EP2024/075313 Pending WO2025056578A1 (fr) | 2023-09-12 | 2024-09-11 | Dispositif de traitement de données et procédé mis en œuvre par ordinateur pour afficher des valeurs d'oxygénation et de concentration de sang dans un dispositif d'observation médicale et dispositif d'observation médicale et son procédé d'utilisation |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025056578A1 (fr) |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20130131517A1 (en) * | 2004-11-29 | 2013-05-23 | Hypermed Imaging, Inc. | Medical hyperspectral imaging for evaluation of tissue and tumor |
| US20140063216A1 (en) * | 2012-09-05 | 2014-03-06 | Fujifilm Corporation | Endoscope system, processor device thereof, and image processing method |
-
2024
- 2024-09-11 WO PCT/EP2024/075313 patent/WO2025056578A1/fr active Pending
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20130131517A1 (en) * | 2004-11-29 | 2013-05-23 | Hypermed Imaging, Inc. | Medical hyperspectral imaging for evaluation of tissue and tumor |
| US20140063216A1 (en) * | 2012-09-05 | 2014-03-06 | Fujifilm Corporation | Endoscope system, processor device thereof, and image processing method |
Non-Patent Citations (1)
| Title |
|---|
| HASHIMOTO M. ET AL.: "Color analysis method for estimating the oxygen saturation of hemoglobin using an image-input and processing system", ANALYTICAL BIOCHEMISTRY, vol. 162, no. 1, 1987, pages 178 - 184, XP025650380, DOI: 10.1016/0003-2697(87)90025-X |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20250350700A1 (en) | Image processor and computer-implemented method for a medical observation device, using a location-dependent color conversion function | |
| US20250143652A1 (en) | Method, processor, and medical observation device using two color images and color cameras for fluorescence and white-light | |
| EP4275580A1 (fr) | Procédé, processeur et dispositif d'observation de la fluorescence médicale utilisant deux images couleur pour enregistrer la fluorescence | |
| WO2025056578A1 (fr) | Dispositif de traitement de données et procédé mis en œuvre par ordinateur pour afficher des valeurs d'oxygénation et de concentration de sang dans un dispositif d'observation médicale et dispositif d'observation médicale et son procédé d'utilisation | |
| EP4574006A1 (fr) | Dispositif de traitement de données, procédé mis en uvre par ordinateur et dispositif d'observation médicale | |
| WO2025027024A1 (fr) | Dispositif de traitement de données et ordinateur mis en œuvre par ordinateur pour un dispositif d'observation médicale, pour la visualisation d'un signal d'autofluorescence et d'un signal d'émission de fluorescence | |
| EP4502578A1 (fr) | Dispositif de traitement de données et procédé mis en uvre par ordinateur pour combiner un signal d'émission de fluorescence avec un signal de réflexion spéculaire dans un dispositif d'observation médicale | |
| EP4501203A1 (fr) | Dispositif de traitement de données et procédé mis en uvre par ordinateur pour combiner un signal d'émission de fluorescence avec un signal de détection de bord dans un dispositif d'observation médicale | |
| EP4480385A1 (fr) | Dispositif de traitement de données pour un appareil d'observation médical tel qu'un microscope ou un endoscope et procédé mis en oeuvre par ordinateur pour générer une image de sortie de couleur | |
| US20250331722A1 (en) | Method, processor, and medical fluorescence observation device using two color images and color cameras for fluorescence and white-light | |
| US20250331709A1 (en) | Method, processor, and medical fluorescence observation device for toggling images | |
| US20250169677A1 (en) | Data processing device and computer-implemented method combining two images and an overlay color using a uniform color space | |
| US20250331746A1 (en) | Method, processor, and medical fluorescence observation device using a color-dependent color conversion function | |
| EP4275581A1 (fr) | Procédé, processeur et dispositif d'observation de fluorescence médicale pour basculer des images |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24769326 Country of ref document: EP Kind code of ref document: A1 |