[go: up one dir, main page]

WO2024256838A1 - A method and aparatus for calibrating pixel sensitivity for pixels of a hyperspectral image sensor - Google Patents

A method and aparatus for calibrating pixel sensitivity for pixels of a hyperspectral image sensor Download PDF

Info

Publication number
WO2024256838A1
WO2024256838A1 PCT/GB2024/051534 GB2024051534W WO2024256838A1 WO 2024256838 A1 WO2024256838 A1 WO 2024256838A1 GB 2024051534 W GB2024051534 W GB 2024051534W WO 2024256838 A1 WO2024256838 A1 WO 2024256838A1
Authority
WO
WIPO (PCT)
Prior art keywords
reference object
pixel
spectral
image sensor
intensity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/GB2024/051534
Other languages
French (fr)
Inventor
Anisha BAHL
Mirek JANATKA
Tom Vercauteren
Jonathan SHAPEY
Mads BERGHOLT
Conor HORGAN
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hypervision Surgical Ltd
Kings College London
Original Assignee
Hypervision Surgical Ltd
Kings College London
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hypervision Surgical Ltd, Kings College London filed Critical Hypervision Surgical Ltd
Publication of WO2024256838A1 publication Critical patent/WO2024256838A1/en
Anticipated expiration legal-status Critical
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2823Imaging spectrometer
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0297Constructional arrangements for removing other types of optical noise or for performing calibration
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00057Operational features of endoscopes provided with means for testing or calibration
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/313Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for introducing through surgical openings, e.g. laparoscopes
    • A61B1/3132Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for introducing through surgical openings, e.g. laparoscopes for laparoscopy
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2823Imaging spectrometer
    • G01J2003/2826Multispectral imaging, e.g. filter imaging
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10036Multispectral image; Hyperspectral image

Definitions

  • Hyperspectral imaging also referred to as multispectral imaging or simply spectral imaging
  • RGB images appear realistic to the human eye only because the human eye also captures the optical spectrum through 3 similar wavelength bands.
  • RGB image provides insufficient data to reconstruct the optical spectrum of a target.
  • HSI imaging systems capture the optical spectrum through many wavelength bands (typically at least 10), allowing the optical spectrum to be accurately calculated.
  • HSI provides multi-channel spectral imaging data across a field of view, where each channel represents a narrow spectral measurement centred around a given wavelength, i.e. a spectral band.
  • the diffusely reflected light is measured across at least one spectral band for any given pixel of an image and all spectral bands are captured across the spatial field of view.
  • the diffuse reflection is mostly determined by the scattering properties and absorbing or emitting species in the target being examined. HSI therefore provides the opportunity for non-invasive, quantitative analysis of a target to investigate relevant parameters.
  • White balancing typically uses a white and dark reference to account for the lighting conditions in the environment (including light source and illumination distribution), the positioning of the target object with respect to the camera, vignetting, optical transmission through the HSI system, exposure time of the camera, quantum efficiency of the sensors, inherent background sensor noise and the like. It is known that improper white balancing leads to increases in all quantitative and qualitative errors.
  • Typical methods for white balancing involve taking a white reference image at the appropriate distance using a reference plate, which is a well characterized, uniform, highly reflective, Lambertian surface.
  • the reference plate is positioned at the same distance from the camera as the target object will be and sized so as to fill the whole field of view at this position.
  • the reference plate is held in place while the HSI system captures an image.
  • the white reference image capture may take much longer than camera single image, because of the requirement for appropriate positioning, the need to minimise noise which often leads to using frame averaging, and the overall complexity of introducing the reference plate in the scene of interest.
  • a method for calibrating pixel sensitivity for pixels of a hyperspectral image sensor wherein the hyperspectral image sensor is configured to capture image data, the image data including spectral intensities measured within a plurality of spectral bands, and wherein the method comprises the steps of: providing a reference object; moving the reference object and hyperspectral image sensor relative to each other; capturing a plurality of reference images within a field of view of the hyperspectral image sensor, such that in each reference image the reference object covers a different partial portion of the field of view; selecting plurality of pixel locations from the field of view; generating a composite reference; and calibrating the pixel sensitivity of the pixels of the hyperspectral image sensor using the composite reference; wherein generating the composite reference comprises, for each of the plurality of pixel locations: selecting, from each image in the plurality of images, image data corresponding to the pixel location; generating, from the selected image data, an intensity profile for the pixel location; and identifying within the intensity profile a segment
  • an HSI system can be used to measure physiologically relevant parameters such as tissue oxygen saturation.
  • the method may be used to calibrate the pixels of an HSI system which has a lens located in the surgical field (that is, the environment surrounding and including the patient in which sterilisation is maintained) of an open surgery or inserted inside a patient's body (either through a surgical incision, or through a patient's orifice such as the mouth, ear, or anus).
  • a small, sterile reference object can be inserted in the surgical field near the HSI lens and used to perform the calibration.
  • Such a reference object is significantly easier to insert and manipulate than the reference plates known in the art, which must be sufficiently large and positioned sufficiently close to the lens to cover the entire field of view while a conventional white balancing procedure is carried out.
  • a segment for a selected pixel location comprises a portion of the intensity profile in which an intensity value is between an upper and lower reference object threshold, the upper and lower reference object threshold being higher intensity than a background value.
  • the upper reference object threshold is lower than a saturation or specular reflection value.
  • generating a segment for a selected pixel location comprises: calculating a derivative profile of the intensity profile; and identifying a positive peak and a negative peak in the derivative profile, the positive and negative peak each having an absolute value greater than a derivative threshold, and the segment comprises a period defined between the positive and negative peaks;
  • generating a segment for a selected pixel location comprises: applying an image segmentation method for each image in the plurality of images to locate the reference object in the image; and considering an image data point at the selected pixel location to be part of the segment if it falls within the detected reference object.
  • identifying a segment further comprises calculating an average intensity of the intensity profile within the reference object observation period, and in another example wherein the average is a median, a M-estimator-based average, or any other form of robust averaging.
  • the method further comprises correcting a reflectivity of the generated composite reference using a reflectivity for the reference object, the reflectivity comprising a correction factor corresponding to at least one spectral band of the composite reference.
  • combining the selected segments further comprises the steps of: generating a parametric white reference model; and applying the white reference model to the composite reference to generate a synthetic reference.
  • the synthetic reference is generated by extrapolating the white reference model from the composite reference so as to cover substantially all pixel locations in the field of view.
  • the white reference model comprises a product of a spectral sensitivity function and a spatial vignetting function.
  • the spectral sensitivity function is parametrically modelled as a scalar value per spectral band and calculating the spectral sensitivity function includes calculating a spatial average of the composite reference for at least one spectral band of the composite reference.
  • calculating the spatial vignetting function includes fitting a two- dimensional parametric function to a spectral average of composite reference.
  • the two-dimensional parametric function is an isotropic Gaussian function.
  • the method further comprises pre-processing each of the intensity profiles using a threshold or smoothing filter.
  • calibrating the sensitivity of the pixels composing the hyperspectral image sensor is performed by using only the calculated spectral sensitivity function.
  • the pixel locations selected from the field of view, and from which segments are selected to generate the composite reference collectively cover between 10% and 100% of the total field of view.
  • the field of view is automatically determined by a segmentation method for content area detection.
  • the reference object has a reproducible reflectivity. In an example, the reference object is white.
  • the reference object has a matte reflectance spectrum. That is, the reference object has a diffuse reflectivity.
  • calibrating pixel sensitivity involves calibrating the white balance of the pixels.
  • each pixel of the hyperspectral image sensor is configured for measuring photometric intensity corresponding to a single spectral band.
  • the HSI image sensor is configured to capture images in a snapshot mode.
  • pixels of the HSI image sensor are configured to capture images such that spectral bands are arranged in a mosaic pattern.
  • Figure 1(a) is a flowchart illustrating an exemplary method 100 for calibrating pixel sensitivity.
  • Figure 1(b) is a sub-flowchart of Figure 1(a), showing the process for generating composite reference images.
  • Figure 2 is a graph showing exemplary intensity profiles calculated for a pixel at a particular spectral band.
  • Figure 3 shows a sterile ruler 300 for use as an exemplary reference object.
  • Figure 4 shows an exemplary HSI imaging system.
  • Figure 5 shows an exemplary reference image taken by a HSI imaging system.
  • Figure 6 shows a computer readable medium
  • An exemplary HSI system is configured to capture image data, by measuring spectral intensities at each pixel of an I by J image sensor.
  • the image data captures information across space, wavelength, and time.
  • Each pixel measures the spectral intensity of light at N distinct wavelengths in the optical spectrum. That is, the image data captured by the HSI imaging system at an imaging time T can be represented as an I x J x N array, with each I x J layer of the array representing the spectral intensity of each pixel for a particular wavelength.
  • N is at least 9. In some embodiments, N is at least 16.
  • each pixel of the HSI sensor is configured for measuring photometric intensity corresponding to a single spectral band with the N spectral bands being scattered across all the pixel locations.
  • An interpolation strategy is then used to form an HSI system able to reconstruct all N spectral bands at each pixel location, that is the creation of an I x J x N array.
  • the HSI system captures images in a field of view through a lens.
  • the lens may be positioned distally from the rest of the HSI system, in a dif icult-to-access environment.
  • the lens may be positioned inside a confined space such as a patient's body cavity.
  • the lens is positioned outside of the body but still in the surgical field, thereby requiring careful manipulation.
  • the method 100 for calibrating pixel sensitivity involves providing 101 a reference object (also referred to as a partial reference).
  • the reference object is provided in the sense that it can be positioned within the environment of interest (that is, the environment for which the HSI imaging system is being white-balanced).
  • the reference object has a known reflectance spectrum, allowing calibration to be performed.
  • the pixel calibration method is sufficiently robust to allow a wide range of objects to be used as a reference object.
  • the reference object is white or close to white, ensuring that it reflects light equally well across at least the optical spectrum.
  • the reference object is diffusively reflective. This ensures that the reference object can be easily seen in low-light conditions, but does not direct bright, localised specular reflections into the lens of the HSI system. In other words, the reference object is matte rather than mirror-like or shiny.
  • the reference object has a consistent surface texture across the majority of its surface, ensuring that the reflectance spectrum of the reference object does not change substantially when viewed from different orientations.
  • the reference object has a primarily flat or convex geometry, to ensure that the reference object does not cast shadows or reflections on itself when lit, which would cause the reference object to have an inconsistent brightness.
  • a reference object is selected such that the spectral profile of the reference differs significantly from that of the background. This enables the reference object to be more easily distinguished from the background.
  • the reference object is moved 102 relative to the lens of the HSI imaging system over a time period. This may be done by manually manipulating the reference object or the HSI imaging system. Alternatively, a mechanical actuator (such as a robotic arm or mechanical mount) may be used to manipulate the reference object or HSI imaging system.
  • the HSI imaging system is held steady relative to the environment. This means that only the reference object is moving relative to the HSI imaging system, avoiding motion blurring of the environment. In alternative embodiments, it may be easier to move the lens of the HSI system while the reference object is held steady relative to the environment.
  • the HSI imaging system is configured to capture a plurality of images of the environment.
  • the HSI imaging system captures a video of the environment over a given time period, with each frame of the video comprising an image.
  • the HSI imaging system captures images within a field of view, the field of view being defined by the lens, optical pathways, sensor, and other components of the HSI imaging system.
  • the field of view is a cone with a fixed internal angle typically between 5 and 65 degrees. It will be understood that the field of view does not necessarily refer to a specific viewpoint within the environment. For example, the viewpoint of the HSI imaging system can be changed without changing the field of view, by moving the lens of the HSI imaging system to point towards a different part of the environment.
  • the HSI imaging system captures 103 a plurality of reference images within the field of view, such that in each reference image the reference object covers a different partial portion of the field of view. That is, at the start of the acquisition, the reference object is positioned at a first location relative to the lens, such that the reference object partially covers a first portion of the field of view. During the acquisition, the reference object is moved to a second portion relative to the lens, such that the reference object covers at least a second portion of the field of view. While the reference object is moved, the HSI imaging system captures a plurality of reference images. In capturing the reference images, the HSI imaging system may record spectral intensities at each pixel of the imaging sensor, and any other typical metadata such as time of acquisition.
  • the intensities measured by the HSI imaging system represent the reflectance intensities of the objects within the field of view but are confounded by the sensitivity of the imaging setup which our calibration seeks to compensate.
  • the HSI imaging system is configurable to perform a main imaging procedure, in which the HSI imaging system captures imagery of the environment or subject for analysis.
  • the pixel calibration method 100 is carried out prior to or after the main imaging procedure.
  • the pixel calibration method 100 may be carried out alongside the main imaging procedure. That is, the reference images could be a sub-set taken from the images captured during the main imaging procedure. For example, the period of the main imaging procedure could begin with, or include as an intermediate step, passing a reference object around the field of view of the HSI imaging system.
  • the pixel calibration method 100 comprises a method for white balancing the HSI system, but it will be understood that the pixel calibration method 100 is not limited only to white balancing.
  • the reference images may be a sub-set of the images captured by the HSI imaging system over a given time period.
  • the reference object may be moved through the field of view slowly so that the reference object covers substantially the same portion of the field of view in two successive frames. Only one of these two frames may be used as a reference image, with the other image discarded.
  • Each reference image can be arranged as an array of spatiospectral pixels, that is, each pixel has an associated location (j,j) and at least one measured spectral intensity.
  • the reference images are arranged to form a hypercube where an additional axis is included to index the collection of reference images.
  • an additional axis is included to index the collection of reference images.
  • the image data in the reference images may be arranged to form an I x J x N x T hypercube, where T is the number of steps at which a reference image was captured during the time period.
  • This hypercube has two spatial axes with lengths I and J, corresponding to the two- dimensional sensor array.
  • the hypercube has one spectral wavelength axes with length /V, corresponding to the wavelength bands captured.
  • the hypercube has one temporal axis with length T, corresponding to the time period over which the data was captured.
  • the intensity profile of each spatiospectral pixel or plurality of pixels can be easily generated from the hypercube, as each row of the hypercube along the temporal axis comprises the intensity profile for an individual spatiospectral pixel. That is, each intensity profile may be a temporal intensity profile for an individual spatiospectral pixel.
  • a composite reference is then generated 104 using the reference images. A plurality of pixel locations from the field of view are selected 1041.
  • selecting a plurality of pixel locations in the field of view involves selecting a plurality of pixels corresponding with that location in the field of view from the reference images. For example, pixel (/,j) and spectral band n could be selected from each IxJxN reference image. Corresponding image data is selected 1042 corresponding to the location and spectral band.
  • the selected 1042 image data and/or parameters are used to generate 1043 an intensity profile for the selected image data for a selected spectral band.
  • the intensity profile represents the spectral intensity in a given spectral band over time at a particular location. That is, the intensity profile can be graphically displayed as a graph of spectral intensity against time (as will be discussed in relation to Figure 2).
  • the intensity of the intensity profile changes as the reference object passes through the location. That is, for each spatiospectral pixel in the image plane, there is a temporal distribution of measured intensities, which alternates between background and reference intensities over time. This allows a segment of the intensity profile to be identified 1044, corresponding to the period of time in which the reference object was observed in a particular location in the field of view. In some embodiments, the segment is identified from one or more intensity profiles corresponding to one set of wavelength bands, and image data is extracted from intensity profiles corresponding to a second set of wavelength bands.
  • the method comprises processing the locations in the field of view in parallel, generating 1043 an intensity profile for each location and identifying and extracting a segment.
  • the method may comprise looping iteratively through locations in the field of view.
  • an intensity profile is generated for every pixel of the HSI sensor array.
  • a composite reference can be outputted.
  • an average intensity is calculated from the segments at each location, and the composite reference is an image which combines these average intensities.
  • the composite reference represents the measured reflectance intensity of the reference object at each spatiospectral location within the field of view.
  • the composite reference is then compared against the a priori known reflectivity of the reference object.
  • the reflectivity of an object is a measure of reflectance averaged over the spectral bands of interest. That is, we integrate the reflectance of the object over the spectral band of the imaging setup as disclosed in detail hereafter.
  • the reflectance of an object as given by its reflectance spectrum, is generally only a property of the surface material of the object, and therefore, the reflectivity only depends on the object properties and the spectral bands of the imaging system but does not otherwise depend on the imaging configuration (distance to target, vignetting, etc.).
  • the composite reference can be corrected 105 using the known reflectivity. In some embodiments, the reflectivity is less than 100%, and varies across the spectral bands of interest. As such, the reflectivity correction 104 may be referred to as a bandwise reflectivity correction.
  • t(A) is the known reflectance spectrum of the reference object for a wavelength A
  • & n (A) is the band response as obtained 110 experimentally or from the HSI imaging sensor manufacturer.
  • Each pixel of the initial composite reference W' c (i,j, may then be divided by the a priori correction factor for the appropriate band to form the corrected composite reference: W c (i,j,n) - W' c i,j,n)/p“.
  • the reflectance spectrum of the reference object is measured 111. That is, an a priori measurement 111 of the reference object's reflectance spectrum is made.
  • the reflectance spectrum of the reference object is measured 111 using a calibrated measurement device such as a spectrophotometer.
  • the reflectivity correction factors may then be calculated from the reflectance spectrum measured by the calibrated measurement device.
  • the reference object can be imaged with the HSI imaging system and compared with an HSI image of a calibration object with a known reflectance spectrum.
  • a calibration object is a Spectralon 95% reflective tile. Spectralon is a material which acts as a highly reflective Lambertian surface using a specific formulation, but it will be understood that other reflectance standards are also available and could be used.
  • the calibration object has a uniform reflectance spectrum across its whole surface.
  • the calibration object is positioned relative to the HSI imaging system such that the whole field of view of the HSI imaging system is covered by the calibration object.
  • the reflectivity correction factors may be calculated 112 by taking a reflectivity profile (i.e. a well calibrated reflectance spectrum t(2)) of the calibration object and using equation (1) with the band responses b n (A) of the HSI imaging system.
  • the reflectivity correction factors are calculated by using the HSI imaging system to capture HSI images of both the calibration object and of a known reference (such as a Spectralon) and dividing the former by the latter.
  • the reflectance spectrum of the reference object and/or calibration object are measured under known lighting conditions, that is, when they are lit by a light source with a known spectral profile.
  • a dark balancing step is performed to account for any bias in sensor noise.
  • a dark reference may be obtained by covering the lens with an opaque object (such as the lens cap), such that no light is transmitted into the sensor.
  • One or more dark images are then captured in this configuration.
  • the dark image (or an average of the dark images) is then subtracted from the reference and/or*/ composite reference images to account for the sensor noise bias.
  • a mathematical model may be used to estimate a synthetic dark image based on exposure time, temperature of the sensor or any other known parameter of influence.
  • the images may also be corrected for exposure time using methods known in the art.
  • a synthetic reference can then be constructed 108.
  • the synthetic reference is simply the composite reference or the reflectivity-corrected 105 composite reference.
  • the composite references may still retain imperfections from the input video. It may also be incomplete in the sense that not all correction factors for all spatiospectral locations in the field of view may have been calculated. These imperfections may be removed by applying a white reference mathematical model fit to the composite reference. The white reference model is applied to the composite reference to construct 108 a synthetic reference.
  • the white reference is modelled as the separable product of a spatially dependent vignetting, spectrally dependent spectral sensitivities, and a scalar multiplicative factor with noise treated as negligible. That is, the white reference model is a product of a spectral sensitivity function and a spatial vignetting function.
  • the white reference model may be modelled by the formula :
  • n is the spectral co-ordinate (i.e. the wavelength band index)
  • S(n) is the spectral sensitivity function
  • M is a scalar factor to account for the overall light intensity.
  • a noise function N(i,/,n) accounts for noise in both the spatial and spectral domains.
  • the spatial vignetting function may be calculated 107 by fitting a two-dimensional parametric function to a spectral average of the reflectivity-corrected composite reference.
  • a weighted spectral average may be used, for example by compensating for spectral sensitivities.
  • a single spectral band may be used for the fit.
  • the two-dimensional parametric function is an isotropic Gaussian function.
  • the Gaussian function is represented by the equation : where m and /z 7 represent the co-ordinates of the centre of the Gaussian in the i and j directions respectively and a represents the standard deviation.
  • the white reference model is fitted to the composite reference using a joint least squares approach. In other examples a different normalisation choice could be used.
  • the HSI imaging system includes a circular field of view as may be resulting from coupling an exoscope or laparoscope to a rectangular HSI imaging sensor.
  • the content area and field of view is circular while the sensor area is rectangular. This may be accounted for when constructing 108 the synthetic reference.
  • the field of view is automatically detected 113 by a segmentation method for content area detection or may be known beforehand through a calibration step.
  • a content area disk is detected 113 from a single image of the reference images. This may be performed using any method known in the art from simple methods such as thresholding to more advanced approaches such as a pre-trained neural network.
  • the radius of the content area disk is reduced 114 to a percentage of its original radius such as a proportion between 90% and 98%. In some examples, the radius of the content area disk is reduced 114 beyond 90%. The reduced-radius disk is then used to mask the content area as a step in constructing 113 the synthetic white reference. In other examples, the radius of the content area disk is not reduced, thereby utilizing all of the information from the content area disk.
  • Figure 2 shows an unfiltered intensity profile 201. Note that since in this example the time interval between image frames is consistent, the frame number can be used as a substitute for the time on the x-axis.
  • the intensity profile is pre-processed before the segment is identified 1044, to remove noise and allow the reference and background intensities to be distinguished from each other more easily.
  • the pre-processing may comprise a smoothing filter and/or a threshold filter.
  • the filtered temporal profile 202 is generated using a smoothing filter and a threshold filter, wherein the threshold is set well below the average reference intensity and well above the average background intensity.
  • the threshold filter discards all intensity values whose intensity lie below a lower threshold. This suppresses the impact of background intensity values.
  • the lower threshold is calculated using a classical parameter-free Otsu's method.
  • the threshold filter also discards all intensity values with an intensity above an upper threshold. This mitigates the impact of specular reflections, which can saturate the HSI imaging sensor.
  • the smoothing filter is a Savitsky-Golay filter, with window size 15 and order 2. Note that in some embodiments, the reference object may pass multiple times through the location. As such, multiple segments 206 can be identified 1044.
  • a segment 206 is identified 1044 by calculating a temporal gradient of the intensity profile, that is, the time derivative of the intensity profile.
  • the temporal gradient 203 is the time derivative of the filtered intensity profile 202.
  • the intensity in one or more wavelengths will change quickly over time. Specifically, if the reference object has a higher reflectance intensity than the background (as shown in Figure 2), the temporal gradient 203 will have a large positive peak 203p when the reference object comes into view, and a large negative peak 203n when the reference object goes out of view.
  • a positive peak 203p followed by a negative peak 203n which can be detected using a peak picking algorithm, identify a segment 206.
  • the positive and negative peak are only identified if they each have an absolute value greater than a threshold value. This ensures that small peaks in the temporal gradient (e.g. a gradual change in intensity caused by shadowing on the reference object) are not mis-identified as the start or end of a segment 206.
  • the highlighted datapoints 205 comprise image data in the segments 206.
  • An average of the datapoints 205 such as a median, M-estimator-based average, or any other form of robust averaging (that is, any averaging method not easily swayed by outliers), can be utilised to give an intensity for the reference object at that location.
  • the segments 206 are located from the filtered (e.g. by smoothing and thresholding) intensity profile, but the image data for the segments 206 is taken from the unfiltered raw intensity profile.
  • a temporal intensity profile is generated for a plurality of pixel locations, and then the temporal location of the reference object is identified for each intensity profile.
  • a segment 206 is identified 1044 by applying an image segmentation method for each image in the plurality of images to locate the reference object in the image. In other words, rather than finding the time a reference object passes through a location, the location of the reference object is found for a particular time. The pixel locations are selected if the reference object is detected within them using the image segmentation method, and the corresponding image data for those locations at the timestamp of the image is selected for the segments 206.
  • a method known in the art such as based on deep learning is used to spatially segment the area corresponding to the reference object in the reference images.
  • the segmented spatial areas may be reduced to a percentage of their original size so as to discount edge effects.
  • the synthetic white reference may be used to apply 109 pixel calibration on the images of the target object.
  • Pixel calibration may be performed according to the following formula: where w is the synthetic white reference in any of the disclosed embodiment (e.g. composite white reference or white reference model), D is a dark image, and R is the white-balanced reflectance estimation of the target object of interest.
  • r ; , T W , and t D describe the exposure time used for acquisition of each of the above, and p n is the bandwise reflectivity of the white reference used when the white reference has not been pre-emptively compensated for it.
  • relative pixel calibration may be sufficient. In such instances, normalization is performed only based on the estimated spectral sensitivity function with no influence from the vignetting function.
  • the composite reference is generated for only a partial portion of the field of view.
  • the partial reference may only pass through a subset of the field of view, and the composite reference may be generated corresponding only for that subset.
  • Computation of a synthetic white reference may be performed by calculating the white reference model from to the composite reference, and then applying the white reference model to the whole field of view by means of extrapolation.
  • Figure 3 shows a sterile ruler 300, which may be used as a reference object in some embodiments.
  • Sterile rulers are widely available in surgical theatres and can be placed on tissue in the surgical field. Furthermore, it has been found that the spectral profile of mass-produced sterile rulers is generally consistent across individual units. Thus, in some embodiments the spectral profile of a first sterile ruler can be measured as previously described.
  • a second sterile ruler of the same design can be used as the reference object. This ensures that the second sterile ruler can remain in a sealed sterile container until just before insertion into a patient.
  • the unmarked side 300b of the sterile ruler 300 is faced towards the lens, rather than the marked side 300a.
  • the unmarked side 300b of many mass- produced sterile rulers has a more consistent spectral profile, because it has a more matte aspect than the marked side across its whole surface.
  • the reflectivity of one or more reference objects are stored onboard the HSI system. An operator then selects the reflectivity corresponding with the reference object provided. For example, the reflectivity of various designs of commercially available sterile rulers and may be measured. The user then selects the reflectivity corresponding to the particular design of sterile ruler which is provided.
  • the reflectivity of one reference object may be used to calculate 112 the reflectivity correction factors for a different reference object. This is because the same reflectance spectrum can generally be used for any reference object made of a particular material, so long as the geometry of the reference object is not so extreme as to distort the reflectance spectrum of the object.
  • the HSI imaging system 400 comprises an exoscope or laparoscope 401, operating as a lens located at a distal end of the HSI system.
  • the exoscope or laparoscope is optically coupled via an adapter 404 to an HSI imaging sensor 402.
  • the HSI imaging sensor 402 is connected to a processor 405, either directly through a cable 403 as shown, or through a wireless communications interface.
  • the processor is configured to generate images for a display 409.
  • the HSI imaging system 400 is configured to capture images within an environment 407.
  • a reference object 408 is provided within the environment, so that a pixel calibration method as previously described can be carried out.
  • the reference object 408 is moved relative to the field of view 410 of the HSI imaging system 410.
  • the exoscope or laparoscope 401 is optically connected to a light source 406, providing illumination within the environment 407.
  • HSI image sensors 402 There are three broad categories of HSI image sensors 402: spatial scan including linescan, spectral scan, and snapshot.
  • Linescan cameras acquire data for all wavelengths simultaneously for each line of pixels sequentially, whereas spectral scan cameras collect all pixels at each wavelength in turn using band pass filters. These methods both measure complete hypercubes.
  • These HSI approaches are however not able to provide real-time data as their respective scanning mechanisms result in long acquisition times that are prone to motion artefacts.
  • snapshot mosaic cameras utilize sensors where each pixel has a dedicated band pass filter. This provides information on one band per pixel but captures all pixels in a single shot. This allows highly time resolved data to be obtained, at the cost of lower spatial and spectral resolution.
  • a reference image 500 has a field of view 501 defined by a circular lens and exoscope or laparoscope.
  • the reference object 502 partially covers the field of view 501, in the sense that the reference object covers a partial portion of the field of view.
  • the rest of the field of view 501 is taken up by the background environment 503.
  • the reference object 502 is more reflective than the environment 503 across the visual spectrum, meaning that the reference object 502 appears generally whiter and brighter to the HSI imaging system than the environment 503 (excluding the specular reflections 504 from the background environment).
  • Figure 6 shows a computer readable medium 600 comprising computer program code configured to cause a controller comprising a processor and a memory to operate as described herein. It will be understood that in some embodiments, processing operations which have been described as being carried out sequentially may be carried out in parallel. Other optimisations for computational efficiency are also envisioned.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Surgery (AREA)
  • Radiology & Medical Imaging (AREA)
  • Medical Informatics (AREA)
  • Pathology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Optics & Photonics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Signal Processing (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)

Abstract

A method for calibrating pixel sensitivity for pixels of a hyperspectral image sensor, wherein the hyperspectral image sensor is configured to capture image data, the image data including spectral intensities measured within a plurality of spectral bands, and wherein the method comprises the steps of: providing a reference object; moving the reference object and hyperspectral image sensor relative to each other; capturing a plurality of reference images within a field of view of the hyperspectral image sensor, such that in each reference image the reference object covers a different partial portion of the field of view; selecting a plurality of pixel locations from the field of view; generating a composite reference; and calibrating the pixel sensitivity of the pixels of the hyperspectral image sensor using the composite reference; wherein generating the composite reference comprises, for each of the plurality of pixel locations: selecting, from each image in the plurality of images, image data corresponding to the pixel location; generating, from the selected image data, an intensity profile for the pixel location; and identifying within the intensity profile a segment corresponding to the reference object being observed at the pixel location; and combining the selected segments.

Description

A METHOD AND APARATUS FOR CALIBRATING PIXEL SENSITIVITY FOR PIXELS OF A HYPERSPECTRAL IMAGE SENSOR
BACKGROUND TO THE INVENTION
Hyperspectral imaging (HSI), also referred to as multispectral imaging or simply spectral imaging, is becoming increasingly common in a wide range of applications. In conventional photography, the optical spectrum is typically only captured through three broad wavelengths bands in the visible range (red, green and blue (RGB)). RGB images appear realistic to the human eye only because the human eye also captures the optical spectrum through 3 similar wavelength bands. However, an RGB image provides insufficient data to reconstruct the optical spectrum of a target. By contrast, HSI imaging systems capture the optical spectrum through many wavelength bands (typically at least 10), allowing the optical spectrum to be accurately calculated.
HSI provides multi-channel spectral imaging data across a field of view, where each channel represents a narrow spectral measurement centred around a given wavelength, i.e. a spectral band. The diffusely reflected light is measured across at least one spectral band for any given pixel of an image and all spectral bands are captured across the spatial field of view. The diffuse reflection is mostly determined by the scattering properties and absorbing or emitting species in the target being examined. HSI therefore provides the opportunity for non-invasive, quantitative analysis of a target to investigate relevant parameters.
To enable parameter extraction from HSI images, accurate spectra must typically be extracted from the images. This requires white balancing the HSI system for the particular target/imaging geometry and environment as an initial step. White balancing typically uses a white and dark reference to account for the lighting conditions in the environment (including light source and illumination distribution), the positioning of the target object with respect to the camera, vignetting, optical transmission through the HSI system, exposure time of the camera, quantum efficiency of the sensors, inherent background sensor noise and the like. It is known that improper white balancing leads to increases in all quantitative and qualitative errors. Typical methods for white balancing involve taking a white reference image at the appropriate distance using a reference plate, which is a well characterized, uniform, highly reflective, Lambertian surface. The reference plate is positioned at the same distance from the camera as the target object will be and sized so as to fill the whole field of view at this position. The reference plate is held in place while the HSI system captures an image. The white reference image capture may take much longer than camera single image, because of the requirement for appropriate positioning, the need to minimise noise which often leads to using frame averaging, and the overall complexity of introducing the reference plate in the scene of interest.
Existing HSI systems relying on standard white balancing face limitations in certain applications. It is difficult to perform white balance calibration in confined spaces, because there may be inadequate space for inserting the reference plate and holding it in place in front of the HSI system's lens. It is also difficult to impose that the same camera to object distance and lighting conditions is kept across the white reference and the target image. It can also be difficult to hold the reference plate in place relative to the lens as an image is captured, particularly (as is often required for imaging a confined space) where the lens is mounted on the distal end of some kind of flexible or handheld mount such as an exoscope or laparoscope. In some cases, there is simply insufficient space to perform white balancing in situ, and an alternative method must be used for calibration (for example, using a preset calibration profile which appears to be similar to the current environment). These alternative methods provide less accurate white balancing, reducing the accuracy of the parameters extracted from the HSI images.
Accordingly, there is a need for systems and methods which enable accurate HSI imaging in confined spaces.
SUMMARY OF THE INVENTION
According to a first example, there comprises a method for calibrating pixel sensitivity for pixels of a hyperspectral image sensor, wherein the hyperspectral image sensor is configured to capture image data, the image data including spectral intensities measured within a plurality of spectral bands, and wherein the method comprises the steps of: providing a reference object; moving the reference object and hyperspectral image sensor relative to each other; capturing a plurality of reference images within a field of view of the hyperspectral image sensor, such that in each reference image the reference object covers a different partial portion of the field of view; selecting plurality of pixel locations from the field of view; generating a composite reference; and calibrating the pixel sensitivity of the pixels of the hyperspectral image sensor using the composite reference; wherein generating the composite reference comprises, for each of the plurality of pixel locations: selecting, from each image in the plurality of images, image data corresponding to the pixel location; generating, from the selected image data, an intensity profile for the pixel location; and identifying within the intensity profile a segment corresponding to the reference object being observed at the pixel location; and combining the selected segments.
One potential application of the method is in pre-clinical and clinical studies to non- invasively provide information for disease diagnosis and surgical guidance. Existing methods are limited in their capability to provide absolute, quantitative pixel calibration matching the acquisition setup used for the target object of interest. By providing correctly calibrated images of a tissue, an HSI system can be used to measure physiologically relevant parameters such as tissue oxygen saturation. In some embodiments, the method may be used to calibrate the pixels of an HSI system which has a lens located in the surgical field (that is, the environment surrounding and including the patient in which sterilisation is maintained) of an open surgery or inserted inside a patient's body (either through a surgical incision, or through a patient's orifice such as the mouth, ear, or anus). A small, sterile reference object can be inserted in the surgical field near the HSI lens and used to perform the calibration. Such a reference object is significantly easier to insert and manipulate than the reference plates known in the art, which must be sufficiently large and positioned sufficiently close to the lens to cover the entire field of view while a conventional white balancing procedure is carried out.
Moreover, conventional reference plates, such as 95% reflective Spectralon tiles, cannot be sterilized and are very sensitive, making them challenging to integrate in a surgical environment. This necessitates that white balancing is acquired outside of the surgical field, resulting in different imaging conditions relative to the subject. Ideally, a new white reference should be acquired whenever the imaging conditions are altered which, at best, disrupts the clinical workflow and, at worst, is impossible. This forces the settings to be kept constant throughout a surgical procedure, thereby limiting the HSI system's use, or risks losing the ability to acquire quantitative spectral data. The method can use a wider range of objects as the reference object, including standard sterile objects typically used. This enables calibration in situ inside the body under the correct imaging conditions, as well as re-calibration during the procedure if the lighting conditions change.
In an example, a segment for a selected pixel location comprises a portion of the intensity profile in which an intensity value is between an upper and lower reference object threshold, the upper and lower reference object threshold being higher intensity than a background value. In an example the upper reference object threshold is lower than a saturation or specular reflection value.
In an example, generating a segment for a selected pixel location comprises: calculating a derivative profile of the intensity profile; and identifying a positive peak and a negative peak in the derivative profile, the positive and negative peak each having an absolute value greater than a derivative threshold, and the segment comprises a period defined between the positive and negative peaks;
In an example, generating a segment for a selected pixel location comprises: applying an image segmentation method for each image in the plurality of images to locate the reference object in the image; and considering an image data point at the selected pixel location to be part of the segment if it falls within the detected reference object.
In an example, identifying a segment further comprises calculating an average intensity of the intensity profile within the reference object observation period, and in another example wherein the average is a median, a M-estimator-based average, or any other form of robust averaging.
In an example, the method further comprises correcting a reflectivity of the generated composite reference using a reflectivity for the reference object, the reflectivity comprising a correction factor corresponding to at least one spectral band of the composite reference.
In an example, combining the selected segments further comprises the steps of: generating a parametric white reference model; and applying the white reference model to the composite reference to generate a synthetic reference.
In an example, the synthetic reference is generated by extrapolating the white reference model from the composite reference so as to cover substantially all pixel locations in the field of view.
In an example, the white reference model comprises a product of a spectral sensitivity function and a spatial vignetting function.
In an example, the spectral sensitivity function is parametrically modelled as a scalar value per spectral band and calculating the spectral sensitivity function includes calculating a spatial average of the composite reference for at least one spectral band of the composite reference.
In an example, calculating the spatial vignetting function includes fitting a two- dimensional parametric function to a spectral average of composite reference.
In an example, the two-dimensional parametric function is an isotropic Gaussian function.
In an example, the method further comprises pre-processing each of the intensity profiles using a threshold or smoothing filter.
In an example, calibrating the sensitivity of the pixels composing the hyperspectral image sensor is performed by using only the calculated spectral sensitivity function. This means that relative data can be acquired with spectral sensitivities alone using values that can be pre-calculated for a given configuration or calculated in situ. While spectral sensitivities are mostly independent of the distance to the imaged object, this is not the case for the spatial function. It is thus significantly harder or even impossible to calculate a spatial function if the distance to the imaged object cannot be maintained and so it is likely that in many situations it would be advantageous to only use precalculated spectral sensitivities.
In an example, the pixel locations selected from the field of view, and from which segments are selected to generate the composite reference, collectively cover between 10% and 100% of the total field of view.
In an example, the field of view is automatically determined by a segmentation method for content area detection.
In an example, the reference object has a reproducible reflectivity. In an example, the reference object is white.
In an example, the reference object has a matte reflectance spectrum. That is, the reference object has a diffuse reflectivity.
In an example, calibrating pixel sensitivity involves calibrating the white balance of the pixels. In an example, each pixel of the hyperspectral image sensor is configured for measuring photometric intensity corresponding to a single spectral band.
In a second example, there comprises an apparatus for calibrating an HSI image sensor using a method according to any preceding example.
In an example, the HSI image sensor is configured to capture images in a snapshot mode.
In an example, pixels of the HSI image sensor are configured to capture images such that spectral bands are arranged in a mosaic pattern.
BRIEF DESCRIPTION OF THE DRAWINGS
Figure 1(a) is a flowchart illustrating an exemplary method 100 for calibrating pixel sensitivity. Figure 1(b) is a sub-flowchart of Figure 1(a), showing the process for generating composite reference images.
Figure 2 is a graph showing exemplary intensity profiles calculated for a pixel at a particular spectral band.
Figure 3 shows a sterile ruler 300 for use as an exemplary reference object.
Figure 4 shows an exemplary HSI imaging system.
Figure 5 shows an exemplary reference image taken by a HSI imaging system.
Figure 6 shows a computer readable medium
DETAILED DESCRIPTION
An exemplary HSI system is configured to capture image data, by measuring spectral intensities at each pixel of an I by J image sensor. The image data captures information across space, wavelength, and time. Each pixel measures the spectral intensity of light at N distinct wavelengths in the optical spectrum. That is, the image data captured by the HSI imaging system at an imaging time T can be represented as an I x J x N array, with each I x J layer of the array representing the spectral intensity of each pixel for a particular wavelength. In some embodiments, N is at least 9. In some embodiments, N is at least 16. In some examples, each pixel of the HSI sensor is configured for measuring photometric intensity corresponding to a single spectral band with the N spectral bands being scattered across all the pixel locations. An interpolation strategy is then used to form an HSI system able to reconstruct all N spectral bands at each pixel location, that is the creation of an I x J x N array.
The HSI system captures images in a field of view through a lens. The lens may be positioned distally from the rest of the HSI system, in a dif icult-to-access environment. For example, the lens may be positioned inside a confined space such as a patient's body cavity. In another example, the lens is positioned outside of the body but still in the surgical field, thereby requiring careful manipulation.
An exemplary method 100 for calibrating pixel sensitivity is described in relation to Figure 1. The method 100 for calibrating pixel sensitivity involves providing 101 a reference object (also referred to as a partial reference). The reference object is provided in the sense that it can be positioned within the environment of interest (that is, the environment for which the HSI imaging system is being white-balanced).
The reference object has a known reflectance spectrum, allowing calibration to be performed. In some embodiments, the pixel calibration method is sufficiently robust to allow a wide range of objects to be used as a reference object. In some embodiments, the reference object is white or close to white, ensuring that it reflects light equally well across at least the optical spectrum. In some examples, the reference object is diffusively reflective. This ensures that the reference object can be easily seen in low-light conditions, but does not direct bright, localised specular reflections into the lens of the HSI system. In other words, the reference object is matte rather than mirror-like or shiny. In some embodiments, the reference object has a consistent surface texture across the majority of its surface, ensuring that the reflectance spectrum of the reference object does not change substantially when viewed from different orientations. Similarly, in some embodiments the reference object has a primarily flat or convex geometry, to ensure that the reference object does not cast shadows or reflections on itself when lit, which would cause the reference object to have an inconsistent brightness. In some embodiments, a reference object is selected such that the spectral profile of the reference differs significantly from that of the background. This enables the reference object to be more easily distinguished from the background. The reference object is moved 102 relative to the lens of the HSI imaging system over a time period. This may be done by manually manipulating the reference object or the HSI imaging system. Alternatively, a mechanical actuator (such as a robotic arm or mechanical mount) may be used to manipulate the reference object or HSI imaging system. In some embodiments, the HSI imaging system is held steady relative to the environment. This means that only the reference object is moving relative to the HSI imaging system, avoiding motion blurring of the environment. In alternative embodiments, it may be easier to move the lens of the HSI system while the reference object is held steady relative to the environment.
The HSI imaging system is configured to capture a plurality of images of the environment. In some embodiments, the HSI imaging system captures a video of the environment over a given time period, with each frame of the video comprising an image.
The HSI imaging system captures images within a field of view, the field of view being defined by the lens, optical pathways, sensor, and other components of the HSI imaging system. In some embodiments, the field of view is a cone with a fixed internal angle typically between 5 and 65 degrees. It will be understood that the field of view does not necessarily refer to a specific viewpoint within the environment. For example, the viewpoint of the HSI imaging system can be changed without changing the field of view, by moving the lens of the HSI imaging system to point towards a different part of the environment.
The HSI imaging system captures 103 a plurality of reference images within the field of view, such that in each reference image the reference object covers a different partial portion of the field of view. That is, at the start of the acquisition, the reference object is positioned at a first location relative to the lens, such that the reference object partially covers a first portion of the field of view. During the acquisition, the reference object is moved to a second portion relative to the lens, such that the reference object covers at least a second portion of the field of view. While the reference object is moved, the HSI imaging system captures a plurality of reference images. In capturing the reference images, the HSI imaging system may record spectral intensities at each pixel of the imaging sensor, and any other typical metadata such as time of acquisition. The intensities measured by the HSI imaging system represent the reflectance intensities of the objects within the field of view but are confounded by the sensitivity of the imaging setup which our calibration seeks to compensate. In some embodiments, the HSI imaging system is configurable to perform a main imaging procedure, in which the HSI imaging system captures imagery of the environment or subject for analysis. In some embodiments, the pixel calibration method 100 is carried out prior to or after the main imaging procedure. Alternatively, the pixel calibration method 100 may be carried out alongside the main imaging procedure. That is, the reference images could be a sub-set taken from the images captured during the main imaging procedure. For example, the period of the main imaging procedure could begin with, or include as an intermediate step, passing a reference object around the field of view of the HSI imaging system. In some examples, the pixel calibration method 100 comprises a method for white balancing the HSI system, but it will be understood that the pixel calibration method 100 is not limited only to white balancing.
The reference images may be a sub-set of the images captured by the HSI imaging system over a given time period. For example, the reference object may be moved through the field of view slowly so that the reference object covers substantially the same portion of the field of view in two successive frames. Only one of these two frames may be used as a reference image, with the other image discarded. Each reference image can be arranged as an array of spatiospectral pixels, that is, each pixel has an associated location (j,j) and at least one measured spectral intensity.
In some embodiments, the reference images are arranged to form a hypercube where an additional axis is included to index the collection of reference images. For clarity we refer to this additional dimension as time but it should be clear that reference images may not necessarily need to be acquired consecutively. For example, the image data in the reference images may be arranged to form an I x J x N x T hypercube, where T is the number of steps at which a reference image was captured during the time period. This hypercube has two spatial axes with lengths I and J, corresponding to the two- dimensional sensor array. The hypercube has one spectral wavelength axes with length /V, corresponding to the wavelength bands captured. Finally, the hypercube has one temporal axis with length T, corresponding to the time period over which the data was captured. The intensity profile of each spatiospectral pixel or plurality of pixels can be easily generated from the hypercube, as each row of the hypercube along the temporal axis comprises the intensity profile for an individual spatiospectral pixel. That is, each intensity profile may be a temporal intensity profile for an individual spatiospectral pixel. A composite reference is then generated 104 using the reference images. A plurality of pixel locations from the field of view are selected 1041. Since each pixel in the sensor array captures a section of the field of view, in some embodiments selecting a plurality of pixel locations in the field of view involves selecting a plurality of pixels corresponding with that location in the field of view from the reference images. For example, pixel (/,j) and spectral band n could be selected from each IxJxN reference image. Corresponding image data is selected 1042 corresponding to the location and spectral band.
The selected 1042 image data and/or parameters are used to generate 1043 an intensity profile for the selected image data for a selected spectral band. The intensity profile represents the spectral intensity in a given spectral band over time at a particular location. That is, the intensity profile can be graphically displayed as a graph of spectral intensity against time (as will be discussed in relation to Figure 2).
Due to the relative motion of the reference object and the lens, and the different spectral profiles of the background environment and reference object, the intensity of the intensity profile changes as the reference object passes through the location. That is, for each spatiospectral pixel in the image plane, there is a temporal distribution of measured intensities, which alternates between background and reference intensities over time. This allows a segment of the intensity profile to be identified 1044, corresponding to the period of time in which the reference object was observed in a particular location in the field of view. In some embodiments, the segment is identified from one or more intensity profiles corresponding to one set of wavelength bands, and image data is extracted from intensity profiles corresponding to a second set of wavelength bands. For example, the reference intensity may differ from the background intensity most significantly at the band centred around 550 nm wavelength if this is the most sensitive spectral band for the selected imaging setup. The segment is identified from the intensity profile for this wavelength band at a location. Image data is then extracted at the corresponding segments from intensity profiles for other wavelength bands at that location. Exemplary methods for identifying the segments will be discussed in relation to Figure 2.
Once the segment is identified, the corresponding image data is extracted. The extracted image data for the segment at that spatiospectral location is combined with image data for segments at other locations. In the embodiment shown, the method comprises processing the locations in the field of view in parallel, generating 1043 an intensity profile for each location and identifying and extracting a segment. In other examples, the method may comprise looping iteratively through locations in the field of view. In some embodiments, an intensity profile is generated for every pixel of the HSI sensor array.
Once all available locations of interest have been processed 1046, a composite reference can be outputted. In some embodiments, an average intensity is calculated from the segments at each location, and the composite reference is an image which combines these average intensities. In other words, the composite reference represents the measured reflectance intensity of the reference object at each spatiospectral location within the field of view.
The composite reference is then compared against the a priori known reflectivity of the reference object. The reflectivity of an object is a measure of reflectance averaged over the spectral bands of interest. That is, we integrate the reflectance of the object over the spectral band of the imaging setup as disclosed in detail hereafter. The reflectance of an object, as given by its reflectance spectrum, is generally only a property of the surface material of the object, and therefore, the reflectivity only depends on the object properties and the spectral bands of the imaging system but does not otherwise depend on the imaging configuration (distance to target, vignetting, etc.). The composite reference can be corrected 105 using the known reflectivity. In some embodiments, the reflectivity is less than 100%, and varies across the spectral bands of interest. As such, the reflectivity correction 104 may be referred to as a bandwise reflectivity correction.
In some embodiments, variation in spectrum due to small angle changes is small, and the images of the reference object appear matte with few specular reflections. Thus, the reference object can be approximated as having a Lambertian surface. The ideal imaging condition is fronto-parallel imaging, so angle deviations from this are considered sufficiently small as to have negligible effect. In other words, the reflectivity does not vary significantly with small angle deviations. From there, an a priori reflectivity correction factor
Figure imgf000013_0001
is calculated per wavelength band n: a f tWbMdA (1)
P-n = - 7 -
$ bn X)dX where t(A) is the known reflectance spectrum of the reference object for a wavelength A, and &n(A) is the band response as obtained 110 experimentally or from the HSI imaging sensor manufacturer. Each pixel of the initial composite reference W'c(i,j, may then be divided by the a priori correction factor
Figure imgf000014_0001
for the appropriate band to form the corrected composite reference: Wc(i,j,n) - W'c i,j,n)/p“. The reflectance spectrum of the reference object is measured 111. That is, an a priori measurement 111 of the reference object's reflectance spectrum is made.
The reflectivity correction factors per band can be calculated 112 separately to the main imaging procedure. In some embodiments, the reflectivity correction factors are calculated 112 a priori, that is, prior to capturing 103 the reference images. In other embodiments, the reference images are captured first 103, and then the reflectivity correction factors are calculated 112.
In some embodiments, the reflectance spectrum of the reference object is measured 111 using a calibrated measurement device such as a spectrophotometer. The reflectivity correction factors may then be calculated from the reflectance spectrum measured by the calibrated measurement device.
Alternatively, the reference object can be imaged with the HSI imaging system and compared with an HSI image of a calibration object with a known reflectance spectrum. One example of a calibration object is a Spectralon 95% reflective tile. Spectralon is a material which acts as a highly reflective Lambertian surface using a specific formulation, but it will be understood that other reflectance standards are also available and could be used. In some embodiments, the calibration object has a uniform reflectance spectrum across its whole surface. In some embodiments, the calibration object is positioned relative to the HSI imaging system such that the whole field of view of the HSI imaging system is covered by the calibration object.
The reflectivity correction factors may be calculated 112 by taking a reflectivity profile (i.e. a well calibrated reflectance spectrum t(2)) of the calibration object and using equation (1) with the band responses bn(A) of the HSI imaging system. In other examples, the reflectivity correction factors are calculated by using the HSI imaging system to capture HSI images of both the calibration object and of a known reference (such as a Spectralon) and dividing the former by the latter.
In some embodiments, the reflectance spectrum of the reference object and/or calibration object are measured under known lighting conditions, that is, when they are lit by a light source with a known spectral profile. In some embodiments, a dark balancing step is performed to account for any bias in sensor noise. A dark reference may be obtained by covering the lens with an opaque object (such as the lens cap), such that no light is transmitted into the sensor. One or more dark images are then captured in this configuration. The dark image (or an average of the dark images) is then subtracted from the reference and/or*/ composite reference images to account for the sensor noise bias.
In some embodiments, a mathematical model may be used to estimate a synthetic dark image based on exposure time, temperature of the sensor or any other known parameter of influence.
The images may also be corrected for exposure time using methods known in the art.
A synthetic reference can then be constructed 108. In some embodiments, the synthetic reference is simply the composite reference or the reflectivity-corrected 105 composite reference. However, the composite references may still retain imperfections from the input video. It may also be incomplete in the sense that not all correction factors for all spatiospectral locations in the field of view may have been calculated. These imperfections may be removed by applying a white reference mathematical model fit to the composite reference. The white reference model is applied to the composite reference to construct 108 a synthetic reference.
In some embodiments, the white reference is modelled as the separable product of a spatially dependent vignetting, spectrally dependent spectral sensitivities, and a scalar multiplicative factor with noise treated as negligible. That is, the white reference model is a product of a spectral sensitivity function and a spatial vignetting function. The white reference model may be modelled by the formula :
W(i,j,ri) = MS(n)V(i,j) + N i,j,n) (2)
Where i and j are the spatial co-ordinates in the images, n is the spectral co-ordinate (i.e. the wavelength band index),
Figure imgf000015_0001
is the spatial vignetting function, S(n) is the spectral sensitivity function, and M is a scalar factor to account for the overall light intensity. A noise function N(i,/,n) accounts for noise in both the spatial and spectral domains.
The spectral sensitivity function may be calculated 106 by calculating a spatial average of the reflectivity-corrected 105 composite reference for at least one spectral band of the composite reference. Alternatively, any other standard reference can be used instead of the reflectivity-corrected composite reference 105. A normalisation step may be applied to ensure that the spectral sensitivities sum up to one. As detailed earlier, if a mosaic HSI sensor is used, a demosaicking algorithm may be applied before calculating 106 the spatial average(s).
The spatial vignetting function may be calculated 107 by fitting a two-dimensional parametric function to a spectral average of the reflectivity-corrected composite reference. In some embodiments, a weighted spectral average may be used, for example by compensating for spectral sensitivities. In some embodiments, a single spectral band may be used for the fit. In some embodiments, the two-dimensional parametric function is an isotropic Gaussian function. In some embodiments, the Gaussian function is represented by the equation :
Figure imgf000016_0001
where m and /z7 represent the co-ordinates of the centre of the Gaussian in the i and j directions respectively and a represents the standard deviation.
To achieve a well-posed decomposition, in some embodiments the constraints max7(i,j) = 1 and -£r; ' (T0 = l are applied. These constraints ensure that: spatial i,; vignetting only accounts for variations in light intensity and optical transmission across the image; spectral sensitivities only account for relative efficiency of the different bands on the sensor and the spectral shape of the light source; and the scalar factor only accounts for electronic gain, exposure, and overall light intensity settings. Applying these constraints to the white reference model given above, the following equations for the spatial vignetting and spectral sensitivity functions can be obtained :
Figure imgf000016_0002
In some embodiments, the white reference model is fitted to the composite reference using a joint least squares approach. In other examples a different normalisation choice could be used.
In some embodiments, the HSI imaging system includes a circular field of view as may be resulting from coupling an exoscope or laparoscope to a rectangular HSI imaging sensor. Thus, the content area and field of view is circular while the sensor area is rectangular. This may be accounted for when constructing 108 the synthetic reference. The field of view is automatically detected 113 by a segmentation method for content area detection or may be known beforehand through a calibration step. In the embodiment, a content area disk is detected 113 from a single image of the reference images. This may be performed using any method known in the art from simple methods such as thresholding to more advanced approaches such as a pre-trained neural network. To discount edge effects at the border of the content area, the radius of the content area disk is reduced 114 to a percentage of its original radius such as a proportion between 90% and 98%. In some examples, the radius of the content area disk is reduced 114 beyond 90%. The reduced-radius disk is then used to mask the content area as a step in constructing 113 the synthetic white reference. In other examples, the radius of the content area disk is not reduced, thereby utilizing all of the information from the content area disk.
Figure 2 shows an unfiltered intensity profile 201. Note that since in this example the time interval between image frames is consistent, the frame number can be used as a substitute for the time on the x-axis. In some embodiments, the intensity profile is pre-processed before the segment is identified 1044, to remove noise and allow the reference and background intensities to be distinguished from each other more easily. The pre-processing may comprise a smoothing filter and/or a threshold filter. In the embodiment shown, the filtered temporal profile 202 is generated using a smoothing filter and a threshold filter, wherein the threshold is set well below the average reference intensity and well above the average background intensity. The threshold filter discards all intensity values whose intensity lie below a lower threshold. This suppresses the impact of background intensity values. In some embodiments, the lower threshold is calculated using a classical parameter-free Otsu's method. In some embodiments, the threshold filter also discards all intensity values with an intensity above an upper threshold. This mitigates the impact of specular reflections, which can saturate the HSI imaging sensor. In some embodiments, the smoothing filter is a Savitsky-Golay filter, with window size 15 and order 2. Note that in some embodiments, the reference object may pass multiple times through the location. As such, multiple segments 206 can be identified 1044.
In some embodiments, a segment 206 is identified 1044 by calculating a temporal gradient of the intensity profile, that is, the time derivative of the intensity profile. In the embodiment shown, the temporal gradient 203 is the time derivative of the filtered intensity profile 202. As the reference object enters or leaves the location (i.e. comes into and out of view) the intensity in one or more wavelengths will change quickly over time. Specifically, if the reference object has a higher reflectance intensity than the background (as shown in Figure 2), the temporal gradient 203 will have a large positive peak 203p when the reference object comes into view, and a large negative peak 203n when the reference object goes out of view. A positive peak 203p followed by a negative peak 203n, which can be detected using a peak picking algorithm, identify a segment 206. In some embodiments, the positive and negative peak are only identified if they each have an absolute value greater than a threshold value. This ensures that small peaks in the temporal gradient (e.g. a gradual change in intensity caused by shadowing on the reference object) are not mis-identified as the start or end of a segment 206.
In some embodiments, a segment 206 is identified 1044 by identifying a portion of the intensity profile in which the intensity is between an upper 207a and lower 207b reference object threshold. The upper 207a and lower 207b reference object threshold may be higher intensity than the background threshold 208. The datapoints 205 with intensities between the upper 207a and lower 207b reference object threshold are selected for the segment 206. The upper 207a may be lower intensity than the expected or measured saturation/specular reflection intensity from the environment. This ensures that specular reflections from the environment are not mistaken for the reference object.
The highlighted datapoints 205 comprise image data in the segments 206. An average of the datapoints 205, such as a median, M-estimator-based average, or any other form of robust averaging (that is, any averaging method not easily swayed by outliers), can be utilised to give an intensity for the reference object at that location. In some embodiments, the segments 206 are located from the filtered (e.g. by smoothing and thresholding) intensity profile, but the image data for the segments 206 is taken from the unfiltered raw intensity profile. In the examples discussed above, a temporal intensity profile is generated for a plurality of pixel locations, and then the temporal location of the reference object is identified for each intensity profile. That is, the time at which a reference object passes through a particular pixel location is identified. In other examples, a segment 206 is identified 1044 by applying an image segmentation method for each image in the plurality of images to locate the reference object in the image. In other words, rather than finding the time a reference object passes through a location, the location of the reference object is found for a particular time. The pixel locations are selected if the reference object is detected within them using the image segmentation method, and the corresponding image data for those locations at the timestamp of the image is selected for the segments 206. In some embodiments, a method known in the art such as based on deep learning is used to spatially segment the area corresponding to the reference object in the reference images. In some embodiments, the segmented spatial areas may be reduced to a percentage of their original size so as to discount edge effects.
The synthetic white reference may be used to apply 109 pixel calibration on the images of the target object. Pixel calibration may be performed according to the following formula:
Figure imgf000019_0001
where w is the synthetic white reference in any of the disclosed embodiment (e.g. composite white reference or white reference model), D is a dark image, and R is the white-balanced reflectance estimation of the target object of interest. r;, TW, and tD describe the exposure time used for acquisition of each of the above, and pn is the bandwise reflectivity of the white reference used when the white reference has not been pre-emptively compensated for it.
In some embodiments, relative pixel calibration may be sufficient. In such instances, normalization is performed only based on the estimated spectral sensitivity function with no influence from the vignetting function.
In some examples, the composite reference is generated for only a partial portion of the field of view. For example, the partial reference may only pass through a subset of the field of view, and the composite reference may be generated corresponding only for that subset. Computation of a synthetic white reference may be performed by calculating the white reference model from to the composite reference, and then applying the white reference model to the whole field of view by means of extrapolation.
Figure 3 shows a sterile ruler 300, which may be used as a reference object in some embodiments. Sterile rulers are widely available in surgical theatres and can be placed on tissue in the surgical field. Furthermore, it has been found that the spectral profile of mass-produced sterile rulers is generally consistent across individual units. Thus, in some embodiments the spectral profile of a first sterile ruler can be measured as previously described. A second sterile ruler of the same design can be used as the reference object. This ensures that the second sterile ruler can remain in a sealed sterile container until just before insertion into a patient.
In some embodiments, the unmarked side 300b of the sterile ruler 300 is faced towards the lens, rather than the marked side 300a. The unmarked side 300b of many mass- produced sterile rulers has a more consistent spectral profile, because it has a more matte aspect than the marked side across its whole surface.
In some embodiments, the reflectivity of one or more reference objects are stored onboard the HSI system. An operator then selects the reflectivity corresponding with the reference object provided. For example, the reflectivity of various designs of commercially available sterile rulers and may be measured. The user then selects the reflectivity corresponding to the particular design of sterile ruler which is provided. In some embodiments, the reflectivity of one reference object may be used to calculate 112 the reflectivity correction factors for a different reference object. This is because the same reflectance spectrum can generally be used for any reference object made of a particular material, so long as the geometry of the reference object is not so extreme as to distort the reflectance spectrum of the object.
Referring to Figure 4, the HSI imaging system 400 comprises an exoscope or laparoscope 401, operating as a lens located at a distal end of the HSI system. The exoscope or laparoscope is optically coupled via an adapter 404 to an HSI imaging sensor 402. The HSI imaging sensor 402 is connected to a processor 405, either directly through a cable 403 as shown, or through a wireless communications interface. The processor is configured to generate images for a display 409. The HSI imaging system 400 is configured to capture images within an environment 407. A reference object 408 is provided within the environment, so that a pixel calibration method as previously described can be carried out. The reference object 408 is moved relative to the field of view 410 of the HSI imaging system 410. The exoscope or laparoscope 401 is optically connected to a light source 406, providing illumination within the environment 407.
There are three broad categories of HSI image sensors 402: spatial scan including linescan, spectral scan, and snapshot. Linescan cameras acquire data for all wavelengths simultaneously for each line of pixels sequentially, whereas spectral scan cameras collect all pixels at each wavelength in turn using band pass filters. These methods both measure complete hypercubes. These HSI approaches are however not able to provide real-time data as their respective scanning mechanisms result in long acquisition times that are prone to motion artefacts. In contrast, snapshot mosaic cameras utilize sensors where each pixel has a dedicated band pass filter. This provides information on one band per pixel but captures all pixels in a single shot. This allows highly time resolved data to be obtained, at the cost of lower spatial and spectral resolution. To obtain a full hypercube from a snapshot mosaic camera, the data must be demosaicked with the remaining information inferred using classical or learningbased interpolation. The increased temporal resolution given by effective snapshot mosaic image demosaicking provides the opportunity for more accurate image analysis.
Referring to Figure 5, a reference image 500 has a field of view 501 defined by a circular lens and exoscope or laparoscope. The reference object 502 partially covers the field of view 501, in the sense that the reference object covers a partial portion of the field of view. The rest of the field of view 501 is taken up by the background environment 503. The reference object 502 is more reflective than the environment 503 across the visual spectrum, meaning that the reference object 502 appears generally whiter and brighter to the HSI imaging system than the environment 503 (excluding the specular reflections 504 from the background environment).
Figure 6 shows a computer readable medium 600 comprising computer program code configured to cause a controller comprising a processor and a memory to operate as described herein. It will be understood that in some embodiments, processing operations which have been described as being carried out sequentially may be carried out in parallel. Other optimisations for computational efficiency are also envisioned.

Claims

1. A method for calibrating pixel sensitivity for pixels of a hyperspectral image sensor, wherein the hyperspectral image sensor is configured to capture image data, the image data including spectral intensities measured within a plurality of spectral bands, and wherein the method comprises the steps of: providing a reference object; moving the reference object and hyperspectral image sensor relative to each other; capturing a plurality of reference images within a field of view of the hyperspectral image sensor, such that in each reference image the reference object covers a different partial portion of the field of view; selecting a plurality of pixel locations from the field of view; generating a composite reference; and calibrating the pixel sensitivity of the pixels of the hyperspectral image sensor using the composite reference; wherein generating the composite reference comprises, for each of the plurality of pixel locations: selecting, from each image in the plurality of images, image data corresponding to the pixel location; generating, from the selected image data, an intensity profile for the pixel location; and identifying within the intensity profile a segment corresponding to the reference object being observed at the pixel location; and combining the selected segments.
2. The method of claim 1, wherein a segment for a selected pixel location comprises a portion of the intensity profile in which an intensity value is between an upper and lower reference object threshold, the upper and lower reference object threshold being higher intensity than a background value.
3. The method of claim 2, wherein the upper reference object threshold is lower than a saturation or specular reflection value.
4. The method of any preceding claim, wherein generating a segment for a selected pixel location comprises: calculating a derivative profile of the intensity profile; and identifying a positive peak and a negative peak in the derivative profile, the positive and negative peak each having an absolute value greater than a derivative threshold, and the segment comprises a period defined between the positive and negative peaks.
5. The method of any preceding claim, wherein generating a segment for a selected pixel location comprises: applying an image segmentation method for each image in the plurality of images to locate the reference object in the image; and considering an image data point at the selected pixel location to be part of the segment if it falls within the detected reference object.
6. The method of any preceding claim, wherein identifying a segment further comprises calculating an average intensity of the intensity profile within the reference object observation period, and preferably wherein the average is a median, a M- estimator-based average, or any other form of robust averaging.
7. The method of any preceding claim, wherein the method further comprises correcting the generated composite reference using a reflectivity for the reference object, the reflectivity correction comprising a correction factor corresponding to at least one spectral band of the composite reference.
8. The method of any preceding claim, wherein combining the selected segments further comprises the steps of: generating a parametric white reference model; and applying the white reference model to the composite reference to generate a synthetic reference.
9. The method of claim 8, wherein the synthetic reference is generated by extrapolating the white reference model from the composite reference so as to cover substantially all pixel locations in the field of view.
10. The method of claims 8 or 9, wherein the white reference model comprises a product of a spectral sensitivity function and a spatial vignetting function.
11. The method of claim 10, wherein the spectral sensitivity function is parametrically modelled as a scalar value per spectral band and calculating the spectral sensitivity function includes calculating a spatial average of the composite reference for at least one spectral band of the composite reference.
12. The method of claim 10 or 11, wherein calculating the spatial vignetting function includes fitting a two-dimensional parametric function to a spectral average of composite reference.
13. The method of claim 12, wherein the two-dimensional parametric function is an isotropic Gaussian function.
14. The method of any preceding claim, wherein the method further comprises preprocessing each of the intensity profiles using a threshold or smoothing filter.
15. The method of any preceding claim, wherein calibrating the sensitivity of the pixels composing the hyperspectral image sensor is performed by using only the calculated spectral sensitivity function.
16. The method of any preceding claim, wherein the pixel locations selected from the field of view, and from which segments are selected to generate the composite reference, collectively cover between 10% and 100% of the total field of view.
17. The method of any preceding claim, wherein the field of view is automatically determined by a segmentation method for content area detection.
18. The method of any preceding claim, wherein the reference object has a reproducible reflectivity.
19. The method of any preceding claim, wherein the reference object is white.
20. The method of any preceding claim, wherein the reference object has a matte reflectance spectrum.
21. The method of any preceding claim, wherein calibrating pixel sensitivity involves calibrating the white balance of the pixels.
22. The method of any preceding claim, wherein each pixel of the hyperspectral image sensor is configured for measuring photometric intensity corresponding to a single spectral band.
23. An apparatus for calibrating an HSI image sensor using a method according to any preceding claim.
24. The apparatus of claim 23, wherein the HSI image sensor is configured to capture images in a snapshot mode.
25. The apparatus of claim 23 or 24, wherein pixels of the HSI image sensor are configured to capture images such that spectral bands are arranged in a mosaic pattern.
PCT/GB2024/051534 2023-06-16 2024-06-14 A method and aparatus for calibrating pixel sensitivity for pixels of a hyperspectral image sensor Pending WO2024256838A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB2309106.9 2023-06-16
GB2309106.9A GB2631097B (en) 2023-06-16 2023-06-16 Imaging

Publications (1)

Publication Number Publication Date
WO2024256838A1 true WO2024256838A1 (en) 2024-12-19

Family

ID=91670441

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2024/051534 Pending WO2024256838A1 (en) 2023-06-16 2024-06-14 A method and aparatus for calibrating pixel sensitivity for pixels of a hyperspectral image sensor

Country Status (2)

Country Link
GB (1) GB2631097B (en)
WO (1) WO2024256838A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN120274652A (en) * 2025-06-03 2025-07-08 杭州高谱成像技术有限公司 A film thickness measurement method and system based on line scanning hyperspectral camera

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018085841A1 (en) * 2016-11-07 2018-05-11 BioSensing Systems, LLC Calibration method and apparatus for active pixel hyperspectral sensors and cameras
WO2021245374A1 (en) * 2020-06-03 2021-12-09 King's College London Method and system for joint demosaicking and spectral signature estimation

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018085841A1 (en) * 2016-11-07 2018-05-11 BioSensing Systems, LLC Calibration method and apparatus for active pixel hyperspectral sensors and cameras
WO2021245374A1 (en) * 2020-06-03 2021-12-09 King's College London Method and system for joint demosaicking and spectral signature estimation

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
VANDEBRIEL ROELAND ET AL: "Integrating hyperspectral imaging in an existing intra-operative environment for detection of intrinsic brain tumors", PROGRESS IN BIOMEDICAL OPTICS AND IMAGING, SPIE - INTERNATIONAL SOCIETY FOR OPTICAL ENGINEERING, BELLINGHAM, WA, US, vol. 12368, 6 March 2023 (2023-03-06), pages 1 - 10, XP060172459, ISSN: 1605-7422, ISBN: 978-1-5106-0027-0, DOI: 10.1117/12.2647690 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN120274652A (en) * 2025-06-03 2025-07-08 杭州高谱成像技术有限公司 A film thickness measurement method and system based on line scanning hyperspectral camera

Also Published As

Publication number Publication date
GB2631097A (en) 2024-12-25
GB2631097B (en) 2025-09-17

Similar Documents

Publication Publication Date Title
AU2023222877B2 (en) System and method for camera calibration
US7233693B2 (en) Methods and systems for computer analysis of skin image
US8155413B2 (en) Method and system for analyzing skin conditions using digital images
US8027533B2 (en) Method of automated image color calibration
AU2014293317A1 (en) Optical detection of skin disease
WO2021229984A1 (en) Image processing device, image processing method, and program
EP3563189B1 (en) System and method for 3d reconstruction
WO2024256838A1 (en) A method and aparatus for calibrating pixel sensitivity for pixels of a hyperspectral image sensor
WO2020090348A1 (en) Coefficient determination device, pigment concentration calculation device, coefficient determination method, and information processing program
WO2010066951A1 (en) Method and device for imaging a target
KR102864414B1 (en) Device and method for skin burn degree analysis by use of hyperspectral imaging
US20160210746A1 (en) Organ imaging device
JP7455716B2 (en) Endoscope processor and endoscope system
JP2020536221A (en) Equipment and methods for determining surface topology and associated colors
JP2006030014A (en) Color estimation system and color estimation method
US20240050026A1 (en) Multi-function device and a multi-function system for ergonomically and remotely monitoring a medical or a cosmetic skin condition
WO2024088858A1 (en) Improved white balance
Gomez et al. Collecting highly reproducible images to support dermatological medical diagnosis
Zenteno et al. Spatial and Spectral Calibration of a Multispectral-Augmented Endoscopic Prototype
Maglogiannis et al. A digital image acquisition system for skin lesions
Kamimura et al. Evaluation and analysis for spectral reflectance imaging of human skin
Benezeth et al. Spatial and Spectral Calibration of a Multispectral-Augmented Endoscopic Prototype

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24736506

Country of ref document: EP

Kind code of ref document: A1