US12307636B2 - Specular reflection reduction in endoscope visualization - Google Patents
Specular reflection reduction in endoscope visualization Download PDFInfo
- Publication number
- US12307636B2 US12307636B2 US17/731,127 US202217731127A US12307636B2 US 12307636 B2 US12307636 B2 US 12307636B2 US 202217731127 A US202217731127 A US 202217731127A US 12307636 B2 US12307636 B2 US 12307636B2
- Authority
- US
- United States
- Prior art keywords
- image
- level
- pyramid
- specular reflection
- region
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/77—Retouching; Inpainting; Scratch removal
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/20—Image enhancement or restoration using local operators
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30028—Colon; Small intestine
Definitions
- Endoscopes are routinely used to illuminate and visualize internal anatomy in a wide range of diagnostic and surgical procedures, such as laparoscopy. Endoscopes typically serve both as an illumination source as well as an optical fiber camera, collecting light reflected off the anatomical scene. As a result of collocating the illumination source and the detector surface, specular reflections from anatomical and instrument surfaces are common. Such reflections are distracting, adversely affect the surgeon, and may impair the quality of care.
- FIG. 1 is a schematic diagram illustrating an example endoscopy system to image an inner space of a biological subject, in accordance with an embodiment of the disclosure.
- FIG. 2 is a schematic diagram illustrating an example procedure for detecting regions of specular reflection in an endoscope image, in accordance with an embodiment of the disclosure.
- FIG. 3 is a schematic diagram illustrating an example procedure for generating an image pyramid, in accordance with an embodiment of the disclosure.
- FIG. 4 is a schematic diagram illustrating an example approach for weighted extrapolation of image intensity information, in accordance with an embodiment of the disclosure.
- FIG. 5 A is an example graph illustrating cumulative weights of levels in an image pyramid having “S” levels, in accordance with an embodiment of the disclosure.
- FIG. 5 B is an example graph illustrating weights of levels in an image pyramid having “S” levels, in accordance with an embodiment of the disclosure.
- FIG. 5 C illustrates an example reconstructed image exhibiting reduced specular reflection, in accordance with an embodiment of the disclosure.
- FIG. 6 is a block diagram illustrating an example process for reducing specular reflection in endoscopy images, in accordance with an embodiment of the disclosure.
- FIG. 7 is a block diagram illustrating an example process for reconstructing an output image on an image-wise basis, in accordance with an embodiment of the disclosure.
- FIG. 8 is a block diagram illustrating an example process for reconstructing an output image on a pixel-wise basis, in accordance with an embodiment of the disclosure.
- Embodiments of a system and method for specular reflection reduction in endoscope visualizations are described herein.
- numerous specific details are set forth to provide a thorough understanding of the embodiments.
- One skilled in the relevant art will recognize, however, that the techniques described herein can be practiced without one or more of the specific details, or with other methods, components, materials, etc.
- well-known structures, materials, or operations are not shown or described in detail to avoid obscuring certain aspects.
- Specular reflection is mirror-like reflection of electromagnetic radiation, such as visible light, by a reflecting surface. Specular reflection represents a portion of light reflected from the reflecting surface, the other being diffuse reflectance. Specular reflection generally occurs for a broad range of energies, for example, where the reflecting surface includes a population of electrons in a conduction band or otherwise reflects light that reproduces the energy of incident radiation. Importantly, specular reflection typically includes diffuse-reflected light from external reflecting surfaces, rather than describing the reflecting surface itself, most familiarly seen in visible-light mirrors for cosmetic use. In an illustrative example, specular reflection from a surface can create a virtual image of the endoscope illumination source, appearing as a bright spot inside a body cavity. Such external information can cause confusion as to the nature of the reflecting surface and can impair diagnostic procedures, such as colonoscopy.
- an endoscope used to guide a surgeon during laparoscopic surgery on an internal organ typically illuminates the inner space of a body cavity.
- the endoscope includes an optical fiber probe coupled with a camera that gives the surgeon a visible image of the body cavity. Wet surfaces of the body cavity can generate specular reflections reflecting broad-spectrum white light from the endoscope that blind the camera and obscure portions of the body cavity from view.
- an endoscopy system includes components that detect and replace specular reflections in the camera field of view with extrapolated intensity information.
- Extrapolated intensity information generally describes image data that is obscured or absent from an image due to specular reflection.
- Extrapolation includes techniques for separating higher frequency components from lower frequency components, through defining an image pyramid that includes multiple images at multiple scales. Subsampling and filtering images reduces the prominence of higher frequency components and emphasizes lower frequency components. Examples of lower frequency components include, but are not limited to, background diffuse reflection from surfaces that is descriptive of the material (e.g., biological tissues). Higher frequency components, by contrast, include localized features, such as edges, textures, and/or fine structures (e.g., vascularization).
- Operations for detection, extrapolation, and replacement may be applied to a video stream generated by the endoscope camera in real time or near-real time with a limited latency, such that multiple frames of the video stream are processed per second.
- the images viewed by the surgeon may present fewer and/or reduced specular reflections, thereby improving the surgeon's ability to perform internal surgery and to conduct diagnostic assessments in the body cavity.
- replacing specular reflection with extrapolated information may improve diagnostic and surgical outcomes, for example, by permitting a user to view and interact with internal structures free of specular reflection that may dazzle the user or saturate the camera.
- specular reflection can also describe interactions of diverse spectra of electromagnetic ranges.
- ultraviolet photons or infrared photons are used, as when imaging in low-light conditions or where the illumination source is invisible to humans.
- infrared or ultraviolet systems may provide improved image resolution and clarity by reducing the appearance of specular reflections, which cause bloom in images, can saturate an image sensor, or can obscure regions of interest in an image.
- specular reflection reduction may be applied to systems for remote controlled visualization, such as mobile camera platforms configured to explore covered conduits or other low-light areas (e.g., sewers, electricity, or telecommunications conduits).
- FIG. 1 is a schematic diagram illustrating an example endoscopy system 100 for imaging an inner space 101 of a biological subject 105 , in accordance with an embodiment of the disclosure.
- Endoscopy system 100 includes an endoscope 110 , an image processing module 120 , and a computer system 130 .
- Inner space 101 includes organs 107 and surgical tools 109 with wet or reflective surfaces giving rise to specular reflection.
- Endoscope 110 includes an illumination source 111 , a camera 113 , and a gas supply 115 .
- Image processing module 120 includes constituent modules, including but not limited to input/output components 121 , video processing 123 , image binarization 125 , image extrapolation 127 , and image reconstruction 129 .
- Computer system 130 includes user interface components 131 that permit visualization of the inner space 101 to be presented to a user, which may be the surgeon performing an endoscopic procedure, such as a laparoscopy or a colonoscopy.
- endoscope 110 illuminates inner space 101 of biological subject 105 within a field of view 117 of camera 113 .
- Field of view 117 of camera 113 includes surgical tools 109 that may be metallic or otherwise reflective (e.g., do to having a wet surface) and organs 107 that also exhibit specular reflection.
- gas supply 115 may be used to inflate biological subject 105 near organs 107 , for example, where inner space 101 corresponds to the abdominal cavity or the colon of biological subject 105 .
- Gas from gas supply 115 , illumination from illumination source 111 , and reflected light from field of view 117 of the camera may each be carried by endoscope 110 through an entry point in biological subject 105 .
- endoscope 110 includes a central optical fiber, a concentric hollow optical fiber around the central optical fiber, and a concentric plenum around the hollow optical fiber to carry gas into inner space 101 from gas supply 115 .
- the concentric arrangement permits illumination from illumination source 111 to be carried into inner space 101 , light reflected from organs 107 to be carried back to camera 113 , to be processed and visualized during the endoscopic procedure.
- image processing module 120 may be or include software, general purpose computer hardware, and/or specialized computer hardware to receive images, such as image frames making up a real time video stream, and to process the images to reduce and/or remove specular reflection in real time or near-real time.
- images such as image frames making up a real time video stream
- process the images to reduce and/or remove specular reflection in real time or near-real time.
- real time and near-real time describe processing of the video stream that may introduce latency but does not perceptibly reduce frame rate or introduce delay in the video stream to the extent that user experience is impaired.
- frame rate may be reduced by as much as 5%-10%, and a delay may be introduced of 100 msec or less, 90 msec or less, 80 msec or less, 70 msec or less, 60 msec or less, 50 msec or less, 40 msec or less, 30 msec or less, 20 msec or less, 10 msec or less, 5 msec or less, 1 msec or less, 500 microsec or less, 100 microsec or less, or less.
- reducing specular reflection describes multiple operations applied to the image, by which regions of specular reflection are identified and removed from the image. Once removed, the missing information from the image is estimated using peripheral information around the regions of specular reflection. Once the missing information is estimated, an output image is reconstructed using the estimated information in combination with the original image.
- Image processing module 120 may be incorporated as one or more electronic components into camera 113 and/or computer system 130 .
- image processing module 210 may be a separate electronic device including input/output components 121 to communicate with camera 113 and/or computer system 130 .
- image processing module 120 includes one or more processors, a non-transitory computer readable storage medium, and electronic components to facilitate communication between the one or more processors and the storage medium (e.g., a bus).
- the one or more processors may be or include central processing units (CPUs), graphical processing units (GPUs), or other architectures.
- a combination of processor architectures is used as an approach to real time image processing. For example, operations for image binarization 125 , image extrapolation 127 , and/or image reconstruction 129 may be executed using GPUs to permit pixel-wise processing with RGBA formatted files, as described in more detail in reference to FIG. 8 .
- Input/output components 121 may include, but are not limited to, electronic communication systems suitable for transfer of high-resolution video streams.
- electronic communication systems suitable for transfer of high-resolution video streams.
- USB universal serial bus
- Thunderbolt thunderbolt
- firewire Firewire
- HDMI HDMI
- display port components may carry video streams to/from image processing module 120 .
- Video processing 123 , image binarization 125 , image extrapolation 127 , and image reconstruction 129 describe algorithms stored as software of image processing module 120 .
- Video processing 123 describes operations on the images generated by camera 113 to segment or otherwise isolate image frames for processing as part of specular reflection reduction.
- Video processing 123 includes parallelization operations to process individual image frames and reduce potential latency arising from serial operation. Parallelization also may be used to reduce processing time of computationally intensive operations. For example, extrapolation and reconstruction include operations on an image pyramid of multiple images for each image frame of the video stream. In this way, the images making up image pyramid may be processed in parallel, along with multiple image frames being similarly processed in parallel.
- video processing 123 includes reformatting images. For example, an image may be received from camera 113 in a first image format, such as an MP4, MOV, WMV, FLV, AVI, or the like.
- the images are reformatted into RGBA format, which includes the color triad and an alpha channel.
- RGBA formatted images may improve processing times for images, by permitting pixel-wise processing by GPUs, as described in more detail in reference to FIG. 8 .
- Image binarization 125 describes one or more techniques for detecting specular reflection in images. As described in more detail in reference to FIG. 2 , regions of specular reflection in an image are characterized by clusters of pixels having intensity near or exceeding a sensor limit of camera 113 . In some cases the specular reflection may saturate a region of the sensor, resulting in a solid white region in the image. Image binarization 125 includes algorithms in software for defining a luminance channel of an image and generating a binary mask by comparing luminance values against a luminance threshold. Pixels for which the luminance value satisfies the threshold are labeled with a “true” value, while pixels that do not satisfy the threshold are labeled with a “false” value in the binary mask.
- Image extrapolation 127 describes one or more algorithms in software for predicting image intensity information in regions of specular reflection in an image. As described in more detail in reference to FIGS. 3 - 4 , image extrapolation 127 includes two top-level processes. First, an image pyramid representation of the image is generated to represent the image at different spatial scales. Second, image intensity information is extrapolated from the peripheral region around specular reflection regions in the image, for each spatial scale of the image pyramid.
- Each subsequent higher level of the image pyramid may be a low-pass filtered image, generated from the preceding level.
- the base scale contains image information at both high and low spatial frequencies, the image at the top level contains low frequency information.
- Extrapolation of image intensity information describes a multi-scale process for estimating missing image pixels using pixels for which the luminance channel value satisfied the threshold.
- some embodiments include a weighted approach to extrapolating pixel values.
- regions of specular reflection are extrapolated from the periphery of the regions toward the center of the regions.
- the multi-scale approach permits pixels nearer the center of the region to be extrapolated from higher pyramid levels, while pixels nearer the periphery of the region are extrapolated using lower pyramid levels. In this way, higher spatial frequency information, such as edges, smaller structures, or small reflections, is predicted nearer the periphery.
- Base information such as material color or larger structures, is predicted with extrapolated data from surrounding larger structures.
- Reconstruction 129 describes algorithms in software for recombination of the pyramid levels in a manner reflecting the spatial frequency of the image intensity information described at each respective level.
- Image-wise recombination is described in more detail in reference to FIG. 5 and FIG. 7 .
- recombination occurs on a pixel-wise basis, such that each pixel is recombined including information for only those pyramid levels for which a threshold alpha value is satisfied.
- Pixel-wise recombination for example using RGBA formatted images in a GPU-implementation of example system 100 , is described in more detail in reference to FIG. 8 .
- Computing device 130 may be or include a personal computer, server, edge device, or other electronic device that is configured to interface with endoscope 110 , camera 113 , and/or image processing module 120 .
- image processing module 120 is incorporated into computing device 130 .
- computing device 130 may include one or more processors, a non-transitory computer readable memory device, and specialized GPU hardware (e.g., a graphics accelerator). In this way, computing device 130 may be configured to execute the operations of image processing module 120 directly on a video stream received from camera 113 .
- the interface components 131 include a display 133 . Display 133 permits the visualization of the processed video feed.
- operations by the computing device 130 and/or image processing module 120 may include outputting reconstructed images to display 133 .
- computing device 130 is illustrated as a standalone device, it is contemplated that operations described in reference to reducing specular reflection may be executed on multiple machines, for example, coordinated over a network. While network communication may introduce some latency into the video stream, access to greater computational resources of a networked system may reduce overall latency and provide near real-time processing of images from the video stream received from camera 113 .
- output operations of image processing module 120 and/or computing device 130 may include communicating processed images or video to multiple electronic devices in one or more physical locations.
- FIG. 2 is a schematic diagram illustrating an example procedure 200 for detecting regions of specular reflection in an endoscope image 210 , in accordance with an embodiment of the disclosure.
- Example procedure 200 includes operations for the definition of a binary mask 220 and for the generation of a masked image 230 .
- Example procedure 200 describes operations implemented by image processing module 120 of FIG. 1 , and may be executed by one or more processors in accordance with computer readable instructions, as described in more detail in reference to FIG. 1 . While operations of example procedure 200 are illustrated in a given order, operations may be omitted or reordered in some embodiments.
- endoscope image 210 is received, for example, via input/output components 121 of FIG. 1 , or by computing device 130 of FIG. 1 .
- Endoscope image 210 may be generated by camera 113 and may describe field of view 117 of innerspace 101 of biological subject 105 , for example, including organs 107 and tools 109 .
- Endoscope image 210 also includes one or more regions of specular reflection 211 , illustrated as white regions surrounded by a black outline. As shown, specular reflections 211 obscure the underlying details of the organs 107 and/or tools 109 , which may impair the ability of a surgeon to visualize the location of a structure, or the true position of a tool.
- binary mask 220 is defined to identify regions of specular reflection 211 in the image.
- operation 203 includes defining a luminance channel for the pixels making up the image 210 .
- ( ⁇ , ⁇ , ⁇ ) represent proportionality factors that may be predefined or may be dynamically determined, for example, based on metering of light received from endoscope 110 .
- ( ⁇ , ⁇ , ⁇ ) are equal, but may also be unequal.
- ( ⁇ , ⁇ , ⁇ ) may be defined as (0.33, 0.33, 0.33).
- ( ⁇ , ⁇ , ⁇ ) may also be defined as (0.299, 0.587, 0.114).
- the combination of ( ⁇ , ⁇ , ⁇ ) sums to one in these examples. In this way, the luminance channel ranges from zero to one, as well.
- a luminance channel is defined using CIE-lab color space or as the value channel in a hue-saturation-value (HSV) color space.
- HSV hue-saturation-value
- the luminance channel is compared to a luminance threshold, “T.”
- the luminance threshold may be predefined, for example, 0.5 or greater, 0.6 or greater, 0.7 or greater, 0.8 or greater, or 0.9 or greater. It is understood that setting a higher value for the luminance threshold will reduce the number of pixels and the size of regions of specular reflection 211 . Smaller regions improve the performance of the image processing module 120 by limiting the number of pixels to be extrapolated. By contrast, however, a threshold value that is too high will incorrectly classify pixels and undercount the reflection regions. The value for the luminance threshold, therefore, is a balance between accurate identification of regions of specular reflection 211 and reducing latency in the video stream.
- the luminance threshold is defined as 0.9, such that all values of the luminance channel equal or exceeding 0.9 are classified as belonging to a region of specular reflection 211 .
- binary mask 220 H(i,j)
- H(i,j) represents a first subset of pixels in the image corresponding to L ⁇ T as “true,” and represents a second subset of the pixels in the image corresponding to L>T as “false.”
- H(i,j) may be represented as a matrix of values where “true” values are represented by one and “false” by zero.
- binary mask 220 depicts true pixels as white and false pixels as black.
- pixels defined as belonging to regions of specular reflection 211 are removed from endoscope image 210 .
- removing the pixels includes matrix multiplication of endoscope image 210 with binary mask 220 .
- the resulting image retains the original image data from endoscope image 210 where binary mask 220 is “true” and is zero for all pixels where binary mask 220 is false.
- Pixels in masked image 230 that correspond to a region of specular reflection 211 are candidates for reflection reduction involving estimation of image intensity information based on surrounding regions (e.g., peripheral pixels).
- binary mask 220 is not combined with endoscope image 210 , but rather is stored with endoscope image in memory and used to generate an image pyramid.
- FIG. 3 is a schematic diagram illustrating an example procedure 300 for generating an image pyramid, in accordance with an embodiment of the disclosure.
- the image pyramid approach is implemented as a technique for improving accuracy of estimation, to limit the introduction of potentially medically significant information through propagation of high-frequency information, and to limit the latency introduced into video streams.
- the pyramid representation of masked image 230 starts at operation 301 , which includes receiving masked image 230 . It is understood that operation 301 may be optional or omitted from example procedure 300 where the binarization and generating the image pyramid are executed in one process.
- masked image 230 may be stored in memory and sent to a processor before proceeding with operations for generating the image pyramid.
- endoscope image 210 is used for example procedure 300 , rather than masked image 230 .
- both endoscope image 210 and binary mask 220 are subsampled and smoothed as part of example procedure 300 .
- Masked image 230 is described by a size M ⁇ N, where “M” and “N” are integer values for the number of pixels in masked image 230 .
- Subsampling may use a factor of two in both dimensions, which results in scaled image 320 being of size M/2 ⁇ N/2.
- Other sub-sampling approaches are contemplated, based on the initial size of endoscope image 210 and/or the number of levels in the image pyramid. With larger subsampling factors, fewer scales are used, which reduces latency.
- a smaller subsampling factor provides improved extrapolation at more frequency levels but increases resource demand and may slow processing.
- masked image 230 is convolved with a two-dimensional gaussian filter 310 “G ⁇ ” where “G” describes the gaussian kernel and ⁇ represents the standard deviation of the gaussian filter 310 .
- Gaussian filter 310 acts as a low-pass filter, such that scaled image 320 contains low frequency information from masked image 230 .
- subsampling is applied to a smoothed image, rather than smoothing a sub-sampled image.
- scaled image 320 is similarly subsampled and smoothed using the same sampling factor and the same gaussian filter 310 , resulting in a second scaled image 330 that has a size of M/4 ⁇ N/4 and has been filtered to remove high frequency information from scaled image 320 .
- the sampling factor may be higher or lower than 2.0, but 2.0 is used here as an illustrative example.
- non-gaussian smoothing may be used, rather than gaussian filter 310 .
- operations 307 and 309 describe subsampling and smoothing applied to the highest scaled image to generate one or more subsequent scaled images 340 .
- the number of scales included in the image pyramid may be defined, may be determined as part of initializing the system, or may be determined based, for example, on the density and/or number of regions of specular reflection 211 in endoscope image 210 . While a higher number of scaled images (“S”) provides improved extrapolation and improved reduction of specular reflection, larger image pyramids with more levels also increases computational resource demand for extrapolation and reconstruction operations, as described in more detail in reference to FIG. 4 - 8 .
- the number of scaled images may be 1 or more, 2 or more, 3, or more, 4 or more, 5 or more, 6 or more, 7 or more, 8 or more, 9 or more, 10 or more, or more.
- S 5.
- Operations 303 , 305 , 307 , and 309 may be implemented on endoscope image 210 and binary mask 220 directly, rather than using masked image 230 .
- subsampling and smoothing binary mask 220 and endoscope image 210 facilitates extrapolation using matrix techniques described in reference to FIG. 4 .
- FIG. 4 is a schematic diagram illustrating an example approach for weighted extrapolation of image intensity information, in accordance with an embodiment of the disclosure.
- Weighted extrapolation is applied to masked image 230 and includes application of matrix operation 400 to masked image 230 and binary mask 220 to prepare a weighted extrapolation on a pixel-wise basis.
- matrix operation 400 is applied to an array of pixels 420 surrounding pixel 410 and for a corresponding region 425 of binary mask 220 (“H s ”) for the level “s” of an image pyramid having “S” levels.
- Matrix operation 400 includes weight matrix 430 applied to both array of pixels 420 and corresponding region 425 . Weighting the results of matrix operation 400 improves the accordance of the extrapolated value for pixel 410 with neighboring pixels in the image.
- extrapolating image intensity information includes generating a mapping of the region of specular reflection for each level of the image pyramid (e.g., binary mask 220 ), described in reference to FIG. 3 , defining a missing region for each level of the image pyramid using the respective mapping, and extrapolating image intensity information for each missing region.
- Matrix operation 400 describes extrapolating image intensity information for pixel 410 for a level “s” of the image pyramid. Generating an extrapolated intensity image, I s e (i,j), may be done for the image in accordance with the expression:
- I s e ⁇ ( i , j ) I s ⁇ ( i , j ) * G ⁇ H s ⁇ ( i , j ) * G ⁇ ( 2 )
- I s e represents the extrapolated intensity image at level “s” of the image pyramid
- G ⁇ represents a gaussian smoothing kernel of width “ ⁇ ”
- H s (i, j) represents the mapping of the region of specular reflection at the level “s” of the image pyramid having “S” levels, wherein “i,” “j,” “s,” and “ ⁇ ” are numerical values.
- Matrix operation 400 is applied to pixels for which the masked image 230 has a value of zero, corresponding to regions of specular reflection 211 in endoscope image 210 . Regions of specular reflection correspond to pixels of binary mask 220 that are false, or zero. In this way, pixel 410 has a value equal to the weighted sum of intensity values for each pixel in array of pixels 420 , divided by the weighted sum of the true pixels in the corresponding region 425 of binary mask 220 . It can be understood from the numerator term of matrix operation 400 that values are contributed by those pixels in endoscope image 210 that are not within region of specular reflection 211 .
- matrix operation 400 it can be understood from the denominator term of matrix operation 400 that the value of pixel 410 is scaled by the number of pixels in corresponding region 425 of binary mask 220 that are true. In this way, the extrapolated value of pixel 410 is normalized to a value between zero and one, as the range of values for each constituent intensity level (e.g., in RGB format) is from zero to one. As such, matrix operation 400 may be applied separately for each color channel.
- the gaussian smoothing kernel may be the same as gaussian filter 310 described in reference to FIG. 3 .
- applying the same smoothing kernel to image pyramid generation and to matrix operation 400 reduces computational complexity, when a single convolution operation is used for both extrapolation at scale s and for generating pyramid representation at scale s+1 (e.g., extrapolation in scaled image 320 and generating second scaled image 330 ).
- matrix operation 400 recreates missing regions of masked image 230 including both lower frequency information and higher frequency information, including, but not limited to, long edges, corners, or other fine details.
- FIG. 5 A , FIG. 5 B , and FIG. 5 C describe reconstruction of extrapolated images on an image-wise basis using weighted reconstruction.
- procedures may be implemented using alpha-channel thresholding as described in reference to FIG. 8 .
- FIG. 5 A is an example graph 500 illustrating cumulative weights of levels in an image pyramid having “S” levels, in accordance with an embodiment of the disclosure.
- the image pyramid has “S” levels, for which a first cumulative weight image 501 , a second cumulative weight image 503 , a third cumulative weight image 505 , and a final cumulative weight image 507 are shown, relative to binary mask 220 , for a region of specular reflection 211 .
- Example graph 500 demonstrates the relative depth into region of specular reflection 211 at which a given pyramid level contributes extrapolated information.
- defining cumulative weight images 501 - 507 permits extrapolated image intensity information to be included where it is reliable, and does not represent information extrapolated entirely from information absent from endoscope image 210 .
- T is a threshold value representing a reliability of the extrapolated intensity value, below which the value of cw s (i,j) is set at zero.
- the reliability factor T limits inclusion of pixels for which extrapolated image intensity information is based too greatly on extrapolated information, rather than on pixels present in endoscope image 210 . In this way, artifacts or inaccurate information are reduced, and potentially incorrect diagnostic assessments by the surgeon may be avoided.
- a lower value of the reliability factor permits more pixels to be included in the cumulative weight image, because it is defined as a threshold below which a pixel is excluded, but also introduces higher-frequency information beyond a spatial position where it may no longer be accurate.
- a higher value conversely, reduces the number of pixels that are included.
- the reliability factor T may be zero, 0.001, 0.01, 0.05, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, or greater, including fractions and interpolations thereof. At higher values, less information from lower levels is included during reconstruction near the center of region of specular reflection 211 .
- reliability factor T is set at 0.1, indicating that any pixel for which more than 90% of neighboring pixels include extrapolated information will not be included in reconstruction (e.g., at least 10% of neighboring pixels are outside region of specular reflection 211 ). Instead, extrapolated image intensity information from higher pyramid levels will be included.
- FIG. 5 B is an example graph 520 illustrating weight images of levels in an image pyramid having “S” levels, in accordance with an embodiment of the disclosure.
- a weight image, w s (i,j) may be defined for each level of the image pyramid.
- For an image pyramid having S levels a first weight image 521 , a second weight image 523 , a third weight image 525 , and a final weight image 527 are shown for region of specular reflection 211 .
- FIG. 5 C illustrates an example reconstructed image 540 exhibiting reduced specular reflection, in accordance with an embodiment of the disclosure.
- Reconstructed image 540 reduces the appearance of regions of specular reflection 211 in endoscope image 210 and reproduces low frequency information and may include some high-frequency information.
- I R ⁇ ( i , j ) w 0 ⁇ ( i , j ) ⁇ I 0 e ⁇ ( i , j ) + ⁇ ... ⁇ + w S ⁇ ( i , j ) ⁇ I S e ⁇ ( i , j ) w 0 ⁇ ( i , j ) + ⁇ ... ⁇ + w S ⁇ ( i , j ) ( 5 ) where “w” is the weight image and I is the extrapolated image. It is understood that the denominator of the expression serves to normalize the intensity information between a value of zero and one for each pixel of reconstructed image 540 .
- Expression (6) is similar to alpha blending, permitting the extent variable to determine, on an image level or on a pixel level, the extent of information being contributed in the output image “J” from each of reconstructed image 540 “I R ” and endoscope image 210 “I.”
- J is defined as a combination of reconstructed image 540 and masked image 230 .
- Image reconstruction may include dynamic or adaptive blending between reconstructed image 540 and endoscope image 210 .
- blending may include increasing the contribution from reconstructed image 540 near regions of specular reflection 211 while keeping information from endoscope image 210 in regions unaffected by specular reflection.
- FIG. 6 is a block diagram illustrating an example process 600 for reducing specular reflection in endoscopy images, in accordance with an embodiment of the disclosure.
- Example process 600 may describe operations executed by example system 100 of FIG. 1 , as part of reducing the appearance of specular reflections in endoscope images. In this way, example process 600 may include the operations described in reference to FIGS. 2 - 5 C and FIGS. 7 - 8 . Other applications of example process 600 are contemplated, including those where reduction of specular reflection in real time would be advantageous.
- the order in which some or all of the process blocks appear in each process should not be deemed limiting. Rather, one of ordinary skill in the art having the benefit of the present disclosure will understand that some of the process blocks may be executed in a variety of orders not illustrated, or even in parallel.
- example process 600 optionally includes receiving a video feed at process block 610 .
- images may be received from camera 113 as part of a video stream including multiple image frames generated according to a frame rate.
- endoscope camera 113 may generate 18 frames per second, 24 frames per second, 30 frames per second, 40 frames per second, 50 frames per second, 60 frames per second, or more.
- Process block 610 may include operations described in reference to input/output components 121 and video processing 123 modules in FIG. 1 .
- example process 600 includes, as part of process block 610 , operations for segmenting or parallelizing image processing operations to reduce the perception of latency.
- example process 600 includes receiving an image, such as endoscope image 210 of FIG. 2 .
- Endoscope image 210 includes one or more regions of specular reflection 211 .
- Receiving the image may include accessing the image from a data store.
- process block 620 may include communicating image information (e.g., a location of the image in memory) from one processor (e.g., a CPU) to another processor (e.g., a GPU) tasked with performing subsequent operations of example process 600 .
- example process 600 includes detecting specular reflection in the image received at block 620 .
- detecting specular reflection may include defining a luminance channel for the image, comparing luminance channel values on a pixel-wise basis against a luminance threshold, and defining binary mask 220 based on the comparison.
- the luminance threshold may be defined as a value between zero and one, corresponding to intensity values used to define colors in images (e.g., RGB, CMYK, etc.).
- Specular reflection may be defined by identifying pixels for which the luminance channel value exceeds the luminance threshold.
- the luminance threshold may be 0.9, such that pixels having a luminance value above 0.9 are attributed to region of specular reflection 211 .
- defining specular reflection by luminance thresholding may reduce latency introduced to video streams.
- estimation operations include defining an image pyramid of multiple levels, as described in more detail in reference to FIG. 3 .
- the image pyramid may be defined by a series of smoothing and sub-sampling operations, resulting in a pyramid of “S” levels, with “S” images of progressively decreasing size.
- gaussian smoothing kernel 310 serves to filter higher frequency information and to retain lower frequency information in higher levels of the image pyramid. For example, masked image 230 is filtered to reduce higher frequency information in scaled image 320 , which is in turn filtered to remove higher frequency information from second scaled image 330 . With the images, binary mask 220 is similarly subsampled and smoothed to define a mask pyramid.
- expression (2) describes an approach for estimating image intensity information, on a pixel-wise basis, for an image of an image pyramid.
- extrapolation involves a weighted combination of neighboring pixels outside region of specular reflection 211 that contain image intensity information, normalized for the number of pixels being added, such that the resulting extrapolated image intensity values fall between zero and one.
- image intensity information normalized for the number of pixels being added, such that the resulting extrapolated image intensity values fall between zero and one.
- each color channel may be extrapolated individually or together.
- a reconstructed image (e.g., reconstructed image 540 of FIG. 5 C ) is generated using extrapolated image intensity information from multiple levels of the image pyramid defined at process block 640 .
- reconstructing an image is done on an image-wise basis, by defining weight images for each level, and rescaling both the pyramid images and the weight images to the original size of endoscope image 210 .
- reconstructed image 540 is generated by weighted combination of pyramid images, with higher level images contributing extrapolated image information nearer the center of regions of specular reflection 211 and lower level images contributing extrapolated image information nearer the peripheries of regions of specular reflection 211 .
- reconstruction is done on a pixel-wise basis, as described in more detail in reference to FIG. 8 .
- Pixel-wise reconstruction may be optimized for use on GPUs, for example, through defining image pyramid levels as texture objects in RGBA format, incorporating binary mask 220 as an alpha channel in the images making up the image pyramid.
- example process 600 may optionally include outputting the reconstructed image at process block 660 .
- Outputting operations may include, but are not limited to, generating a visualization of reconstructed image 540 on a display, such as a monitor (e.g., display 133 of FIG. 1 ).
- Output may also include communicating reconstructed image 540 to an external system (e.g., over a network connection) for streaming to a remote location.
- example process 600 are performed repeatedly in parallel for multiple images from a video stream.
- processes and/or subprocesses of example process 600 may be parallelized.
- extrapolation operations may be parallelized, such that each level of the image pyramid are processed in parallel.
- multiple images form the video stream may be processed by parallel instances of example process 600 .
- outputting operations include storing reconstructed image 540 in a data store.
- subsequent operations may include reconstructing the video stream received at process block 610 , for example, by making reference to sequence indicators in image file metadata.
- FIG. 7 is a block diagram illustrating an example process 700 for reconstructing an output image on an image-wise basis, in accordance with an embodiment of the disclosure.
- Example process 700 is an example of subprocesses of example process 600 , described in reference to FIG. 6 .
- example process 700 describes operations of process block 650 , as described in more detail in reference to FIG. 5 A - FIG. 5 C .
- example process 700 may be applied to extrapolated images of an image pyramid to generate reconstructed image 540 of FIG. 5 C .
- the order in which some or all of the process blocks appear in each process should not be deemed limiting. Rather, one of ordinary skill in the art having the benefit of the present disclosure will understand that some of the process blocks may be executed in a variety of orders not illustrated, or even in parallel.
- cumulative weight images (e.g., cumulative weight images 501 - 507 of FIG. 5 A ) are defined for the image pyramid.
- Cumulative weight images describe reliability conditions for reconstruction, whereby information that is too far removed from image intensity information in endoscope image 210 .
- inclusion or exclusion of a pixel from a cumulative weight image may be determined in reference to a reliability factor, for example, pixels for which fewer than 10% of neighboring pixels are outside region of specular reflection.
- higher levels of the image pyramid include smaller regions of specular reflection (e.g., smaller by a factor of 2 s for subsampling by a factor of two).
- regions of specular reflection e.g., smaller by a factor of 2 s for subsampling by a factor of two.
- each cumulative weight image is rescaled to the size of endoscope image 210 at process block 720 .
- Rescaling permits cumulative weight images to be combined at process block 730 as part of defining a weight image for each level of the image pyramid.
- expression (4) describes an approach for determining weight images from scaled cumulative weight images.
- weight images are combined with extrapolated image information to generate a reconstructed image (e.g., reconstructed image 540 of FIG. 5 ).
- Reconstruction includes weighting the result of combination of extrapolated image intensity information and normalizing the resulting combination by weighted combination with binary mask 230 at each level, on an image-wise basis.
- an output image is generated by weighted blending of the original endoscope image (e.g., endoscope image 210 of FIG. 2 ) and reconstructed image (e.g., reconstructed image 540 of FIG. 5 C ).
- blending may be dynamically defined by comparing a luminance value for a pixel from region of specular reflection 211 against a luminance threshold, as described in more detail in reference to FIG. 2 .
- the output image may be presented as part of a video stream after stream assembly operations, for example, via display 133 .
- FIG. 8 is a block diagram illustrating an example process 800 for reconstructing an output image on a pixel-wise basis, in accordance with an embodiment of the disclosure.
- Example process 800 is an example of subprocesses of example process 600 , described in reference to FIG. 6 .
- example process 800 describes operations of process block 650 , optimized for pixel-wise processing by a GPU, using texture objects (e.g., as RGBA objects).
- example process 800 may include one or more preliminary operations for generating a texture object for each image of the image pyramid being extrapolated and reconstructed.
- FIGS. 6 - 7 the order in which some or all of the process blocks appear in each process should not be deemed limiting. Rather, one of ordinary skill in the art having the benefit of the present disclosure will understand that some of the process blocks may be executed in a variety of orders not illustrated, or even in parallel.
- Pixel-wise reconstruction describes operations whereby pixels (e.g., pixel 510 of FIG. 5 A ) are included in reference to paired threshold parameters for each layer of an image pyramid 810 .
- T UPPER upper threshold parameter
- T LOWER the alpha channel encoding information from binary mask 220 .
- the alpha value may be defined by a product of binary mask 220 with gaussian smoothing kernel 310 . Based on the value of the alpha channel value relative to the two thresholds, a pixel will be included in reconstruction or will be excluded from reconstruction.
- Example process 800 includes decision block 830 , where the alpha value is compared to the upper threshold, T UPPER . Where the alpha value for the pixel is greater than T UPPER , the pixel is included directly in reconstructed image with information from lower levels at process block 835 , without considering higher levels of image pyramid 810 . Where the output of decision block 830 is false, example process 800 includes decision block 840 , where the alpha value is compared to lower threshold T LOWER . Where the output of decision block 840 is false, example process 800 includes process block 845 , whereby reconstructed image is generated excluding the pixel and corresponding pixels from higher levels of image pyramid 810 . For a value of the alpha channel being between T UPPER and T LOWER the pixel is included in reconstruction, being weighted by a weight value.
- weight value w s (i,j) is determined in accordance with the expression:
- w S ⁇ ( i , j ) H S ⁇ ( i , j ) * G ⁇ - t lower t upper - t lower ( 7 )
- G ⁇ represents a gaussian smoothing kernel of width ⁇
- H s (i, j) represents the mapping at the level “s” of the image pyramid, wherein “i,” “j,” “s,” and “ ⁇ ” are numerical values.
- example process 800 includes incrementing the pyramid level to a higher level and repeating process blocks 825 - 855 .
- decision block 855 is true, example process includes generating a reconstructed pixel using weighted pixel values for each level included in the reconstruction.
- a tangible machine-readable storage medium includes any mechanism that provides (i.e., stores) information in a non-transitory form accessible by a machine (e.g., a computer, network device, personal digital assistant, manufacturing tool, any device with a set of one or more processors, etc.).
- a machine-readable storage medium includes recordable/non-recordable media (e.g., read only memory (ROM), random access memory (RAM), magnetic disk storage media, optical storage media, flash memory devices, etc.).
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
Abstract
Description
L(i,j)=α×R(i,j)+β×G(i,j)+γ×B(i,j) (1)
where R(i,j), G(i,j), and B(i,j) represent red, green, and blue channels of
where Is e represents the extrapolated intensity image at level “s” of the image pyramid, where Gσ represents a gaussian smoothing kernel of width “σ,” and where Hs(i, j) represents the mapping of the region of specular reflection at the level “s” of the image pyramid having “S” levels, wherein “i,” “j,” “s,” and “σ” are numerical values.
cw s(i,j)=((H s(i,j)*G σ)>T)*G σ (3)
where T is a threshold value representing a reliability of the extrapolated intensity value, below which the value of cws(i,j) is set at zero. The reliability factor T limits inclusion of pixels for which extrapolated image intensity information is based too greatly on extrapolated information, rather than on pixels present in
w s(i,j)=cw s(i,j)−cw s-1(i,j) (4)
where “s” and “s−1” represent consecutive levels of the image pyramid from s=0 to s=S, where s=0 represents
where “w” is the weight image and I is the extrapolated image. It is understood that the denominator of the expression serves to normalize the intensity information between a value of zero and one for each pixel of
J(i,j)=(1−ε)×I(i,j)+ε×I R(i,j) (6)
where I(i, j) represents the image and where E represents an extent variable between zero and one. Expression (6) is similar to alpha blending, permitting the extent variable to determine, on an image level or on a pixel level, the extent of information being contributed in the output image “J” from each of reconstructed
where Gσ represents a gaussian smoothing kernel of width σ, and where Hs(i, j) represents the mapping at the level “s” of the image pyramid, wherein “i,” “j,” “s,” and “σ” are numerical values.
Claims (26)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US17/731,127 US12307636B2 (en) | 2021-05-24 | 2022-04-27 | Specular reflection reduction in endoscope visualization |
Applications Claiming Priority (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202163192497P | 2021-05-24 | 2021-05-24 | |
| US202163232089P | 2021-08-11 | 2021-08-11 | |
| US17/731,127 US12307636B2 (en) | 2021-05-24 | 2022-04-27 | Specular reflection reduction in endoscope visualization |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20220375043A1 US20220375043A1 (en) | 2022-11-24 |
| US12307636B2 true US12307636B2 (en) | 2025-05-20 |
Family
ID=84103915
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US17/731,127 Active 2043-09-23 US12307636B2 (en) | 2021-05-24 | 2022-04-27 | Specular reflection reduction in endoscope visualization |
Country Status (2)
| Country | Link |
|---|---|
| US (1) | US12307636B2 (en) |
| WO (1) | WO2022250905A1 (en) |
Families Citing this family (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN113129391B (en) * | 2021-04-27 | 2023-01-31 | 西安邮电大学 | Multi-exposure fusion method based on multi-exposure image feature distribution weight |
| CN116309435B (en) * | 2023-03-13 | 2025-11-21 | 卓外(上海)医疗电子科技有限公司 | Medical image specular reflection restoration method and system |
Citations (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2001224549A (en) * | 2000-02-18 | 2001-08-21 | Fuji Photo Film Co Ltd | Endoscopic image-obtaining method and device |
| US20080205785A1 (en) * | 2005-11-23 | 2008-08-28 | Cedara Software Corp. | Method and system for enhancing digital images |
| US20110305388A1 (en) * | 2009-11-18 | 2011-12-15 | Thomas Wedi | Image processing method, and image processing device |
| US8145003B2 (en) | 2000-12-19 | 2012-03-27 | Altera Corporation | Adaptive transforms |
| US8605970B2 (en) | 2008-12-25 | 2013-12-10 | Medic Vision-Imaging Solutions Ltd. | Denoising medical images |
| US20150348239A1 (en) * | 2014-06-02 | 2015-12-03 | Oscar Nestares | Image refocusing for camera arrays |
| US20170046819A1 (en) | 2014-05-02 | 2017-02-16 | Olympus Corporation | Image processing apparatus and image acquisition apparatus |
| US20170084075A1 (en) * | 2015-09-17 | 2017-03-23 | Thomson Licensing | Reflectance parameter estimation in real scenes using an rgb-d sequence |
| WO2020025684A1 (en) | 2018-07-31 | 2020-02-06 | Deutsches Krebsforschungszentrum Stiftung des öffentlichen Rechts | Method and system for augmented imaging in open treatment using multispectral information |
| US20240346724A1 (en) * | 2021-08-13 | 2024-10-17 | Beijing Zitiao Network Technology Co., Ltd. | Image processing method, apparatus, device and storage medium |
-
2022
- 2022-04-27 US US17/731,127 patent/US12307636B2/en active Active
- 2022-05-04 WO PCT/US2022/027656 patent/WO2022250905A1/en not_active Ceased
Patent Citations (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2001224549A (en) * | 2000-02-18 | 2001-08-21 | Fuji Photo Film Co Ltd | Endoscopic image-obtaining method and device |
| US8145003B2 (en) | 2000-12-19 | 2012-03-27 | Altera Corporation | Adaptive transforms |
| US20080205785A1 (en) * | 2005-11-23 | 2008-08-28 | Cedara Software Corp. | Method and system for enhancing digital images |
| US8605970B2 (en) | 2008-12-25 | 2013-12-10 | Medic Vision-Imaging Solutions Ltd. | Denoising medical images |
| US20110305388A1 (en) * | 2009-11-18 | 2011-12-15 | Thomas Wedi | Image processing method, and image processing device |
| US20170046819A1 (en) | 2014-05-02 | 2017-02-16 | Olympus Corporation | Image processing apparatus and image acquisition apparatus |
| US20150348239A1 (en) * | 2014-06-02 | 2015-12-03 | Oscar Nestares | Image refocusing for camera arrays |
| US20170084075A1 (en) * | 2015-09-17 | 2017-03-23 | Thomson Licensing | Reflectance parameter estimation in real scenes using an rgb-d sequence |
| WO2020025684A1 (en) | 2018-07-31 | 2020-02-06 | Deutsches Krebsforschungszentrum Stiftung des öffentlichen Rechts | Method and system for augmented imaging in open treatment using multispectral information |
| US20240346724A1 (en) * | 2021-08-13 | 2024-10-17 | Beijing Zitiao Network Technology Co., Ltd. | Image processing method, apparatus, device and storage medium |
Non-Patent Citations (6)
| Title |
|---|
| Hakamata et al., JP-2001-22459A English Translation, par 0074 (Year: 2001). * |
| International Search Report and Written Opinion mailed on Sep. 2, 2022, issued in corresponding International Application No. PCT/US2022/27656, filed on May 4, 2022, 8 pages. |
| International Search Report and Written Opinion, mailed Sep. 2, 2022, in corresponding International Patent Application No. PCT/US2022/27656, 8 pages. |
| Karapetyan et al., Automatic Detection and Concealment of Specular Reflections for Endoscopic Images, Ninth International Conference on Computer Science and Information Technologies Revised Selected Papers, IEEE, Sep. 23, 2013, 4 pages. |
| Stehle, Specular Reflection Removal in Endoscopic Images, Proceedings of the 10th International Student Conference on Electrical Engineering, May 11, 2006, 6 pages. |
| Van Raad et al., Active Contour Model based Segmentation of Colposcopy Images from Cervix Uteri using Gaussian Pyramids, 6th International Symposium on Digital Signal Processing for Communication Systems, Jan. 2002, 6 pages. |
Also Published As
| Publication number | Publication date |
|---|---|
| WO2022250905A1 (en) | 2022-12-01 |
| US20220375043A1 (en) | 2022-11-24 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| AU2019431299B2 (en) | AI systems for detecting and sizing lesions | |
| KR102581685B1 (en) | Classification and 3D modeling of 3D oral and maxillofacial structures using deep learning | |
| Solomon et al. | Fundamentals of Digital Image Processing: A practical approach with examples in Matlab | |
| Bankman | Handbook of medical imaging: processing and analysis management | |
| CN104700109B (en) | The decomposition method and device of EO-1 hyperion intrinsic image | |
| Herzog et al. | NoRM: No‐reference image quality metric for realistic image synthesis | |
| US10004403B2 (en) | Three dimensional tissue imaging system and method | |
| US8781233B2 (en) | Image processing apparatus, method of processing image, and computer-readable recording medium | |
| KR102025756B1 (en) | Method, Apparatus and system for reducing speckles on image | |
| US12307636B2 (en) | Specular reflection reduction in endoscope visualization | |
| CN111784686A (en) | Dynamic intelligent detection method, system and readable storage medium for endoscope bleeding area | |
| JP2022527868A (en) | How to enhance the object contour of an image in real-time video | |
| Muhammad et al. | Frequency component vectorisation for image dehazing | |
| Paranjape | Fundamental enhancement techniques | |
| KR101849458B1 (en) | Method of compressing image in digital pathology system | |
| CN117152036A (en) | Endoscopic image processing method, device, equipment and storage medium | |
| Kwok et al. | Visual impact enhancement via image histogram smoothing and continuous intensity relocation | |
| Kipele et al. | Poisson noise reduction with nonlocal-pca hybrid model in medical x-ray images | |
| CN116883249B (en) | Super-resolution endoscope imaging device and method | |
| JP2001509923A (en) | Method for automatically detecting objects of predefined size in an image, for example a mammographic image | |
| Singh et al. | Dark channel processing for medical image enhancement | |
| Floor et al. | 3D reconstruction of the human colon from capsule endoscope video | |
| Oh et al. | Texture-preserving low dose CT image denoising using Pearson divergence | |
| Lim | Robust specular reflection removal and visibility enhancement of endoscopic images using 3-channel thresholding technique and image inpainting | |
| Pan et al. | NBIGAN: Automatic nasopharyngeal white light and narrow band imaging transformation based on feature aggregation and perceptual generative adversarial networks |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: VERILY LIFE SCIENCES LLC, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:VERMA, NISHANT;ANDERSON, BRIAN;REEL/FRAME:059749/0614 Effective date: 20220426 |
|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |