WO2025235411A1 - Extended depth of field for high resolution images based on sub-pixel shifted images - Google Patents
Extended depth of field for high resolution images based on sub-pixel shifted imagesInfo
- Publication number
- WO2025235411A1 WO2025235411A1 PCT/US2025/027815 US2025027815W WO2025235411A1 WO 2025235411 A1 WO2025235411 A1 WO 2025235411A1 US 2025027815 W US2025027815 W US 2025027815W WO 2025235411 A1 WO2025235411 A1 WO 2025235411A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- images
- specimen
- image
- light emitting
- computer
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/48—Increasing resolution by shifting the sensor relative to the scene
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B01—PHYSICAL OR CHEMICAL PROCESSES OR APPARATUS IN GENERAL
- B01L—CHEMICAL OR PHYSICAL LABORATORY APPARATUS FOR GENERAL USE
- B01L3/00—Containers or dishes for laboratory use, e.g. laboratory glassware; Droppers
-
- C—CHEMISTRY; METALLURGY
- C07—ORGANIC CHEMISTRY
- C07K—PEPTIDES
- C07K1/00—General methods for the preparation of peptides, i.e. processes for the organic chemical preparation of peptides or proteins of any length
- C07K1/14—Extraction; Separation; Purification
- C07K1/16—Extraction; Separation; Purification by chromatography
-
- C—CHEMISTRY; METALLURGY
- C07—ORGANIC CHEMISTRY
- C07K—PEPTIDES
- C07K1/00—General methods for the preparation of peptides, i.e. processes for the organic chemical preparation of peptides or proteins of any length
- C07K1/14—Extraction; Separation; Purification
- C07K1/34—Extraction; Separation; Purification by filtration, ultrafiltration or reverse osmosis
-
- C—CHEMISTRY; METALLURGY
- C07—ORGANIC CHEMISTRY
- C07K—PEPTIDES
- C07K1/00—General methods for the preparation of peptides, i.e. processes for the organic chemical preparation of peptides or proteins of any length
- C07K1/14—Extraction; Separation; Purification
- C07K1/36—Extraction; Separation; Purification by a combination of two or more processes of different types
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
- G01N15/10—Investigating individual particles
- G01N15/14—Optical investigation techniques, e.g. flow cytometry
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
- G01N15/10—Investigating individual particles
- G01N15/14—Optical investigation techniques, e.g. flow cytometry
- G01N15/1429—Signal processing
- G01N15/1433—Signal processing using image recognition
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
- G01N15/10—Investigating individual particles
- G01N15/14—Optical investigation techniques, e.g. flow cytometry
- G01N15/1434—Optical arrangements
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/06—Means for illuminating specimens
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/36—Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/04—Processes or apparatus for producing holograms
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/04—Processes or apparatus for producing holograms
- G03H1/08—Synthesising holograms, i.e. holograms synthesized from objects or objects from holograms
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
-
- C—CHEMISTRY; METALLURGY
- C12—BIOCHEMISTRY; BEER; SPIRITS; WINE; VINEGAR; MICROBIOLOGY; ENZYMOLOGY; MUTATION OR GENETIC ENGINEERING
- C12M—APPARATUS FOR ENZYMOLOGY OR MICROBIOLOGY; APPARATUS FOR CULTURING MICROORGANISMS FOR PRODUCING BIOMASS, FOR GROWING CELLS OR FOR OBTAINING FERMENTATION OR METABOLIC PRODUCTS, i.e. BIOREACTORS OR FERMENTERS
- C12M1/00—Apparatus for enzymology or microbiology
- C12M1/34—Measuring or testing with condition measuring or sensing means, e.g. colony counters
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N35/00—Automatic analysis not limited to methods or materials provided for in any single one of groups G01N1/00 - G01N33/00; Handling materials therefor
- G01N2035/00346—Heating or cooling arrangements
Definitions
- the various embodiments relate generally to generating high resolution images, and, more specifically, extended depth of field for high resolution images based on sub-pixel shift images.
- An image sensor is typically used to convert light into electrical signals that can be used for generating images.
- the quality of the images can be indicated by a resolution parameter in the form of pixel density.
- the ePetri technology which, in some instances, refers to certain technology described in U.S. Patent Nos. 9,426,429, 9,643,184, 9,569,664, and 9,343,494, exhibits the problem of having a limited depth of field.
- images generated by such a system and the accompanying methods suffer from the drawback that specimens or portion of specimens that are imaged that are outside a plane of focus become blurred.
- a method to generate sub-pixel shifted images includes generating light by sequentially activating each of a plurality of light emitting elements arranged in a spatially non-uniform configuration, propagating the generated light through a transparent layer supporting a specimen, and generating a plurality of output signals from an image sensor configured to detect the light propagated through the transparent layer supporting the specimen, the plurality of output signals indicative of a plurality of sub-pixel shifted images.
- the plurality of sub-pixel shifted images do not include artifacts that may be present in sub-pixel shifted images generated by use of at least some prior art imaging systems such as the prior art system described above with reference to FIGs 1 A-C.
- the method can further include steps such as evaluating at least some of the plurality of sub-pixel shifted images to detect any anomaly if present in the specimen and/or using at least some of the plurality of sub-pixel shifted image for generating a high-resolution image of the specimen.
- At least one technical advantage of the disclosed techniques herein relative to the prior art is that, with the disclosed techniques, the depth of field of images captured in an ePetri dish environment is improved.
- Another technical advantage of the disclosed techniques includes improved analysis of images of a sample that is captured in an ePetri dish environment.
- FIG. IB illustrates an imaging system that includes the prior art lighting element shown in FIG. 1 A.
- FIG. 1C illustrates another imaging system that includes the prior art lighting element shown in FIG. 1 A.
- FIG. 2 illustrates a lighting element according to various embodiments.
- FIG. 3 illustrates an example imaging system that includes the lighting element shown in FIG. 2.
- FIG. 5 shows an example evaluation system that includes a specimen evaluation unit containing the lighting element shown in FIG. 2.
- FIG. 6 illustrates an example sample module that can be a part of the specimen evaluation unit shown in FIG. 5.
- FIG. 7 illustrates an example data flow of the process of generating a superresolution image of a specimen from multiple images captured using an image sensor, lighting element, and nutrient cartridge, according to various embodiments.
- FIG. 8 is a flow diagram of method steps to generate sub-pixel shifted images according to an embodiment.
- FIG. 1 A illustrates a lighting element 105 of an imaging system.
- the lighting element 105 includes a set of light-emitting components arranged in a spatially uniform configuration.
- the spatially uniform configuration corresponds to a square matrix array having eight rows and eight columns.
- each of the lightemitting components is a light emitting diode (LED).
- FIG. IB illustrates an imaging system 150 that includes the lighting element 105.
- the imaging system 150 further includes an image sensor 115, such as a CMOS image sensor or a charge coupled detector (CCD).
- image sensor 115 such as a CMOS image sensor or a charge coupled detector (CCD).
- CCD charge coupled detector
- eight detection elements are shown arranged adjacent to each other in one row. Additional detection elements can be present in additional rows of the image sensor 115.
- a transparent layer 110 for placement of a specimen 118.
- the transparent layer 110 is typically made of glass and may be referred to as a petri plate, petri dish, or a cell-culture dish.
- the transparent layer 110 can be either mounted directly upon a top surface of the image sensor 115 or upon an intermediate film (a plastic sheet, for example) that is placed upon the top surface of the image sensor 115.
- This arrangement effectively renders a separation distance “d2” between the transparent layer 110 and the top surface of the image sensor 115 as being close to zero. Furthermore, the distance “d2” is negligible in comparison to a separation distance “dl” between the transparent layer 110 and a light-emitting surface of the lighting element 105.
- the light-emitting surface of the lighting element 105 propagates light generated by the array of light-emitting components shown in FIG. 1A.
- a raster scanning operation is applied to the array of lightemitting components of the lighting element 105, which can be an array of LEDs.
- the raster scanning operation employs a sequential activation procedure whereby each LED in a first row of LEDs is activated sequentially followed by sequential activation of each LED in a second row of LEDs, and so on.
- LED l is activated over a preset time period, followed by activation of LED 2 for the preset time period, followed by activation of LED 3 for the preset time period, and so on.
- the LEDs of an adjacent row are similarly activated in a sequential manner, followed by sequential activation of LEDs of the next adjacent row, and so on. The activation is repeated cyclically from the first row, after LED 8 of the last row is activated.
- the illustration depicts a snapshot of a set of light beams corresponding to a row scanning sequence.
- the set of light beams is angularly emitted towards the image sensor 115.
- the set of light beams is incident upon detection element (4) of the image sensor 115.
- a LED emits non-coherent light that radiates in multiple directions. Not shown are many other light beams that are emitted by each LED and produce a shadow of the transparent layer 110 and the specimen 118 placed thereon, upon the top surface of the image sensor 115.
- the illustrated set of light beams correspond to a focal point 119 on the transparent layer 110, which is negligibly close to the top surface of the image sensor 115.
- the dashed line oval 121 shows an expanded view of the light beams propagating through the specimen 118 at the focal point 119.
- the distance “pl” corresponds to a width of the detection element (4) and thus corresponds to one pixel.
- the distance “si” corresponds to a separation distance between a beam emitted by the LED l and a beam emitted by adjacent LED 2 and is referred to as a sub-pixel distance.
- the lighting element 105 has eight LEDs in each row.
- the distance “pl” is subdivided into eight sub-pixels (“si” through” s8”).
- the width of each of the eight sub-pixel distances is identical as a result of the equidistant spacing between LEDs (LED l through LED 8).
- each of the light beams is one among many beams produced by a corresponding LED and a shadow created by the light beams is projected on to the top surface of the image sensor 115.
- the image sensor 115 generates output in the form of electrical signals, in response to a first projected shadow corresponding to the light beams of the LED L
- the electrical signals are provided to a computer, which generates a first subpixel shifted image based on the electrical signals. Seven more sub-pixel shifted images that are sub-pixel shifted with reference to the first sub-pixel shifted image are generated corresponding to light beams produced by the other seven LEDs.
- the eight sub-pixel shifted images generated as a result of the light beams of the eight LEDs can be operated upon by the computer to generate a high-resolution image of the specimen 118 (by executing a software algorithm, for example).
- FIG. 1C illustrates an imaging system 155 that includes the lighting element 105.
- the transparent layer 110 is located at a distance “d4” above the top surface of the image sensor 115, where d4 > d2 (d2 is shown in FIG. IB).
- the illustrated set of example light beams correspond to a focal point 122 on the transparent layer 110.
- the shadow created by the light beams shown in FIG. IB is spread out over a larger area on the top surface of the image sensor 115. More particularly, the illustrated light beam produced by LED l is incident upon a spot between detection element (7) and detection element (8) of the image sensor 115.
- the image sensor 115 fails to produce an electrical signal, or produces a weak electrical signal, when the light beam is incident upon the image sensor 115. If no electrical signal is produced, the computer fails to generate a portion of an image corresponding to the spot between detection element (7) and detection element (8) of the image sensor 115. If the electrical signal is weak, a portion of an image corresponding to the spot between detection element (7) and detection element (8) of the image sensor 115 can be an imaging artifact.
- each of the other light beams produced by the other LEDs are incident upon spots between adjacent detection elements. Consequently, each of the sub-pixel shifted images generated at the time of incidence of the other light beams upon the image sensor 115 either lacks image information at these spots or includes imaging artifacts.
- a high-resolution image produced by a computer based on operating upon these low-quality sub-pixel shifted images can contain image artifacts. In at least some cases, the high-resolution image appears as a pixelated image. It is therefore desirable to provide an improved image system that eliminates the low-quality sub-pixel shifted images.
- FIG. 2 illustrates a lighting element 205 in accordance with various embodiments.
- the lighting element 205 includes a number of light-emitting components arranged in a spatially non-uniform configuration.
- each of the light-emitting components is a light emitting diode (LED) and the non-uniform configuration is based on a random arrangement of the LEDs upon a substrate.
- a repetition of LEDs is avoided along any radial axis of the lighting element 205. Such an arrangement can improve performance of a Fourier transform as will be understood by those skilled in the art.
- LED light emitting diode
- the random or pseudo-random arrangement of LEDs does not conform to a square matrix array grid notwithstanding the position of some LEDs randomly coinciding with grid lines or vertices of the grid.
- all the LEDs are identical.
- some or all of the LEDs differ from each other in various ways such as, for example, emission wavelength, emission color, shape, and/or size.
- a lighting element controller 210 is coupled to the lighting element 205 for activating the LEDs in various ways.
- the lighting element controller 210 includes a LED activation sequence generator 220 coupled to a LED driver 215 in a configuration that activates the LEDs of the lighting element 205 in accordance with various activation sequences.
- FIG. 3 illustrates an imaging system 300 that includes the lighting element 205 according to an embodiment.
- the imaging system 300 further includes an image sensor 315 containing a number of detection elements such as CMOS detectors or charge coupled detectors (CCD).
- the image sensor 315 is identical to the image sensor 115 described above.
- eight detection elements are shown arranged adjacent to each other in one row. Additional detection elements can be present in additional rows of the image sensor 315.
- a transparent layer 310 for placement of a specimen 318 or sample.
- the transparent layer 310 is identical to the transparent layer 110 described above.
- the transparent layer 310 is typically made of glass and may be referred to as a petri plate, petri dish, or a cell-culture dish.
- the transparent layer 310 can be either mounted directly upon a top surface of the image sensor 315 or upon an intermediate film (a plastic sheet, for example) that is placed upon the top surface of the image sensor 315. This arrangement effectively renders a separation distance “d2” between the transparent layer 310 and the top surface of the image sensor 315 as being close to zero.
- the distance “d2” is negligible in comparison to a separation distance “dl” between the transparent layer 310 and a light-emitting surface of the lighting element 205.
- a separation distance “dl” between the transparent layer 310 and a light-emitting surface of the lighting element 205.
- capturing a single image or a series of images of the specimen 318 at a single focus distance results in an image that is out of focus in certain regions of the image.
- Such a specimen 318 would have a d2 distance that varies at different portions of the cross-sectional area of the specimen 318.
- examples of the disclosure utilize a focus stacking process that utilizes multiple images to assemble a super-resolution image suitable for computer-aided analysis.
- the lighting element 205 emits light generated by the array of light-emitting components.
- a sequential activation operation is carried out upon the LEDs of the lighting element 205 in accordance with one or more pre-configured activation sequences.
- the LED activation sequence generator 220 shown in FIG. 2 and described above can be configured to generate the preconfigured activation sequences.
- a sequential activation operation is carried out upon the LEDs of the lighting element 205 in accordance with random activation sequences that can be generated by the LED activation sequence generator 220 shown in FIG. 2 and described above.
- the illustration includes a cutaway view of the lighting element 205 containing eight LEDs.
- the cutaway view includes eight LEDs placed in accordance with the random arrangement shown in FIG. 2.
- LED l can be at a first location on a 3D map 316
- LED 2 can be at a second location on the 3D map 316, and so on for all eight LEDs.
- Each of the LEDs emits non-coherent light that radiates in multiple directions. Not shown are many other light beams that are emitted by each LED and produce a shadow of the transparent layer 310 and the specimen 318 placed thereon, upon the top surface of the image sensor 315.
- the illustrated set of light beams correspond to a focal point 319 on the transparent layer 310, which is negligibly close to the top surface of the image sensor 315.
- the dashed line oval 321 shows an expanded view of the light beams propagating through the specimen 318 at the focal point 319 and falling upon detection element (4) of the image sensor 315.
- the designation “pl” corresponds to a width of the detection element (4) and thus corresponds to one pixel. Unlike the equidistant sub-pixel spacing described above with reference to FIG.
- the sub-pixel spacing shown in dashed line oval 321 has a non-uniform characteristic and can encompass various sub-pixel widths as a result of the eight LEDs being arranged in a spatially non-uniform configuration.
- the distance between LED l and LED 2 in the 3D map 316 is different than a distance between LED 2 and LED 3 in the 3D map 316, and so on.
- a pixel (indicated by distance “pl”) includes eight sub-pixels having non-uniform widths.
- the image sensor 315 generates electrical signals in response to the incident light beams and provides the electrical signals to a computer (not shown) for generating eight subpixel shifted images.
- the eight sub-pixel shifted images can be operated upon by the computer (by executing a software algorithm, for example) to generate a high-resolution image of the specimen 318 in accordance with an embodiment.
- FIG. 4 illustrates an imaging system 400 that includes the lighting element 205 in accordance with an embodiment.
- the transparent layer 310 is located at a distance “d4” above the top surface of the image sensor 315, where d4 > d2 (d2 is shown in FIG. IB and FIG. 3).
- the illustrated set of example light beams correspond to a focal point 419 on the transparent layer 310.
- the shadow created by the light beams shown in FIG. 4 is spread out over a larger area on the top surface of the image sensor 315.
- the illustrated light beam produced by LED l is incident upon a detection element (8) of the image sensor 315 at an activation time “tl” of LED l.
- the detection element (8) produces a first electrical signal in response.
- the first electrical signal can be provided to a computer which, in one embodiment, generates a first sub-pixel shifted image based on the first electrical signal and position information associated with LED l .
- the computer can be pre-configured with position information of LED l and the other LEDs of the lighting element 205.
- the position of each LED is defined by Cartesian coordinates, such as, for example, a set of eight two-dimensional (2D) x-y Cartesian coordinates corresponding to the eight LEDs.
- the position information of the LEDs can be stored in a lookup table that is accessible to the computer.
- a computer generates the first sub-pixel shifted image based on the first electrical signal, position information associated with LED l, and in at least some cases, based on additional information.
- the additional information can include, for example, a separation distance between the lighting element 205 and the transparent layer 310 (“d3” in the illustrated example), a separation distance between the transparent layer 310 and the image sensor 315 (“d4” in the illustrated example), a separation distance between the lighting element 205 and the image sensor 315, and/or an angle of incidence of light upon a respective detection element of the image sensor 315 (detection element (8), in this example).
- the illustrated light beam produced by LED 2 is incident upon the detection element (7) of the image sensor 315 at an activation time “t2” of LED 2.
- the detection element (7) produces a second electrical signal in response.
- the second electrical signal is provided to the computer which generates a second sub-pixel shifted image based on the second electrical signal, position information associated with LED 2, and in at least some cases, based on additional information such as described above.
- the illustrated light beam produced by LED 3 is incident upon the detection element (6) of the image sensor 315 at an activation time “t3” of LED 3.
- the detection element (6) produces a third electrical signal in response.
- the third electrical signal is provided to the computer which generates a third sub-pixel shifted image based on the third electrical signal, position information associated with LED 3, and in at least some cases, based on additional information such as described above.
- the illustrated light beam produced by LED 4 is incident upon the detection element (6) of the image sensor 315 at an activation time “t4” of LED 4.
- the detection element (6) produces a fourth electrical signal in response.
- the fourth electrical signal is provided to the computer which generates a fourth sub-pixel shifted image based on the fourth electrical signal, position information associated with LED 4, and in at least some cases, based on additional information such as described above.
- the illustrated light beam produced by LED 5 is incident upon the detection element (5) of the image sensor 315 at an activation time “t5” of LED 5.
- the detection element (5) produces a fifth electrical signal in response.
- the fifth electrical signal is provided to the computer which generates a fifth sub-pixel shifted image based on the fifth electrical signal, position information associated with LED 5, and in at least some cases, based on additional information such as described above.
- the illustrated light beam produced by LED 6 is incident upon the detection element (3) of the image sensor 315 at an activation time “t6” of LED 6.
- the detection element (3) produces a sixth electrical signal in response.
- the sixth electrical signal is provided to the computer which generates a sixth sub-pixel shifted image based on the sixth electrical signal, position information associated with LED 6, and in at least some cases, based on additional information such as described above.
- the illustrated light beam produced by LED 7 is incident upon the detection element (2) of the image sensor 315 at an activation time “t7” of LED 7.
- the detection element (2) produces a seventh electrical signal in response.
- the seventh electrical signal is provided to the computer which generates a seventh sub-pixel shifted image based on the seventh electrical signal, position information associated with LED 7, and in at least some cases, based on additional information such as described above.
- the illustrated light beam produced by LED 8 is incident upon the detection element (1) of the image sensor 315 at an activation time “t8” of LED 8.
- the detection element (1) produces an eighth electrical signal in response.
- the eighth electrical signal is provided to the computer which generates an eighth sub-pixel shifted image based on the eighth electrical signal, position information associated with LED 8, and in at least some cases, based on additional information such as described above.
- the eight sub-pixel shifted images can be operated upon by the computer to generate a high-resolution image of the specimen 318 in accordance with an embodiment.
- the computer generates the high-resolution image of the specimen 318 based on executing a software application.
- the software application can include an algorithm that evaluates sub-pixel shifted images that are generated by use of the electrical signals provided by the eight LEDs at the times “tl” through “t8” (corresponding to one activation sequence) and at other times corresponding to additional activation sequences.
- One or more of the sub-pixel shifted images offer the best focus for viewing contaminants (if any) that may be present in the specimen 318.
- the high-resolution image of the specimen 318 is based on evaluating a number of sub-pixel shifted images generated over an extended period of time using multiple activation sequences. Evaluating the sub-pixel shifted images over the extended period of time can be carried out, for example, to evaluate a specimen culture that changes over time.
- the sub-pixel shifted images obtained by use of the imaging system 400 incorporating the lighting element 205 typically do not contain artifacts such as can be present in sub-pixel shifted images obtained by use of the imaging system 155 described above.
- the description provided above with reference to FIG. 4 is equally applicable to any other imaging system that includes the lighting element 205.
- the transparent layer 310 can be at any location between the lighting element 205 and the image sensor 315 in such an imaging system. More particularly, the separation distance “d3” can have any value over a first range of separation distances, the separation distance “d4” can have any value over a second range of separation distances, and the separation distance between the lighting element 205 and the image sensor 315 can have any value over a third range of separation distances.
- FIG. 5 shows an evaluation system 500 that includes a specimen evaluation unit 505 communicatively coupled to a computer 515.
- the specimen evaluation unit 505 is a stand-alone unit that is communicatively coupled to the computer 515 either wirelessly or via a wired connection.
- An example wireless connection is a Bluetooth connection.
- An example wired connection is an Ethernet connection.
- the specimen evaluation unit 505 and the computer 515 are co-located in an integrated configuration inside an enclosure.
- the specimen evaluation unit 505 can include “n” sample modules 510 (n > 1).
- Each of the sample modules 510 can include the components described above with reference to the imaging system 300 and the imaging system 400.
- each of the one or more sample modules 510 can be configured to evaluate the specimen 318 described above.
- the “n” sample modules 510 can be configured for evaluating “n” specimens over an extended period of time under different conditions. For example, a first specimen placed in sample module (1) 510-1 can be evaluated at a first ambient temperature, a second identical specimen placed in sample module (2) 510-2 can be evaluated at a second ambient temperature, and so on.
- evaluation of each of the specimens can involve detecting a contaminant such as, for example, detecting a contaminant in a drug specimen, detecting a contaminant in a fluid specimen, or detecting a pollutant in a liquid specimen.
- evaluation of a culture specimen can involve an automated enumeration of bacterial and fungal colonies.
- two or more sample modules 510 can be configured to evaluate more than one kind of specimen 310.
- Communication link 540 is selected to support transmission of electrical signals generated by one or more image sensors included in the “n” sample modules.
- the image sensor(s) can be identical to, or substantially similar to, the image sensor 315 described above.
- the computer 515 includes a processor 520 and a memory 525 and can further include components (not shown) such as associated with an input/output interface (keyboard, display, etc.) and associated with communications (wireless communications, connectivity to a communication network, etc.).
- the memory 525 includes high-resolution image generation software 530 and high-resolution image evaluation software 530.
- the processor 520 can execute the high-resolution image generation software 530 for generating one or more high-resolution images each based on a sequence of sub-pixel shifted images.
- the sub-pixel shifted images are generated based on electrical signals provided by sensing elements of an image sensor 315 and position information of the randomly placed LEDs in the lighting element 205 in the manner described above with reference to FIG. 3 and FIG. 4.
- the processor 520 obtains multiple images from a respective sample module 510.
- the images correspond to images captured at various focus depths of a specimen 318.
- the images are synthetically focused at different depths based on a depth parameter and a plurality of low-resolution images captured by the image sensor 315.
- the images correspond to a particular plane of interest of the specimen 318.
- the specimen 318 is often non-uniform. Accordingly, certain regions of a single synthetically focused image associated with a first depth parameter can be out of focus when a non-uniform specimen 318 is imaged.
- examples of the disclosure capture images that are synthetically focused at different focus depths and perform a focus stacking process where infocus portions of multiple images are identified, extracted, and re-assembled into a single image that has a greater depth of field relative to a single image.
- the processor 520 performs a deconvolution process to generate a super-resolution single image from multiple images.
- the computer 515 is configure to generate a set of high- resolution images using the above-referenced focus-stacking process over an extended time period based on electrical signals provided by the specimen evaluation unit 505.
- the electrical signals can be provided by the specimen evaluation unit 505 to the computer 515 in accordance with one of a repetitive schedule, an intermittent schedule, or a random schedule.
- the processor 520 can execute the high-resolution image evaluation software 535 for evaluating one or more high-resolution images generated by the high-resolution image generation software 530.
- evaluating the high-resolution image(s) can involve detecting a contaminant, and/or evaluating growth characteristics of a culture.
- the high- resolution image evaluation software 535 is typically configured to detect and provide an indication of contaminants that may be invisible to the human eye or are not yet at a stage where visible to the human eye.
- the high-resolution image evaluation software 535 is configured to interact with a human who can visually inspect one or more images. The visual inspection may be carried out via a display screen of the computer 515.
- FIG. 6 illustrates an embodiment of a sample module 510 of the specimen evaluation unit 505 shown in FIG. 5.
- the sample module 510 includes the lighting element 205 and the image sensor 315 that have been described above.
- the sample module 510 further include a nutrient cartridge 615 and a temperature control element 620.
- the nutrient cartridge 615 houses the specimen 318 on a transparent layer 310.
- the specimen 318 may be placed in contact with a nutrient such as agar that is filled into the nutrient cartridge 615 on top of the specimen 318.
- the nutrient cartridge 615 can be replaced by one or more other elements that provide support to the transparent layer 310.
- the lighting element 205, the nutrient cartridge 615, and the image sensor 315 of the sample module 510 function in the manner described above with reference to FIG. 3 and FIG. 4.
- a separation distance exists between the transparent layer 310 of the nutrient cartridge 615 and a top surface of the image sensor 315 upon which light is incident after propagating through the specimen 318 contained in the nutrient cartridge 615.
- the separation distance can vary in accordance with various factors such as, for example, the structure of the nutrient cartridge 615 and the structure of the sample module 510.
- the electrical signals generated by the image sensor 315 are provided to the computer 315 via an I/O 625 (an Ethernet card or Bluetooth circuit, for example).
- the computer 515 generates one or more high resolution images that can be evaluated for purposes such as detecting the presence of contaminants in the specimen 318 if any such contaminants are present.
- the nutrient cartridge 615 is disposable after use.
- An advantage offered by the disposable cartridge is savings in cost compared to an image sensor, such as, for example, the image sensor 115, which is typically discarded after a single use.
- the reason for discarding the image sensor after a single use is due to residual contamination of a top surface of the sensor upon which a specimen has been placed.
- the image sensor is typically more expensive than the disposable nutrient cartridge 615.
- the temperature control element 620 can be provided in the sample module 510 for incubation purposes of the specimen 318.
- heat generated by a CMOS image sensor was used to generate heat.
- Heating control was implemented in a relatively crude fashion by turning the CMOS image sensor on and off by use of a “bang-bang” control loop. Cooling was achieved by use of an additional component in the form of a forward-biased thermo-electric cooler (TEC).
- TEC thermo-electric cooler
- the temperature control element 620 is a TEC configured to operate in a dual-purpose role as both a heating element and a cooling element.
- the TEC is a part of a H-bridge circuit that is configured to place the TEC in a forward bias condition over a first period of time to operate as a heating element, and to place the TEC in a reverse bias condition over a second period of time to operate as a cooling element.
- the time periods, repetition rate, and other factors of the forward bias and reverse bias conditions can be controlled by a computer in order to achieve a desired ambient temperature processor in the sample module 510.
- a TEC intrinsically operates as a heat pump and thus achieves very high operating efficiency. The efficiency can be greater than 100% in some cases when used for heating.
- FIG. 7 illustrates an example data flow of the process of generating a superresolution image of a specimen 318 from multiple images captured using an image sensor 315, lighting element 205, and nutrient cartridge 615, according to various embodiments.
- images 701a, 701b, and 701c are generated based on a set of low- resolution images captured using image sensor 315.
- the physical position of a light emitting element within lighting element 205 that was activated when a low-resolution image was captured is recorded along with the low-resolution images.
- high-resolution image generation software 530 generates multiple high-resolution images, such as images 701a, 701b, and 701c.
- high-resolution image generation software 530 generates an in-focus super-resolution image 707 of the specimen 318 by extracting the regions 703a, 703b, and 703c and reassembling the regions into a single super-resolution image 707 of the specimen 318.
- In-focus superresolution image 707 represents a composite image generated based on the regions 703a, 703b, and 703c of the respective images 701a, 701b, and 701c.
- FIG. 8 is a flow diagram of an example method 800 to generate an image with an expanded depth of field according to various embodiments.
- the method 800 can be performed by processor 520 executing high-resolution image generation software 530 in some embodiments.
- processor 520 executing high-resolution image generation software 530 in some embodiments.
- method 800 begins at step 802, which involves sequentially activating light emitting elements, such as one or more lighting elements in lighting element 205, toward a specimen 318 in a growth medium, such as a growth medium provided by a nutrient cartridge 615.
- the respective light emitting elements have a known physical position within lighting element 205.
- the method 800 includes capturing a plurality of low resolution images based on the sequential activation of the light emitting elements.
- the low-resolution images are captured using image sensor 115.
- Each low-resolution image is associated with a physical position of the light emitting element within lighting element 205 that was activated when a respective low-resolution image is captured.
- the low-resolution image along with a position indication for the respective light emitting element from lighting element 205 is recorded and used to later generate high-resolution images.
- method 800 includes synthesizing high-resolution images at multiple depth parameters.
- high-resolution image generation software 530 synthetically refocuses and super resolves the low-resolution images to generate respective high-resolution images that are focused at a depth specified by a respective depth parameter.
- different depth parameters are used to account for the nonuniformity of specimen 318 in terms of the distance of the specimen from image sensor 315 across the cross-sectional area of the specimen 318.
- High-resolution image generation software 530 can select the depth parameters based on a pre-selected range of depth parameters.
- high-resolution image generation software 530 obtains high-resolution images using ten separate depth parameters. For example, a first slice can be obtained using a depth parameter of 13 microns or a distance from the image sensor to the film associated with the cartridge. Subsequent depth parameters are chosen using a super-linear function with respect to depth. Therefore, in one example, a next slice is obtained at 14 microns, a next slice at 16 microns, a next slice at 19 microns, then 23 microns, and so on.
- the method 800 includes generating an output image based on the high-resolution images that are focused at various depths based on the depth parameters. Generating the output image could involve performing a focus stacking process on multiple images of the specimen 318. For example, an image deconvolution process can be performed. Additionally, a contrast analysis of a depth map generated from the images can be performed to identify in-focus regions of the multiple images, from which an in-focus super-resolution image is generated.
- a method described above for generating an in-focus super-resolution image of a sample includes activating a light emitting element, wherein the light emitting element emits light towards a specimen in a growth medium, capturing a plurality of images of the specimen using an image sensor positioned on an opposing side of the specimen relative to the light emitting element, wherein each of the plurality of images is synthetically focused at a different depth based on a respective depth parameter, and generating an output image based on the plurality of images based on focus stacking the plurality of images to generate the output image.
- a computer-implemented method to generate an in-focus super-resolution image of a sample comprises activating a lighting element, wherein the lighting element emits light towards a specimen in a growth medium, capturing a plurality of images of the specimen using an image sensor positioned on an opposing side of the specimen relative to the light emitting element, wherein each of the plurality of images is synthetically focused at a different depth, and generating an output image based on the plurality of images based on focus stacking the plurality of images to generate the output image.
- one or more non-transitory computer-readable media store instructions that, when executed by one or more processors, cause the one or more processors to perform the steps of activating a light emitting element, wherein the light emitting element emits light towards a specimen in a growth medium, capturing a plurality of images of the specimen using an image sensor positioned on an opposing side of the specimen relative to the light emitting element, wherein each of the plurality of images is focused at a different depth, and generating an output image based on the plurality of images based on focus stacking the plurality of images to generate the output image.
- generating the output image comprises generating a composite image of the plurality of images based on the depth map.
- an imaging system comprises a light emitting element, an image sensor, and a processor executing an image generation application that causes the processor to perform the steps of activating a light emitting element, wherein the light emitting element emits light towards a specimen in a growth medium, capturing a plurality of images of the specimen using an image sensor positioned on an opposing side of the specimen relative to the light emitting element, wherein each of the plurality of images is focused at a different depth, and generating an output image based on the plurality of images based on focus stacking the plurality of images to generate the output image.
- the imaging system of clauses 18 or 19, further comprises estimating a depth map based on a contrast analysis of the plurality of images, wherein the output image is based on the depth map.
- aspects of the present embodiments can be embodied as a system, method or computer program product. Accordingly, aspects of the present disclosure can take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that can all generally be referred to herein as a “module,” a “system,” or a “computer.” In addition, any hardware and/or software technique, process, function, component, engine, module, or system described in the present disclosure can be implemented as a circuit or set of circuits. Furthermore, aspects of the present disclosure can take the form of a computer program product embodied in one or more computer readable medium having computer readable program code embodied thereon.
- the computer readable medium can be a computer readable signal medium or a computer readable storage medium.
- a computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
- a computer readable storage medium can be any tangible medium that can contain or store a program for use by or in connection with an instruction execution system, apparatus, or device.
- each block in the flowchart or block diagrams can represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function.
- the functions noted in the block can occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks can sometimes be executed in the reverse order, depending upon the functionality involved.
Landscapes
- Chemical & Material Sciences (AREA)
- Physics & Mathematics (AREA)
- Analytical Chemistry (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Biochemistry (AREA)
- Organic Chemistry (AREA)
- General Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Medicinal Chemistry (AREA)
- Proteomics, Peptides & Aminoacids (AREA)
- Molecular Biology (AREA)
- Dispersion Chemistry (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Genetics & Genomics (AREA)
- Signal Processing (AREA)
- Optics & Photonics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Water Supply & Treatment (AREA)
- Clinical Laboratory Science (AREA)
- Chemical Kinetics & Catalysis (AREA)
- Investigating Or Analysing Materials By Optical Means (AREA)
- Apparatus Associated With Microorganisms And Enzymes (AREA)
Abstract
One among various embodiments discloses a computer-implemented method described above for generating an in-focus super-resolution image of a sample includes activating a light emitting element, wherein the light emitting element emits light towards a specimen in a growth medium, capturing a plurality of images of the specimen using an image sensor positioned on an opposing side of the specimen relative to the light emitting element, wherein each of the plurality of images is focused at a different depth, and generating an output image based on the plurality of images based on focus stacking the plurality of images to generate the output image.
Description
EXTENDED DEPTH OF FIELD FOR HIGH RESOLUTION IMAGES BASED ON SUB-PIXEL SHIFTED IMAGES
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] The present application claims the benefit of U.S. Provisional Application titled, “SYSTEM AND METHODS FOR IMPROVED BIOBURDEN AND/OR STERILITY TESTING,” filed on May 6, 2024, and having Serial No. 63/643,126. The present application claims the benefit of U.S. Provisional Application titled, “SYSTEM AND METHODS FOR IMPROVED TIMING FOR BIOBURDEN AND/OR STERILITY TESTING,” filed on May 6, 2024, and having Serial No. 63/643,140. The present application claims the benefit of U.S. Provisional Application titled, “SYSTEM AND METHODS FOR IMPROVED FILTRATION FOR BIOBURDEN AND/OR STERILITY TESTING,” filed on May 6, 2024, and having Serial No. 63/643,142. The present application claims the benefit of U.S. Provisional Application titled, “IMPROVED CARTRIDGES/CONSUMABLES FOR BIOBURDEN AND/OR STERILITY TESTING,” filed on May 6, 2024, and having Serial No. 63/643,132. The present application claims the benefit of U.S. Provisional Application titled, “SYSTEM AND METHODS FOR IMPROVED CONTACT AND IMAGING IN BIOBURDEN AND/OR STERILITY TESTING,” filed on May 6, 2024, and having Serial No. 63/643,135. The subject matter of these related applications is hereby incorporated herein by reference.
BACKGROUND
Field of the Various Embodiments
[0002] The various embodiments relate generally to generating high resolution images, and, more specifically, extended depth of field for high resolution images based on sub-pixel shift images.
Description of the Related Art
[0003] An image sensor is typically used to convert light into electrical signals that can be used for generating images. The quality of the images can be indicated by a resolution parameter in the form of pixel density. The ePetri technology which, in some instances, refers to certain technology described in U.S. Patent Nos. 9,426,429, 9,643,184, 9,569,664, and 9,343,494, exhibits the problem of having a limited depth of field. As a result, images generated by such a system and the accompanying methods suffer from the drawback that specimens or portion of specimens that are imaged that are outside a plane of focus become blurred. As a result, producing high-resolution images of the specimen is a challenge because
the film or filter on which a specimen is deposited is a non-uniform distance from an image sensor. Additionally, in some cases, the specimen itself is non-uniform such that portions of the specimen are a non-uniform distance from the image sensor. Therefore, computer-aided or manual analysis of images with a limited depth of field is impaired because the images that are captured can be partly out of focus.
[0004] As the foregoing illustrates, what is needed in the art are more effective ways to produce high resolution a petri dish environment to facilitate analysis of specimens.
SUMMARY
[0005] In an example embodiment, a method to generate sub-pixel shifted images includes generating light by sequentially activating each of a plurality of light emitting elements arranged in a spatially non-uniform configuration, propagating the generated light through a transparent layer supporting a specimen, and generating a plurality of output signals from an image sensor configured to detect the light propagated through the transparent layer supporting the specimen, the plurality of output signals indicative of a plurality of sub-pixel shifted images. The plurality of sub-pixel shifted images do not include artifacts that may be present in sub-pixel shifted images generated by use of at least some prior art imaging systems such as the prior art system described above with reference to FIGs 1 A-C. The method can further include steps such as evaluating at least some of the plurality of sub-pixel shifted images to detect any anomaly if present in the specimen and/or using at least some of the plurality of sub-pixel shifted image for generating a high-resolution image of the specimen.
[0006] At least one technical advantage of the disclosed techniques herein relative to the prior art is that, with the disclosed techniques, the depth of field of images captured in an ePetri dish environment is improved. Another technical advantage of the disclosed techniques includes improved analysis of images of a sample that is captured in an ePetri dish environment.
[0007] These technical advantages provide one or more technological improvements over prior art approaches.
BRIEF DESCRIPTION OF THE DRAWINGS
[0008] So that the manner in which the above recited features of the various embodiments can be understood in detail, a more particular description of the inventive concepts, briefly summarized above, can be had by reference to various embodiments, some of which are illustrated in the appended drawings. It is to be noted, however, that the appended drawings
illustrate only typical embodiments of the inventive concepts and are therefore not to be considered limiting of scope in any way, and that there are other equally effective embodiments.
[0009] FIG. 1 A illustrates a prior art lighting element of an imaging system.
[0010] FIG. IB illustrates an imaging system that includes the prior art lighting element shown in FIG. 1 A.
[0011] FIG. 1C illustrates another imaging system that includes the prior art lighting element shown in FIG. 1 A.
[0012] FIG. 2 illustrates a lighting element according to various embodiments.
[0013] FIG. 3 illustrates an example imaging system that includes the lighting element shown in FIG. 2.
[0014] FIG. 4 illustrates another example imaging system that includes the lighting element shown in FIG. 2.
[0015] FIG. 5 shows an example evaluation system that includes a specimen evaluation unit containing the lighting element shown in FIG. 2.
[0016] FIG. 6 illustrates an example sample module that can be a part of the specimen evaluation unit shown in FIG. 5.
[0017] FIG. 7 illustrates an example data flow of the process of generating a superresolution image of a specimen from multiple images captured using an image sensor, lighting element, and nutrient cartridge, according to various embodiments.
[0018] FIG. 8 is a flow diagram of method steps to generate sub-pixel shifted images according to an embodiment.
DETAILED DESCRIPTION
[0019] In the following description, numerous specific details are set forth to provide a more thorough understanding of the various embodiments. However, it will be apparent to one skilled in the art that the inventive concepts can be practiced without one or more of these specific details. For explanatory purposes, multiple instances of like objects are symbolized with reference numbers identifying the object and parenthetical numbers identifying the instance where needed. While the invention may be described within the context of a use case associated with a specific imaging system, the inventive concepts described herein are broader than that particular use case and can be applied in any appropriate context / use case. In certain
instances, this application may reference directly or indirectly to ePetri technology which, in some instances, refers to certain technology described in U.S. Patent Nos. 9,426,429, 9,643,184, 9,569,664, and 9,343,494, the disclosures of all of which are incorporated herein by reference in their entireties.
[0020] FIG. 1 A illustrates a lighting element 105 of an imaging system. The lighting element 105 includes a set of light-emitting components arranged in a spatially uniform configuration. The spatially uniform configuration corresponds to a square matrix array having eight rows and eight columns. In an example implementation, each of the lightemitting components is a light emitting diode (LED).
[0021] FIG. IB illustrates an imaging system 150 that includes the lighting element 105. The imaging system 150 further includes an image sensor 115, such as a CMOS image sensor or a charge coupled detector (CCD). In the illustrated example, eight detection elements are shown arranged adjacent to each other in one row. Additional detection elements can be present in additional rows of the image sensor 115. Further included in the imaging system 150 is a transparent layer 110 for placement of a specimen 118. The transparent layer 110 is typically made of glass and may be referred to as a petri plate, petri dish, or a cell-culture dish. The transparent layer 110 can be either mounted directly upon a top surface of the image sensor 115 or upon an intermediate film (a plastic sheet, for example) that is placed upon the top surface of the image sensor 115. This arrangement effectively renders a separation distance “d2” between the transparent layer 110 and the top surface of the image sensor 115 as being close to zero. Furthermore, the distance “d2” is negligible in comparison to a separation distance “dl” between the transparent layer 110 and a light-emitting surface of the lighting element 105. In this example, the light-emitting surface of the lighting element 105 propagates light generated by the array of light-emitting components shown in FIG. 1A.
[0022] In one application, a raster scanning operation is applied to the array of lightemitting components of the lighting element 105, which can be an array of LEDs. The raster scanning operation employs a sequential activation procedure whereby each LED in a first row of LEDs is activated sequentially followed by sequential activation of each LED in a second row of LEDs, and so on. Thus, in the illustration shown in FIG. 2, LED l is activated over a preset time period, followed by activation of LED 2 for the preset time period, followed by activation of LED 3 for the preset time period, and so on. After activation of LED 8, the LEDs of an adjacent row (not shown) are similarly activated in a sequential manner, followed by sequential activation of LEDs of the next adjacent row, and so on. The activation is
repeated cyclically from the first row, after LED 8 of the last row is activated.
[0023] The illustration depicts a snapshot of a set of light beams corresponding to a row scanning sequence. The set of light beams is angularly emitted towards the image sensor 115. In the example illustration, the set of light beams is incident upon detection element (4) of the image sensor 115. As is known, a LED emits non-coherent light that radiates in multiple directions. Not shown are many other light beams that are emitted by each LED and produce a shadow of the transparent layer 110 and the specimen 118 placed thereon, upon the top surface of the image sensor 115.
[0024] The illustrated set of light beams correspond to a focal point 119 on the transparent layer 110, which is negligibly close to the top surface of the image sensor 115. The dashed line oval 121 shows an expanded view of the light beams propagating through the specimen 118 at the focal point 119. The distance “pl” corresponds to a width of the detection element (4) and thus corresponds to one pixel. The distance “si” corresponds to a separation distance between a beam emitted by the LED l and a beam emitted by adjacent LED 2 and is referred to as a sub-pixel distance. In this example, the lighting element 105 has eight LEDs in each row. Correspondingly, the distance “pl” is subdivided into eight sub-pixels (“si” through” s8”). The width of each of the eight sub-pixel distances is identical as a result of the equidistant spacing between LEDs (LED l through LED 8).
[0025] As indicated above, each of the light beams is one among many beams produced by a corresponding LED and a shadow created by the light beams is projected on to the top surface of the image sensor 115. The image sensor 115 generates output in the form of electrical signals, in response to a first projected shadow corresponding to the light beams of the LED L The electrical signals are provided to a computer, which generates a first subpixel shifted image based on the electrical signals. Seven more sub-pixel shifted images that are sub-pixel shifted with reference to the first sub-pixel shifted image are generated corresponding to light beams produced by the other seven LEDs. The eight sub-pixel shifted images generated as a result of the light beams of the eight LEDs can be operated upon by the computer to generate a high-resolution image of the specimen 118 (by executing a software algorithm, for example).
[0026] FIG. 1C illustrates an imaging system 155 that includes the lighting element 105. In this case, the transparent layer 110 is located at a distance “d4” above the top surface of the image sensor 115, where d4 > d2 (d2 is shown in FIG. IB). The illustrated set of example light beams correspond to a focal point 122 on the transparent layer 110. Unlike the shadow
created by the light beams shown in FIG. 1 A, the shadow created by the light beams shown in FIG. IB is spread out over a larger area on the top surface of the image sensor 115. More particularly, the illustrated light beam produced by LED l is incident upon a spot between detection element (7) and detection element (8) of the image sensor 115. Consequently, the image sensor 115 fails to produce an electrical signal, or produces a weak electrical signal, when the light beam is incident upon the image sensor 115. If no electrical signal is produced, the computer fails to generate a portion of an image corresponding to the spot between detection element (7) and detection element (8) of the image sensor 115. If the electrical signal is weak, a portion of an image corresponding to the spot between detection element (7) and detection element (8) of the image sensor 115 can be an imaging artifact.
[0027] Due to the equidistant sub-pixel spacing (described above), each of the other light beams produced by the other LEDs are incident upon spots between adjacent detection elements. Consequently, each of the sub-pixel shifted images generated at the time of incidence of the other light beams upon the image sensor 115 either lacks image information at these spots or includes imaging artifacts. As a result of this shortcoming, a high-resolution image produced by a computer based on operating upon these low-quality sub-pixel shifted images can contain image artifacts. In at least some cases, the high-resolution image appears as a pixelated image. It is therefore desirable to provide an improved image system that eliminates the low-quality sub-pixel shifted images.
[0028] FIG. 2 illustrates a lighting element 205 in accordance with various embodiments. The lighting element 205 includes a number of light-emitting components arranged in a spatially non-uniform configuration. In one embodiment, each of the light-emitting components is a light emitting diode (LED) and the non-uniform configuration is based on a random arrangement of the LEDs upon a substrate. In another embodiment, a repetition of LEDs is avoided along any radial axis of the lighting element 205. Such an arrangement can improve performance of a Fourier transform as will be understood by those skilled in the art. The random or pseudo-random arrangement of LEDs (indicated by dark circles) does not conform to a square matrix array grid notwithstanding the position of some LEDs randomly coinciding with grid lines or vertices of the grid. In one implementation, all the LEDs are identical. In another implementation, some or all of the LEDs differ from each other in various ways such as, for example, emission wavelength, emission color, shape, and/or size.
[0029] In the illustrated embodiment, a lighting element controller 210 is coupled to the lighting element 205 for activating the LEDs in various ways. In the illustrated example, the
lighting element controller 210 includes a LED activation sequence generator 220 coupled to a LED driver 215 in a configuration that activates the LEDs of the lighting element 205 in accordance with various activation sequences.
[0030] FIG. 3 illustrates an imaging system 300 that includes the lighting element 205 according to an embodiment. The imaging system 300 further includes an image sensor 315 containing a number of detection elements such as CMOS detectors or charge coupled detectors (CCD). In one embodiment, the image sensor 315 is identical to the image sensor 115 described above. In the illustrated example, eight detection elements are shown arranged adjacent to each other in one row. Additional detection elements can be present in additional rows of the image sensor 315.
[0031] Further included in the imaging system 300 is a transparent layer 310 for placement of a specimen 318 or sample. In one embodiment, the transparent layer 310 is identical to the transparent layer 110 described above. The transparent layer 310 is typically made of glass and may be referred to as a petri plate, petri dish, or a cell-culture dish. The transparent layer 310 can be either mounted directly upon a top surface of the image sensor 315 or upon an intermediate film (a plastic sheet, for example) that is placed upon the top surface of the image sensor 315. This arrangement effectively renders a separation distance “d2” between the transparent layer 310 and the top surface of the image sensor 315 as being close to zero. Furthermore, the distance “d2” is negligible in comparison to a separation distance “dl” between the transparent layer 310 and a light-emitting surface of the lighting element 205. In some cases, due to the non-uniform nature of specimen 318, capturing a single image or a series of images of the specimen 318 at a single focus distance results in an image that is out of focus in certain regions of the image. Such a specimen 318 would have a d2 distance that varies at different portions of the cross-sectional area of the specimen 318. Accordingly, examples of the disclosure utilize a focus stacking process that utilizes multiple images to assemble a super-resolution image suitable for computer-aided analysis.
[0032] Returning to the configuration shown in FIG. 2, the lighting element 205 emits light generated by the array of light-emitting components. In one implementation, a sequential activation operation is carried out upon the LEDs of the lighting element 205 in accordance with one or more pre-configured activation sequences. The LED activation sequence generator 220 shown in FIG. 2 and described above can be configured to generate the preconfigured activation sequences. In another implementation, a sequential activation operation is carried out upon the LEDs of the lighting element 205 in accordance with random activation
sequences that can be generated by the LED activation sequence generator 220 shown in FIG. 2 and described above.
[0033] The illustration includes a cutaway view of the lighting element 205 containing eight LEDs. In this example, the cutaway view includes eight LEDs placed in accordance with the random arrangement shown in FIG. 2. LED l can be at a first location on a 3D map 316, LED 2 can be at a second location on the 3D map 316, and so on for all eight LEDs. Each of the LEDs emits non-coherent light that radiates in multiple directions. Not shown are many other light beams that are emitted by each LED and produce a shadow of the transparent layer 310 and the specimen 318 placed thereon, upon the top surface of the image sensor 315.
[0034] The illustrated set of light beams correspond to a focal point 319 on the transparent layer 310, which is negligibly close to the top surface of the image sensor 315. The dashed line oval 321 shows an expanded view of the light beams propagating through the specimen 318 at the focal point 319 and falling upon detection element (4) of the image sensor 315. The designation “pl” corresponds to a width of the detection element (4) and thus corresponds to one pixel. Unlike the equidistant sub-pixel spacing described above with reference to FIG. IB and shown in dashed line oval 121, the sub-pixel spacing shown in dashed line oval 321 has a non-uniform characteristic and can encompass various sub-pixel widths as a result of the eight LEDs being arranged in a spatially non-uniform configuration. For example, the distance between LED l and LED 2 in the 3D map 316 is different than a distance between LED 2 and LED 3 in the 3D map 316, and so on. In this example, a pixel (indicated by distance “pl”) includes eight sub-pixels having non-uniform widths.
[0035] The image sensor 315 generates electrical signals in response to the incident light beams and provides the electrical signals to a computer (not shown) for generating eight subpixel shifted images. The eight sub-pixel shifted images can be operated upon by the computer (by executing a software algorithm, for example) to generate a high-resolution image of the specimen 318 in accordance with an embodiment.
[0036] FIG. 4 illustrates an imaging system 400 that includes the lighting element 205 in accordance with an embodiment. In this case, the transparent layer 310 is located at a distance “d4” above the top surface of the image sensor 315, where d4 > d2 (d2 is shown in FIG. IB and FIG. 3). The illustrated set of example light beams correspond to a focal point 419 on the transparent layer 310. Unlike the shadow created by the light beams shown in FIG. 3 upon the detection element (4) of the image sensor 315, the shadow created by the light beams shown in FIG. 4 is spread out over a larger area on the top surface of the image sensor 315. More
particularly, the illustrated light beam produced by LED l is incident upon a detection element (8) of the image sensor 315 at an activation time “tl” of LED l. The detection element (8) produces a first electrical signal in response.
[0037] The first electrical signal can be provided to a computer which, in one embodiment, generates a first sub-pixel shifted image based on the first electrical signal and position information associated with LED l . The computer can be pre-configured with position information of LED l and the other LEDs of the lighting element 205. In one implementation, the position of each LED is defined by Cartesian coordinates, such as, for example, a set of eight two-dimensional (2D) x-y Cartesian coordinates corresponding to the eight LEDs. The position information of the LEDs can be stored in a lookup table that is accessible to the computer.
[0038] In another embodiment, a computer generates the first sub-pixel shifted image based on the first electrical signal, position information associated with LED l, and in at least some cases, based on additional information. The additional information can include, for example, a separation distance between the lighting element 205 and the transparent layer 310 (“d3” in the illustrated example), a separation distance between the transparent layer 310 and the image sensor 315 (“d4” in the illustrated example), a separation distance between the lighting element 205 and the image sensor 315, and/or an angle of incidence of light upon a respective detection element of the image sensor 315 (detection element (8), in this example).
[0039] Furthermore, the illustrated light beam produced by LED 2 is incident upon the detection element (7) of the image sensor 315 at an activation time “t2” of LED 2. The detection element (7) produces a second electrical signal in response. The second electrical signal is provided to the computer which generates a second sub-pixel shifted image based on the second electrical signal, position information associated with LED 2, and in at least some cases, based on additional information such as described above.
[0040] The illustrated light beam produced by LED 3 is incident upon the detection element (6) of the image sensor 315 at an activation time “t3” of LED 3. The detection element (6) produces a third electrical signal in response. The third electrical signal is provided to the computer which generates a third sub-pixel shifted image based on the third electrical signal, position information associated with LED 3, and in at least some cases, based on additional information such as described above.
[0041] The illustrated light beam produced by LED 4 is incident upon the detection element (6) of the image sensor 315 at an activation time “t4” of LED 4. The detection
element (6) produces a fourth electrical signal in response. The fourth electrical signal is provided to the computer which generates a fourth sub-pixel shifted image based on the fourth electrical signal, position information associated with LED 4, and in at least some cases, based on additional information such as described above.
[0042] The illustrated light beam produced by LED 5 is incident upon the detection element (5) of the image sensor 315 at an activation time “t5” of LED 5. The detection element (5) produces a fifth electrical signal in response. The fifth electrical signal is provided to the computer which generates a fifth sub-pixel shifted image based on the fifth electrical signal, position information associated with LED 5, and in at least some cases, based on additional information such as described above.
[0043] The illustrated light beam produced by LED 6 is incident upon the detection element (3) of the image sensor 315 at an activation time “t6” of LED 6. The detection element (3) produces a sixth electrical signal in response. The sixth electrical signal is provided to the computer which generates a sixth sub-pixel shifted image based on the sixth electrical signal, position information associated with LED 6, and in at least some cases, based on additional information such as described above.
[0044] The illustrated light beam produced by LED 7 is incident upon the detection element (2) of the image sensor 315 at an activation time “t7” of LED 7. The detection element (2) produces a seventh electrical signal in response. The seventh electrical signal is provided to the computer which generates a seventh sub-pixel shifted image based on the seventh electrical signal, position information associated with LED 7, and in at least some cases, based on additional information such as described above.
[0045] The illustrated light beam produced by LED 8 is incident upon the detection element (1) of the image sensor 315 at an activation time “t8” of LED 8. The detection element (1) produces an eighth electrical signal in response. The eighth electrical signal is provided to the computer which generates an eighth sub-pixel shifted image based on the eighth electrical signal, position information associated with LED 8, and in at least some cases, based on additional information such as described above.
[0046] The eight sub-pixel shifted images can be operated upon by the computer to generate a high-resolution image of the specimen 318 in accordance with an embodiment. In an example implementation, the computer generates the high-resolution image of the specimen 318 based on executing a software application. The software application can include an algorithm that evaluates sub-pixel shifted images that are generated by use of the electrical
signals provided by the eight LEDs at the times “tl” through “t8” (corresponding to one activation sequence) and at other times corresponding to additional activation sequences. One or more of the sub-pixel shifted images offer the best focus for viewing contaminants (if any) that may be present in the specimen 318.
[0047] In one implementation, the high-resolution image of the specimen 318 is based on evaluating a number of sub-pixel shifted images generated over an extended period of time using multiple activation sequences. Evaluating the sub-pixel shifted images over the extended period of time can be carried out, for example, to evaluate a specimen culture that changes over time. The sub-pixel shifted images obtained by use of the imaging system 400 incorporating the lighting element 205 typically do not contain artifacts such as can be present in sub-pixel shifted images obtained by use of the imaging system 155 described above.
[0048] The description provided above with reference to FIG. 4 is equally applicable to any other imaging system that includes the lighting element 205. The transparent layer 310 can be at any location between the lighting element 205 and the image sensor 315 in such an imaging system. More particularly, the separation distance “d3” can have any value over a first range of separation distances, the separation distance “d4” can have any value over a second range of separation distances, and the separation distance between the lighting element 205 and the image sensor 315 can have any value over a third range of separation distances.
[0049] FIG. 5 shows an evaluation system 500 that includes a specimen evaluation unit 505 communicatively coupled to a computer 515. In one embodiment, the specimen evaluation unit 505 is a stand-alone unit that is communicatively coupled to the computer 515 either wirelessly or via a wired connection. An example wireless connection is a Bluetooth connection. An example wired connection is an Ethernet connection. In another embodiment, the specimen evaluation unit 505 and the computer 515 are co-located in an integrated configuration inside an enclosure.
[0050] The specimen evaluation unit 505 can include “n” sample modules 510 (n > 1). Each of the sample modules 510 can include the components described above with reference to the imaging system 300 and the imaging system 400. In an embodiment, each of the one or more sample modules 510 can be configured to evaluate the specimen 318 described above. Thus, for example, the “n” sample modules 510 can be configured for evaluating “n” specimens over an extended period of time under different conditions. For example, a first specimen placed in sample module (1) 510-1 can be evaluated at a first ambient temperature, a second identical specimen placed in sample module (2) 510-2 can be evaluated at a second
ambient temperature, and so on. In one implementation, evaluation of each of the specimens can involve detecting a contaminant such as, for example, detecting a contaminant in a drug specimen, detecting a contaminant in a fluid specimen, or detecting a pollutant in a liquid specimen. In another implementation, evaluation of a culture specimen can involve an automated enumeration of bacterial and fungal colonies. In another implementation, two or more sample modules 510 can be configured to evaluate more than one kind of specimen 310.
[0051] Communication link 540 is selected to support transmission of electrical signals generated by one or more image sensors included in the “n” sample modules. The image sensor(s) can be identical to, or substantially similar to, the image sensor 315 described above. The computer 515 includes a processor 520 and a memory 525 and can further include components (not shown) such as associated with an input/output interface (keyboard, display, etc.) and associated with communications (wireless communications, connectivity to a communication network, etc.).
[0052] In this example embodiment, the memory 525 includes high-resolution image generation software 530 and high-resolution image evaluation software 530. The processor 520 can execute the high-resolution image generation software 530 for generating one or more high-resolution images each based on a sequence of sub-pixel shifted images. The sub-pixel shifted images are generated based on electrical signals provided by sensing elements of an image sensor 315 and position information of the randomly placed LEDs in the lighting element 205 in the manner described above with reference to FIG. 3 and FIG. 4.
[0053] In an example embodiment, the processor 520 obtains multiple images from a respective sample module 510. The images correspond to images captured at various focus depths of a specimen 318. The images are synthetically focused at different depths based on a depth parameter and a plurality of low-resolution images captured by the image sensor 315. The images correspond to a particular plane of interest of the specimen 318. However, the specimen 318 is often non-uniform. Accordingly, certain regions of a single synthetically focused image associated with a first depth parameter can be out of focus when a non-uniform specimen 318 is imaged. Accordingly, examples of the disclosure capture images that are synthetically focused at different focus depths and perform a focus stacking process where infocus portions of multiple images are identified, extracted, and re-assembled into a single image that has a greater depth of field relative to a single image. In one example, the processor 520 performs a deconvolution process to generate a super-resolution single image from multiple images.
[0054] In an embodiment, the computer 515 is configure to generate a set of high- resolution images using the above-referenced focus-stacking process over an extended time period based on electrical signals provided by the specimen evaluation unit 505. In various implementations, the electrical signals can be provided by the specimen evaluation unit 505 to the computer 515 in accordance with one of a repetitive schedule, an intermittent schedule, or a random schedule.
[0055] The processor 520 can execute the high-resolution image evaluation software 535 for evaluating one or more high-resolution images generated by the high-resolution image generation software 530. In some cases, evaluating the high-resolution image(s) can involve detecting a contaminant, and/or evaluating growth characteristics of a culture. The high- resolution image evaluation software 535 is typically configured to detect and provide an indication of contaminants that may be invisible to the human eye or are not yet at a stage where visible to the human eye. In some cases, the high-resolution image evaluation software 535 is configured to interact with a human who can visually inspect one or more images. The visual inspection may be carried out via a display screen of the computer 515.
[0056] FIG. 6 illustrates an embodiment of a sample module 510 of the specimen evaluation unit 505 shown in FIG. 5. In this example, the sample module 510 includes the lighting element 205 and the image sensor 315 that have been described above. The sample module 510 further include a nutrient cartridge 615 and a temperature control element 620. The nutrient cartridge 615 houses the specimen 318 on a transparent layer 310. The specimen 318 may be placed in contact with a nutrient such as agar that is filled into the nutrient cartridge 615 on top of the specimen 318. In other embodiments, the nutrient cartridge 615 can be replaced by one or more other elements that provide support to the transparent layer 310. The lighting element 205, the nutrient cartridge 615, and the image sensor 315 of the sample module 510 function in the manner described above with reference to FIG. 3 and FIG. 4.
[0057] As shown, a separation distance exists between the transparent layer 310 of the nutrient cartridge 615 and a top surface of the image sensor 315 upon which light is incident after propagating through the specimen 318 contained in the nutrient cartridge 615. The separation distance can vary in accordance with various factors such as, for example, the structure of the nutrient cartridge 615 and the structure of the sample module 510. The electrical signals generated by the image sensor 315 are provided to the computer 315 via an I/O 625 (an Ethernet card or Bluetooth circuit, for example). The computer 515 generates one
or more high resolution images that can be evaluated for purposes such as detecting the presence of contaminants in the specimen 318 if any such contaminants are present.
[0058] In various embodiments, the nutrient cartridge 615 is disposable after use. An advantage offered by the disposable cartridge is savings in cost compared to an image sensor, such as, for example, the image sensor 115, which is typically discarded after a single use. The reason for discarding the image sensor after a single use is due to residual contamination of a top surface of the sensor upon which a specimen has been placed. The image sensor is typically more expensive than the disposable nutrient cartridge 615.
[0059] The temperature control element 620 can be provided in the sample module 510 for incubation purposes of the specimen 318. In one scenario, heat generated by a CMOS image sensor was used to generate heat. Heating control was implemented in a relatively crude fashion by turning the CMOS image sensor on and off by use of a “bang-bang” control loop. Cooling was achieved by use of an additional component in the form of a forward-biased thermo-electric cooler (TEC).
[0060] An improvement provided in accordance with one or more embodiments involves the use of a single component to achieve both heating and cooling. In one such embodiment, the temperature control element 620 is a TEC configured to operate in a dual-purpose role as both a heating element and a cooling element. In an example implementation, the TEC is a part of a H-bridge circuit that is configured to place the TEC in a forward bias condition over a first period of time to operate as a heating element, and to place the TEC in a reverse bias condition over a second period of time to operate as a cooling element. The time periods, repetition rate, and other factors of the forward bias and reverse bias conditions can be controlled by a computer in order to achieve a desired ambient temperature processor in the sample module 510. In terms of heating operations, a TEC intrinsically operates as a heat pump and thus achieves very high operating efficiency. The efficiency can be greater than 100% in some cases when used for heating.
[0061] FIG. 7 illustrates an example data flow of the process of generating a superresolution image of a specimen 318 from multiple images captured using an image sensor 315, lighting element 205, and nutrient cartridge 615, according to various embodiments. In the depicted example, images 701a, 701b, and 701c, are generated based on a set of low- resolution images captured using image sensor 315. The physical position of a light emitting element within lighting element 205 that was activated when a low-resolution image was captured is recorded along with the low-resolution images. Next, high-resolution image
generation software 530 generates multiple high-resolution images, such as images 701a, 701b, and 701c. The images 701a, 701b, and 701c are synthetically focused by high- resolution image generation software 530 at different depths based on a depth parameter. Due to the non-uniformity of a specimen 318 in nutrient cartridge 615, the images 701a, 701b, and 701c have different regions 703a, 703b, and 703c, that are in-focus. Accordingly, high- resolution image generation software 530 generates or estimates a depth map 705 based on a contrast analysis of the images 701, 701b, and 701c. Based on the depth map 705, high- resolution image generation software 530 identifies the in-focus regions 703a, 703b, and 703c of the respective images 701a, 701b, and 701c. Additionally, based on the depth map 705, high-resolution image generation software 530 generates an in-focus super-resolution image 707 of the specimen 318 by extracting the regions 703a, 703b, and 703c and reassembling the regions into a single super-resolution image 707 of the specimen 318. In-focus superresolution image 707 represents a composite image generated based on the regions 703a, 703b, and 703c of the respective images 701a, 701b, and 701c.
[0062] FIG. 8 is a flow diagram of an example method 800 to generate an image with an expanded depth of field according to various embodiments. The method 800 can be performed by processor 520 executing high-resolution image generation software 530 in some embodiments. Although the method steps are described with respect to the systems of FIGS 2-7 persons skilled in the art will understand that any system configured to perform the method steps, in any order, is within the scope of the various embodiments.
[0063] As shown, method 800 begins at step 802, which involves sequentially activating light emitting elements, such as one or more lighting elements in lighting element 205, toward a specimen 318 in a growth medium, such as a growth medium provided by a nutrient cartridge 615. The respective light emitting elements have a known physical position within lighting element 205.
[0064] At step 804, the method 800 includes capturing a plurality of low resolution images based on the sequential activation of the light emitting elements. The low-resolution images are captured using image sensor 115. Each low-resolution image is associated with a physical position of the light emitting element within lighting element 205 that was activated when a respective low-resolution image is captured. The low-resolution image along with a position indication for the respective light emitting element from lighting element 205 is recorded and used to later generate high-resolution images.
[0065] At step 806, method 800 includes synthesizing high-resolution images at multiple
depth parameters. For multiple depth parameters, high-resolution image generation software 530 synthetically refocuses and super resolves the low-resolution images to generate respective high-resolution images that are focused at a depth specified by a respective depth parameter. In one embodiment, different depth parameters are used to account for the nonuniformity of specimen 318 in terms of the distance of the specimen from image sensor 315 across the cross-sectional area of the specimen 318.
[0066] High-resolution image generation software 530 can select the depth parameters based on a pre-selected range of depth parameters. In one example, high-resolution image generation software 530 obtains high-resolution images using ten separate depth parameters. For example, a first slice can be obtained using a depth parameter of 13 microns or a distance from the image sensor to the film associated with the cartridge. Subsequent depth parameters are chosen using a super-linear function with respect to depth. Therefore, in one example, a next slice is obtained at 14 microns, a next slice at 16 microns, a next slice at 19 microns, then 23 microns, and so on.
A super-linear function for selecting a plurality of depth parameters is used rather than a linear function because resolution decreases with distance from the sensor.
[0067] At step 808, the method 800 includes generating an output image based on the high-resolution images that are focused at various depths based on the depth parameters. Generating the output image could involve performing a focus stacking process on multiple images of the specimen 318. For example, an image deconvolution process can be performed. Additionally, a contrast analysis of a depth map generated from the images can be performed to identify in-focus regions of the multiple images, from which an in-focus super-resolution image is generated.
[0068] In sum, a method described above for generating an in-focus super-resolution image of a sample includes activating a light emitting element, wherein the light emitting element emits light towards a specimen in a growth medium, capturing a plurality of images of the specimen using an image sensor positioned on an opposing side of the specimen relative to the light emitting element, wherein each of the plurality of images is synthetically focused at a different depth based on a respective depth parameter, and generating an output image based on the plurality of images based on focus stacking the plurality of images to generate the output image.
[0069] At least one technical advantage of the disclosed techniques herein relative to the prior art is that, with the disclosed techniques, the depth of field of images captured in an
ePetri dish environment is improved. Another technical advantage of the disclosed techniques includes improved analysis of images of a sample that is captured in an ePetri dish environment.
[0070] These technical advantages provide one or more technological improvements over prior art approaches.
[0071] 1. In some embodiments, a computer-implemented method to generate an in-focus super-resolution image of a sample comprises activating a lighting element, wherein the lighting element emits light towards a specimen in a growth medium, capturing a plurality of images of the specimen using an image sensor positioned on an opposing side of the specimen relative to the light emitting element, wherein each of the plurality of images is synthetically focused at a different depth, and generating an output image based on the plurality of images based on focus stacking the plurality of images to generate the output image.
[0072] 2. The computer-implemented method of clause 1, wherein the light emitting element comprises a spatially non-uniform array of a plurality of lighting elements.
[0073] 3. The computer-implemented method of clauses 1 or 2, wherein of the plurality of light emitting elements are arranged with non-uniform spacing between light emitting elements.
[0074] 4. The computer-implemented method of any of clauses 1-3, further comprising estimating a depth map based on a contrast analysis of the plurality of images, wherein the output image is based on the depth map.
[0075] 5. The computer-implemented method of any of clauses 1-4, wherein generating the output image comprises generating a composite image of the plurality of images based on the depth map.
[0076] 6. The computer-implemented method of any of clauses 1-5, wherein a first image of the plurality of images is associated with a first region of the specimen, and a second image of the plurality of images is associated with a second region of the specimen.
[0077] 7. The computer-implemented method of any of clauses 1-6, wherein the specimen is positioned at a non-uniform depth relative to the image sensor.
[0078] 8. The computer-implemented method of any of clauses 1-7, wherein the specimen is positioned at a non-uniform depth relative to the image sensor.
[0079] 9. The computer-implemented method of any of clauses 1-8, wherein the plurality of images are generated based on a plurality of low-resolution images generated based on sequentially activating a plurality of light emitting elements associated with the lighting element.
[0080] 10. In some embodiments, one or more non-transitory computer-readable media store instructions that, when executed by one or more processors, cause the one or more processors to perform the steps of activating a light emitting element, wherein the light emitting element emits light towards a specimen in a growth medium, capturing a plurality of images of the specimen using an image sensor positioned on an opposing side of the specimen relative to the light emitting element, wherein each of the plurality of images is focused at a different depth, and generating an output image based on the plurality of images based on focus stacking the plurality of images to generate the output image.
[0081] 11. The one or more non-transitory computer-readable media of clause 10, wherein the light emitting element comprises a spatially non-uniform array of a plurality of lighting elements.
[0082] 12. The one or more non-transitory computer-readable media of clauses 10 or 11, wherein of the plurality of light emitting elements are arranged with non-uniform spacing between light emitting elements.
[0083] 13. The one or more non-transitory computer-readable media of any of clauses 10-
12, further comprising estimating a depth map based on a contrast analysis of the plurality of images, wherein the output image is based on the depth map.
[0084] 14. The one or more non-transitory computer-readable media of any of clauses 10-
13, wherein generating the output image comprises generating a composite image of the plurality of images based on the depth map.
[0085] 15. The one or more non-transitory computer-readable media of any of clauses 10-
14, wherein a first image of the plurality of images is associated with a first region of the specimen, and a second image of the plurality of images is associated with a second region of the specimen.
[0086] 16. The one or more non-transitory computer-readable media of any of clauses 10-
15, wherein the specimen is positioned at a non-uniform depth relative to the image sensor.
[0087] 17. The one or more non-transitory computer-readable media of any of clauses 10-
16, wherein the specimen is positioned at a non-uniform depth relative to the image sensor.
[0088] 18. In some embodiments, an imaging system comprises a light emitting element, an image sensor, and a processor executing an image generation application that causes the processor to perform the steps of activating a light emitting element, wherein the light emitting element emits light towards a specimen in a growth medium, capturing a plurality of images of the specimen using an image sensor positioned on an opposing side of the specimen relative to the light emitting element, wherein each of the plurality of images is focused at a different depth, and generating an output image based on the plurality of images based on focus stacking the plurality of images to generate the output image.
[0089] 19. The imaging system of clause 18, wherein the light emitting element comprises a spatially non-uniform array of a plurality of lighting elements.
[0090] 20. The imaging system of clauses 18 or 19, further comprises estimating a depth map based on a contrast analysis of the plurality of images, wherein the output image is based on the depth map.
[0091] 21. The imaging system of any of clauses 18-20, wherein the specimen is positioned at a non-uniform depth relative to the image sensor.
[0092] Any and all combinations of any of the claim elements recited in any of the claims and/or any elements described in this application, in any fashion, fall within the contemplated scope of the present invention and protection.
[0093] The descriptions of the various embodiments have been presented for purposes of illustration but are not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments.
[0094] Aspects of the present embodiments can be embodied as a system, method or computer program product. Accordingly, aspects of the present disclosure can take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that can all generally be referred to herein as a “module,” a “system,” or a “computer.” In addition, any hardware and/or software technique, process, function, component, engine, module, or system described in the present disclosure can be implemented as a circuit or set of circuits. Furthermore, aspects of the present disclosure can take the form of a computer program product embodied in one or more computer readable medium having computer readable program code embodied thereon.
[0095] Any combination of one or more computer readable medium can be utilized. The computer readable medium can be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable readonly memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium can be any tangible medium that can contain or store a program for use by or in connection with an instruction execution system, apparatus, or device.
[0096] Aspects of the present disclosure are described above with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions can be provided to a processor of a general-purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine. The instructions, when executed via the processor of the computer or other programmable data processing apparatus, enable the implementation of the functions/acts specified in the flowchart and/or block diagram block or blocks. Such processors can be, without limitation, general purpose processors, special-purpose processors, application-specific processors, or field-programmable gate arrays.
[0097] The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams can represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function. It should also be noted that, in some alternative implementations, the functions noted in the block can occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks can sometimes be executed in the reverse order, depending upon the functionality involved. It
will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. [0098] While the preceding is directed to embodiments of the present disclosure, other and further embodiments of the disclosure can be devised without departing from the basic scope thereof, and the scope thereof is determined by the claims that follow.
Claims
1. A computer-implemented method to generate an in-focus super-resolution image of a sample, comprising: activating a lighting element, wherein the lighting element emits light towards a specimen in a growth medium; capturing a plurality of images of the specimen using an image sensor positioned on an opposing side of the specimen relative to the light emitting element, wherein each of the plurality of images is synthetically focused at a different depth; and generating an output image based on the plurality of images based on focus stacking the plurality of images to generate the output image.
2. The computer-implemented method of claim 1, wherein the light emitting element comprises a spatially non-uniform array of a plurality of lighting elements.
3. The computer-implemented method of claim 2, wherein of the plurality of light emitting elements are arranged with non-uniform spacing between light emitting elements.
4. The computer-implemented method of claim 1, further comprising estimating a depth map based on a contrast analysis of the plurality of images, wherein the output image is based on the depth map.
5. The computer-implemented method of claim 4, wherein generating the output image comprises generating a composite image of the plurality of images based on the depth map.
6. The computer-implemented method of claim 1, wherein a first image of the plurality of images is associated with a first region of the specimen, and a second image of the plurality of images is associated with a second region of the specimen.
7. The computer-implemented method of claim 1, wherein the specimen is positioned at a non-uniform depth relative to the image sensor.
8. The computer-implemented method of claim 1, wherein the specimen is positioned at a non-uniform depth relative to the image sensor.
9. The computer-implemented method of claim 1, wherein the plurality of images are generated based on a plurality of low-resolution images generated based on sequentially activating a plurality of light emitting elements associated with the lighting element.
10. One or more non-transitory computer-readable media storing instructions that, when executed by one or more processors, cause the one or more processors to perform the steps of: activating a light emitting element, wherein the light emitting element emits light towards a specimen in a growth medium; capturing a plurality of images of the specimen using an image sensor positioned on an opposing side of the specimen relative to the light emitting element, wherein each of the plurality of images is focused at a different depth; and generating an output image based on the plurality of images based on focus stacking the plurality of images to generate the output image.
11. The one or more non-transitory computer-readable media of claim 10, wherein the light emitting element comprises a spatially non-uniform array of a plurality of lighting elements.
12. The one or more non-transitory computer-readable media of claim 11, wherein of the plurality of light emitting elements are arranged with non-uniform spacing between light emitting elements.
13. The one or more non-transitory computer-readable media of claim 10, further comprising estimating a depth map based on a contrast analysis of the plurality of images, wherein the output image is based on the depth map.
14. The one or more non-transitory computer-readable media of claim 11, wherein generating the output image comprises generating a composite image of the plurality of images based on the depth map.
15. The one or more non-transitory computer-readable media of claim 10, wherein a first image of the plurality of images is associated with a first region of the specimen, and a second image of the plurality of images is associated with a second region of the specimen.
16. The one or more non-transitory computer-readable media of claim 10, wherein the specimen is positioned at a non-uniform depth relative to the image sensor.
17. The one or more non-transitory computer-readable media of claim 10, wherein the specimen is positioned at a non-uniform depth relative to the image sensor.
18. An imaging system comprising: a light emitting element; an image sensor; and a processor executing an image generation application that causes the processor to perform the steps of: activating a light emitting element, wherein the light emitting element emits light towards a specimen in a growth medium; capturing a plurality of images of the specimen using an image sensor positioned on an opposing side of the specimen relative to the light emitting element, wherein each of the plurality of images is focused at a different depth; and generating an output image based on the plurality of images based on focus stacking the plurality of images to generate the output image.
19. The imaging system of claim 18, wherein the light emitting element comprises a spatially non-uniform array of a plurality of lighting elements.
20. The imaging system of claim 18, wherein the specimen is positioned at a non-uniform depth relative to the image sensor.
Applications Claiming Priority (10)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202463643126P | 2024-05-06 | 2024-05-06 | |
| US202463643140P | 2024-05-06 | 2024-05-06 | |
| US202463643132P | 2024-05-06 | 2024-05-06 | |
| US202463643142P | 2024-05-06 | 2024-05-06 | |
| US202463643135P | 2024-05-06 | 2024-05-06 | |
| US63/643,142 | 2024-05-06 | ||
| US63/643,132 | 2024-05-06 | ||
| US63/643,140 | 2024-05-06 | ||
| US63/643,126 | 2024-05-06 | ||
| US63/643,135 | 2024-05-06 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025235411A1 true WO2025235411A1 (en) | 2025-11-13 |
Family
ID=97675482
Family Applications (4)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2025/027815 Pending WO2025235411A1 (en) | 2024-05-06 | 2025-05-05 | Extended depth of field for high resolution images based on sub-pixel shifted images |
| PCT/US2025/027814 Pending WO2025235410A1 (en) | 2024-05-06 | 2025-05-05 | Techniques for maintaining contact or separation distance between a sealed nutrient cartridge and an image sensor |
| PCT/US2025/027812 Pending WO2025235409A1 (en) | 2024-05-06 | 2025-05-05 | Systems and methods of testing for product contamination |
| PCT/US2025/027816 Pending WO2025235412A1 (en) | 2024-05-06 | 2025-05-05 | High resolution image generation based on sub-pixel imaging |
Family Applications After (3)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2025/027814 Pending WO2025235410A1 (en) | 2024-05-06 | 2025-05-05 | Techniques for maintaining contact or separation distance between a sealed nutrient cartridge and an image sensor |
| PCT/US2025/027812 Pending WO2025235409A1 (en) | 2024-05-06 | 2025-05-05 | Systems and methods of testing for product contamination |
| PCT/US2025/027816 Pending WO2025235412A1 (en) | 2024-05-06 | 2025-05-05 | High resolution image generation based on sub-pixel imaging |
Country Status (1)
| Country | Link |
|---|---|
| WO (4) | WO2025235411A1 (en) |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20080266655A1 (en) * | 2005-10-07 | 2008-10-30 | Levoy Marc S | Microscopy Arrangements and Approaches |
| US20090180111A1 (en) * | 2008-01-15 | 2009-07-16 | Xerox Corporation | Illuminator for specular measurements |
| WO2011056658A1 (en) * | 2009-10-27 | 2011-05-12 | Duke University | Multi-photon microscopy via air interface objective lens |
| US20170094243A1 (en) * | 2013-03-13 | 2017-03-30 | Pelican Imaging Corporation | Systems and Methods for Synthesizing Images from Image Data Captured by an Array Camera Using Restricted Depth of Field Depth Maps in which Depth Estimation Precision Varies |
| US20180088309A1 (en) * | 2012-10-30 | 2018-03-29 | California Institute Of Technology | Embedded pupil function recovery for fourier ptychographic imaging devices |
| US20210051315A1 (en) * | 2018-03-07 | 2021-02-18 | Everysight Ltd. | Optical display, image capturing device and methods with variable depth of field |
Family Cites Families (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9115919B2 (en) * | 2009-01-28 | 2015-08-25 | Micro Q Technologies | Thermo-electric heat pump systems |
| US9643184B2 (en) * | 2010-10-26 | 2017-05-09 | California Institute Of Technology | e-Petri dishes, devices, and systems having a light detector for sampling a sequence of sub-pixel shifted projection images |
| US8866063B2 (en) * | 2011-03-31 | 2014-10-21 | The Regents Of The University Of California | Lens-free wide-field super-resolution imaging device |
| WO2016019324A2 (en) * | 2014-08-01 | 2016-02-04 | The Regents Of The University Of California | Device and method for iterative phase recovery based on pixel super-resolved on-chip holography |
-
2025
- 2025-05-05 WO PCT/US2025/027815 patent/WO2025235411A1/en active Pending
- 2025-05-05 WO PCT/US2025/027814 patent/WO2025235410A1/en active Pending
- 2025-05-05 WO PCT/US2025/027812 patent/WO2025235409A1/en active Pending
- 2025-05-05 WO PCT/US2025/027816 patent/WO2025235412A1/en active Pending
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20080266655A1 (en) * | 2005-10-07 | 2008-10-30 | Levoy Marc S | Microscopy Arrangements and Approaches |
| US20090180111A1 (en) * | 2008-01-15 | 2009-07-16 | Xerox Corporation | Illuminator for specular measurements |
| WO2011056658A1 (en) * | 2009-10-27 | 2011-05-12 | Duke University | Multi-photon microscopy via air interface objective lens |
| US20180088309A1 (en) * | 2012-10-30 | 2018-03-29 | California Institute Of Technology | Embedded pupil function recovery for fourier ptychographic imaging devices |
| US20170094243A1 (en) * | 2013-03-13 | 2017-03-30 | Pelican Imaging Corporation | Systems and Methods for Synthesizing Images from Image Data Captured by an Array Camera Using Restricted Depth of Field Depth Maps in which Depth Estimation Precision Varies |
| US20210051315A1 (en) * | 2018-03-07 | 2021-02-18 | Everysight Ltd. | Optical display, image capturing device and methods with variable depth of field |
Also Published As
| Publication number | Publication date |
|---|---|
| WO2025235412A1 (en) | 2025-11-13 |
| WO2025235409A1 (en) | 2025-11-13 |
| WO2025235410A1 (en) | 2025-11-13 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| KR102408322B1 (en) | Auto-Focus system | |
| JP4806630B2 (en) | A method for acquiring optical image data of three-dimensional objects using multi-axis integration | |
| US20180017939A1 (en) | Device for observing a sample | |
| EP2300983B1 (en) | System and method for producing an optically sectioned image using both structured and uniform illumination | |
| KR102461745B1 (en) | Imaging methods and systems for obtaining super-resolution images of objects | |
| CN105122113B (en) | Method for generating a composite image of an object body consisting of a plurality of sub-images | |
| EP3007432A1 (en) | Image acquisition device, image acquisition method, and program | |
| EP2439576B1 (en) | Image processing device, program and microscope | |
| US20180329225A1 (en) | Pattern Detection at Low Signal-To-Noise Ratio | |
| US10809514B2 (en) | Low resolution slide imaging and slide label imaging and high resolution slide imaging using dual optical paths and a single imaging sensor | |
| US20070141719A1 (en) | Reduction of scan time in imaging mass spectrometry | |
| JP2005181312A (en) | Method for converting scanner image coordinates of rare cells into microscope coordinates using reticle mark on sample medium | |
| JP2008502929A (en) | Inspection apparatus or inspection method for fine structure by reflected or transmitted infrared light | |
| US20140320513A1 (en) | Image display apparatus and image display method | |
| CA3170991A1 (en) | Method and system for real-time wide-field dynamic temperature sensing | |
| JP2022084889A (en) | Slide inventory check and reinsert system | |
| CN115511866B (en) | System and method for image analysis of multi-dimensional data | |
| JP6345001B2 (en) | Image processing method and image processing apparatus | |
| JP2003255231A (en) | Optical imaging system and optical image data processing method | |
| US20150168705A1 (en) | Autofocus system and autofocus method for focusing on a surface | |
| JP2012247743A (en) | Chart for checking resolution and method for checking resolution | |
| WO2025235411A1 (en) | Extended depth of field for high resolution images based on sub-pixel shifted images | |
| KR20180048986A (en) | Measuring the rotational position of a lenticular lens sheet | |
| US11943537B2 (en) | Impulse rescan system | |
| EP3115769B1 (en) | Image pickup device and method |