[go: up one dir, main page]

WO2025117061A1 - Système et procédé de détection de bulles objectives par immersion dans l'eau - Google Patents

Système et procédé de détection de bulles objectives par immersion dans l'eau Download PDF

Info

Publication number
WO2025117061A1
WO2025117061A1 PCT/US2024/051384 US2024051384W WO2025117061A1 WO 2025117061 A1 WO2025117061 A1 WO 2025117061A1 US 2024051384 W US2024051384 W US 2024051384W WO 2025117061 A1 WO2025117061 A1 WO 2025117061A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
liquid
immersion objective
signal
imaging system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/US2024/051384
Other languages
English (en)
Inventor
Matthew Arthur STILES
Benjamin Knight
Christopher Houle
James Donald PIETTE
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Agilent Technologies Inc
Original Assignee
Agilent Technologies Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Agilent Technologies Inc filed Critical Agilent Technologies Inc
Publication of WO2025117061A1 publication Critical patent/WO2025117061A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/33Immersion oils, or microscope systems or objectives for use with immersion fluids
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/0004Microscopes specially adapted for specific applications
    • G02B21/002Scanning microscopes
    • G02B21/0024Confocal scanning microscopes (CSOMs) or confocal "macroscopes"; Accessories which are not restricted to use with CSOMs, e.g. sample holders
    • G02B21/0032Optical details of illumination, e.g. light-sources, pinholes, beam splitters, slits, fibers
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/365Control or image processing arrangements for digital or video microscopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/0004Microscopes specially adapted for specific applications
    • G02B21/0088Inverse microscopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/16Microscopes adapted for ultraviolet illumination ; Fluorescence microscopes

Definitions

  • Embodiments of the present disclosure relate to imaging systems and methods thereof, and more particularly to automated imaging systems and methods thereof in which bubbles in a liquid between a liquid immersion objective and a sample of the automated imaging systems may be detected.
  • liquid e.g., water
  • immersion objectives may utilize the liquid between the objective and the sample.
  • the images may be blurred and degraded.
  • the focal height of the sample to be imaged is larger than if there are no air bubbles in the liquid between the objective and the sample. Due to the increased focal height of the sample, there is an increased risk of the sample contacting the lab ware and damaging the objective.
  • the operator manually observes the application of immersion liquid to the immersion objective. If the operator visually observes air bubbles in the immersion liquid between the objective and the sample, the operator may manually modify the operation by adding more liquid. This is cumbersome to the operator and slows the operation. Additionally, automated microscopes do not benefit from this manually guided liquid application to the objective.
  • Embodiments of the present disclosure may solve the above problems and other problems.
  • an automated imaging system may be provided and include a liquid immersion objective, a light source that may be configured to emit a first light to a body via the liquid immersion objective in a state in which an immersion liquid is on the liquid immersion objective between the liquid immersion objective and the body, wherein the body is configured to hold a sample, a first sensor that may be configured to receive a first signal based on the first light, after the first light arrives at the body via the liquid immersion objective, and a controller that may be configured to detect whether an air bubble is present in the immersion liquid between the liquid immersion objective and the body based on the first signal received by the first sensor.
  • the automated imaging system may further include an air immersion objective, wherein the light source may be further configured to emit a second light to the body via the air immersion objective, the first sensor may be further configured to receive a second signal based on the second light, after the second light arrives at the body via the air immersion objective, and the controller may be further configured to detect whether the air bubble is present in the immersion liquid between the liquid immersion objective and the body based on the first signal and the second signal received by the first sensor.
  • the controller may be further configured to detect whether the air bubble is present in the immersion liquid between the liquid immersion objective and the body by comparing a ratio of the second signal to the first signal to a predetermined threshold value.
  • the controller may be further configured to detect that the air bubble is not present in the immersion liquid between the liquid immersion objective and the body based on the ratio of the second signal to the first signal being greater than the predetermined threshold value, and analyze the sample based on detecting that the air bubble is not present in the immersion liquid between the liquid immersion objective and the body.
  • the controller may be further configured to detect that the air bubble is present in the immersion liquid between the liquid immersion objective and the body based on the ratio of the second signal to the first signal being less than or equal to the predetermined threshold value.
  • the first sensor may be configured to receive the first signal based on the portion of the first light that is reflected by the body, and receive the second signal based on the portion of the second light that is reflected by the body.
  • the first light may be first excitation light configured to cause the sample to emit a third light via fluorescence
  • the second light may be second excitation light configured to cause the sample to emit a fourth light via fluorescence
  • the first sensor may be configured to receive the first signal based on the third light emitted by the sample, and receive the second signal based on the fourth light emitted by the sample
  • the light source may be a narrow beam light source including a laser diode.
  • the first sensor may be an image sensor.
  • the senor may be adjacent to the light source, and the automated imaging system may include a second sensorthat is an image sensor which may be configured to image the sample.
  • the light source may be a wide beam light source including a lamp.
  • the automated imaging system may further include a spinning disk confocal microscope
  • the spinning disk confocal microscope may include the liquid immersion objective, the light source, the first sensor, a confocal spinning disk, at least one first optical body defining a first infinity corrected optics region on a first side of the confocal spinning disk, toward the body that is configured to hold the sample, and at least one second optical body defining a second infinity corrected optics region on a second side of the confocal spinning disk, away from the body that is configured to hold the sample.
  • the light source and the first sensor are in the first infinity corrected optics region of the spinning disk confocal microscope.
  • the light source and the first sensor are in the second infinity corrected optics region of the spinning disk confocal microscope.
  • the light source may be further configured to emit a second light to the body via the liquid immersion objective in a state where no immersion liquid is present on the liquid immersion objective
  • the first sensor may be further configured to receive a second signal based on the second light, after the second light arrives at the body via the liquid immersion objective
  • the controller may be further configured to detect whether the air bubble is present in the immersion liquid between the liquid immersion objective and the body based on the first signal and the second signal received by the first sensor.
  • a method performed by an automated imaging system may include emitting, by a light source of the automated imaging system, a first light to a body of the automated imaging system via a liquid immersion objective of the automated imaging system in a state in which an immersion liquid is on the liquid immersion objective, between the liquid immersion objective and the body, wherein the body may be configured to hold a sample, receiving, by a first sensor of the automated imaging system, a first signal based on the first light, after the first light arrives at the body via the liquid immersion objective, and detecting whether an air bubble is present in the immersion liquid between the liquid immersion objective and the body based on the first signal received by the first sensor.
  • a method performed by an automated imaging system may further include emitting, by the light source, a second light to the body via an air immersion objective of the automated imaging system, receiving, by the first sensor, a second signal based on the second light, after the second light arrives at the body via the air immersion objective, the detecting may include detecting whether the air bubble is present in the immersion liquid between the liquid immersion objective and the body based on the first signal and the second signal received by the first sensor.
  • the detecting may include detecting whether the air bubble is present in the immersion liquid between the liquid immersion objective and the body by comparing a ratio of the second signal to the first signal to a predetermined threshold value.
  • the detecting may include detecting that the air bubble is not present in the immersion liquid between the liquid immersion objective and the body based on the ratio of the second signal to the first signal being greater than the predetermined threshold value, and the method may further include analyzing the sample based on the detecting that the air bubble is not present in the immersion liquid between the liquid immersion objective and the body.
  • the detecting may include detecting that the air bubble is present in the immersion liquid between the liquid immersion objective and the body based on the ratio of the second signal to the first signal being less than or equal to the predetermined threshold value.
  • the method may further include reflecting, by the body, at least a portion of the first light and at least a portion of the second light that are emitted from the light source, the receiving the first signal may include receiving the first signal based on the portion of the first light that is reflected by the body, and the receiving the second signal may include receiving the second signal based on the portion of the second light that is reflected by the body.
  • the emitting the first light causes the sample to emit a third light via fluorescence
  • the emitting the second light causes the sample to emit a fourth light via fluorescence
  • the receiving the first signal includes receiving the first signal based on the third light emitted by the sample
  • the receiving the second signal includes receiving the second signal based on the fourth light emitted by the sample.
  • the method may further include emitting, by the light source, a second light to the body via the liquid immersion objective in a state where no immersion liquid is present on the liquid immersion objective, receiving, by the first sensor, a second signal based on the second light, after the second light arrives at the body via the liquid immersion objective, the detecting may include detecting whether the air bubble is present in the immersion liquid between the liquid immersion objective and the body based on the first signal and the second signal received by the first sensor.
  • a non-transitory computer- readable medium storing computer instructions may be provided.
  • the computer instructions when executed by at least one processor, may cause the at least one processor to: emit, by a light source of an automated imaging system, a first light to a body of the automated imaging system via a liquid immersion objective of the automated imaging system in a state in which an immersion liquid is on the liquid immersion objective, between the liquid immersion objective and the body, wherein the body is configured to hold a sample; receive, by a first sensor of the automated imaging system, a first signal based on the first light, after the first light arrives at the body via the liquid immersion objective; and detect whether an air bubble is present in the immersion liquid between the liquid immersion objective and the body based on the first signal received by the first sensor.
  • detection of air bubbles in a liquid between the objective and the sample may be efficiently achieved through automation.
  • imaging of samples may be done without human intervention to determine whether there are air bubbles in a liquid between the objective and the sample, thus saving countless hours of error prone manual labor.
  • FIG. l is a diagram illustrating an automated imaging system according to an embodiment of the present disclosure.
  • FIG. 2 is a diagram illustrating an automated imaging system according to an embodiment of the present disclosure
  • FIG. 3 is a diagram illustrating an automated imaging system according to an embodiment of the present disclosure
  • FIG. 4 is a diagram illustrating an automated imaging system according to an embodiment of the present disclosure
  • FIG. 5 is a diagram illustrating an automated imaging system according to an embodiment of the present disclosure.
  • FIG. 6 is a diagram illustrating an automated imaging system according to an embodiment of the present disclosure.
  • FIG. 7A is a diagram illustrating detection of air bubble within an immersion liquid, according to an embodiment of the present disclosure.
  • FIG. 7B is a diagram illustrating detection of no air bubble(s) within an immersion liquid, according to an embodiment of the present disclosure
  • FIG. 8A is a flow chart illustrating a method of operation of an automated imaging system according to embodiments of the present disclosure
  • FIG. 8B is a flow chart illustrating a method of operation of an automated imaging system according to embodiments of the present disclosure
  • FIG. 9 is a diagram illustrating an automated imaging system according to an embodiment of the present disclosure.
  • FIG. 10 is a diagram illustrating an automated imaging system according to an embodiment of the present disclosure.
  • FIG. 11 is a diagram illustrating an automated imaging system according to an embodiment of the present disclosure.
  • FIG. 12 is a block diagram illustrating an automated imaging system according to an embodiment of the present disclosure
  • FIG. 13 is a diagram illustrating a portion of an automated imaging system that implements a microplate according to an embodiment of the present disclosure.
  • FIG. 14 is a diagram illustrating a portion of an automated imaging system that implements a slide with a coverslip according to an embodiment of the present disclosure.
  • lens here and throughout the description may refer to a single lens or group of lenses depending on the embodiment and function, as appreciated by person of ordinary skill in the art.
  • an automated imaging system may include a microscope that utilizes a liquid (e.g., water) immersion objective, and the automated imaging system may be configured to perform processes in which air bubbles within an immersion liquid (e.g., water) of the liquid immersion objective may be detected.
  • a liquid e.g., water
  • an immersion liquid e.g., water
  • the processes may include a plurality of steps, including but not limited to: (1) emitting first light from a light source to a body holding a sample via an air immersion objective; (2) recording a signal level (referred to as an air signal level) based on the intensity of a reflection of the first light from the body holding the sample; (3) switching the air immersion objective with a liquid immersion objective; (4) dispensing immersion liquid on the liquid immersion objective; (5) emitting second light from a light source to the body holding the sample via the liquid immersion objective; (6) recording a signal level (referred to as a liquid (e.g., water) signal level) based on an intensity of a reflection of the second light from the body holding the sample; and (7) comparing the ratio of the air signal level to the liquid signal level to a predetermined threshold value to determine whether the immersion liquid between the liquid immersion objective and the body contains air bubbles.
  • a signal level referred to as an air signal level
  • the level of the reflected signals may be directly related to the change of the refractive index in the immersion liquid and air (e.g. air signal/liquid signal). For example, when a bubble(s) is present, there may be a higher delta in refractive indexes of the two measurements (e.g., air signal/liquid signal) than in a case where there are no bubble present.
  • Embodiments of the present disclosure may implement the structure and functions of the systems of U.S. Pat. No. 9,772,540B2, to Norris et al., U.S. Pat. No. 10,072,982B2, to Zimenkov et al., and International Patent Application Publication No. W02022120047A1, to Piette, et. al, the entire contents of which are herein incorporated by reference.
  • FIG. 1 is a diagram illustrating an automated imaging system according to a first embodiment of the present disclosure.
  • the automated imaging system 3000 may include a first cube 3532, a mirror 3503, one or more immersion objectives 3522, a body 3523, a first lens 3524, a second lens 3525, a third lens 3526, a confocal spinning disk 3514, a second cube 3511, and a first sensor 3515.
  • the automated imaging system 3000 may be configured as a spinning disk confocal microscope.
  • the first cube 3532 may be a structure (e.g., a body) that includes, for example, a light source 3510 and a dichroic mirror 3531.
  • the light source 3510 may be configured to emit a light 3501 towards the dichroic mirror 3531.
  • the light source 3510, and its light 3501, may be utilized in bubble detection as described in detail further below.
  • the dichroic mirror 3531 may reflect the light 3501 towards the mirror 3503, and the mirror 3503 may reflect the light 3501 to the immersion objective 3522, which is directed towards the body 3523. That is, the light source 3510 may be configured to emit light 3501 to the body 3523 via the immersion objective 3522.
  • the first cube 3532 may be used, for example, in a confocal mode relative to the confocal spinning disk 3514.
  • the light source 3510 may be, for example, a narrow beam light source such as, but not limited to, a laser diode (e.g., solid state laser or semiconductorbased laser) and one or more light emitting diodes (LEDs) capable of producing a narrow beam of light.
  • a narrow beam light source of embodiments of the present disclosure such as the light source 3510, may include a laser diode with a wavelength of 635 nm for optimizing camera efficiency and laser line availability. Increasing the wavelength of the laser of laser diode may reduce photo-toxicity but may decrease the efficiency of a camera. In order to combat a decrease in camera efficiency, the laser power can be increased.
  • An additional benefit of using a longer wavelength is the possibility of in situ bubble detection by using a dichroic to “steer” the laser signal while leaving the microscope’s imaging light path unaffected.
  • the dichroic mirror 3531 may be partially reflective and partially transmissive. For example, some light may reflect off a surface of the dichroic mirror 3531 and other light may pass through the surface of the dichroic mirror 3531.
  • the one or more objectives 3522 may be configured to be located under the body 3523, that holds the sample, and view the sample from underneath.
  • the one or more objectives 3522 may include, for example, one or more liquid immersion objectives and one or more air immersion objectives.
  • the liquid immersion objective may be a specially designed objective lens used to increase the resolution of a microscope.
  • an immersion liquid e.g., water or other liquids such as oil or glycerol that may increase the refractive index
  • the objective 3522 may then be brought to the sample, where the droplet is sandwiched between the sample and the objective 3522. In this way, the light passing to and from the sample to the objective 3522 does not go through air.
  • the higher refractive index of the liquid over air results in an increased numerical aperture for the objective 3522, and therefore provides greater light gathering.
  • the increased numerical aperture increases resolution and increases the signal level when imaging a sample.
  • a reflection property is reduced with a higher refractive index fluid (compared to air) because the closer the refractive index is matched when moving through one material to another material, the less the reflection.
  • the objective 3522 that is the liquid immersion objective, may be brought to the sample, and then the drop of fluid may be (manually or automatically) put on the objective 3522.
  • the air immersion objective (also known as a dry objective) is configured to operate without an immersion liquid thereon. That is, when using an objective 3522 that is an air immersion objective, air (instead of an immersion liquid) may be sandwiched between the sample and the objective 3522.
  • the one or more objectives 3522 may be moved by the movement mechanism 3530.
  • the movement mechanism 3530 may include, for example, at least one actuator.
  • the movement mechanism 3530 may be configured to move the one or more objectives 3522 in at least one of vertical directions, horizontal directions, and/or rotational directions to interchange the objectives 3522 to be in a viewing position below the body 3523 for, for example, imaging a sample held by the body 3523.
  • the movement mechanism 3530 may be configured to move one or more of the obj ectives 3522, that is in the viewing position, in vertical directions to change a sharpness of an image obtained using the one or more objectives 3522.
  • the movement mechanism 3530 may include an objective turret that holds the objectives 3522 and may be configured to rotate, due to control of the actuators of the movement mechanism 3530 such that the objectives 3522 may be selectively provided in the viewing position.
  • the body 3523 may be configured to hold the sample so that the sample is maintained in a fixed position within the automated imaging system 3000 for imaging using the objective 3522.
  • the body 3523 may be, for example, a microplate 310
  • the microplate 310 may holds the sample S within a well
  • the microplate 310 may include a bottom plate 312 and side walls 314 that define one or more of the wells 316.
  • the bottom plate 312 and/or the side walls 314 may be made of a transparent material.
  • the bottom plate 312 and/or the side walls 314 may be made of glass or a plastic.
  • the objective 3522 may be underneath the microplate 310.
  • an immersion liquid L e.g., water
  • the immersion liquid L may be in direct contact with a lens of the objective 3522 and a bottom surface of the bottom plate 312 of the microplate 310.
  • the objective 3522 is an air immersion objective, instead of an immersion liquid, air may be provided between the objective 3522 and the microplate 310.
  • the air may be in direct contact with a lens of the objective 3522 and a bottom surface of the bottom plate 312 of the microplate 310.
  • input light 301 of the automated imaging system 3000 may reach the sample S in the well 316 by passing upwards through the objective 3522, the immersion liquid L or air, and then the bottom plate 312 of the microplate 310.
  • output light 302 may be provided from the sample S and may pass downwards through the bottom plate 312, the immersion liquid L or air, and then the objective 3522 so as to be received by a sensor (e.g., the first sensor 3515) of an automated imaging system (e.g., automated imaging system 3000).
  • the output light 302 may be a portion of the input light 301 that is reflected from the sample S, or may be light that is fluorescently emitted by the sample S in response to the sample S receiving the input light 301.
  • the input light 301 may be light output by any light source of automated imaging systems (e.g., the automated imaging system 3000) of embodiments of the present disclosure.
  • the input light may be the light 3501 that is emitted by the light source 3510 and/or light that is emitted by a light source 3512 of the second cube 3511, but embodiments of the present disclosure are not limited thereto.
  • the sample S when performing bubble detection (explained further below), the sample S may be omitted and at least a portion of the input light 301 (e.g., light such as the light 3501 in FIG. 1) may be reflected by the bottom plate 312 of the microplate 310. That is, the output light 302 may be the portion of the input light 301 that is reflected by the bottom plate 312. For example, the output light 302 may be the reflection light 3502 shown in FIG. 1. The output light 302 may then be used by automated imaging systems of embodiments of the present disclosure to determine whether an air bubble is present within the immersion liquid L.
  • the input light 301 e.g., light such as the light 3501 in FIG. 1
  • the output light 302 may be the portion of the input light 301 that is reflected by the bottom plate 312.
  • the output light 302 may be the reflection light 3502 shown in FIG. 1.
  • the output light 302 may then be used by automated imaging systems of embodiments of the present disclosure to determine whether an air bubble is present within the immersion
  • the bubble detection may be performed while the sample S is present.
  • the input light 301 may be reflected by the sample S, or light may be fluorescently emitted by the sample S based on the input light 301, as the output light 302, and the output light 302 may be used by automated imaging systems of embodiments of the present disclosure to determine whether an air bubble is present within the immersion liquid L.
  • the sample S may be placed on a slide 330 (e.g., a microscope slide), and the cover slip 320 may be on the slide 330, such that the slide 330 and the cover slip 320 hold the sample S therebetween.
  • the slide 330 and/or the cover slip 320 may be made of a transparent material.
  • the slide 330 and/or the cover slip 320 may be made of glass or a plastic.
  • the output light 302 may be provided from the sample S and may pass downwards through the slide 330, the immersion liquid L or air, and then the objective 3522 so as to be received by a sensor (e.g., the first sensor 3515) of an automated imaging system (e.g., automated imaging system 3000).
  • the output light 302 may be a portion of the input light 301 that is reflected from the sample S, or may be light that is fluorescently emitted by the sample S in response to the sample S receiving the input light 301.
  • the input light 301 may be light output by any light source of automated imaging systems (e.g., automated imaging system 3000) of embodiments of the present disclosure.
  • the input light may be the light 3501 that is emitted by the light source 3510 and/or light that is emitted by a light source 3512 of the second cube 3511.
  • the sample S when performing bubble detection (explained further below), the sample S may be omitted such that the cover slip 320 is in direct contact with the slide 330.
  • the input light 301 e.g., light such as the light 3501 in FIG. 1
  • the output light 302 may be the portion of the input light 301 that is reflected by the cover slip 320.
  • the output light 302 may be the reflection light 3502 shown in FIG. 1.
  • the output light 302 may then be used by automated imaging systems of embodiments of the present disclosure to determine whether an air bubble is present within the immersion liquid L.
  • the bubble detection may be performed while the sample S is present.
  • the input light 301 may be reflected by the sample S, or light may be fluorescently emitted by the sample S based on the input light 301, as the output light 302, and the output light 302 may be used by automated imaging systems of embodiments of the present disclosure to determine whether an air bubble is present within the immersion liquid L.
  • the sensed signal level of the output light 302 may depend on whether the input light 301 is reflected by the body 3523 (or the sample S) or is emitted by the sample S via fluorescence. For example, when sensing reflected light (e.g., light from a narrow beam), the sensed liquid signal when no bubble is present in the immersion liquid may be lower than when a bubble(s) is present in the immersion liquid. In contrast, when sensing light emitted by the sample via fluorescence, the sensed liquid signal when no bubble is present in the immersion liquid may be higher than when a bubble(s) is present in the immersion liquid.
  • reflected light e.g., light from a narrow beam
  • the sensed liquid signal when no bubble is present in the immersion liquid may be lower than when a bubble(s) is present in the immersion liquid.
  • the sensed liquid signal when no bubble is present in the immersion liquid may be higher than when a bubble(s) is present in the immersion liquid.
  • the reflection light 3502 may pass downwards through the immersion objective 3522 so as to be received by the first sensor 3515.
  • the reflection light 3502 may be received by the first sensor 3515 by being reflected by the mirror 3503 and passing through the dichroic mirror 3531, the first lens 3524, the confocal spinning disk 3514, the second lens 3525, a dichroic mirror 3516 of the second cube 3511, and then the third lens 3526.
  • the first lens 3524, the second lens 3525, and/or the third lens 3526 may be a tube lens.
  • the first lens 3524 and the back of the immersion objective 3522 may define a first infinity corrected optics region 3504 therebetween, and the second lens 3525 and the third lens 3526 may define a second infinity corrected optics region 3505 therebetween. That is, the first infinity corrected optics region 3504 may be on a side of the confocal spinning disk 3514 towards the body 3523 that holds the sample, and the second infinity corrected optics region 3505 may be on a second side of the confocal spinning disk 3514 towards the first sensor 3515.
  • the infinity corrected optics regions may be regions in which light rays are provide in parallel (i.e., the light rays meet at an infinity distance).
  • the confocal spinning disk 3514 may be configured to provide lights paths (e.g., excitation light path, reflection light path, and/or emission light path) of the automated imaging system 3000.
  • lights paths e.g., excitation light path, reflection light path, and/or emission light path
  • the confocal spinning disk 3514 may include a pattern of pin holes or slits that provide the light paths. The pin holes or slits may be spaced far away from each other to act optically independently, and may be arranged in several spirals along the confocal spinning disk 3514.
  • the confocal spinning disk 3514 may be around 2 mm thick and made from glass or quartz, in an example embodiment.
  • the confocal spinning disk 3514 may be coated to be non-transparent, or have a given transparency or opacity, except for clear areas left as a pattern of the pin holes or slits.
  • the surface of the confocal spinning disk 3514 may be configured to not reflect oncoming light.
  • the confocal spinning disk 3514 may be configured to spin based on actuation by at least one actuator of the automated imaging system 3000.
  • the confocal spinning disk 3514 may be controlled to continuously spin, thus scanning the sample.
  • the sample may be illuminated one spot at time and the complete sample image may be detected on a sensor (e.g., first sensor 3515) for reconstruction as a complete image of the sample.
  • the confocal spinning disk 3514 may have a configuration of a spinning disk described in International Patent Application Publication No. W02022120047A1, to Piette, et. al. [082]
  • the second cube 3511 may be a structure (e.g., a body) that includes, for example, a light source 3512, a lens 3513 (e.g., an emission fdter), and the dichroic mirror 3516.
  • the light source 3512 may be configured to emit light towards the dichroic mirror 3516, via the lens 3513.
  • the lens 3513 may be configured to block light of certain wavelengths from the light source 3512 while allowing other light of other certain wavelengths from the light source 3512 to pass therethrough.
  • the lens 3513 may be configured to form a bandpass for excitation of a sample held by the body 3523.
  • the dichroic mirror 3516 may be partially reflective and partially transmissive. For example, some light may reflect off a surface of the dichroic mirror 3516 and other light may pass through the surface of the dichroic mirror 3516.
  • the light source 3512 may be configured to emit light towards a sample held by the body 3523 in order to image the sample using the light.
  • the light of the light source 3512 may reach the sample (e.g., as input light 301 in FIGS. 13-14) by passing through the lens 3513, reflecting from the dichroic mirror 3516, passing through the second lens 3525, the confocal spinning disk 3514, the first lens 3524, and the dichroic mirror 3531, reflecting from the mirror 3503, and then passing through the immersion objective 3522.
  • output light e.g., output light 302 in FIGS.
  • the light source 3512 may be a confocal excitation light source.
  • the confocal excitation light source may be any light source suitable for confocal microscopy.
  • the light source 3512 may be a solid state light source (e.g., one or more LEDS) or a solid state laser or semiconductor-based laser (e.g., a laser diode).
  • the first sensor 3515 may be configured to receive light in response to light being emitted towards the body 3523 by the light source 3510 and/or by the light source 3512.
  • the first sensor 3515 may be configured to obtain an image of a sample, as a part of analysis by a controller 70 (refer to FIG. 12) of the automated imaging system 3000, by receiving reflection light or light generated by the sample (e.g., via florescence), that is caused by the light emitted towards the body 3523 by a light source (e.g., the light source 3512).
  • the first sensor 3515 may be further configured to obtain signals for detecting whether an air bubble is present in the immersion liquid between the immersion objective 3522 (that is a liquid immersion objective) and the body 3523 by receiving the reflection light 3502 that is the reflected portion of the light 3501 emitted by the light source 3510.
  • the first sensor 3515 may be, for example, an imaging camera, charge coupled device (CCD), complementary metal oxide semiconductor (CMOS) based detector, a line sensor, or a single element photo diode.
  • the imaging camera may have over one million pixels of the size of 3-6 pm.
  • the controller 70 (refer to FIG. 12) of the automated imaging system 3000 may be configured to detect whether an air bubble is present in the immersion liquid between the immersion objective 3522 (that is a liquid immersion objective) and the body 3523 based on the reflection light 3502 received by the first sensor 3515 in two different states of the automated imaging system 3000.
  • the first state (hereinafter referred to as a dry state) may be when the immersion objective 3522, through which the light 3501 and the reflection light 3502 passes, does not have the immersion liquid thereon.
  • the second state (hereinafter referred to as wet state) may be when the same or different immersion objective 3522, through which the light 3501 and the reflection light 3502 passes, does have the immersion liquid thereon.
  • the immersion objective 3522 that is used may be an air immersion objective or a liquid immersion objective that presently does not have an immersion liquid thereon.
  • the immersion objective 3522 that is used may be a liquid immersion objective that has the immersion liquid thereon, and may be a different objective from the air or liquid immersion objective used in the dry state, or may be the same liquid immersion objective used in the dry state.
  • the reflection light 3502 obtained by the first sensor 3515 may be referred to as an air signal.
  • the reflection light 3502 obtained by the first sensor 3515 (or another sensor) may be referred to as a liquid signal.
  • the controller 70 may obtain the air signal and the liquid signal from the first sensor 3515, and record the air signal and the liquid signal.
  • the controller 70 may be configured to compare the ratio of the air signal to the liquid signal to a predetermined value (e.g., a predetermined threshold value) in order to detect whether an air bubble is present in the immersion liquid between the immersion objective 3522 and the body 3523, during the wet state. Details of how the controller 70 may detect whether the air bubble is present, according to embodiments of the present disclosure, is provided below.
  • the intensity of the liquid signal varies based on whether there is an air bubble in the immersion liquid on the immersion objective 3522.
  • the percentage of reflectance of the light 3501 from the body 3523 is affected by whether the air bubble is present.
  • a high reflectance may indicate a significant change in a refractive index, thereby indicating a poor quality immersion fluid application that includes air bubbles.
  • a low reflectance may indicate a uniform and high quality application of the immersion fluid on the immersion objective 3522 (e.g., no air bubbles are present).
  • nt is the refraction index of the medium (e.g., liquid or air) between the immersion objective 3522 and the body 3523 that holds the sample
  • m is the refraction index of the body 3523
  • R is the reflectance of the light 3501.
  • nt may be 1.333 when the medium is water and 1 when the medium is air (e.g., the immersion objective has no immersion fluid thereon), and n, may be 1.5 when the body 3523 is glass.
  • the expected reflectance (R) may be 0.35%.
  • the expected reflectance (R) may be 4%. Accordingly, a ratio of air to water medium reflection (4/0.35) is 11.5.
  • the intensity of the liquid signal obtained by the controller 70 is about 11.5 times weaker compared to the intensity of the air signal obtained by the controller 70 during the dry state.
  • the intensity of the liquid signal obtained by the controller 70 is closer to the intensity of the air signal obtained by the controller 70 during the dry state.
  • the ratio of the air signal to the liquid signal may be much less than 11.5, and may be about 1.
  • the controller 70 may determine whether there is an air bubble present in the immersion liquid on the immersion objective 3522 based on the ratio of the air signal to the liquid signal. For example, the controller 70 may compare the ratio of the air signal to the liquid signal to a predetermined threshold value to make the determination. [096] According to embodiments, the controller 70 may determine (or detect) that there is no air bubble present in the immersion liquid between the immersion objective 3522 (the liquid immersion objective) and the body 3523 based on the ratio of the air signal to the liquid signal being greater than the predetermined threshold value. The controller 70 may then automatically cause the automated imaging system 3000 to perform analysis, including imaging of the sample (e.g., using the light source 3512 and the immersion objective 3522), based on detecting that no air bubble is present on the immersion objective 3522.
  • the controller 70 may determine (or detect) that there is an air bubble(s) present in the immersion liquid between the immersion objective 3522 (the liquid immersion objective) and the body 3523 based on the ratio of the air signal to the liquid signal being less than or equal to the predetermined threshold value. Based on detecting the air bubble, the controller 70 may not automatically perform the analysis (e.g., the imaging of the sample). For example, the controller 70 may cause an output device 20 (refer to FIG. 12) to output an alert, indicating an error, to a user, so that the user may manually check to see if there is an air bubble present in the immersion liquid between the liquid immersion objective and the body 3523.
  • the controller 70 may cause an output device 20 (refer to FIG. 12) to output an alert, indicating an error, to a user, so that the user may manually check to see if there is an air bubble present in the immersion liquid between the liquid immersion objective and the body 3523.
  • the user may manually re-apply immersion liquid on the immersion objective 3522 or instruct the controller 70, via input on an input device 10 (refer to FIG. 12), to automatically re-apply immersion liquid on the immersion objective 3522. Thereafter, the controller 70 may re-perform bubble detection.
  • the user may cause, via input on an input device 10 (refer to FIG. 12), the error to be overridden such that the controller 70 causes the analysis (e.g., capturing a sample image using the light source 3512 and the immersion objective 3522) to be performed.
  • the predetermined threshold value may be determined or obtained (e.g., by the controller 70) in various ways.
  • the controller 70 may determine the predetermined threshold value based on its image analysis results and/or bubble detection results, or the controller 70 may be obtain the predetermined threshold value by, for example, a user inputting the predetermined threshold value to the controller 70 using an input device 10 (refer to FIG. 12).
  • FIG. 2 is a diagram illustrating an automated imaging system 3000A according to a second embodiment of the present disclosure.
  • the automated imaging system 3000A may be the same or similar to the automated imaging system 3000 illustrated in FIG. 1, except that a second sensor 3521 may be located within the first cube 3532, adjacent to the light source 3510. For example, the second sensor 3521 may be located in the first infinity corrected optics region 3504.
  • the second sensor 3521 may be used to perform bubble detection.
  • the second sensor 3521 may be configured to the air signal and the liquid signal (e.g., the reflection light 3502 during both the dry state and the wet state) for bubble detection.
  • the reflection light 3502 may be received by the second sensor 3521 by the reflection light 3502 being reflected by the mirror 3503 and the dichroic mirror 3531.
  • the controller 70 may obtain the air signal and the liquid signal for bubble detection from the second sensor 3521, and separately use the first sensor 3515 for sample analysis, including imaging the sample.
  • the second sensor 3521 may be a dedicated sensor for bubble detection.
  • the second sensor 3521 may have a same or different configuration from the first sensor 3515, and may be, for example, an imaging camera, charge coupled device (CCD), complementary metal oxide semiconductor (CMOS) based detector, a line sensor, or a single element photo diode.
  • CCD charge coupled device
  • CMOS complementary metal oxide semiconductor
  • the controller 70 may be configured to implement auto focusing for sample imaging as described in Norris et al, in addition to bubble detection.
  • the first cube 3532 may have the configuration of an autofocus module of Norris et al., and the light source (e g., laser) and sensor of such autofocus module may be controlled by the controller 70 to perform both bubble detection and autofocus.
  • FIG. 3 is a diagram illustrating an automated imaging system 3000B according to a third embodiment of the present disclosure.
  • the automated imaging system 3000B may be the same or similar to the automated imaging system 3000 illustrated in FIG. 1, except that the first cube 3532 is replaced with the first cube 3532A.
  • the first cube 3532 of FIG. 1 includes the light source 3510 that may be a narrow beam light source
  • the first cube 3532A of FIG. 3 includes a light source 3510A that may be a wide beam light source.
  • the wide beam light source may be, for example, a lamp
  • the first cube 3532A may be used, for example, in a wide field mode for imaging the sample. Additionally, a lens 3541 A may be located in first cube 3532A. The lens 3541A may be configured to collimate the light 3501 into the back of the immersion objective 3522. Similar to as described with respect to the light source 3510 of the first embodiment, the light source 3510A, and its light 3501, may be utilized in bubble detection.
  • FIG. 4 is a diagram illustrating an automated imaging system 3000C according to a fourth embodiment of the present disclosure.
  • the automated imaging system 3000C may be the same or similar to the automated imaging system 3000 illustrated in FIG. 1, except that the first cube 3532 is located between the second lens 3525 and the third lens 3526, within the second infinity corrected optics region 3505, instead of within the first infinity corrected optics region 3504. Additionally, a third cube 3517 may be provided within the first infinity corrected optics region 3504, between the mirror 3503 and the first lens 3524.
  • the functions of the first cube 3532 in FIGS. 1 and 4 may be similar to each other. That is, for example, light 3501 from the light source 3510 may still be used to perform bubble detection as described in embodiments of the present disclosure.
  • the third cube 3517 may be a structure (e.g., a body) that includes, for example, a light source 3518, a lens 3520 (e.g., a filter) and a dichroic mirror 3519. According to embodiments of the present disclosure, the third cube 3517 may be similar to the second cube 3511.
  • the light source 3512 of the second cube 3511 may be a laser wide beam light source for wide field imaging, positioned to utilize confocal optics (e.g., the confocal spinning disk 3514), while the light source 3518 of the third cube 3517 may be an LED wide beam light source, not positioned to utilize confocal optics (e.g., the confocal spinning disk 3514).
  • confocal optics e.g., the confocal spinning disk 3514
  • the light source 3518 of the third cube 3517 may be an LED wide beam light source, not positioned to utilize confocal optics (e.g., the confocal spinning disk 3514).
  • the light source 3518 may be configured to emit a light towards the dichroic mirror 3519 via the lens 3520.
  • the light source 3518 may be a confocal excitation light source.
  • the confocal excitation light source may be any light source suitable for confocal microscopy.
  • the light source 3518 may be a solid state light source (e.g., one or more LEDS) or a solid state laser or semiconductor-based laser (e.g., a laser diode).
  • the lens 3520 may be configured to block light of certain wavelengths from the light source 3518 while allowing other light of other certain wavelengths from the light source
  • the lens 3520 may be configured to form a bandpass for excitation of a sample held by the body 3523.
  • the dichroic mirror 3519 may be partially reflective and partially transmissive. For example, some light may reflect off a surface of the dichroic mirror
  • the dichroic mirror 3519 may reflect the light towards the mirror 3503, and the mirror 3503 may reflect the light to the immersion objective 3522, which is directed towards the body 3523. That is, the light source 3518 may be configured to emit the light to the body via the immersion objective 3522.
  • the first sensor 3515 may be configured perform analysis of a sample (e.g., image a sample) based on receiving light reflected or generated by the sample based on the light of the light source 3518.
  • the automated imaging system 3000C may include at least one actuatorthat is configured to move one or both of the first cube 3532 and the second cube 3511 into and out of position between the second lens 3525 and the third lens 3526.
  • the at least one actuator (based on control by the controller 70) may be configured to interchange the first cube 3532 and the second cube 3511 into the position between the second lens 3525 and the third lens 3526 so that the automated imaging system 3000C may selectively perform bubble detection using the first cube 3532 or sample analysis using the second cube 3511.
  • the at least one actuator (based on control by the controller 70) may be configured to move one or both of the first cube 3532 and the second cube 3511 so that the first cube 3532 and the second cube 3511 are coaxially positioned, one after another, between the second lens 3525 and the third lens 3526.
  • the first cube 3532 and the second cube 3511 may be coaxially fixed, one after another, between the second lens 3525 and the third lens 3526 with partial transmission optics therebetween.
  • the controller 70 may be configured to selectively perform sample analysis using a sample image(s) obtained by the first sensor 3515 based on light emitted by the light source 3512 of the second cube 3511 and/or light emitted by the light source 3518 of the third cube 3517.
  • FIG. 5 is a diagram illustrating an automated imaging system 3000D according to a fifth embodiment of the present disclosure.
  • the automated imaging system 3000D may be the same or similar to the automated imaging system 3000C illustrated in FIG. 4, except that the second sensor 3521 may be located within the first cube 3532, adjacent to the light source 3510. For example, the second sensor 3521 may be located in the second infinity corrected optics region 3505.
  • the second sensor 3521 may be used to perform bubble detection.
  • the second sensor 3521 may be configured to receive the reflection light 3502 that constitutes the air signal and the liquid signal for bubble detection.
  • the reflection light 3502 may be received by the second sensor 3521 by the reflection light 3502 reflecting from the mirror 3503, passing through the dichroic mirror 3519, the first lens 3524, and the confocal spinning disk 3514, and reflecting from the dichroic mirror 3531.
  • the controller 70 may obtain the air signal and the liquid signal for bubble detection from the second sensor 3521, and separately use the first sensor 3515 for sample analysis, including imaging the sample.
  • the second sensor 3521 may be a dedicated sensor for bubble detection.
  • the second sensor 3521 may have a same or different configuration from the first sensor 3515, and may be, for example, an imaging camera, charge coupled device (CCD), complementary metal oxide semiconductor (CMOS) based detector, a line sensor, or a single element photo diode.
  • CCD charge coupled device
  • CMOS complementary metal oxide semiconductor
  • a line sensor or a single element photo diode.
  • the controller 70 may be configured to implement auto focusing for sample imaging as described in Norris et al, in addition to bubble detection.
  • the first cube 3532 may have the configuration of an autofocus module of Norris et al., and the light source (e g., laser) and sensor of such autofocus module may be controlled by the controller 70 to perform both bubble detection and autofocus.
  • the light source 3510, within the first cube 3532, that is a narrow beam light source may be replaced with a light source 3510A (refer to FIG. 3), within the first cube 3532, that is a wide beam light source, similar to as described above with respect to the third embodiment (refer to FIG. 3).
  • FIG. 6 is a diagram illustrating an automated imaging system 3000E according to a sixth embodiment of the present disclosure.
  • the automated imaging system 3000E may be the same or similar to the automated imaging system 3000C illustrated in FIG. 4, except that the second cube 3511 is omitted, and the first cube 3532 is replaced with a first cube 3532B within the second infinity corrected optics region 3505.
  • the first cube 3532 of FIG. 4 includes the light source 3510 that is a narrow beam light source
  • the first cube 3532B of FIG. 6 includes a light source 3510B that is a wide beam light source.
  • the wide beam light source may be, for example, a lamp (e.g. a LED, tungsten-halogen arc-lamp, mercury arc-lamp, xenon arc-lamp, laser diode, etc.).
  • a lens 3541B may be located in first cube 3532B in order to focus the wide beam light source to infinity. Similar to as described with respect to the light source 3510 of the first embodiment (refer to FIG. 1), the light source 3510B, and its light 3501, may be utilized in bubble detection.
  • FIG. 9 is a diagram illustrating an automated imaging system 3000F according to a seventh embodiment of the present disclosure.
  • the automated imaging system 3000F may be the same or similar to the automated imaging system 3000 illustrated in FIG. 1, except that the confocal spinning disk 3514, the second lens 3525, the second cube 3511, and the third lens 3526 may be omitted.
  • the automated imaging system 3000F may be configured as a digital microscope, instead of a spinning disk confocal microscope, in which the first cube 3532 is provided in an infinity corrected optics region of the digital microscope.
  • a second cube 3511 A may be provided instead of the second cube 3511 (refer to FIG. 1).
  • the second cube 3511 A may be a structure (e.g., a body) that includes, for example, a light source 3512A, a lens 3513A (e.g., an emission fdter), and a dichroic mirror 3516A.
  • the light source 3512A may be configured to emit light towards the dichroic mirror 3516A, via the lens 3513A.
  • the lens 3513A may be configured to block light of certain wavelengths from the light source 3512A while allowing other light of other certain wavelengths from the light source 3512A to pass therethrough.
  • the lens 3513A may be configured to form a bandpass for excitation of a sample held by the body 3523.
  • the dichroic mirror 3516A may be partially reflective and partially transmissive. For example, some light may reflect off a surface of the dichroic mirror 3516A and other light may pass through the surface of the dichroic mirror 3516.
  • the light source 3512A maybe configured to emit light towards a sample held by the body 3523 in order to image the sample using the light. For example, the light of the light source 3512A may reach the sample (e.g., as input light 301 in FIGS.
  • output light (e.g., output light 302 in FIGS. 13-14) may be reflected from or generated by (e.g., via florescence) by the sample, and the output light may pass downwards through the immersion objective 3522 so as to be received by the first sensor 3515 such that an image of the sample is obtained.
  • the output light may be received by the first sensor 3515 by being reflected by the mirror 3503 and passing through the dichroic mirror 3516A and the first lens 3524.
  • the light source 3512A may be a confocal excitation light source.
  • the confocal excitation light source may be any light source suitable for confocal microscopy.
  • the light source 3512A may be a solid state light source (e.g., one or more LEDS) or a solid state laser or semiconductor-based laser (e.g., a laser diode).
  • the light source 3512A may be a widefield fluorescence excitation light source (e.g. a lamp).
  • the automated imaging system 3000F may include at least one actuator that is configured to move one or both of the first cube 3532 and the second cube 3511 A into and out of position between the mirror 3503 and the first lens 3524.
  • the at least one actuator (based on control by the controller 70) may be configured to interchange the first cube 3532 and the second cube 3511 A into the position between the mirror 3503 and the first lens 3524 so that the automated imaging system 3000F may selectively perform bubble detection using the first cube 3532 or sample analysis using the second cube 3511 A.
  • the at least one actuator (based on control by the controller 70) may be configured to move one or both of the first cube 3532 and the second cube 3511 A so that the first cube 3532 and the second cube 3511 A are coaxially positioned, one after another, between the mirror 3503 and the first lens 3524.
  • the first cube 3532 and the second cube 3511A may be coaxially fixed, one after another, between the mirror 3503 and the first lens 3524 with partial transmission optics therebetween.
  • FIG. 10 is a diagram illustrating an automated imaging system 3000G according to an eighth embodiment of the present disclosure.
  • the automated imaging system 3000G may be the same or similar to the automated imaging system 3000F illustrated in FIG. 9, except that a second sensor 3521 may be located within the first cube 3532, adjacent to the light source 3510, similar to the second embodiment illustrated in FIG. 2.
  • the second sensor 3521 may be used to perform bubble detection.
  • the second sensor 3521 may be configured to receive the air signal and the liquid signal (e.g., the reflection light 3502 during both the dry state and the wet state) for air bubble detection.
  • the reflection light 3502 may be received by the second sensor 3521 by the reflection light 3502 being reflected by the mirror 3503 and the dichroic mirror 3531.
  • the controller 70 may obtain the air signal and the liquid signal for bubble detection from the second sensor 3521, and separately use the first sensor 3515 for sample analysis, including imaging the sample.
  • the second sensor 3521 may be a dedicated sensor for bubble detection.
  • the second sensor 3521 may have a same or different configuration from the first sensor 3515, and may be, for example, an imaging camera, charge coupled device (CCD), complementary metal oxide semiconductor (CMOS) based detector, a line sensor, or a single element photo diode.
  • CCD charge coupled device
  • CMOS complementary metal oxide semiconductor
  • the controller 70 may be configured to implement auto focusing for sample imaging as described in Norris et al, in addition to bubble detection.
  • the first cube 3532 may have the configuration of an autofocus module of Norris et al., and the light source (e g., laser) and sensor of such autofocus module may be controlled by the controller 70 to perform both bubble detection and autofocus.
  • FIG. 11 is a diagram illustrating an automated imaging system 3000H according to a ninth embodiment of the present disclosure.
  • the automated imaging system 3000H may be the same or similar to the automated imaging system 3000F illustrated in FIG. 9, except that the first cube 3532 is replaced with the first cube 3532A of FIG. 3.
  • the first cube 3532 of FIG. 9 includes the light source 3510 that is a narrow beam light source
  • the first cube 3532A of FIG. 3 includes a light source 3510A that is a wide beam light source.
  • the wide beam light source may be, for example, a lamp (e.g. an LED, tungsten-halogen arc-lamp, mercury arc-lamp, xenon arc-lamp, laser diode, etc.).
  • a lens 3541A may be located in first cube 3532A.
  • the lens 3541A may be configured to collimate the light 3501 into the back of the immersion objective 3522. Similar to as described with respect to the light source 3510 of the first embodiment (refer to FIG. 1), the light source 3510A, and its light 3501, may be utilized in bubble detection.
  • the controller 70 may be configured to determine whether an air bubble is present in the immersion liquid between the immersion objective 3522 and the body 3523, during the wet state, based on a ratio of an air signal to a liquid signal, which indicates a ratio of reflectance of light with respect to the body 3523 during the dry state to reflectance of light with respect to the body 3523 during the wet state.
  • a ratio of an air signal to a liquid signal which indicates a ratio of reflectance of light with respect to the body 3523 during the dry state to reflectance of light with respect to the body 3523 during the wet state.
  • embodiments of the present disclosure are not limited thereto.
  • the controller 70 may alternatively use an “imaging” approach (e.g., sensing light fluorescently emitted by a sample due to excitation light) to determine whether a bubble is present in an immersion liquid on the immersion objective 3522.
  • an “imaging” approach e.g., sensing light fluorescently emitted by a sample due to excitation light
  • NA numerical aperture
  • the controller 70 may alternatively use an “imaging” approach (e.g., sensing light fluorescently emitted by a sample due to excitation light) to determine whether a bubble is present in an immersion liquid on the immersion objective 3522.
  • NA numerical aperture
  • the controller 70 may compare the signal from a dry liquid immersion objective (or a dry air immersion objective) to a wetted liquid immersion objective and, based on the signal not increasing an expected amount, the controller 70 may determine the bubble(s) is present.
  • the sensors may obtain an air signal that is a first in-focus fluorescent image (or a feature(s) thereof) of the sample held by the body 3523 during the dry state (hereinafter referred to as an air image signal), and a liquid signal that is a second in-focus fluorescent image (or a feature(s) thereof) of the sample held by the body 3523 during the wet state (hereinafter referred to as a liquid image signal), wherein the two images have the same feature(s) therein.
  • an air signal that is a first in-focus fluorescent image (or a feature(s) thereof) of the sample held by the body 3523 during the dry state
  • a liquid image signal a liquid signal that is a second in-focus fluorescent image (or a feature(s) thereof) of the sample held by the body 3523 during the wet state
  • the florescent images may be obtained based on an excitation light from a light source (e.g., the light source 3510, the light source 3510A, or the light source 3510B) causing the sample to emit light via fluorescence.
  • the controller 70 may obtain the air image signal and the liquid image signal from the sensor (e.g., first sensor 3515 or second sensor 3521) and record the air image signal and the liquid image signal.
  • the controller 70 may be configured to compare the ratio of the air image signal to the liquid image signal to a predetermined value (e.g., a predetermined threshold value) in order to detect whether an air bubble is present in the immersion liquid between the immersion objective 3522 and the body 3523, during the wet state.
  • a predetermined value e.g., a predetermined threshold value
  • the ratio of the air image signal to the liquid image signal may be much less than 1 in a case that no air bubbles are present during the wet state, and the ratio of the air image signal to the liquid signal may be about 1 in a case that an air bubble(s) is present during the wet state.
  • the controller 70 may determine whether there is an air bubble present in the immersion liquid on the immersion objective 3522 based on the ratio of the air image signal to the liquid image signal. For example, the controller 70 may compare the ratio of the air image signal to the liquid image signal to a predetermined threshold value to make the determination.
  • the controller 70 may determine (or detect) that there is no air bubble present in the immersion liquid between the immersion objective 3522 (the liquid immersion objective) and the body 3523 based on the ratio of the air image signal to the liquid image signal being less than the predetermined threshold value. The controller 70 may then automatically cause the automated imaging system 3000 to perform imaging of the sample (e.g., using the light source 3512 and the immersion objective 3522) based on detecting that no air bubble is present on the immersion objective 3522.
  • the controller 70 may determine (or detect) that there is an air bubble(s) present in the immersion liquid between the immersion objective 3522 (the liquid immersion objective) and the body 3523 based on the ratio of the air image signal to the liquid image signal being greater than or equal to the predetermined threshold value. Based on detecting the air bubble, the controller 70 may not automatically perform the analysis. For example, the controller 70 may cause an output device 20 (refer to FIG. 12) to output an alert, indicating an error, to a user, so that the user may manually check to see if there is an air bubble present in the immersion liquid between the liquid immersion objective and the body 3523.
  • the user may manually re-apply immersion liquid on the immersion objective 3522 or instruct the controller 70, via input on an input device 10 (refer to FIG. 12), to automatically re-apply immersion liquid on the immersion objective 3522. Thereafter, the controller 70 may re-perform bubble detection.
  • the user may cause, via input on an input device 10 (refer to FIG. 12), the error to be overridden such that the controller 70 controls the analysis (e.g., capturing a sample image using the light source 3512 and the immersion objective 3522) to be performed.
  • the predetermined threshold value may be determined or obtained (e.g., by the controller 70) in various ways.
  • the controller 70 may determine the predetermined threshold value based on its image analysis results and/or bubble detection results, or the controller 70 may be obtain the predetermined threshold value by, for example, a user inputting the predetermined threshold value to the controller 70 using an input device 10 (refer to FIG. 12).
  • the controller 70 may be configured to determine whether a bubble(s) is present in the immersion liquid between the immersion objective 3522 (the liquid immersion objective) and the body 3523 based on the liquid signal (or the liquid image signal) without obtaining the air signal (or the air image signal). For example, the controller 70 may be configured to determine that no bubbles are present based on the liquid signal being less than a predetermined threshold value, or that a bubble(s) is present based on the liquid signal being greater than or equal to the predetermined threshold value.
  • the controller 70 may be configured to determine that no bubbles are present based on the liquid image signal being greater than a predetermined threshold value, or that a bubble(s) is present based on the liquid image signal being less than or equal to the predetermined threshold value.
  • the predetermined threshold value may be determined or obtained (e.g., by the controller 70) in various ways.
  • FIGS. 8A-B are flowcharts illustrating methods of operation of an automated imaging system according to embodiments of the present disclosure according to embodiments of the present disclosure.
  • FIGS. 8A-B For purposes of clarity, the methods of FIGS. 8A-B will be described with reference to the first embodiment shown in FIG. 1. However, a person of ordinary skill in the art would understand that the methods may also be performed by other embodiments of the present disclosure (including the above-described second through ninth embodiments).
  • the controller 70 may control the light source 3510 to emit light 3501 such that the light 3501 travels to the body 3523 via a dry objective (operation 3401).
  • the dry objective may be an air immersion objective or a liquid immersion objective that does not presently have an immersion liquid thereon.
  • the controller 70 may measure and record the corresponding air signal level (or the air image signal level) received by the first sensor 3515 due to the light (e.g., reflection light 3502) reflecting from the body 3523 or due to the sample generating light due to the light 3501 (operation 3402).
  • the controller 70 may then control the movement mechanism 3530 to move a liquid immersion objective to replace the dry objective to be in the viewing position below the body 3523 (operation 3403).
  • the moved liquid immersion objective may be a different objective from the air or liquid immersion objective used in the dry state, or may be the same liquid immersion objective used in the dry state but with liquid immersion applied thereon.
  • the operation 3403 may further include the controller 70 (or a user) causing the immersion liquid to be applied to the liquid immersion objective before or after, moving the liquid immersion objective to the viewing position.
  • the operation 3403 may further include the controller 70 (or a user) causing the immersion liquid to be applied to the liquid immersion objective before, after, or in lieu of moving the liquid immersion objective to the viewing position.
  • the controller 70 may control the light source 3510 to emit light 3501 such that the light 3501 travels to the body 3523 via the liquid immersion objective that has the immersion liquid thereon (operation 3404). During such wet state, the controller 70 may measure and record the corresponding water signal level (or the water image signal level) received by the first sensor 3515 due to the light (e.g., reflection light 3502) reflecting from the body 3523 or due to the sample generating light due to the light 3501 (operation 3405).
  • the light e.g., reflection light 3502
  • the controller 70 may then compare the ratio of the air signal (or the air image signal) to the liquid signal (liquid image signal) to a predetermined threshold value (operation 3406). For example, in the case the ratio of the air signal to the liquid signal is obtained, which indicates a ratio of reflectance of light with respect to the body 3523 during the dry state and during the wet state, the controller 70 may determine that no air bubble is present (operation 3407) in the immersion fluid on the liquid immersion objective based on the ratio being greater than the predetermined threshold value. Alternatively, the controller 70 may determine that an air bubble(s) is present (operation 3408) in the immersion fluid on the liquid immersion objective based on the ratio being less than or equal to predetermined threshold value.
  • the controller 70 may also cause an output device 20 (refer to FIG. 12) to output an alert, indicating an error, to a user, so that the user may manually check to see if there is an air bubble present in the immersion liquid between the liquid immersion objective and the body 3523.
  • operation 3410 may include the controller 70 (and/or a user) retracting the liquid immersion objective, dispensing the immersion liquid on the liquid immersion objective, and recontacting the body 3523 on the liquid immersion objective.
  • the user may cause, via input on an input device 10 (refer to FIG. 12), the error to be overridden such that the controller 70 causes the analysis (e.g., capturing a sample image using the light source 3512 and the water immersion objective) to be performed (operation 3409).
  • the analysis e.g., capturing a sample image using the light source 3512 and the water immersion objective
  • the controller 70 may automatically cause the analysis (e.g., capturing a sample image using the light source 3512 and the water immersion objective) to be performed (operation 3407) Additionally, in operation 3407, the controller 70 may also cause an output device 20 (refer to FIG. 12) to output an alert, indicating that no air bubble was detected.
  • the analysis e.g., capturing a sample image using the light source 3512 and the water immersion objective
  • the controller 70 may also cause an output device 20 (refer to FIG. 12) to output an alert, indicating that no air bubble was detected.
  • operations 3401 and 3402 may be performed by the controller 70 in a first phase (e.g., a calibration phase), and the subsequent operations (e.g., operations 3404, 3403, etc.) may be performed by the controller 70 in a second phase (e.g., an operation phase).
  • the second phase may be performed by the controller 70 multiple times after performance of a single first phase. In other words, the second phase may be performed multiple times based on a single first phase.
  • the controller 70 may compare a same recorded air signal (or air image signal) to multiple recorded liquid signals (or liquid image signals) in order to determine whether an air bubble is present in immersion liquid on respective liquid immersion objectives.
  • the method may further include the controller 70 causing one of the samples to move to a capture position (operation 3411) after operation 3403.
  • the controller 70 may control at least one actuator of the automated imaging system to move the microplate such that one of the samples is directly above the water immersion objective, so as to be viewable by the water immersion objective.
  • operations 3404, 3405, etc. may be performed.
  • the controller 70 may determine whether another sample in the microplate should be imaged (operation 3415). For example, in case where there is a non-previously imaged sample in the microplate, the controller 70 may determine that there is another sample to be imaged. According to embodiments, the controller 70 may determine whether there is another sample to be imaged based on counting a number of imaged samples in relation to a total number of samples, or sensing a position of wells of the microplate relative the water immersion objective based on output of sensor, but embodiments of the present disclosure are not limited thereto.
  • the controller 70 may cause the next sample to move to the capture position (operation 3413).
  • the controller 70 may control at least one actuator of the automated imaging system to move the microplate such that the next sample is directly above the water immersion objective, so as to be viewable by the water immersion objective.
  • operations 3405, 3406 may be repeated.
  • the controller 70 may determine that the imaging process is done and eject the sample(s) (operation 3414). For example, the controller 70 may control the at least one actuator of the automated imaging system (or enable a user) to move the microplate away from the water immersion objective.
  • an automated imaging system 1 of the present disclosure may include input devices 10, output devices 20, light sources 30, sensors 40, actuators 50, and a controller 70.
  • the input devices 10 may include, for example, a microphone, a keyboard, a mouse, a switch, a button, and/or a touchscreen of a display.
  • the output devices 20 may be include for example, a speaker, a display, and/or a piezoelectric buzzer.
  • the light sources 30 may include, for example, the light sources (e.g., light source 3510, light source 3510A, light source 3510B, light source 3512, light source 3152A, light source 3518, etc.) described above with respect to embodiments of the present disclosure.
  • the sensors 40 may include, for example, the sensors (e.g., the first sensor 3515, the second sensor 3521, etc.) described above with respect to embodiments of the present disclosure.
  • the actuators 50 may include, for example, the actuators described above that move the various components of embodiments of the present disclosure.
  • the controller 70 may include hardware and/or software components that enable automatic control of any number (e.g., all or some) of the components of the automated imaging system 3000 to perform their respective functions, wherein the automatic control may be based on one or more user inputs to the automated imaging system 3000.
  • controller 70 may include at least one processor 4501 and a memory 74.
  • the controller 70 may be connected via to the input devices 10 for receiving inputs therefrom, the output devices 20 for sending outputs thereto, the light sources 30 for controlling emission of the light sources 30, the sensors 40 for receiving information (e.g., air and water signals) therefrom, and the actuators 50 for controlling the components.
  • the memory 74 may store computer instructions.
  • the computer instructions when executed by the at least one processor 4501, may be configured to cause the controller 70 to perform its functions.
  • the controller 70 may be connected via wires and/or wirelessly to one or more (e.g., some or all) of the components of the automated imaging system 3000 that are configured to be controlled (e.g. the output devices 20, the light sources 30, the sensors 40), in order to perform control of such components.
  • the controller 70 may be connected via wires and/or wirelessly to one or more (e.g., some or all) of the components of the automated imaging system 3000 that sense light/images (e.g., the sensors 40), in order to receive signals concerning the light/images.
  • the controller 70 may include a hardware communication interface that is configured to connect the automated imaging system 3000 via wires and/or wirelessly to external devices (e.g., computers, mobile devices, etc.).
  • the hardware communication interface may be configured to connect the at least one processor 72 and memory 74 of the automated imaging system to the external devices via a local network and/or a wide-area network (e.g., the Internet).
  • the at least one processor 72 and memory 74 may be configured to report information to the external devices concerning the status, results, and/or schedule of sample analysis of the automated imaging system.
  • the at least one processor 72 and memory 74 may be configured to receive commands from the external components for controlling the components of the automated imaging system, including scheduling of sample analysis.
  • the at least one processor 72 and memory 74 may be configured to perform the control based on the commands.
  • the methods and systems discussed above may be implemented in hardware, software, and any combination of hardware and software.
  • systems described above may be implemented to include one or more processors (e.g., CPU, microcontroller, microprocessor, etc.) that execute computer-readable instructions stored in a computer-readable memory (e.g., ROM, RAM, flash, etc.).
  • the computer-readable instructions may be provided as machine-readable algorithms that cause execution of the algorithms discussed above with respect to the systems and flowcharts.
  • Embodiments of the present discourse may allow an automated imaging system to easily detect if there is an air bubble in an immersion liquid between an objective and a body that is configured to hold a sample, before the automated imaging system takes an image of the sample.

Landscapes

  • Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Oil, Petroleum & Natural Gas (AREA)
  • Investigating, Analyzing Materials By Fluorescence Or Luminescence (AREA)
  • Microscoopes, Condenser (AREA)

Abstract

L'invention concerne des systèmes et des procédés pour détecter efficacement si une bulle d'air est présente dans un liquide d'immersion entre un objectif d'immersion dans un liquide et un corps. Par exemple, un système d'imagerie automatisé comprenant un objectif d'immersion dans un liquide, une source de lumière configurée pour émettre une première lumière vers un corps par l'intermédiaire de l'objectif d'immersion dans un liquide dans un état dans lequel un liquide d'immersion est sur l'objectif d'immersion dans un liquide entre l'objectif d'immersion dans un liquide et le corps, le corps étant configuré pour contenir un échantillon, un premier capteur configuré pour recevoir un premier signal sur la base de la première lumière, après que la première lumière arrive au niveau du corps par l'intermédiaire de l'objectif d'immersion dans un liquide, et un dispositif de commande configuré pour détecter si une bulle d'air est présente dans le liquide d'immersion entre l'objectif d'immersion dans un liquide et le corps sur la base du premier signal reçu par le premier capteur.
PCT/US2024/051384 2023-11-30 2024-10-15 Système et procédé de détection de bulles objectives par immersion dans l'eau Pending WO2025117061A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202363604493P 2023-11-30 2023-11-30
US63/604,493 2023-11-30

Publications (1)

Publication Number Publication Date
WO2025117061A1 true WO2025117061A1 (fr) 2025-06-05

Family

ID=95897664

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2024/051384 Pending WO2025117061A1 (fr) 2023-11-30 2024-10-15 Système et procédé de détection de bulles objectives par immersion dans l'eau

Country Status (1)

Country Link
WO (1) WO2025117061A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050179997A1 (en) * 2003-12-24 2005-08-18 Nikon Corporation Microscope and immersion objective lens
US20180251833A1 (en) * 2015-11-03 2018-09-06 President And Fellows Of Harvard College Method and Apparatus for Volumetric Imaging of a Three-Dimensional Nucleic Acid Containing Matrix
US20210405337A1 (en) * 2020-06-29 2021-12-30 Mgi Tech Co., Ltd. Systems and methods for optical scanning and imaging through a fluid medium for nucleic acid sequencing
US20220390367A1 (en) * 2019-10-29 2022-12-08 Oxford NanoImaging Limited An optical imaging system
US20230138764A1 (en) * 2020-03-13 2023-05-04 University Of Southern California Optimized photon collection for light-sheet microscopy

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050179997A1 (en) * 2003-12-24 2005-08-18 Nikon Corporation Microscope and immersion objective lens
US20180251833A1 (en) * 2015-11-03 2018-09-06 President And Fellows Of Harvard College Method and Apparatus for Volumetric Imaging of a Three-Dimensional Nucleic Acid Containing Matrix
US20220390367A1 (en) * 2019-10-29 2022-12-08 Oxford NanoImaging Limited An optical imaging system
US20230138764A1 (en) * 2020-03-13 2023-05-04 University Of Southern California Optimized photon collection for light-sheet microscopy
US20210405337A1 (en) * 2020-06-29 2021-12-30 Mgi Tech Co., Ltd. Systems and methods for optical scanning and imaging through a fluid medium for nucleic acid sequencing

Similar Documents

Publication Publication Date Title
US8643946B2 (en) Autofocus device for microscopy
US10921234B2 (en) Image forming cytometer
US8304704B2 (en) Method and apparatus for autofocus using a light source pattern and means for masking the light source pattern
CN205958834U (zh) 基于图像的激光自动聚焦系统
CN113795778B (zh) 用于无限远校正显微镜的自校准和定向聚焦系统和方法
US7761257B2 (en) Apparatus and method for evaluating optical system
US6823079B1 (en) Device for examining samples
JP6126693B2 (ja) 光学レンズを用いることなく試料の光学的分析を行う容器及びシステム
US8809809B1 (en) Apparatus and method for focusing in fluorescence microscope
EP2110696A1 (fr) Procédé et appareil d'autofocus
JP4854880B2 (ja) レーザー顕微鏡
KR20220104821A (ko) 큰 표면들의 반사 푸리에 타이코그래피 (reflective fourier ptychography) 이미징
CN112867918A (zh) 用于确定光学介质的折射率的方法和显微镜
JP2006522948A (ja) 顕微鏡配列
JP4932162B2 (ja) 焦点検出装置とそれを用いた蛍光観察装置
CN101583895B (zh) 焦点检测装置和显微镜
NL2005902C2 (en) Method and apparatus for surface plasmon resonance angle scanning.
JP2019508746A (ja) 源自己蛍光を低減させ、均一性を改良するための散乱を伴う撮像システムおよび方法
JP2010156557A (ja) 入射光学系及びラマン散乱光測定装置
JP2003270524A (ja) 焦点検出装置およびこれを備えた顕微鏡、および、焦点検出方法
JP2006201465A5 (fr)
KR20220084147A (ko) 가상 기준
WO2025117061A1 (fr) Système et procédé de détection de bulles objectives par immersion dans l'eau
JP2005062515A (ja) 蛍光顕微鏡
JP5070995B2 (ja) 共焦点顕微鏡装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24898444

Country of ref document: EP

Kind code of ref document: A1