WO2020049383A1 - Microscope d'imagerie double - Google Patents
Microscope d'imagerie double Download PDFInfo
- Publication number
- WO2020049383A1 WO2020049383A1 PCT/IB2019/056832 IB2019056832W WO2020049383A1 WO 2020049383 A1 WO2020049383 A1 WO 2020049383A1 IB 2019056832 W IB2019056832 W IB 2019056832W WO 2020049383 A1 WO2020049383 A1 WO 2020049383A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- camera
- responses
- image
- sample
- version
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/06—Means for illuminating specimens
- G02B21/08—Condensers
- G02B21/086—Condensers for transillumination only
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J3/00—Spectrometry; Spectrophotometry; Monochromators; Measuring colours
- G01J3/02—Details
- G01J3/0294—Multi-channel spectroscopy
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J3/00—Spectrometry; Spectrophotometry; Monochromators; Measuring colours
- G01J3/02—Details
- G01J3/08—Beam switching arrangements
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/36—Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
- G02B21/365—Control or image processing arrangements for digital or video microscopes
- G02B21/367—Control or image processing arrangements for digital or video microscopes providing an output produced by processing a plurality of individual source images, e.g. image tiling, montage, composite images, depth sectioning, image comparison
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J3/00—Spectrometry; Spectrophotometry; Monochromators; Measuring colours
- G01J3/28—Investigating the spectrum
- G01J3/2803—Investigating the spectrum using photoelectric array detector
- G01J2003/2806—Array and filter array
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J3/00—Spectrometry; Spectrophotometry; Monochromators; Measuring colours
- G01J3/28—Investigating the spectrum
- G01J3/2823—Imaging spectrometer
- G01J2003/2826—Multispectral imaging, e.g. filter imaging
Definitions
- Microscopes are well known in the art, and generally refer to instruments for producing magnified views of objects/scenes (or samples, in general), especially of samples too small to be seen by the unaided eye.
- an optical microscope is a type of microscope that typically uses visible light and a system of lenses to magnify images of small objects.
- imaging refers to the process of capturing images of desired samples.
- An imaging microscope thus refers to a microscope that is equipped for capturing (generating) images of samples.
- the microscope may be fitted with one or more cameras (image capture devices, in general) for generating the images.
- Magnified images of scenes/objects may thus be captured by the imaging microscope, and be used for later analysis.
- Embodiments of the present disclosure are directed to a extending such microscopes for additional purposes.
- Figure 1 is a diagram of a microscope-based imaging system in which several aspects of the present disclosure can be implemented.
- Figures 2A and 2B are diagrams illustrating the functional details of a microscope-based imaging system in an embodiment of the present disclosure.
- Figure 3 is a diagram illustrating the functional details of a microscope-based imaging system in another embodiment of the present disclosure.
- Figure 4 is a diagram illustrating the functional details of a microscope-based imaging system in yet another embodiment of the present disclosure.
- Figure 5 is a diagram of the implementation details of a beam splitter used in a microscope- based imaging system, in an embodiment of the present disclosure.
- Figure 6 is a block diagram illustrating the details of a computing system of a microscope- based imaging system in which several aspects of the present disclosure are operative by execution of appropriate executable modules.
- a system provided according to aspects of the present disclosure includes a magnifier, a filter and one or more image capture devices.
- the magnifier is operable to generate a magnified version of a source scene representing a sample.
- the filter in combination with the magnifier, is operable to provide a distorted version of the magnified version.
- the one or more image capture devices is operable to generate six or more responses from the distorted version and the magnified version.
- the magnifier is an optical microscope.
- the magnifier is an optical microscope containing a source of illumination to illuminate the sample to cause generation of the magnified version of the source scene.
- the optical microscope includes a motor operable to place the filter in an optical path traversed by the magnified version to cause the optical microscope to generate the distorted version.
- a camera is coupled to the optical microscope via a C-mount connector of the optical microscope to be in the optical path.
- the camera contains an RGB color sensor.
- the camera is operable to be exposed at a first time instance to the magnified version and to generate an RGB image, the RGB image containing three responses in the six or more responses.
- the camera is operable to be exposed at a second time instance to the distorted version and to generate an R ' G ' B ' image, the R ' G ' B ' image containing another three responses in the six or more responses.
- system includes a computing device to process the six or more responses to construct the hypercube.
- the system includes comprises a first camera and a second camera.
- the first camera is coupled to the optical microscope via a first eyepiece of the optical microscope.
- the second camera is coupled to the optical microscope via a second eyepiece of the optical microscope.
- each of the first camera and the second camera contains a respective RGB color sensor.
- the filter is coupled between the second camera and the second eyepiece to cause generation of the distorted version.
- the first camera and the second camera are operable to simultaneously capture the magnified version and the distorted version respectively, and to respectively generate an RGB image and an R ' G ' B ' image.
- the RGB image contains three responses in the six or more responses.
- the R G B ' image contains another three responses in the six or more responses.
- the system includes a computing device to process the six or more responses to construct the hypercube.
- the system includes a first camera and a second camera.
- the optical microscope includes a beam splitter to receive the illumination as a first beam of light.
- the beam splitter is designed to generate a second beam of light and a third beam of light from the first beam of light.
- the first camera is coupled to a first output port of the beam splitter via a first connector to receive the second beam of light.
- the second camera is coupled to a second output port of the beam splitter via a second connector to receive the third beam of light.
- the filter is coupled between the second connector and the second output port of the beam splitter to generate the distorted version.
- each of the first camera and the second camera contains a respective RGB color sensor.
- the first camera and the second camera are operable to simultaneously capture the magnified version and the distorted version respectively, and to respectively generate an RGB image and an R G B ' image.
- the RGB image contains three responses in the six or more responses.
- the R G B ' image contains another three responses in the six or more responses.
- the system includes a computing device to process the six or more responses to construct the hypercube.
- FIG 1 is a diagram of an example system in which several aspects of the present disclosure can be implemented.
- Duplex imaging microscope 100 is shown containing microscope 140 and camera 150. Additionally, computing device 160 is also shown in Figure 1. Each part is described below in further detail.
- Microscope 140 is shown containing C-mount 101, eyepiece 102, tube 103, objective (lens) 104A and 104B, stage 105, condenser 106, filter 108, motor 109, illumination (lamp) 110, base 111 and arm 112.
- microscope 140 (as well as the microscopes of other figures) is realized by improvements (in accordance with the present disclosure) to Vulcan201 or Vulcan302 manufactured by the assignee of this application - Spectral Insights Pvt Ltd.
- microscope 140 The parts and components of microscope 140 are shown merely by way of illustration and microscope 140 may be built to have more or fewer (or different) parts and components also, and also using other technologies, as would be apparent to one skilled in the relevant arts upon reading the disclosure herein.
- C-mount (101) is shown, more than one C-mount can be present to accommodate multiple cameras.
- only two objective lenses are shown in the interest of conciseness.
- line Z is an imaginary line passing through (or close to) the center of parts 101, 103, 106 and 110, and represents an‘optical path’ that a beam of light from illuminator 110 traverses in microscope 140.
- the sample on stage 105 is within the focal length of the objective lens being used.
- Eyepiece 102 enables a user to view the magnified version of the sample placed on stage 105.
- Example values of magnification provided by eyepiece 102 are lOx (10 times magnification), 20X, 40X, 100X, etc.
- C-mount 101 is a connector for attaching a camera (here camera 150) to microscope 140.
- C-mount 101 may internally contain a magnifying lens to provide a magnification equal to that provided by eyepiece 102.
- C-Mount is the industry standard attachment for digital imaging devices that contain microscopes.
- Tube 103 connects eyepiece 102 and C-mount 101 to the structure containing objective lenses 104A and 104B.
- Arm 112 supports tube 103, and connects tube 103 to the base 111 of the microscope.
- Base 111 is the bottom portion of the microscope, and is used for support.
- Stage 105 is a platform on which a sample (e.g., a slide containing human tissue) to be magnified and analyzed is placed. Stage 105 may be movable along two mutually perpendicular axes, to enable the desired portion of the sample to be precisely positioned directly underneath the objective lens being used (and in line with the vertical line‘Z’, along which tube 103 and eyepiece 102 and objective lens (when correctly positioned) are aligned).
- a sample e.g., a slide containing human tissue
- Stage 105 may be movable along two mutually perpendicular axes, to enable the desired portion of the sample to be precisely positioned directly underneath the objective lens being used (and in line with the vertical line‘Z’, along which tube 103 and eyepiece 102 and objective lens (when correctly positioned) are aligned).
- Illuminator 110 represents a source of broadband light, and may be implemented for example as one or more multi-wavelength LEDs (Light Emitting Diode). Alternatively, illuminator 110 may be replaced by a reflecting mirror, with the mirror being adjustable to reflect an external broadband light source (e.g., sunlight) onto the sample. In either case, the source of light is referred generally herein as“illumination”. Condenser lens (condenser) 106 focuses the light from illuminator 110 (or alternatively the mirror in its place as noted above) onto the sample placed on stage 105.
- Condenser lens (condenser) 106 focuses the light from illuminator 110 (or alternatively the mirror in its place as noted above) onto the sample placed on stage 105.
- Filter 108 operates to filter/change (distort) the spectral content of the light beam emanating from illuminator 110 by removing or altering the intensity of one or more wavelengths from the beam of light generated by illuminator 110.
- the filtered light passes on through the sample, and via the corresponding parts of microscope 140, and impinges on sensors in camera 150.
- Filter 108 thus, operates to produce a distorted version of a magnified image produced by microscope 140.
- filter 108 is implemented as a fluorescence light distorter (FLD).
- FLD fluorescence light distorter
- filter 108 can be implemented as a low- pass filter, band-pass filter, high-pass filter, color correction filter, etc.
- Motor 109 is fixed to base 111, and is operable to position filter 108 to be aligned with (and perpendicular to) vertical line‘Z’.
- the motor is operated to swing filter 108 along a horizontal arc to bring filter 109 to be aligned with vertical line‘Z’, and thus in the optical path.
- Filter 108 can be moved out of the optical path by motor 109, when not needed.
- motor 109 and filter 108 enable camera 150 to capture filtered or non-filtered image(s) of the sample. It is to be understood in general, that filter 108 can be positioned anywhere on the optical path Z (and between camera 150 and illuminator 110), and that the position of motor 109 and filter 108 as noted with respect to Figure 1 is merely illustrative.
- Objective lenses 104A and 104B provide magnification to the sample placed on stage 105. While only two objective lenses 104A and 104B are shown in the interest of conciseness, more than two objective lenses can be present in other embodiments.
- the term‘objective 104’ refers to either one of objective lenses 104A and 104B.
- Example values of magnification that may be provided by either of lenses 104A and 104B are 4x (i.e., four times magnification), 20x, 40x, lOOx, etc.
- Overall (total) magnification provided by microscope 140 for a given combination of objective 104 and C-mount 101 is the product of individual magnifications provided by the objective lens and the lens within C-mount 101. For example, if magnification provided by C- mount 101 is 4x, and magnification provided by objective 104 is lOx, the total magnification provided by microscope 140 is 40x.
- Camera 150 is attached to microscope 140 via camera mount (C-mount) 101, and represents an image capture device.
- camera 150 contains an RGB sensor (built according to Complementary Symmetry Metal Oxide Semiconductor or CMOS technology) having Bayer pattern.
- RGB sensor built according to Complementary Symmetry Metal Oxide Semiconductor or CMOS technology
- Bayer color filter array is a popular format for digital acquisition of color images, and follows a GRGR, BGBG, GRGR, BGBG pattern, as is well known.
- the RGB sensor is covered with either a red, a green, or a blue filter, in a periodic pattern as noted above, and generates three streams of output voltages (or charge) (which can be rendered as three separate monochrome images after processing in computing device 160) corresponding to the red (R), green (G) and blue (B) components (or wavelength bands) of the light that impinges on the sensor.
- RGB red
- G green
- B blue
- different types of sensors e.g., organic
- the sensor in camera 150 can be built to have specific filters such as A, B, C (instead of a standard Bayer pattern), with A, B, and C representing corresponding desired colors (or wavelengths or wavelength bands). Further, more than three color bands (e.g., A, B, C and D) could be generated. The number of color bands depends on the number of color filters (e.g., 4, 5, etc.) used in conjunction with the CMOS sensor in camera 150.
- Camera 150 (and thus duplex imaging microscope 100) may be connected to a computing device (160) by a wired or wireless path 156.
- computing device 160 is shown separate from duplex imaging microscope 100, in an alternative embodiment, computing system 160 may be integrated with camera 150 in a single package.
- Computing device 160 represents a processor-based system and may correspond to a desktop computer (PC), a portable digital assistant (PDA), a mobile phone, notebook computer, and the like.
- Computing device 160 receives one or more images from camera 150, and may perform various analyses on the received images for corresponding applications.
- One such application is hyperspectral reconstruction (generation of a hypercube), well known in the relevant arts.
- Computing device 160 may also be used to simply display the captured image(s) on a display screen contained in computing device 160.
- a user places a sample to be magnified and analyzed on stage 105, such that the sample (or the desired portion of the sample) is aligned with vertical line‘Z’.
- the user views the sample through eyepiece 102, and may adjust stage 105 (e.g., by moving stage 105 up or down) to bring the sample in focus.
- stage 105 e.g., by moving stage 105 up or down
- the user may select the desired magnification by selecting the corresponding one of objective lenses 104A and 104B.
- the user may either position filter 108 in the optical path (i.e., aligned with line‘Z’, or remove the filter from the optical path by operating motor 109 (e.g., via a switch, not shown, or under program control automatically via computing device 160), depending on whether distortion by filter 108 is needed or not.
- the user clicks’ camera 150 to capture a single image (which is thus obtained from a single exposure of the sensor in camera 150).
- the user may capture a desired number of images using camera 150.
- the user may then operate camera 150 to forward the captured images to computing device 160 for analysis and/or display.
- a duplex imaging microscope such as 100 is used to generate a pair of images of a sample, with each image containing three or more responses.
- the two images together may contain six or more responses (images) of the (same) sample, with each response representing the sample in a corresponding band of wavelengths.
- the six or more responses may be suitably processed to generate a hypercube representing the sample.
- the pair of images may respectively be RGB images of the same sample, with one RGB image captured without a distorter placed in the optical path between camera 150 and the source of illumination 110, and the other RGB image (termed R'G' B' image) with the distorter placed in the optical path.
- the pair of images provide six responses of the sample, namely images R, G, B, R ⁇ G' and B ⁇ Computing device 160 may process the six responses to generate a hypercube representing the sample.
- a hypercube contains multiple values for each pixel of a scene (e.g., the magnified version of the sample noted above with respect to Figure 1), with each pixel value representing the magnitude of transmittance of the portion of the sample corresponding to the pixel at a corresponding wavelength contained in the illumination that is used in capturing the image.
- the hypercube contains 300x100x400 values, or 400 slices or ‘images’ (one corresponding to each of the 400 wavelengths), each with 300x100 pixels.
- the combination of duplex imaging microscope and computing system 160 can be made to operate as a spectral imaging system.
- the process of generating the hypercube from the pair of images is described in detail in PCT publication number WO/2018/029544 noted above, and the description is not repeated here in the interest of conciseness.
- the pair of images used to generate the hypercube of the sample may be obtained from duplex imaging microscope 100 using multiple exposures of camera 150, or a single exposure of camera 150, to the magnified sample, as described next with respect to example embodiments of the present disclosure.
- Figure 2A and Figure 2B are diagrams showing relevant details of duplex imaging microscope 100 Figure 1, and are used to illustrate the manner in which six distinct responses (e.g., in the form of two dissimilar RGB images of the same scene/sample) are obtained from a sample in an embodiment of the present disclosure.
- arrow 260 represents illumination (light beam) from illuminator 110 of Figure 1
- vertical line ‘Z’ represents the optical path from illuminator to camera, and has the same meaning as in Figure 1.
- computing device 160 is not shown in either of Figures 2 A and 2B.
- Figure 2A illustrates the details of duplex imaging microscope 100 for capturing a first image containing three responses, for example R, G and B.
- the user operates motor 109 (not shown in Figure 2, but same as that in Figure 1) to remove filter 108 from the optical path.
- filter 108 is shown in Figure 2A as being removed from the optical path represented by line‘Z’.
- line‘Z’ With the filter removed from the optical path, light 260 from the illuminator 110 falling on the sample does not pass through filter 108.
- the user places the sample to be analyzed on stage 105, and in the optical path, i.e., with the portion of the sample to be viewed to be in line with vertical line Z. The user adjusts the focus by viewing the sample via eyepiece 102.
- the image contains three responses, namely red (R), green (G) and blue (B), with the R, G and B responses being distinct images respectively representing the scene/sample in Red, Green and Blue wavelength bands.
- Figure 2B illustrates the details of duplex imaging microscope 100 for capturing a second image containing three responses R' (R-prime), G' (G-prime) and B' (B-prime).
- the user operates motor 109 (not shown in Figure 2, but same as that in Figure 1) to bring filter 108 into the optical path, i.e. in line with line‘Z’, as shown in the Figure. Since the filter is in the optical path, light 260 from the illuminator 110 passes through filter 108 before falling on the sample. With the sample undisturbed, the user clicks/snaps camera 150 to capture an image (second image).
- the second image contains another three responses, namely red-prime (R'), green-prime (G') and blue -prime (B'), with the R', G' and B' responses being another three distinct images respectively representing the scene/sample in red (R), green (G) and blue (B), but now with the intensity in the R,G, and B bands being altered by filter 108.
- the R', G' and B' responses are distinct from the R, G and B responses, even though the frequency bands in which they fall (namely red, green and blue) are the same. Therefore the six images obtained in bands R, G, B, R', G' and B' represent six distinct responses obtained from the same sample.
- first image and ‘second image’ are used only to distinguish between the two images captured by camera 150, and do not indicate any sequence for their acquisition.
- the user can also first capture an image with the filter in the optical path, and then capture another image without the filter in the optical path.
- the user causes camera 150 to forward the six responses in the form of the two images RGB and R G B' to computing device 160 via path 156.
- Computing device 160 may process the six responses to generate the hypercube representing the sample.
- the user may send commands via computing device 160 to camera 150 to sequentially capture the RGB and R'G' B' images (with motor 109 also remotely controlled by computing device 160 via means not shown), and then may retrieve the images from camera 150 via computing device 160.
- a duplex imaging microscope is used to generate six response from a single exposure (single click/snap) of a camera as described next with respect to Figure 3 and Figure 4.
- FIG. 3 is a block diagram that functionally depicts a duplex imaging microscope (300) in another embodiment of the present disclosure.
- Duplex imaging microscope 300 is shown there containing microscope 340 and cameras 350A and 350B. Additionally, a computing device 160 is also shown in the Figure. Only the differences from duplex imaging microscope 100 of Figure 1 are noted below, with all other details being similar to that of duplex imaging microscope 100.
- all components excluding cameras 350A and 350B and computing device 160 together represent a microscope 340.
- Microscope 340 has a binocular head containing two eyepieces, and does not contain a motor for selectively moving a filter into and out of the optical path. Instead, a filter is permanently fixed to one of the eyepieces. Further, system 300 employs two cameras, instead of one.
- objective 304, stage 305 and condenser 306 are implemented similar respectively to objective 104, stage 105 and condenser 106 of Figure 1, and their description is not repeated here in the interest of conciseness.
- the magnified version of the sample/scene (placed on stage 305) created by objective 304 is provided (simultaneously) to each of eyepieces 302A and 302B.
- Such simultaneous provision is an inbuilt feature of microscope 340, and may be achieved for example using mirrors/beam splitters.
- Eyepieces 302A and 302B may provide additional magnification to the scene/sample, and are contained in a binocular head of microscope 340. The magnification provided by each of the two eyepieces 3012A and 302B is the same.
- Connector 352A is designed to allow camera 350A to be firmly attached to eyepiece 302A.
- Filter 308 is attached to the viewing end of eyepiece 302B, and may be implemented for example as a fluorescence light distorter (FLD).
- Connector 352B is designed to allow camera 350B to be attached to filter 308.
- Arrow 360 represents illumination (light beam) from an illumination source (not shown, but similar to illuminator 110 of Figure 1).
- Each of cameras 350A and 350B may be implemented similar to camera 150 of Figure 1.
- the user places the sample to be analyzed on stage 305 beneath (and aligned with) objective 304. Once the sample is in focus (the user may view the sample via the cameras), the user clicks/snaps both cameras simultaneously to obtain two images. Accordingly, a common snap mechanism may be implemented in a known way. Alternatively, the user can capture a first image by clicking on one of cameras 350A and 350B, and then without disturbing the sample, capture a second image by clicking on the other one of cameras 350A and 350B. Assuming the sample does not change in visual characteristics (as done with Figures 2A/2B as well), both the images represent the same scene.
- the image captured by camera 350A contains three responses, namely red (R), green (G) and blue (B), with the R, G and B responses being distinct images respectively representing the scene/sample in Red, Green and Blue wavelength bands.
- the image captured by camera 350B contains another three responses of the same scene/sample, namely red-prime (R'), green-prime (G') and blue -prime (B'), with the R', G' and B' responses being another three distinct images respectively representing the scene/sample in red (R), green (G) and blue (B), but now with the intensity in the R,G, and B bands being filtered/distorted by filter 108.
- the R', G' and B' responses are different and distinct from the R, G and B responses obtained in the first image. Therefore the six images R, G, B, R', G' and B' represent six distinct responses obtained from the same sample.
- the user may instruct cameras 350A and 350B to capture the RGB and R G B' images via computing device 160 via respective paths 356A and 356B, and thereafter retrieve the images.
- the user causes camera 350A and 350B to forward the six responses in the form of two images RGB and R'G' B' to computing device 160 via path 356A and 356B respectively.
- Computing device 160 processes the six responses to generate the hypercube representing the sample according to the techniques described above.
- the user may send commands via computing device 160 to cameras 350A and 350B to capture the RGB and R'G' B' images, and then may retrieve the images from the camera 150 via computing device 160.
- FIG. 4 is a block diagram that functionally depicts a duplex imaging microscope (400) in another embodiment of the present disclosure, and is shown containing microscope 440 and cameras 450A and 450B. Only the differences from duplex imaging microscope 100 of Figure 1 are noted below, with all other details being similar to that of duplex imaging microscope 100.
- Microscope 440 is a trinocular microscope, which is modified to include a beam splitter (470) as described below. Being a trinocular microscope, microscope 440 has two eyepieces, only one of which, 402, is shown in Figure 4. Instead of the single C-mount that is provided in a conventional trinocular microscope, microscope 440 is implemented with a beam splitter and a pair of C-mounts to accommodate attachment of two cameras to the microscope. Microscope 440 does not contain a motor for selectively moving a filter into and out of the optical path. Instead a filter is permanently fixed to the microscope as noted below.
- objective 404, stage 405 and condenser 406 are implemented similar respectively to objective 104, stage 105 and condenser 106 of Figure 1, and their description is not repeated here in the interest of conciseness.
- Each of cameras 450A and 450B may be implemented similar to camera 150 of Figure 1.
- Camera 450A is attached to beam splitter 470 via C-mount 401 A.
- Filter 408 is attached to C-mount 401B.
- Camera 450B is attached to filter 408 via C-mount 401B.
- filter 408 is first attached to C- mount 401B, and the combination then attached to beam splitter 470. Then, camera 450B is connected to C-mount 401B.
- Each of C-mounts 401A and 401B may contain lenses internally to provide a (same) magnification as that provided by eyepiece 402.
- Arrow 460 represents illumination (light beam) from an illumination source (not shown, but similar to illuminator 110 of Figure 1).
- Objective 404 generates a magnified version of the scene/sample (placed on stage 405).
- Vertical line‘A’ in Figure 4 indicates the direction traversed by the magnified version of the sample before entering beam splitter 470.
- the magnified version is split into two beams (or two images) by beam splitter 470.
- the two beams traverse respective paths B l and B2.
- the beam on path B l impinges on camera 450A.
- the beam on path B2 traverses through filter 408, and then impinges on camera 450B.
- the implementation details of beam splitter 470 are described below with respect to the example of Figure 5.
- the user may view the sample via eyepiece 402, and bring the sample in focus. Then, the user clicks/snaps both cameras 450A and 450B simultaneously to obtain two images.
- a common snap mechanism may be implemented in a known way, just as noted above with respect to Figure 3.
- the user can capture a first image by clicking on one of cameras 450A and 450B, and then without disturbing the sample, capture a second image by clicking on the other one of cameras 450A and 450B.
- the image captured by camera 450A contains three responses, namely red (R), green (G) and blue (B), with the R, G and B responses being distinct images respectively representing the scene/sample in Red, Green and Blue wavelength bands, each of which bands may include multiple wavelengths.
- the image captured by camera 450B contains another three responses of the same scene, namely red-prime (R'), green-prime (G') and blue-prime (B'), with the R', G' and B' representing the scene/sample in red (R), green (G) and blue (B) bands, but now with the intensity in the R,G, and B bands being altered (filtered/distorted) by filter 108.
- the R', G' and B' responses are distinct from the R, G and B responses, even though the frequency bands in which they fall (namely red, green and blue) are the same. Therefore the six images obtained in bands R, G, B, R', G' and B' represent six distinct responses obtained from the same sample.
- FIG. 5 is a block diagram illustrating the implementation details of beam splitter 470 in an embodiment of the present disclosure. Beam splitter 470 is shown containing optical components 550, 560 and 570, which may be enclosed in an aluminum enclosure, not shown.
- Each of the three components 550, 560 and 570 may be made of a kind of glass called BK7, which has good transmission properties in the visible light wavelength region.
- the vertical lines A, B1 and B2 in Figure 5 correspond respectively to lines A, B 1 and B2 of Figure 4.
- Component 550 is made by fusing a right-angled prism and a square glass block.
- Component 560 is made of two right-angled prisms 560A and 560B fixed to each other as shown in the Figure.
- Component 570 is implemented as a right-angled prism.
- Component 560 is designed to split an incoming light beam (along path A) into two beams, and to send the two beams in opposite directions.
- an incoming light beam 501 impinges on edge 561, and is split into beams 502 and 503.
- Beam 503 is reflected at ninety degrees by edge 562, and continues as beam 504.
- Edge 551 of component 550 is designed to reflect beam 502 in the direction B1 towards camera 450A.
- Edge 571 of component 570 is designed to reflect beam 504 in the direction of B2 towards camera 450B.
- the length (distance) traversed by the two beams inside beam splitter 470 may need to be equal to ensure that the respective images captured in camera 450A and 450B have the same magnification. To ensure such equal path length LI of part 550 may be accordingly provided.
- Edge 552 of component 550 and edge 572 of component 570 may be viewed as first output port and second output port respectively of beam splitter 470.
- the cameras there are noted as containing RGB color sensors in general the sensors in the cameras can be implemented to generate A, B, and C responses simultaneously, in which A, B and C represent corresponding desired colors (or wavelengths or wavelength bands). Further, more than three color bands (e.g., A, B, C and D) could be generated. The number of color bands depends on the number of color filters (e.g., 4, 5, etc.) used in conjunction with the CMOS sensor in the cameras.
- the filters of Figures 1 through 4 may be viewed, in combination with the microscope there, as operating to generate a distorted version of a magnified image of a sample produced by the microscope.
- six responses are described as being generated and processed to form the hypercube, more than six responses can also be generated instead, and processed to form the hypercube.
- a duplex imaging microscope as described herein may be used to identify certain undesirable conditions or defects in a sample.
- the system can be used to compare the data in the hypercube representing the sample against spectral signatures (stored, for example in computing device 160) representing base characteristic of human cells.
- the base characteristics can be indicative of disease or health, for example.
- Computing device 160 can attempt to match portions/spectra in the hypercube with the spectral signatures (stored in computing device 160, and obtained in a known way) of diseased and healthy human cells.
- Computing device 160 can then classify the sample as either diseased or healthy.
- Computing device 160 can be implemented in various embodiments as a desired combination of one or more of hardware, executable modules, and firmware. The description is continued with respect to an embodiment in which various features are operative when the software instructions described above are executed.
- FIG. 6 is a block diagram illustrating the details of computing device 160, in an embodiment of the present disclosure.
- Computing device 160 may contain one or more processors such as a central processing unit (CPU) 610, random access memory (RAM) 620, secondary memory 630, graphics controller 660, display unit 670, camera interface 680, and input interface 690. All the components except display unit 670 may communicate with each other over communication path 650, which may contain several buses as is well known in the relevant arts. The components of Figure 6 are described below in further detail.
- CPU central processing unit
- RAM random access memory
- secondary memory 630 secondary memory
- graphics controller 660 graphics controller
- display unit 670 may communicate with each other over communication path 650, which may contain several buses as is well known in the relevant arts.
- the components of Figure 6 are described below in further detail.
- CPU 610 may execute instructions stored in RAM 620 to provide several features of the present disclosure.
- CPU 610 operates to generate a hypercube from the six responses received from camera(s) via camera interface 680.
- CPU 610 may contain multiple processing units, with each processing unit potentially being designed for a specific task. Alternatively, CPU 610 may contain only a single general-purpose processing unit.
- RAM 620 may receive instructions from secondary memory 630 using communication path 650.
- RAM 260 may store data representing a hypercube computed as noted above.
- RAM 620 is shown currently containing software instructions constituting shared environment 625 and applications 626.
- Shared environment 625 includes operating systems, device drivers, virtual machines, etc., which provide a (common) run time environment for execution of user programs 626.
- Graphics controller 660 generates display signals (e.g., in RGB format) to display unit 670 based on data/instructions received from CPU 610.
- Display unit 670 contains a display screen to display the images defined by the display signals.
- display unit 670 may display the RGB and R'G' B' images received from camera(s) as noted above.
- Input interface 690 may correspond to a keyboard and a pointing device (e.g., touch-pad, mouse) that may be used to provide appropriate inputs.
- Camera interface 680 receives images from a camera(s) in the corresponding duplex imaging microscope, and forwards the images to CPU 610 for processing. Camera interface 680 may also be used by CPU 610 to control various operations by the camera(s), such as remotely issuing commands to capture images.
- Secondary memory 630 represents a non-transitory computer readable medium and may contain hard drive 635, flash memory 636, and removable storage drive 637. Secondary memory 630 contains instructions (for example, instructions representing the application that generates a hypercube from six responses, as described above), which enable computing device 160 to provide several features in accordance with the present disclosure.
- the code/instructions stored in secondary memory 630 either may be copied to RAM 620 prior to execution by CPU 610 for higher execution speeds, or may be directly executed by CPU 610.
- removable storage unit 640 may be implemented using medium and storage format compatible with removable storage drive 637 such that removable storage drive 637 can read the data and instructions.
- removable storage unit 640 includes a computer readable (storage) medium having stored therein computer software and/or data.
- the computer (or machine, in general) readable medium can be in other forms (e.g., non-removable, random access, etc.).
- Non-volatile media includes, for example, optical disks, magnetic disks, or solid-state drives, such as storage memory 630.
- Volatile media includes dynamic memory, such as RAM 620.
- storage media include, for example, a floppy disk, a flexible disk, hard disk, solid-state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge.
- Storage media is distinct from but may be used in conjunction with transmission media.
- Transmission media participates in transferring information between storage media.
- transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise bus 650.
- transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.
Landscapes
- Physics & Mathematics (AREA)
- Spectroscopy & Molecular Physics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Optics & Photonics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Microscoopes, Condenser (AREA)
Abstract
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| IN201841033006 | 2018-09-03 | ||
| IN201841033006 | 2018-09-03 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2020049383A1 true WO2020049383A1 (fr) | 2020-03-12 |
Family
ID=69722274
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IB2019/056832 Ceased WO2020049383A1 (fr) | 2018-09-03 | 2019-08-12 | Microscope d'imagerie double |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2020049383A1 (fr) |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20090195646A1 (en) * | 2006-05-11 | 2009-08-06 | Michael Ganser | Microscope comprising a camera connection and a camera adapter |
| US9736365B2 (en) * | 2013-10-26 | 2017-08-15 | Light Labs Inc. | Zoom related methods and apparatus |
| US20180045569A1 (en) * | 2016-08-12 | 2018-02-15 | Spectral Insights Private Limited | Spectral imaging system |
-
2019
- 2019-08-12 WO PCT/IB2019/056832 patent/WO2020049383A1/fr not_active Ceased
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20090195646A1 (en) * | 2006-05-11 | 2009-08-06 | Michael Ganser | Microscope comprising a camera connection and a camera adapter |
| US9736365B2 (en) * | 2013-10-26 | 2017-08-15 | Light Labs Inc. | Zoom related methods and apparatus |
| US20180045569A1 (en) * | 2016-08-12 | 2018-02-15 | Spectral Insights Private Limited | Spectral imaging system |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Luo et al. | Synthetic aperture-based on-chip microscopy | |
| US9245317B2 (en) | Optically enhanced digital imaging system | |
| US8760756B2 (en) | Automated scanning cytometry using chromatic aberration for multiplanar image acquisition | |
| JP5806504B2 (ja) | 撮像装置およびこれを備える顕微鏡システム | |
| JP5452180B2 (ja) | 顕微鏡装置 | |
| WO2019230878A1 (fr) | Dispositif d'observation de fluorescence et procédé d'observation de fluorescence | |
| JP2008295084A (ja) | 顕微鏡用電子カメラ | |
| KR102259841B1 (ko) | 고배율 이미지가 저배율 이미지에 의해 가이드되는 디지털 현미경 및 디지털 현미경 시스템 | |
| US20230258918A1 (en) | Digital microscope with artificial intelligence based imaging | |
| US20120140057A1 (en) | Microscope for Measuring Total Reflection Fluorescence | |
| JP2012052921A (ja) | 撮像システム | |
| JP2002271657A (ja) | 多画面分光撮影装置 | |
| CN103168265A (zh) | 成像系统和其关联的方法 | |
| US20190107436A1 (en) | Spectroscopic detection device, and adjustment method for detection target wavelength range | |
| Katz et al. | Improved multi-resolution foveated laparoscope with real-time digital transverse chromatic correction | |
| Du et al. | Optical design and fabrication of a three-channel common-aperture multispectral polarization camera | |
| Abdo et al. | Dual-mode pushbroom hyperspectral imaging using active system components and feed-forward compensation | |
| Hu et al. | High zoom ratio foveated snapshot hyperspectral imaging for fruit pest monitoring | |
| WO2020049383A1 (fr) | Microscope d'imagerie double | |
| US12072483B2 (en) | Method for producing images from a medical device with split images on common image sensors | |
| JP3007540B2 (ja) | 画像表示装置 | |
| Lindner et al. | Rapid microscopy measurement of very large spectral images | |
| JP2016224368A (ja) | 撮像装置、及び、撮像システム | |
| CN222259738U (zh) | 消零级衍射的全息光镊装置 | |
| EP3961285A1 (fr) | Microscope, système de microscope et procédé d'imagerie d'un objet à l'aide d'un microscope |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19858022 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 19858022 Country of ref document: EP Kind code of ref document: A1 |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 19858022 Country of ref document: EP Kind code of ref document: A1 |
|
| 32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC DATED 02.11.2022 (EPO FORM 1205A) |