WO2024238875A2 - Système et procédé d'interférométrie à angle balayé - Google Patents
Système et procédé d'interférométrie à angle balayé Download PDFInfo
- Publication number
- WO2024238875A2 WO2024238875A2 PCT/US2024/029827 US2024029827W WO2024238875A2 WO 2024238875 A2 WO2024238875 A2 WO 2024238875A2 US 2024029827 W US2024029827 W US 2024029827W WO 2024238875 A2 WO2024238875 A2 WO 2024238875A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- target
- lens
- beams
- steering device
- wavelengths
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B9/00—Measuring instruments characterised by the use of optical techniques
- G01B9/02—Interferometers
- G01B9/02001—Interferometers characterised by controlling or generating intrinsic radiation properties
- G01B9/02007—Two or more frequencies or sources used for interferometric measurement
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B9/00—Measuring instruments characterised by the use of optical techniques
- G01B9/02—Interferometers
- G01B9/02001—Interferometers characterised by controlling or generating intrinsic radiation properties
- G01B9/0201—Interferometers characterised by controlling or generating intrinsic radiation properties using temporal phase variation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B9/00—Measuring instruments characterised by the use of optical techniques
- G01B9/02—Interferometers
- G01B9/02041—Interferometers characterised by particular imaging or detection techniques
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B9/00—Measuring instruments characterised by the use of optical techniques
- G01B9/02—Interferometers
- G01B9/02083—Interferometers characterised by particular signal processing and presentation
- G01B9/02087—Combining two or more images of the same region
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B2290/00—Aspects of interferometers not specifically covered by any group under G01B9/02
- G01B2290/65—Spatial scanning object beam
Definitions
- This disclosure relates generally to interferometry and, in non-limiting embodiments, to a system and method for swept-angle interferometry.
- Depth sensing is among core problems of computer vision and computational imaging, with widespread applications in medicine, industry, and robotics.
- An array of techniques is available for acquiring depth maps of three- dimensional (3D) scenes at different scales.
- micrometer-resolution depth sensing is important in biomedical imaging because biological features are often micron-scale, industrial fabrication and inspection of critical parts that must conform to their specifications, and robotics to handle fine objects.
- Active illumination depth sensing techniques such as lidar, structured light, and correlation time-of-flight (ToF) cannot provide micrometer axial resolution.
- a system for swept-angle interferometry comprising: a light source configured to emit at least one beam having at least two wavelengths; a beam steering device configured to steer the at least one beam across at least one axis of motion; a lens arranged in an optical path between the beam steering device and a target of the at least one beam, the lens having a focal plane; and a relay lens arranged in the optical path between the beam steering device and the lens.
- the system further comprises: a collimator arranged in the optical path between the light source and the beam steering device, the collimator configured to narrow the at least one beam.
- the system further comprises: at least one computing device programmed or configured to: control the beam steering device to steer the at least one beam across two axes of motion through the relay lens.
- steering the at least one beam across the two axes of motion through the relay lens comprises steering the at least one beam to be incident upon the relay lens at different angles such that the at least one beam is focused on different portions of the focal plane.
- the system further comprises: a beam splitter arranged between the lens and the target, the beam splitter configured to: (i) split the at least one beam into a first plurality of beams and a second plurality of beams, (ii) direct the first plurality of beams to a reference mirror, (iii) direct the second plurality of beams to the target, and (iv) direct illumination reflected from the reference mirror and illumination reflected from the target to a sensor.
- a beam splitter arranged between the lens and the target, the beam splitter configured to: (i) split the at least one beam into a first plurality of beams and a second plurality of beams, (ii) direct the first plurality of beams to a reference mirror, (iii) direct the second plurality of beams to the target, and (iv) direct illumination reflected from the reference mirror and illumination reflected from the target to a sensor.
- the system further comprises: at least one computing device programmed or configured to: receive sensor data from the sensor, the sensor data representing the illumination reflected by the target and the illumination reflected by the reference mirror; and determine, based on the sensor data, a depth of at least one surface feature of the target.
- the at least one beam comprises: a plurality of beams having different wavelengths comprising the at least two wavelengths, a beam having a continuum of wavelengths comprising the at least two wavelengths, or any combination thereof.
- a method for swept-angle interferometry comprising: controlling a light source to emit at least one beam having at least two wavelengths toward a target; controlling a beam steering device to steer the at least one beam across at least one axis of motion within a focal plane of a lens arranged in an optical path between the beam steering device and the target; and determining, with at least one processor, a depth of at least one surface feature of the target based on illumination reflected by the target from the at least one beam.
- the method further comprises arranging a relay lens in the optical path between the beam steering device and the lens.
- controlling the beam steering device comprises steering the at least one beam across two axes of motion through the relay lens.
- the at least one beam comprises: a plurality of beams having different wavelengths comprising the at least two wavelengths, a beam having a continuum of wavelengths comprising the at least two wavelengths, or any combination thereof.
- the method further comprises arranging a collimator in the optical path between the light source and the beam steering device, the collimator configured to narrow the at least one beam.
- the method further comprises arranging a beam splitter between the lens and the target, the beam splitter configured to: (i) split the at least one beam into a first plurality of beams and a second plurality of beams, (ii) direct the first plurality of beams to a reference mirror, (iii) direct the second plurality of beams to the target, and (iv) direct illumination reflected from the reference mirror and the illumination reflected from the target to a sensor.
- a system for swept-angle interferometry comprising: a light source configured to emit at least one beam having at least two wavelengths toward a target; a beam steering device; and at least one processor configured to control the beam steering device to steer the at least one beam across at least two axes of motion through at least one lens arranged in an optical path between the light source and the target such that the at least one beam is incident upon the at least one lens at angles that differ during the motion.
- the system further comprises: at least one sensor configured to capture an illumination reflected by the target from the at least one beam, the at least one processor is configured to determine a depth of at least one surface feature of the target based on the illumination captured by the at least one sensor.
- the at least one processor comprises a first processor configured to steer the beam steering device and a second processor configured to determine the depth of the at least one surface feature.
- the system further comprises the at least one lens, the at least one lens comprising: a first lens arranged in an optical path between the beam steering device and the target of the at least one beam, the lens having a focal plane; and a relay lens arranged in the optical path between the beam steering device and the first lens, the at least one beam is incident upon the relay lens at angles that differ during the motion such that the at least one beam is focused on different portions of the focal plane of the first lens.
- the system further comprises: a collimator arranged in an optical path between the light source and the beam steering device, the collimator configured to narrow the at least one beam.
- the system further comprises: a beam splitter arranged between the at least one lens and the target, the beam splitter configured to: (i) split a plurality of beams into a first plurality of beams and a second plurality of beams, (ii) direct the first plurality of beams to a reference mirror, (iii) direct the second plurality of beams to the target, and (iv) direct illumination reflected from the reference mirror and the illumination reflected from the target to at least one sensor.
- a system for swept-angle interferometry comprising: a light source configured to emit at least one beam having at least two wavelengths; a beam steering device configured to steer the at least one beam across at least one axis of motion; a lens arranged in an optical path between the beam steering device and a target of the at least one beam, the lens having a focal plane; and a relay lens arranged in the optical path between the beam steering device and the lens.
- Clause 2 The system of clause 1 , further comprising: a collimator arranged in the optical path between the light source and the beam steering device, the collimator configured to narrow the at least one beam.
- Clause 3 The system of clause 1 or 2, further comprising: at least one computing device programmed or configured to: control the beam steering device to steer the at least one beam across two axes of motion through the relay lens.
- Clause 4 The system of any of clauses 1 -3, wherein steering the at least one beam across the two axes of motion through the relay lens comprises steering the at least one beam to be incident upon the relay lens at different angles such that the at least one beam is focused on different portions of the focal plane.
- Clause 5 The system of any of clauses 1 -4, further comprising: a beam splitter arranged between the lens and the target, the beam splitter configured to: (i) split the at least one beam into a first plurality of beams and a second plurality of beams, (ii) direct the first plurality of beams to a reference mirror, (iii) direct the second plurality of beams to the target, and (iv) direct illumination reflected from the reference mirror and illumination reflected from the target to a sensor.
- a beam splitter arranged between the lens and the target, the beam splitter configured to: (i) split the at least one beam into a first plurality of beams and a second plurality of beams, (ii) direct the first plurality of beams to a reference mirror, (iii) direct the second plurality of beams to the target, and (iv) direct illumination reflected from the reference mirror and illumination reflected from the target to a sensor.
- Clause 6 The system of any of clauses 1 -5, further comprising: at least one computing device programmed or configured to: receive sensor data from the sensor, the sensor data representing the illumination reflected by the target and the illumination reflected by the reference mirror; and determine, based on the sensor data, a depth of at least one surface feature of the target.
- Clause 7 The system of any of clauses 1 -6, wherein the at least one beam comprises: a plurality of beams having different wavelengths comprising the at least two wavelengths, a beam having a continuum of wavelengths comprising the at least two wavelengths, or any combination thereof.
- a method for swept-angle interferometry comprising: controlling a light source to emit at least one beam having at least two wavelengths toward a target; controlling a beam steering device to steer the at least one beam across at least one axis of motion within a focal plane of a lens arranged in an optical path between the beam steering device and the target; and determining, with at least one processor, a depth of at least one surface feature of the target based on illumination reflected by the target from the at least one beam.
- Clause 9 The method of clause 8, further comprising arranging a relay lens in the optical path between the beam steering device and the lens.
- Clause 10 The method of clause 8 or 9, wherein controlling the beam steering device comprises steering the at least one beam across two axes of motion through the relay lens.
- Clause 1 1 The method of any of clauses 8-10, wherein the at least one beam comprises: a plurality of beams having different wavelengths comprising the at least two wavelengths, a beam having a continuum of wavelengths comprising the at least two wavelengths, or any combination thereof.
- Clause 12 The method of any of clauses 8-11 , further comprising arranging a collimator in the optical path between the light source and the beam steering device, the collimator configured to narrow the at least one beam.
- Clause 13 The method of any of clauses 8-12, further comprising arranging a beam splitter between the lens and the target, the beam splitter configured to: (i) split the at least one beam into a first plurality of beams and a second plurality of beams, (ii) direct the first plurality of beams to a reference mirror, (iii) direct the second plurality of beams to the target, and (iv) direct illumination reflected from the reference mirror and the illumination reflected from the target to a sensor.
- Clause 14 The method of any of clauses 8-13, further comprising receiving sensor data from the sensor, the sensor data representing the illumination reflected by the target and the illumination reflected by the reference mirror, wherein the depth of the at least one surface feature is determined based on the sensor data.
- Clause 16 The system of clause 15, further comprising: at least one sensor configured to capture an illumination reflected by the target from the at least one beam, wherein the at least one processor is configured to determine a depth of at least one surface feature of the target based on the illumination captured by the at least one sensor.
- Clause 17 The system of clause 15 or 16, wherein the at least one processor comprises a first processor configured to steer the beam steering device and a second processor configured to determine the depth of the at least one surface feature.
- Clause 18 The system of any of clauses 15-17, further comprising the at least one lens, the at least one lens comprising: a first lens arranged in an optical path between the beam steering device and the target of the at least one beam, the lens having a focal plane; and a relay lens arranged in the optical path between the beam steering device and the first lens, wherein the at least one beam is incident upon the relay lens at angles that differ during the motion such that the at least one beam is focused on different portions of the focal plane of the first lens.
- Clause 19 The system of any of clauses 15-18, further comprising: a collimator arranged in an optical path between the light source and the beam steering device, the collimator configured to narrow the at least one beam.
- Clause 20 The system of any of clauses 15-19, further comprising: a beam splitter arranged between the at least one lens and the target, the beam splitter configured to: (i) split a plurality of beams into a first plurality of beams and a second plurality of beams, (ii) direct the first plurality of beams to a reference mirror, (iii) direct the second plurality of beams to the target, and (iv) direct illumination reflected from the reference mirror and the illumination reflected from the target to at least one sensor.
- a beam splitter arranged between the at least one lens and the target, the beam splitter configured to: (i) split a plurality of beams into a first plurality of beams and a second plurality of beams, (ii) direct the first plurality of beams to a reference mirror, (iii) direct the second plurality of beams to the target, and (iv) direct illumination reflected from the reference mirror and the illumination reflected from the target to at least one sensor.
- FIG. 1 is a schematic diagram of a system for swept-angle interferometry according to non-limiting embodiments
- FIG. 2 is a schematic diagram of an image processing pipeline in a system for swept-angle interferometry according to non-limiting embodiments
- FIG. 3 is a flow diagram of a method for swept-angle interferometry according to non-limiting embodiments.
- FIG. 4 illustrates example components of a computing device used in connection with non-limiting embodiments.
- the terms “communication” and “communicate” refer to the receipt or transfer of one or more signals, messages, commands, or other type of data.
- one unit e.g., any device, system, or component thereof
- to be in communication with another unit means that the one unit is able to directly or indirectly receive data from and/or transmit data to the other unit. This may refer to a direct or indirect connection that is wired and/or wireless in nature.
- two units may be in communication with each other even though the data transmitted may be modified, processed, relayed, and/or routed between the first and second unit.
- a first unit may be in communication with a second unit even though the first unit passively receives data and does not actively transmit data to the second unit.
- a first unit may be in communication with a second unit if an intermediary unit processes data from one unit and transmits processed data to the second unit. It will be appreciated that numerous other arrangements are possible.
- Non-limiting embodiments described herein provide for a new and innovative imaging system and method that may be referred to as swept-angle interferometry or swept-angle synthetic wavelength interferometry.
- Non-limiting embodiments provide for full-field, micron-scale 3D sensing.
- Non-limiting embodiments provide an unconventional light source that, by emulating spatially- incoherent illumination, results in interferometric measurements insensitive to global illumination.
- Non-limiting embodiments advantageously provide a faster, more efficient way to retrieve robust, detailed depth results that can recover full-frame depth at a spatial and axial resolution of a few micrometers using a limited number of measurements (e.g., 16 measurements in some examples), resulting in fast acquisition at frame rates of 1 Hz.
- non-limiting embodiments may produce detailed measurements from data captured at orders of magnitude faster, and while operating at strong ambient light, than a resource-intensive full-field optical coherence tomography system.
- Non-limiting embodiments described herein utilize components of a Michelson interferometer in combination with a new and novel arrangement of devices and processes.
- a Michelson interferometer a beam splitter is used to divide collimated input illumination into two beams: one that propagates toward the scene arm, and another propagates toward the reference arm (e.g., such as a planar mirror mounted on a translation stage that can vary the mirror’s distance from the beam splitter). After reflection, the two light beams recombine at the beam splitter and propagate toward the sensor.
- the variables / and d(x) represent the distance from the beam splitter of the reference mirror and the scene point that pixel x images, respectively. Since I is a controllable parameter, it may be denoted explicitly.
- the values of may represent complex fields arriving at sensor pixel xfrom the reference and scene arms, respectively. Then, the sensor measures as follows: (Equation 1 )
- Equation 1 The first two terms in Equation 1 correspond to the intensities the sensor would measure if it were observing each of the two arms separately. The sum may be referred to as an interference-free image (/).
- the third term, called interference, relates to the complex correlation C between the reflected scene and reference fields.
- This illumination is assumed to be injected into the interferometer as a collimated beam (e.g., in some examples, created by placing the outputs of two fiber-coupled single-frequency lasers at the focal plane of a lens).
- the correlation equals: (Equation 2)
- Equation 1 The interference component of the camera intensity measurements in Equation 1 are as follows: (Equations 3 and 4)
- the interference is the product of two sinusoids: first, a carrier sinusoid with carrier wavelength and corresponding carrier wavenumber second, an envelope sinusoid with synthetic wavelength and corresponding synthetic wavenumber (Equation 5)
- SWI encodes scene depth d(x) in the phase of the envelope sinusoid.
- SWI provides depth measurements at intervals of and may not disambiguate between depths differing by an integer multiple of
- the use of two wavelengths makes it possible to control the unambiguous depth range because, by decreasing the separation between the two emitted wavelengths, the unambiguous depth range is increased at the cost of decreasing depth resolution.
- Full-field interferometers use free-space optics (e.g., lenses, beam splitters, and/or the like) to illuminate and image the entire field of view in the scene and reference arms.
- Full-field interferometers also use a two-dimensional sensor to measure interference at all locations x. This enables fast interference measurements for all scene points at once and at lateral resolutions as high as the pixel pitch of the sensor.
- full-field interferometers are susceptible to phase corruption effects. Equation 5 assumes that the scene field is due to only the direct light path, which produces a sinusoidal envelope with phase delay d(x). In practice, the scene field will include contributions from additional paths. First, indirect paths due to subsurface scattering may contribute. Second, stray paths due to optical aberrations may contribute. These paths have different lengths, and thus contribute to the envelope sinusoidal terms of different phases. The summation of the paths produces an overall sinusoidal envelope with phase ( ) resulting in incorrect depth estimation.
- free-space optics e.g., lenses, beam split
- Scanning interferometers use fiber optics (e.g., couplers, circulators, collimators, and/or the like) to generate a focused beam that illuminates only one point in the scene and reference arms. Additionally, scanning interferometers use a singlepixel sensor, focused at the same point. Scanning interferometers also use steering optics (e.g., micro-electro-mechanical systems (MEMS) mirrors) to scan the focus point and capture interference measurements for the entire scene. Scanning interferometers may effectively mitigate some phase corruption effects because, at any given time, they only illuminate and image one point in the scene. Such scanners eliminate contributions from indirect paths.
- fiber optics e.g., couplers, circulators, collimators, and/or the like
- MEMS micro-electro-mechanical systems
- scanning interferometers may eliminate stray paths. This robustness comes at the cost of having to use beam steering to scan the entire scene, resulting in long acquisition times, especially when it is necessary to measure depth at pixel-level lateral resolutions and at a sensor-equivalent field of view.
- FIG. 1 depicts a system 1000 for swept-angle interferometry according to non-limiting embodiments or aspects.
- the system 1000 may be used for determining the depth of surface features of a target 116 (e.g., scene), such as one or more objects or portions thereof.
- the system 1000 includes a light source 100, which may be a plurality of lasers or any other light-emitting device capable of outputting at least one light beam having at least two wavelengths.
- the light source 100 may emit one or more beams of light having different wavelengths such that they can be differentiated.
- the light source 100 may emit at least one beam having at least two wavelengths.
- the emitted at least one beam may be a plurality of beams having different wavelengths.
- the emitted at least one beam may include a beam or multiple beams having a continuum of wavelengths along a spectrum.
- a coupler 1 10 is arranged in an optical path between the light source 100 and the target 1 16.
- a collimator 1 12 is arranged in the optical path between the coupler 1 10 and the target 1 16 and may narrow the plurality of beams into a collimated beam.
- the outputs of two fiber- coupled single-frequency lasers may be placed at the focal plane of the lens 106. It will be appreciated that various arrangements may be used to direct at least one beam from the light source 100 to the beam steering device 104.
- Extending an emission area of a light source also broadens its emission spectrum. This can make spatially-incoherent light sources incompatible with SWI that may rely on narrow-linewidth dichromatic illumination.
- the light source 100 and beam steering device 104 are combined to result in an area emitter, as opposed to a point emitter, that changes the illumination from a single collimated beam parallel to the optical axis to the superposition of beams traveling along directions offset from the optical axis by angles where depends on the emission area and lens focal length.
- the complex fields resulting from each such beam reflect on the target as Because different points on the area emitter are incoherent with each other, the following equation can be applied: (Equation s)
- the image I measured using an area emitter equals the sum of the images that would be measured using independent point emitters spanning the emission area, each producing illumination at an angle 0 (and respectively for correlation).
- the system 1000 may be configured to emulate a spatially-incoherent source suitable for SWI through time-division multiplexing through the use of the beam steering device 104.
- the system 1000 shown in FIG. 1 includes a beam steering device 104 configured to move along at least two axes.
- the beam steering device 104 may be moved by one or more motors controlled by one or more computing devices (not shown in FIG. 1 ).
- the beam steering device 104 may be arranged in the optical path between the collimator 1 12 and the target 1 16.
- the beam steering device 104 may include a galvo mirror (e.g., mirror galvanometer) that steers a plurality of beams that are formed into a collimated beam of narrow-linewidth dichromatic illumination.
- a relay lens 102 is arranged in the optical path between the beam steering device 104 and the target 1 16.
- Another lens 106 e.g., an illumination lens
- the beam steering device 104 may steer the light beams through different portions of the relay lens 102 such that the plurality of beams are incident upon the relay lens 102 at differing angles. For example, as the beam steering device 104 is controlled to move in a first axis and/or a second axis, the beams of light being reflected by the beam steering device move with respect to the stationary relay lens 102 and therefore the angles at which the beams hit the relay lens 102 also change.
- the lenses 102, 106 may be arranged such that the beams exiting the relay lens 102 are focused on a focal plane 105 of the lens and the beams are moved over an area of the focal plane 105.
- the beam steering device 104 may move the direction of the collimated beam and, as the beam moves, the focus point of the beam may scan an area on the focal plane 105 in x- and y-directions.
- the system 1000 further includes a beam splitter 1 12 arranged in an optical path between the lens 106 and the target 1 16.
- the beam splitter 1 12 may be arranged to split the plurality of beams into a first plurality of beams and a second plurality of beams.
- the first plurality of beams may be directed by the beam splitter 1 12 to a reference mirror 1 18, which redirects the first plurality of beams to a sensor 114.
- the second plurality of beams may be directed by the beam splitter 1 12 to the target 1 16.
- the illumination reflected by the target 1 16 as a result of the second plurality of light beams may be further redirected by the beam splitter 1 12 to the sensor 1 14.
- the sensor 114 receives the illumination reflected from the target 116 as a result of the first plurality of beams in addition to receiving the second plurality of beams from the reference mirror 1 18.
- the beam splitter may include a thin 50:50 plate beam splitter, which may reduce fringes as compared to pellicle and cube beam splitters.
- the beam splitter 1 12 may be misaligned by design to avoid interreflections that would result in strong fringes.
- the reference mirror 1 18 may be have a guaranteed A/4 flatness to ensure a uniform phase reference throughout the field of view of the sensor 1 14.
- the reference mirror 1 18 may be configured on a translation stage.
- the translation stage may be a Newport ultra-precision motorized linear translation stage with a positioning accuracy of up to 10 nm and minimum incremental motion of 1 nm.
- the senor 1 14 includes a camera and camera lens. Because the scenes are sized at the order of 1 inch in some non-limiting embodiments, using a camera lens that achieves high magnifications (e.g., a 1 :1 reproduction ratio) provides for improved performance and allows for better contrast due to lower averaging of speckle (e.g., where the interference signal is convolved with the pixel box when captured with the camera).
- the camera lens may include, for example, a 180 mm Canon prime macro lens arranged in front of the camera.
- the camera may be a machine vision camera (e.g., from Allied Vision, as an example) with a high-sensitivity CCD sensor of 8 MP resolution, and pixel size 3.5 pm.
- a sensor 1 14 with a small pixel pitch averages interference speckle over a smaller spatial area, therefore allowing for the system to resolve finer lateral detail.
- a camera may be modified by removing the protective glass to avoid spurious interreflections caused by the protective glass.
- one or more computing devices 120 may receive the sensor data captured by the sensor 114 and process the sensor data from both the second plurality of light beams (e.g., illumination from reference light beams representative of the light beams before being reflected by the target 116) and the illumination resulting from the first plurality of light beams reflecting from the target 116.
- the computing device 120 may process the sensor data with one or more interferometric algorithms to determine a depth of one or more surface features of the target 116.
- the computing device 120 may generate a full or partial three-dimensional model of the target or a portion thereof.
- a nanometer-accuracy translation process (e.g., using a translation stage) may be implemented to vary the location I of the reference mirror 118.
- the envelope phase is estimated using an N-shift phase retrieval algorithm. It is assumed that estimates of the envelope (x, l n ) are established at reference locations l n corresponding to shifts by fractions of the synthetic wavelength
- the envelope phase may be estimated as: (Equation 7) and the depth may be estimated (up to an integer multiple of as: (Equation 8)
- Equation 11 for determining the envelope may estimate the envelope only up to sign, although it will be appreciated that other variations are possible in non-limiting embodiments.
- the light source 100 may include two distributed Bragg-reflector lasers (having wavelengths 780 nm and 781 nm, power 45 mW, linewidth 1 MHz).
- the lenses 102, 106 may be two compound macro lenses (having a focal length of 200 mm).
- the sensor 1 14 may be a charge-coupled device (CCD) sensor (having a pixel pitch of 3.7 pm and a resolution of 3400 x 2700 pixels.
- CCD charge-coupled device
- the reproduction ratio may be 1 :1
- the field of view may be 12.5 mm x 10 mm
- the working distance may be 400 mm.
- the unambiguous depth range may be approximately 500 pm.
- a minimum per-image exposure time may be 10 ms, resulting at a frame rate of 5 Hz.
- the beam steering device 104 may include two fast-rotating mirrors to scan the laser beam in a 1 ° x 1 ° angular pattern at slightly separated kHz frequencies.
- a relay lens 102 e.g., a 35 mm Nikon prime lens in nonlimiting examples
- the created light source is a dense Lissajous curve that approximates a square.
- the illumination lens 106 may include a 200 mm Nikon prime lens.
- the illumination lens 106 may be a photographic lens, which may perform superior to AR-coated achromatic doublets in terms of spherical and chromatic aberration, therefore resulting in significantly lesser distortion in the generated wavefront.
- near-infrared single frequency tunable laser diodes may be used as the light source 100. Such laser diodes may be tunable in wavelength by adjusting either operating current or temperature of the diode.
- the operating current of one laser diode may be modulated with a square waveform, thus creating two time- multiplexed wavelengths.
- two different laser diodes may be used that are selected at the appropriate central wavelengths.
- the light source 100 may be monochromatic (single longitudinal mode), stable in wavelength and power, and accurately tunable.
- the synthetic wavelength resulting from this illumination is sensitive to the separation between the two wavelengths, especially at microscopic scales.
- the actual synthetic wavelength may be estimated by measuring the envelope sinusoid for a planar diffuser scene at a dense collection of reference arm positions and then fitting a sinusoid to these measured magnitudes and using the fit wavelength as the synthetic wavelength.
- the phase estimation pipeline 2000 may represent an algorithm that can be referred to as an ⁇ M, N ⁇ -shift phase algorithm, where the parameters M and N are design parameters that can be fine-tuned and/or adjusted to affect countering results of acquisition time and depth accuracy (e.g., adjusting the trade-off between speed and accuracy).
- the example shown in FIG. 2 is a ⁇ 4, 4 ⁇ -shift phase algorithm, where the values of M and N are both 4. As the values of M and N are increased, the number M * N of total images to capture also increases. At the same time, the final depth estimate becomes more robust to noise.
- the calculable minimum number of images may be achieved using ⁇ 3, 3 ⁇ shifts, corresponding to 9 images.
- different shifts may be used. For example, performing ⁇ 4, 4 ⁇ shifts, corresponding to 16 images, as shown in the example phase estimation pipeline 200 may provide a robust performance across a variety of target objects (e.g., scenes).
- input images 202 may represent 16 reference positions in a ⁇ 4, 4 ⁇ -shift phase configuration, where the reference positions correspond to 4 synthetic by 4 carrier subwavelength shifts.
- an interference-free image 204 may be estimated for each synthetic subwavelength shift.
- an envelope image 206 may be estimated for each synthetic subwavelength shift.
- each envelope image 206 may be denoised using, for example, joint bilateral filtering, resulting in denoised envelope images 208.
- a 4-shift phase retrieval is used to estimate the envelope phase 210 and depth 212.
- interference in non-specular scenes may be in the form of speckle, a high-frequency pseudo-random pattern. This may be shown in input images (e.g., images 202 from FIG. 2). Speckle may result in a noisy envelope, which results in noise and inaccuracy in phase and depth estimates (e.g., 210, 212).
- the use of swept-angle illumination, as discussed with respect to FIG. 1 mitigates the effects of speckle. Speckle may be further reduced by denoising the estimated quantities with a low-pass filter (e.g., Gaussian). Additionally or alternatively, bilateral filtering with a guide image of the scanned scene captured under ambient light may be used to avoid blurring image details.
- the envelope estimates are blurred before being processed to determine the final phase and depth estimates, which results in improved results compared to blurring the final phase and depth estimates.
- the optical set-up of the system 1000 can be aligned to optimize the depth estimation accuracy.
- the light source 100, collimator 1 12, beam steering device 104, and/or relay lens 102 may be arranged on one or more rigid cage systems.
- the mirrors of the beam may be tuned electronically by adjusting their driving waveform’s direct current (DC) offset, ensuring a mean direction of light propagation that is parallel to the optical axis of the interferometer.
- DC direct current
- one or more software functions may be executed to reconstruct depth from ⁇ 4, 4 ⁇ -shift swept-angle synthetic wavelength interferometry frames.
- interference-free images may be obtained at each four- bucket position by averaging images captured with sub-wavelength shifts.
- interference images at each four-bucket position may be obtained by subtracting interference-free images from the full images.
- the absolute values of the envelope may be estimated by squaring and adding interference images.
- the estimated envelope may be filtered with bilateral filtering using an ambient light image of the scene as the guide image.
- the four-bucket phase retrieval algorithm may then be applied to estimate phase, which may be converted to depth.
- the following algorithm may be used for acquisition using a ⁇ M, N ⁇ -shift phase retrieval algorithm:
- the following algorithm may be used for reconstruction using a ⁇ M, N ⁇ -shift phase retrieval algorithm:
- a method for swept-angle interferometry is shown according to non-limiting embodiments or aspects. It will be appreciated that the order of the steps shown in FIG. 3 is for illustrative purposes only and that non-limiting embodiments may involve more steps, fewer steps, different steps, and/or a different order of steps.
- a light source is controlled to emit at least one beam having at least two wavelengths. For example, a plurality of lasers may be emitted where each laser has a different wavelength. In other examples, one or more laser beams may be emitted that output a continuum of wavelengths along a spectrum.
- a beam steering device may be controlled to steer the at least one beam across two axes of motion within a focal plane of a lens (e.g., an illumination lens in the optical path of the light beams).
- the beam steering device may include a mirror galvanometer controlled with a computing device.
- the beam steering device may move the at least one beam through a relay lens arranged in the optical path such that the beam is incident upon the relay lens at different angles. For example, with a stationary relay lens and a moving light beam, the angle that the beam hits the relay lens will change as the position of the beam moves across the relay lens.
- a computing device may receive sensor data from a sensor arranged to receive illumination reflected by the target (e.g., illumination from the light beams reflecting off of the target) and a reference mirror that reflects one or more split beams (e.g., reference light beams).
- the computing device may then execute a phase retrieval pipeline (e.g., phase retrieval algorithm) at steps 306-314 based on the sensor data.
- phase retrieval pipeline e.g., phase retrieval algorithm
- intensity measurements may be captured for multiple reference positions. For example, for M synthetic subwavelength shifts and N carrier subwavelength shifts, intensity measurements may be taken at M*N positions.
- estimated interference-free images and estimated envelope images are determined.
- the interference-free images may be determined using Equation 10 and the envelope images may be determined using Equation 1 1.
- Equation 10 the envelope images may be determined using Equation 1 1.
- the envelope images may be denoised.
- denoising may be performed using joint bilateral filtering, although it will be appreciated that other denoising techniques may be used.
- the estimated envelope phase may be determined using Equation 7 and the estimated depth may be determined using Equation 8. It will be appreciated that variations of the example equations may be used to estimate the phase and depth.
- phase unwrapping algorithms may be used in connection with the systems and methods described herein. Such phase unwrapping may include capturing measurements at multiple synthetic wavelengths, and using the captured measurements to unwrap the phase estimate.
- non-limiting embodiments are robust to strong ambient lighting through the use of near-monochromatic illumination, even where the ambient lighting presents 10% signal-to-background ratios. Further, nonlimiting embodiments may be adjusted per implementation and/or in an ad hoc manner by adjusting the values of M and N parameters, thereby adjusting acquisition time and reconstruction quality.
- this translates to the square root of 1875, about 43 points.
- the captured images may have the approximate dimension of 1600 x 1300. Distributing these points equally along the larger dimension yields a downsampling factor of 1600/43 « 35. This calculation is beneficial for two reasons. First, the typical scan rate will be lower than the nominal scan rate of 30 kHz because of the need to scan a larger field of view and/or the inability to drive both axes at resonant mode. Second, for scenes with high reflectivity (e.g., metallic scenes), some non-limiting embodiments may operate at 10 Hz, and thus the number of scanned points should be 10x fewer.
- a laser beam that can be collimated or focused at a few micrometers may be used to achieve micrometer lateral resolutions.
- some non-limiting implementations may use a beam that is focused at each point on the scene. However, focusing the laser beam onto the scanned scene points and sharply decreases the depth of field of the imaging system. Thus, such implementations may utilize another axial scan to ensure that the scanned post is within the depth of field, which adds to acquisition time.
- SWI does not need to include the performance of lateral scanning. Instead, it accomplishes direct-only (e.g., coaxial) imaging by scanning an area in the focal plane of the collimating lens, an operation that can be done in the resonant mode of a MEMS mirror within exposure.
- direct-only imaging e.g., coaxial
- using higher values of M and N in an shift algorithm allows for a reduction in the per-image acquisition time by requiring a lower scanned source density for equal visual depth quality. For example, a 100 ms scan with the ⁇ 4, 5 ⁇ -sh ift algorithm performs as well as a 10 ms scan with a ⁇ 5, 5 ⁇ -sh ift algorithm, reducing the total acquisition time from 2 s to 250 ms.
- the depth range may be tuned based on the current used in an implementation.
- the use of two wavelengths in synthetic wavelength interferometry makes it possible to control the unambiguous depth range: By decreasing the separation KE between the two laser wavelengths, the unambiguous depth range is increased at the cost of decreasing depth resolution.
- picometer separations in wavelengths result in synthetic wavelengths of centimeters.
- the use of swept-angle illumination greatly improves reconstruction quality.
- Achieving picometer-scale wavelength separation may be achieved using current-based tuning of the wavelength of distributed Bragg reflector laser (DBR) lasers having a linear response of wavelength to current near their operating point.
- DBR distributed Bragg reflector laser
- the current may be tuned by, for example, 50 mA to provide picometer-scale wavelength separations.
- ambient light that may degrade a depth calculation may be rejected by using an ultra-narrow spectral filter centered at the average illumination wavelength.
- Non-limiting embodiments described herein may be used for medical OCT imaging, surface detection, and/or any other application in which interferometry can be used to sense features of an object, scene, person, or the like.
- device 900 may include additional components, fewer components, different components, or differently arranged components than those shown in FIG. 4.
- Device 900 may include a bus 902, a processor 904, memory 906, a storage component 908, an input component 910, an output component 912, and a communication interface 914.
- Bus 902 may include a component that permits communication among the components of device 900.
- processor 904 may be implemented in hardware, firmware, or a combination of hardware and software.
- processor 904 may include a processor (e.g., a central processing unit (CPU), a graphics processing unit (GPU), an accelerated processing unit (APU), etc.), a microprocessor, a digital signal processor (DSP), and/or any processing component (e.g., a field-programmable gate array (FPGA), an application-specific integrated circuit (ASIC), etc.) that can be programmed to perform a function.
- Memory 906 may include random access memory (RAM), read only memory (ROM), and/or another type of dynamic or static storage device (e.g., flash memory, magnetic memory, optical memory, etc.) that stores information and/or instructions for use by processor 904.
- RAM random access memory
- ROM read only memory
- static storage device e.g., flash memory, magnetic memory, optical memory, etc.
- storage component 908 may store information and/or software related to the operation and use of device 900.
- storage component 908 may include a hard disk (e.g., a magnetic disk, an optical disk, a magneto-optic disk, a solid state disk, etc.) and/or another type of computer-readable medium.
- Input component 910 may include a component that permits device 900 to receive information, such as via user input (e.g., a touch screen display, a keyboard, a keypad, a mouse, a button, a switch, a microphone, etc.).
- input component 910 may include a sensor for sensing information (e.g., a photo-sensor, a thermal sensor, an electromagnetic field sensor, a global positioning system (GPS) component, an accelerometer, a gyroscope, an actuator, etc.).
- Output component 912 may include a component that provides output information from device 900 (e.g., a display, a speaker, one or more light-emitting diodes (LEDs), etc.).
- Communication interface 914 may include a transceiver-like component (e.g., a transceiver, a separate receiver and transmitter, etc.) that enables device 900 to communicate with other devices, such as via a wired connection, a wireless connection, or a combination of wired and wireless connections.
- Communication interface 914 may permit device 900 to receive information from another device and/or provide information to another device.
- communication interface 914 may include an Ethernet interface, an optical interface, a coaxial interface, an infrared interface, a radio frequency (RF) interface, a universal serial bus (USB) interface, a Wi-Fi® interface, a cellular network interface, and/or the like.
- RF radio frequency
- USB universal serial bus
- Device 900 may perform one or more processes described herein. Device 900 may perform these processes based on processor 904 executing software instructions stored by a computer-readable medium, such as memory 906 and/or storage component 908.
- a computer-readable medium may include any non- transitory memory device.
- a memory device includes memory space located inside of a single physical storage device or memory space spread across multiple physical storage devices.
- Software instructions may be read into memory 906 and/or storage component 908 from another computer-readable medium or from another device via communication interface 914. When executed, software instructions stored in memory 906 and/or storage component 908 may cause processor 904 to perform one or more processes described herein.
- hardwired circuitry may be used in place of or in combination with software instructions to perform one or more processes described herein.
- embodiments described herein are not limited to any specific combination of hardware circuitry and software.
- the term “programmed or configured,” as used herein, refers to an arrangement of software, hardware circuitry, or any combination thereof on one or more devices.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Instruments For Measurement Of Length By Optical Means (AREA)
- Investigating Or Analysing Materials By Optical Means (AREA)
Abstract
L'invention concerne des systèmes et des procédés d'interférométrie à angle balayé. Un système comprend une source de lumière conçue pour émettre au moins un faisceau ayant au moins deux longueurs d'onde, un dispositif d'orientation de faisceaux conçu pour diriger l'au moins un faisceau sur au moins un axe de mouvement, une lentille disposée dans un chemin optique entre le dispositif d'orientation de faisceaux et une cible de l'au moins un faisceau, la lentille ayant un plan focal, et une lentille relais disposée dans le chemin optique entre le dispositif d'orientation de faisceaux et la lentille.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202363467338P | 2023-05-18 | 2023-05-18 | |
| US63/467,338 | 2023-05-18 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| WO2024238875A2 true WO2024238875A2 (fr) | 2024-11-21 |
| WO2024238875A3 WO2024238875A3 (fr) | 2025-05-01 |
Family
ID=93520234
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2024/029827 Pending WO2024238875A2 (fr) | 2023-05-18 | 2024-05-17 | Système et procédé d'interférométrie à angle balayé |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2024238875A2 (fr) |
Family Cites Families (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP4452815B2 (ja) * | 2007-07-31 | 2010-04-21 | レーザーテック株式会社 | 深さ測定装置 |
| KR100939537B1 (ko) * | 2007-12-14 | 2010-02-03 | (주) 인텍플러스 | 표면 형상 측정 시스템 및 그를 이용한 측정 방법 |
| WO2021065582A1 (fr) * | 2019-09-30 | 2021-04-08 | 株式会社ニコン | Dispositif ophtalmique et système optique ophtalmique |
| US12292558B2 (en) * | 2021-04-08 | 2025-05-06 | LighTopTech Corp. | Dual-mode optical coherence tomography and optical coherence microscopy imaging systems and methods |
-
2024
- 2024-05-17 WO PCT/US2024/029827 patent/WO2024238875A2/fr active Pending
Also Published As
| Publication number | Publication date |
|---|---|
| WO2024238875A3 (fr) | 2025-05-01 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP5214883B2 (ja) | 三次元分光的符号化撮像のための方法と装置 | |
| US11119193B2 (en) | Micro resolution imaging range sensor system | |
| KR101106894B1 (ko) | 광학 네비게이션 시스템 및 방법 | |
| US20090284748A1 (en) | Speckle Noise Reduction in Coherent Imaging Systems | |
| WO2010048960A1 (fr) | Analyseur optique à balayage avec commande à rétroaction | |
| JP7568767B2 (ja) | マルチ参照アームスペクトルドメイン光干渉断層撮影のためのシステム、方法及び媒体 | |
| JP7394126B2 (ja) | in vivo全視野干渉顕微鏡を用いたイメージング方法およびシステム | |
| US7978336B2 (en) | Three wavelength quantitative imaging systems | |
| Ballester et al. | Single-shot synthetic wavelength imaging: Sub-mm precision ToF sensing with conventional CMOS sensors | |
| EP1391718B1 (fr) | Dispositif de tomographie par coherence optique | |
| Ballester et al. | Single-shot tof sensing with sub-mm precision using conventional cmos sensors | |
| JP7476394B2 (ja) | 非共焦点点走査式フーリエ領域光干渉断層計撮像システム | |
| JPH10504395A (ja) | 回折限界を超過した分解能(超分解能)を達成するための、物体の顕微鏡検査方法及び干渉顕微鏡 | |
| Kotwal et al. | Swept-angle synthetic wavelength interferometry | |
| JP2004502954A (ja) | 干渉測定装置 | |
| Ralston et al. | Phase stability technique for inverse scattering in optical coherence tomography | |
| WO2024238875A2 (fr) | Système et procédé d'interférométrie à angle balayé | |
| JP2001059714A (ja) | 形状測定方法及び装置 | |
| WO2012029809A2 (fr) | Dispositif de mesure de forme et procédé de mesure de forme | |
| JP7339447B2 (ja) | ライン走査マイクロスコピー用の装置および方法 | |
| CN115698626A (zh) | 用于测量物体的表面形貌的方法和系统 | |
| US20140268173A1 (en) | Shape measurement apparatus, measurement method, and method of manufacturing article | |
| WO2024112903A1 (fr) | Procédé et dispositif d'imagerie tridimensionnelle et d'imagerie par diffusion de scènes | |
| WO2024173740A2 (fr) | Système et procédé d'interférométrie de lumière passive | |
| Valero et al. | Depth sensing using coherence mapping |