[go: up one dir, main page]

WO2024191523A2 - Détection rapide de défauts et d'objets cachés à l'aide d'un capteur térahertz diffractif à pixel unique - Google Patents

Détection rapide de défauts et d'objets cachés à l'aide d'un capteur térahertz diffractif à pixel unique Download PDF

Info

Publication number
WO2024191523A2
WO2024191523A2 PCT/US2024/013612 US2024013612W WO2024191523A2 WO 2024191523 A2 WO2024191523 A2 WO 2024191523A2 US 2024013612 W US2024013612 W US 2024013612W WO 2024191523 A2 WO2024191523 A2 WO 2024191523A2
Authority
WO
WIPO (PCT)
Prior art keywords
diffractive
defect
sensor
output
substrate layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/US2024/013612
Other languages
English (en)
Other versions
WO2024191523A3 (fr
Inventor
Aydogan Ozcan
Jingxi LI
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of California Berkeley
University of California San Diego UCSD
Original Assignee
University of California Berkeley
University of California San Diego UCSD
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of California Berkeley, University of California San Diego UCSD filed Critical University of California Berkeley
Publication of WO2024191523A2 publication Critical patent/WO2024191523A2/fr
Publication of WO2024191523A3 publication Critical patent/WO2024191523A3/fr
Anticipated expiration legal-status Critical
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/31Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry
    • G01N21/35Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry using infrared light
    • G01N21/3581Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry using infrared light using far infrared light; using Terahertz radiation
    • G01N21/3586Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry using infrared light using far infrared light; using Terahertz radiation by Terahertz time domain spectroscopy [THz-TDS]
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/9501Semiconductor wafers
    • G01N21/9505Wafer internal defects, e.g. microcracks
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/143Sensing or illuminating at different wavelengths
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/58Extraction of image or video features relating to hyperspectral data
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks

Definitions

  • the technical field relates to a diffractive sensor that rapidly detects hidden defects/objects within a target sample using a single-pixel spectroscopic terahertz detector, without scanning the sample or forming/processing its image.
  • Non-invasive detection systems based on terahertz technology offer unique opportunities for this purpose due to the ability of terahertz waves to penetrate through most optically-opaque materials and grasp the molecular fingerprint information of the sample through the rich spectral signatures of different materials in the terahertz band.
  • terahertz time-domain spectroscopy (THz-TDS) systems have been extensively used in various non-destructive quality control applications since they can provide frequency-resolved and time-resolved responses of hidden objects.
  • existing THz-TDS systems are single-pixel and require raster scanning to acquire the image of the hidden features, resulting in relatively low-speed/low-throughput systems.
  • Nonlinear optical processes can also be utilized to convert the terahertz information of the illuminated sample to the near-infrared regime to visualize the hidden structural information of the sample through an optical camera without raster scanning.
  • these imaging systems offer relatively low signal-to-noise ratio (SNR) levels and require bulky and expensive high-energy lasers to offer acceptable nonlinear conversion efficiencies.
  • SNR signal-to-noise ratio
  • terahertz information of the illuminated sample can be encoded using spatial light modulators and the image data can be resolved using computational methods without raster scanning.
  • the physical constraints of spatial light modulators operating at terahertz wavelengths limit the speed, and increase the size, cost, and complexity of these imaging systems.
  • terahertz focal-plane arrays based on field-effect transistors and microbolometers do not provide time-resolved and frequency- resolved image data, limiting the types of structural information that can be detected. Due to these limitations, the space-bandwidth products (SBPs) of existing terahertz imaging systems are orders of magnitude lower than their counterparts operating in the visible band, thereby constraining the system's overall information throughput and its capacity to adequately capture the desired details of the hidden structures of interest.
  • SBPs space-bandwidth products
  • a diffractive sensor that can rapidly detect hidden defects or defects or target features (e.g., objects) within a target sample volume or target object (e.g., an object to be tested) using a single-pixel spectroscopic terahertz detector.
  • this single-pixel diffractive sensor rapidly inspects the volume of the test sample or object illuminated with terahertz radiation, without the formation or digital processing of an image of the sample.
  • the detection system is treated as a coherent diffractive sensor of terahertz waves that can all-optically search for and classify undesired or unexpected sources of secondary waves generated by diffraction through hidden defects or structures.
  • the diffractive sensor can be considered an all-optical sensor for unexpected or hidden sources of secondary waves within a test volume or object, which are detected through a single-pixel spectroscopic detector.
  • the design includes a series of diffractive layers (also referred to as substrate layers), optimized to modify the spectrum of the terahertz radiation scattered from the test sample volume or object according to the absence or presence of target features (e.g., hidden structures) or defects.
  • the diffractive layers are jointly optimized using deep learning, and contain tens of thousands of subwavelength phase features. Once their deep learning-based training is complete, the resulting diffractive layers are physically fabricated using 3D printing or additive manufacturing, which forms an optical neural network portion of the diffractive sensor.
  • the scattered terahertz waves from the object are all-optically processed by the diffractive network and sampled by a single-pixel spectroscopic detector at the output aperture of the system.
  • the measured spectrum reveals the existence of hidden defects/structures within the object all-optically, without the need for raster scanning or any image reconstruction or processing steps. Since these target features or defects of interest are hidden within a solid volume that forms the object, traditional machine vision approaches that operate at visible wavelengths cannot provide an alternative approach for these tasks.
  • a diffractive sensor device was created and tested that detected hidden defects in silicon samples, which were prepared by stacking two wafers; one wafer containing etched defects and the other wafer covering the defective regions.
  • the diffractive layers were designed to introduce a differential variation in the peak spectral intensity near two predetermined terahertz wavelengths.
  • This diffractive defect sensor was realized using a single-pixel THz-TDS system with a plasmonic nanoantenna-based source generating pulsed terahertz illumination and a plasmonic nanoantenna-based detector sampling the terahertz spectrum at the output aperture.
  • the performance of the diffractive defect sensor was numerically analyzed by evaluating its detection sensitivity as a function of the size and the position of the hidden defects within the detection field-of-view (FOV), also covering subwavelength feature sizes.
  • the optimized diffractive layers were fabricated using a 3D printer and conducted experimental tests for hidden defect detection.
  • the experimental results 2023-184-2 on silicon wafers with various unknown defect sizes and positions showed a good agreement with the numerical analysis, successfully revealing the presence of unknown hidden defects.
  • the reported approach offers distinct advantages compared to the existing terahertz imaging and sensing systems.
  • the hidden target feature/defect detection is accomplished using, in some embodiments, a single-pixel spectroscopic detector, eliminating the need for a focal plane array or raster scanning, thus greatly simplifying and accelerating the target feature/defect detection process.
  • the diffractive layers employed are passive optical components, enabling the diffractive sensor to analyze the test object volume without requiring any external power source except for the terahertz illumination and single-pixel detector.
  • the all-optical end-to-end detection process negates the need for memory, data/image transmission or digital processing using e.g., a graphics processing unit (GPU), resulting in a high-throughput target feature/defect detection scheme.
  • a graphics processing unit GPU
  • the diffractive target feature/defect sensors reported herein were primarily designed for the terahertz band, the underlying concept and design approaches are also applicable for defect detection in other parts of the spectrum, including infrared, visible, and X-ray.
  • a diffractive sensor for sensing a defect or a target feature in an object includes a multi-wavelength illumination system including a broadband light source or individual light sources centered at different wavelengths of electromagnetic radiation and one or more output detectors arranged along an optical path; one or more optically transmissive and/or reflective substrate layer(s) arranged in the optical path and having the object interposed between or outside of the one or more optically transmissive and/or reflective substrate layer(s) along the optical path, each of the optically transmissive and/or reflective substrate layer(s) comprising a plurality of physical features formed on or within the one or more optically transmissive and/or reflective substrate layer(s) and having different 2023-184-2 transmission and/or reflection properties as a function of the lateral coordinates across each substrate layer, wherein the one or more optically transmissive and/or reflective substrate layer(s) and the object collectively modulate an input optical signal from the light source(s) to generate output optical signals at different wavelengths at the one or more
  • a method of detecting the presence of a defect or a target feature in an object includes providing a diffractive sensor for sensing the defect or the target feature in the object including: (1) a multi-wavelength illumination system comprising a broadband light source or individual light sources centered at different wavelengths of electromagnetic radiation and one or more output detectors arranged along an optical path; and (2) one or more optically transmissive and/or reflective substrate layer(s) arranged in the optical path and having the object interposed between or outside of the one or more optically transmissive and/or reflective substrate layer(s) along the optical path, each of the optically transmissive and/or reflective substrate layer(s) comprising a plurality of physical features formed on or within the one or more optically transmissive or reflective substrate layer(s) and having different transmission and/or reflection properties as a function of the lateral coordinates across each substrate layer, wherein the one or more optically transmissive and/or reflective substrate layer(s) and the object collectively modulate an input optical signal from the light
  • FIG.1A illustrates a schematic representation of a diffractive sensor according to one embodiment.
  • This embodiment illustrates a diffractive sensor that operates in transmission mode.
  • a multi-wavelength illumination system comprising illuminates the diffractive sensor that includes a plurality of substrates and an object or sample volume that is being tested for a defect or the presence of a target feature.
  • the object is interposed between two substrate layers on either side.
  • One or more output detectors positioned along the optical path at the output of the diffractive sensor captures the optical output signals at a plurality of wavelengths.
  • the one or more output detectors are used to obtain the power spectrum signal from the diffractive sensor.
  • a differential spectral score is calculated and compared to a threshold.
  • FIG.1B illustrates a schematic representation of a diffractive sensor according to another embodiment.
  • This embodiment illustrates a diffractive sensor that operates in reflection mode.
  • a multi-wavelength illumination system comprising illuminates the diffractive sensor that includes a plurality of substrates and an object or sample volume that is being tested for a defect or target feature.
  • the object is interposed between two reflective substrate layers along a folded optical path.
  • One or more output detectors positioned along the optical path at the output of the diffractive sensor captures the optical output signals at a plurality of wavelengths.
  • FIG.2 illustrates a single substrate layer of the diffractive sensor.
  • the substrate layer may be made from a material that is optically transmissive (for transmission mode) or an optically reflective material (for reflective mode).
  • the substrate layer which may be formed as a substrate or plate in some embodiments, has surface features formed across the substrate layer.
  • the surface features form a patterned surface (e.g., an array) having different valued transmission (or reflection) coefficients as a function of lateral coordinates across each substrate layer. These surface features act as artificial “neurons” that connect to other “neurons” of other substrate layers of the optical neural network portion of the diffractive 2023-184-2 sensor through optical diffraction (or reflection) and alter the phase and/or amplitude of the light wave.
  • FIG.3 schematically illustrates a cross-sectional view of a single substrate layer of diffractive sensor according to one embodiment. In this embodiment, the surface features are formed by adjusting the thickness of the substrate layer that forms the optical neural network.
  • FIG.4 schematically illustrates a cross-sectional view of a single substrate layer of a diffractive sensor according to another embodiment.
  • the different surface features are formed by altering the material composition or material properties of the single substrate layer at different lateral locations across the substrate layer. This may be accomplished by doping the substrate layer with a dopant or incorporating other optical materials into the substrate layer. Metamaterials or plasmonic structures may also be incorporated into the substrate layer.
  • FIG.5 schematically illustrates a cross-sectional view of a single substrate layer of a diffractive sensor according to another embodiment.
  • the substrate layer is reconfigurable in that the optical properties of the various artificial neurons may be changed, for example, by application of a stimulus (e.g., electrical current or field).
  • a stimulus e.g., electrical current or field
  • An example includes spatial light modulators (SLMs) which can change their optical properties.
  • the neuronal structure is not fixed and can be dynamically changed or tuned as appropriate.
  • This embodiment for example, can provide a learning diffractive sensor or a changeable diffractive sensor that can be altered on-the-fly (e.g., over time) to improve the performance, compensate for aberrations, or even change another task.
  • FIG.6 illustrates a flowchart illustrating how a digital model of the diffractive sensor is first digitally trained to sense a defect in a sample volume/object based on captured output optical signals at different wavelengths. Once the digital model of the diffractive sensor is trained, a physical embodiment of the diffractive sensors is than manufactured and used.
  • FIG.7A illustrates a schematic of a diffractive terahertz sensor for rapid sensing of hidden objects or defects using a single pixel spectroscopic detector.
  • FIG.7B illustrates the working principle of the all-optical hidden object/defect detection scheme.
  • the spectral intensity values, ⁇ ⁇ ⁇ ⁇ and ⁇ ⁇ ⁇ ⁇ , sampled at two predetermined wavelengths ⁇ ⁇ and ⁇ ⁇ by the single-pixel detector are used to compute the output score for indicating the existence of defects/structures within the sample volume.
  • FIG.8A is a side view schematic of the sample or object under test that includes two silicon wafers stacked with a hidden defect structure fabricated on the surface of one of the wafers through photolithography and etching. The opaque regions are covered with aluminum to block the terahertz wave transmission, leaving a square-shaped opening of 2 ⁇ 2 cm that serves as the detection FOV.
  • the photos showing cross-sections of a sample structure at planes F, D and B are provided in FIG.12.
  • the direction of terahertz wave propagation is defined as the z direction, while the x and y directions represent the lateral directions.
  • FIG.8B illustrates the physical layout of the single-pixel diffractive terahertz sensor set-up, with the sizes of input/output apertures, the size of the detection FOV, and the axial distances between the adjacent components annotated. Note that dimensions and sizes are not to be construed as limitations on the diffractive sensor.
  • FIG.8C illustrates the thickness profiles of the designed diffractive substrate layers (D1 through D4).
  • FIG.9A illustrates a cross-section of the sample or object under test that includes two silicon wafers stacked with a hidden defect structure fabricated on the surface of one of the wafers. The depth direction Dz is illustrated.
  • FIG.9B illustrates the lateral dimensions (Dx and Dy) of the defect in the sample or object under test. These dimensions are used for analyzing the impact of the shape and size of the hidden defect, defined by the lateral sizes, Dx and Dy, and the depth, Dz, on the detection sensitivity of the diffractive terahertz sensor design.
  • Dx and Dy the lateral dimensions of the defect in the sample or object under test.
  • FIG.9C illustrates defect detection accuracies as a function of the defect dimensions (Dx, Dy and Dz) that are defined in (FIG.9A) and (FIG.9B). 2023-184-2 [0027]
  • FIG.10A is a schematic of the THz-TDS experimental set-up. Dark lines represent the propagation path of the femtosecond pulses generated from a Ti:Sapphire laser operating at 780 nm to pump/probe the terahertz source/detector.
  • FIG.10B are photographs of the 3D-printed diffractive layers.
  • FIG.10C is a photograph of the experimental set-up.
  • FIG.11A is a cross-sectional illustration of the sample under test showing planes F, D, B.
  • FIG.11B shows photograph images of the exemplary test samples used for the experimental blind testing, which reveal the hidden structures at the cross-sectional plane D (not visible from outside).
  • the first nine of the test samples i.e., samples No.1-9) contain etched defects that have different shapes and are positioned at different locations within the detection FOV, while the last sample (i.e., sample No.10) has no defects. These photos were captured by removing the smaller silicon wafer at the front, i.e., the left wafer in (FIG.10A).
  • FIG.11C illustrates normalized experimental spectral scores for the test samples shown in (FIG.11B).
  • FIG.11D illustrates a histogram showing the distribution of the 252 experimental differential detection scores, which were obtained through measuring a defect-free (negative) sample 10 times through repeated experiments and combining these 10 spectral measurements in unique groups of 5, each resulting in an experimental differential detection 2023-184-2 score, ⁇ ⁇ , ⁇ , based on the average spectrum. Note that ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ 252, where ⁇ refers to the combination operation.
  • FIG.11E illustrates normalized experimental spectral intensity (solid lines) for the different test samples shown in (FIG.11B), compared with their numerically simulated counterparts (dashed lines). Each test sample was measured 5 times and the results of all 5 measurements are shown in the same graph.
  • FIG.12 illustrates photographs of the cross-sectional planes F, D and B of the test sample structure illustrated in FIG.11A.
  • FIGS.13A and 13B illustrate the dispersion curves of the material used for the diffractive layers: the refractive index ⁇ ⁇ ⁇ ⁇ (FIG.13A) and the extinction coefficient ⁇ ⁇ ⁇ ⁇ (FIG.13B).
  • FIG.14 illustrates a graph showing the impact of different averaging factors (N avg ) used for experimental spectral measurements on the false positive rate (FPR) in the defect detection results.
  • FIG.15A shows amplitude and phase distributions of the field ⁇ ⁇ at two predetermined wavelengths, ⁇ ⁇ and ⁇ ⁇ .
  • FIG.15B is an illustration used for analyzing the impact of the position of the hidden defect within the detection FOV.
  • FIGS.15C and 15D illustrate the output spectral intensity at the two operational wavelengths ⁇ ⁇ ⁇ ⁇ and ⁇ ⁇ ⁇ ⁇ as a function of the hidden defect’s position within the detection FOV.
  • FIG.15E illustrates the final detection score ⁇ ⁇ as a function of the hidden defect’s position within the detection FOV. 2023-184-2
  • FIG.15F illustrates the final detection results as a function of the hidden defect’s position within the detection FOV.
  • TP True Positive
  • FN False Negative.
  • the diffractive sensor 10 contains a plurality of diffractive layers or substrate layers 20 that are physical layers which may be formed as a physical substrate or matrix of optically transmissive material (for transmission mode as seen in FIG.1A) or optically reflective material (for reflective mode as seen in FIG.1B).
  • transmission mode light or electromagnetic radiation passes through the substrate layer(s) 20.
  • reflective mode light or radiation reflects off the substrate layer(s) 20.
  • Exemplary materials that may be used for the substrate layers 20 include polymers and plastics (e.g., those used in additive manufacturing techniques such as 3D printing) as well as semiconductor-based materials (e.g., silicon and oxides thereof, gallium arsenide and oxides thereof), crystalline materials or amorphous materials such as glass and combinations of the same.
  • Metal coated materials may be used for reflective substrate layers 20. It should be appreciated that other embodiments may combined aspects of both transmission mode (FIG.1A) and reflection mode (FIG.1B) into the diffractive sensor 10.
  • the diffractive sensor 10 includes a multi- wavelength illumination system 30 that includes a broadband source or individual sources centered at different wavelengths of electromagnetic radiation.
  • the light from the multi- wavelength illumination system 30 travels along an optical path 12 that passes through the object 100 and the substrate layer(s) 20 and ends at one or more output detectors 14.
  • the light from the wavelength illumination system 30 may reflect off the object 100 and/or the substrate layer(s) 20 in reflection-based embodiments.
  • the optical path 12 may be substantially linear as seen in FIG.1A or it may be folded as is seen in FIG.1B.
  • the one or more output detectors 14 may include optical sensors or detectors.
  • the one or more output detectors 14 generate output optical signals for different detected wavelengths of light. This may include a single output detector 14, for example, the single-pixel spectroscopic detector (THz detector) as described herein.
  • the one or more output detectors 14 may also include different optical sensors or detectors that capture different bands or ranges of wavelengths.
  • a plurality of optical sensors or detectors may be used that capture 2023-184-2 light at different bands or ranges of wavelengths. This may include optical sensors or detectors associated with filters to filter particular wavelength ranges or bands.
  • the multi-wavelength illumination system 30 is a time-domain spectroscopy (TDS) system that is used along with the plurality of substrate layers 20.
  • TDS time-domain spectroscopy
  • a sample volume or object 100 that is to be examined or interrogated is interposed between the plurality of substrate layers 20 as described herein.
  • the sample volume or object 100 to be examined or interrogated is located outside of the plurality of substrate layers 20 (i.e., before a first substrate layer 20 encountered by the light from the multi-wavelength illumination system 30 or after a last substrate layer 20 encountered by the light from the multi-wavelength illumination system 30).
  • the object 100 may include a semiconductor in one embodiment (e.g., silicon wafer).
  • the object 100 may also include a circuit, a commercial product, an art work, or the like. It should be understood, however, that in other embodiments, the object 100 may include other types of objects. This includes inorganic and/or organic materials.
  • a biological sample or tissues may include examples of objects 100 in some embodiments.
  • the multi-wavelength illumination system 30 is a terahertz (THz-TDS) system as best seen in FIGS.10A-10D.
  • the THz-TDS system 30 includes a femtosecond laser 32 and beam splitter 34 that divides the laser to a THz source 36 in one arm and to a delay stage 38 in the other arm.
  • the THz source 36 generates THz pulses that are input via an off-axis parabolic mirror 40 and separate mirror 42 through an input aperture 44 that defines the input plane to the diffractive sensor 10 (FIGS.10C-10D) and into the plurality of substrate layers 20 and the object 100.
  • the input THz pulses (e.g., input optical signals) are modulated by the substrate layers 20 to inspect target features such as hidden objects/structures or defects within the object 100 that is interposed in the optical path of the diffractive sensor 10.
  • the modulated signal passes through an outlet aperture 46 where the output optical signal is captured by one or more output detectors 14.
  • the one or more output detectors 14 is a single-pixel spectroscopic detector (THz detector).
  • the single-pixel spectroscopic detector samples the terahertz spectrum at the output aperture 46.
  • each substrate layer 20 of the diffractive sensor 10 has a plurality of physical features 22 formed on the surface of the substrate layer 20 or within the substrate layer 20 itself that collectively define a pattern of physical locations along the length and width of each substrate layer 20 that have varied transmission properties (or varied reflection properties).
  • the physical features 22 formed on or in the substrate 2023-184-2 layer(s) 20 thus create a pattern of physical locations within the substrate layer(s) 20 that have different valued transmission (or reflection) properties as a function of lateral coordinates (e.g., length and width and in some embodiments depth) across each substrate layer 20.
  • each separate physical feature 22 may define a discrete physical location on the substrate layer 20 while in other embodiments, multiple physical features 22 may combine or collectively define a physical region with a particular transmission (or reflection) property. These physical regions act as the “neurons” within the physical optical network of the diffractive sensor 10.
  • the pattern of physical locations formed by the physical features 22 may define, in some embodiments, an array located across the surface of the substrate layer 20.
  • the substrate layer 20 in one embodiment is a two-dimensional generally planer substrate having a length (L), width (W), and thickness (t) that all may vary depending on the particular application. In other embodiments, the substrate layer 20 may be non-planer such as, for example, curved.
  • FIG.2 illustrates a rectangular or square- shaped substrate layer 20, different geometries are contemplated.
  • the physical features 22 and the physical regions formed thereby act as artificial “neurons” that connect to other “neurons” of other substrate layers 20 of the diffractive sensor 20 through optical diffraction (or reflection) and alter the phase and/or amplitude of the light wave that passes through or reflects off the layer 20.
  • the particular number and density of the physical features 22 or artificial neurons that are formed in each substrate layer 20 may vary depending on the type of application. In some embodiments, the total number of artificial neurons may only need to be in the hundreds or thousands while in other embodiments, hundreds of thousands or millions of neurons or more may be used.
  • FIG.3 illustrates one embodiment of how different physical features 22 are formed in the substrate layer 20.
  • a substrate layer 20 has different thicknesses (t) of material at different lateral locations along the substrate layer 20.
  • the different thicknesses (t) modulates the phase of the light passing through the substrate layer 20.
  • This type of physical feature 22 may be used, for instance, in the transmission mode embodiment of FIGS.1A, 7A-7B, 8B, 10A-10D.
  • the different thicknesses of material in the substrate layer 20 form a plurality of discrete “peaks” and “valleys” that control the transmission coefficient of the neurons formed in the substrate layer 20.
  • the different 2023-184-2 thicknesses of the substrate layer 20 may be formed using additive manufacturing techniques (e.g., 3D printing) or lithographic methods utilized in semiconductor processing.
  • the design of the substrate layers 20 may be stored in a stereolithographic file format (e.g., .stl file format) which is then used to 3D print the substrate layers 20.
  • Other manufacturing techniques include well-known wet and dry etching processes that can form very small lithographic features on a substrate layer 20.
  • FIG.4 illustrates another embodiment in which the physical features 22 are created or formed within the substrate layer 20.
  • the substrate layer 20 may have a substantially uniform thickness but have different regions of the substrate layer 20 have different optical properties.
  • the refractive (or reflective) index of the substrate layers 20 may altered by doping the substrate layers 20 with a dopant (e.g., ions or the like) to form the regions of neurons in the substrate layers 20 with controlled transmission properties (or absorption and/or spectral features).
  • a dopant e.g., ions or the like
  • optical nonlinearity can be incorporated into the diffractive sensor 10 design using various optical non-linear materials (e.g., crystals, polymers, semiconductor materials, doped glasses, polymers, organic materials, semiconductors, graphene, quantum dots, carbon nanotubes, and the like) that are incorporated into the substrate layer 20.
  • a masking layer or coating that partially transmits or partially blocks light in different lateral locations on the substrate layer 20 may also be used to form the neurons on the substrate layers 20.
  • the transmission function of the physical features 22 or neurons can also be engineered by using metamaterial or plasmonic structures. Combinations of all these techniques may also be used.
  • non-passive components may be incorporated in into the substrate layers 20 such as spatial light modulators (SLMs).
  • SLMs are devices that imposes spatial varying modulation of the phase, amplitude, or polarization of a light.
  • SLMs may include optically addressed SLMs and electrically addressed SLM.
  • Electric SLMs include liquid crystal-based technologies that are switched by using thin-film transistors (for transmission applications) or silicon backplanes (for reflective applications).
  • Another example of an electric SLM includes magneto-optic devices that use pixelated crystals of aluminum garnet switched by an array of magnetic coils using the magneto-optical 2023-184-2 effect.
  • Additional electronic SLMs include devices that use nanofabricated deformable or moveable mirrors that are electrostatically controlled to selectively deflect light.
  • FIG.5 schematically illustrates a cross-sectional view of a single substrate layer 20 of a diffractive sensor 10 according to another embodiment.
  • the substrate layer 20 is reconfigurable in that the optical properties of the various physical features 22 that form the artificial neurons may be changed, for example, by application of a stimulus (e.g., electrical current or field).
  • a stimulus e.g., electrical current or field
  • An example includes spatial light modulators (SLMs) discussed above which can change their optical properties.
  • the layers may use the DC electro-optic effect to introduce optical nonlinearity into the substrate layers 20 of the diffractive sensor 10 and require a DC electric-field for each substrate layer 20 of the diffractive sensor 10. This electric-field (or electric current) can be externally applied to each substrate layer 20 of diffractive sensor 10.
  • poled materials with very strong built-in electric fields as part of the material (e.g., poled crystals or glasses).
  • the neuronal structure is not fixed and can be dynamically changed or tuned as appropriate (i.e., changed on demand).
  • This embodiment for example, can provide a learning diffractive sensor 10 or a changeable diffractive sensor 10 that can be altered on-the- fly to improve the performance, compensate for aberrations, or even change another task.
  • a computerized model of the diffractive sensor 10 is first digitally trained.
  • the model of the diffractive sensor 10 is trained to sense hidden defects or target features within the object.
  • at least one computing device 100 having one or more processors 102 executes software 104 thereon to digitally train a model or mathematical representation of diffractive sensor 10.
  • training objects are used to define or optimize the plurality of physical features 22 formed on or within the one or more optically transmissive and/or reflective substrate layer(s) 10 to sense the defects or target features in the object 100 based on the output optical signals at different wavelengths detected at the one or more output detectors 14.
  • the design or model has been established that encodes a physical layout for the different physical features 22 that form the artificial neurons in each of the plurality of substrate layers 20 which are present in the diffractive sensor 10
  • the actual physical embodiment of the diffractive sensor 10 is then manufactured or fabricated that reflects the computer-derived design. This is illustrated in operation 210 of FIG.6.
  • the design in some embodiments, may be embodied in a software format (e.g., SolidWorks, AutoCAD, Inventor, or other computer-aided design (CAD) program or lithographic software program) may then 2023-184-2 be manufactured into a physical embodiment that includes the substrate layers 20.
  • the substrate layers 20, once manufactured may be mounted or disposed in a holder or housing 16 as explained herein (e.g., illustrated in FIGS.1A and 10D).
  • the holder or housing 16 may include a number of slots formed therein to hold the substrate layers 20 in the required sequence and with the required spacing between adjacent substrate layers 20 (if needed).
  • the diffractive sensor 10 is then used to analyze objects 100 for potential defects or target features.
  • the diffractive sensor 10 reported herein were primarily designed for the terahertz band of the electromagnetic spectrum, the underlying concept and design approaches are also applicable for defect/target features detection in other parts of the electromagnetic spectrum, including infrared, visible, and X-ray. These unique capabilities of performing computational sensing without a digital computer or the need for creating a digital 3D image will inspire the development of new task-specific all-optical detection systems and smart sensors.
  • Such diffractive sensors 10 and systems employing the same can find diverse applications, such as industrial manufacturing and quality control, material inspection, detection/classification of hidden objects, security screening, and anti- counterfeiting measures.
  • the non-destructive and non-invasive nature of this technology platform also makes it a valuable tool for sensitive applications, e.g., cultural heritage preservation and biomedical sensing.
  • This framework can deliver transformative advances in various fields, where defect/target feature detection and materials diagnosis are of utmost importance.
  • the object 100 is located along the optical path 12 of the diffractive sensor 10. As explained herein, this may be located between one or more front-end diffractive/substrate layers 20 and one or more back-end diffractive/substrate layers 20.
  • the illumination source 30 then illuminates the object 100 via an optical path that transmits through and/or reflects off of the diffractive/substrate layers 20.
  • the object 100 along with the diffraction/substrate layers 20 collectively modulate the input optical signal from the illumination source to generate output optical signals at different wavelengths at one or more output detectors 14.
  • the output optical signals at different wavelengths encode for the presence or absence of the defect or target feature in the object 100.
  • the one or more output detectors 14 capture the power spectrum signal at a plurality of wavelengths (FIGS.1A, 1B).
  • the power spectrum of 2023-184-2 the output optical signal includes, in one embodiment, a plurality of wavelength peaks detected at the one or more output detectors 14.
  • the presence or absence of the defect or target feature in the object 100 is based on the relative intensities of the plurality of wavelength peaks detected at the one or more output detectors 14. According to another embodiment, the presence or absence of the defect or target feature in the object 100 is based on the signal intensities of one or more wavelengths detected at the one or more output detectors 14. [0058] [0059] The power spectrum is used, in one particular embodiment, to calculate a differential spectral score s det which is then compared to a threshold s th that may be empirically derived to determine the presence or absence of a defect or target feature within the object 100.
  • a differential spectral score s det that is at or above the threshold s th may indicate that a defect or target feature is detected in the object 100.
  • a differential spectral score s det that is below the threshold s th may indicate that a defect or target feature is not detected in the object 100.
  • the differential spectral score s det may be calculated and compared to the threshold sth by a computing device, microcontroller, or circuitry 110 that interfaces with or receives the output(s) from the one or more output detectors.
  • Optional software or instructions executed by the computing device, microcontroller, or circuitry 110 may automatically perform the differential spectral score calculations and threshold comparisons.
  • the computing device, microcontroller, or circuitry 110 may be part of the diffractive sensor 10 or, alternatively, may be separate from the diffractive sensor 10.
  • the computing device, microcontroller, or circuitry 110 may optionally be associated with a display or the like for communicating the sensing results of the diffractive sensor 10.
  • the user may be presented with the output of the diffractive sensor 10 (e.g., “Defect Detected” or “No Defect”) and/or the power spectrum signal, differential spectral score s det , and threshold s th .
  • the reported approach demonstrates all-optical detection of hidden structures within 3D objects 100, enabled by a single-pixel spectroscopic terahertz detector 14, entirely eliminating the need to scan the samples or create, store and digitally process their images.
  • the design employs an optical architecture featuring a passive diffractive encoder that 2023-184-2 generates structured illumination impinging onto the 3D sample or object of interest 100, coupled with a diffractive decoder that performs space-to-spectrum transformation, achieving defect or target feature detection based on the optical fields scattered from the sample volume.
  • this single-pixel defect/target feature sensor 10 offers distinct advantages compared to the existing terahertz imaging and sensing systems used for the same purpose.
  • the hidden defect or target feature detection is accomplished using a single- pixel spectroscopic detector 14, eliminating the need for a focal plane array or raster scanning, thus greatly simplifying and accelerating the defect or target feature detection process.
  • the diffractive or substrate layers 20 that are employed are passive optical components, enabling the diffractive sensor 10 to analyze the test object 100 volume without requiring any external power source except for the terahertz illumination system 30 and single-pixel detector 14.
  • the all-optical end-to-end detection process negates the need for memory, data/image transmission or digital processing using e.g., a graphics processing unit (GPU), resulting in a high-throughput defect or target feature detection scheme.
  • GPU graphics processing unit
  • these characteristics render the single-pixel diffractive terahertz sensors 10 particularly well- suited for high-throughput screening applications such as in industrial settings, e.g., manufacturing and security. These applications require high-throughput defect or target feature detection, where the hidden defects or objects of interest are often rare, but critically important to catch.
  • FIGS.1A, 1B, 7A and 7B illustrate the basic principles of the proof-of-concept for the single-pixel diffractive terahertz sensor 10 design.
  • the forward model of this design can be treated as a coherent optical system that processes spatially coherent terahertz waves at a predetermined set of two wavelengths ( ⁇ ⁇ and ⁇ ⁇ ), where the resulting diffraction and interference processes are used for detection task.
  • a set of diffractive layers or substrates 20 is positioned before the sample or object 100 under test to provide spatially coherent, structured broadband illumination within a given detection FOV, acting as an all-optical front-end network that is trainable.
  • This output spectrum is measured at two predetermined wavelengths, ⁇ ⁇ and ⁇ ⁇ , producing the spectral intensity values ⁇ ⁇ ⁇ ⁇ ⁇ and ⁇ ⁇ ⁇ that yield a normalized detection score ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ( ⁇ ⁇ ⁇ (0,1)); see FIG.7B.
  • defect/target feature inference is performed to predict of hidden defects or target features within the target sample volume, i.e., ⁇ ⁇ ⁇ ⁇ ⁇ for a defect/target feature, while ⁇ ⁇ ⁇ ⁇ ⁇ indicates no defect/target feature.
  • an unbiased threshold of ⁇ ⁇ 0.5 was selected in the numerical analysis and experimental validation and, therefore a simple differential decision rule of ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ indicates the existence of hidden defects/target features, and ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ indicates a defect/target feature-free (negative) sample.
  • test samples or objects 100 were created with hidden defects by forming a stack of two silicon wafers that are in close contact with each other, where the surface of one of the wafers contained defect structures fabricated using photolithography and etching (see Methods for details).
  • the inspection FOV of each test object 100 was chosen as 2 ⁇ 2 cm.
  • an exemplary defect is located somewhere inside the detection FOV at the interface between the two wafers.
  • FIG.12 includes additional photographs of a silicon test sample/object 100, sselling its structure across three cross-sectional planes: planes F and B for the front and back surfaces of the stacked wafers, respectively, and plane D for the contact interface of the two silicon wafers (planes illustrated in FIG.11A).
  • planes F and B for the front and back surfaces of the stacked wafers, respectively
  • plane D for the contact interface of the two silicon wafers
  • the single-pixel diffractive sensor 10 design consists of four diffractive/substrate layers 20, with two layers 20 positioned before the target sample and two layers 20 positioned after the target sample or object 100, i.e., forming the front-end and back-end diffractive sensors, respectively.
  • the layout of this diffractive design is provided in [0068] Since the diffractive sensor to effectively detect the hidden defects of unknown shapes and sizes that may appear anywhere in the target sample/object 100, a data- driven approach was adopted by simulating a total of 20,000 silicon test samples with hidden defects of varying sizes and shapes for training the diffractive sensor model.
  • the defects within these simulated test samples were set to be rectangular, with their lateral sizes (D x and Dy) randomly generated within a range of 1 to 3 mm and a depth (Dz) randomly chosen between 0.23 and 0.27 mm.
  • the positions of these defects (x d , y d ) were also randomly set within the detection FOV.
  • a test sample/object 100 was modeled that has no defects in the numerical simulations, which forms the negative sample in the training data.
  • 20,000 replicas of the defect-free sample were created and mixed them with the 20,000 samples with defects, such that the final training set had a balanced ratio of positive and negative samples.
  • a blind testing set composed of 2,000 samples or objects 100 with various defects following the same approach was generated; all these defective test samples/objects 100 were created using different combinations of parameters (D x , D y , D z , x d , y d ) that are uniquely different than any of those used by the training samples.
  • the data-driven training process is a one-time effort, similar to the training effort that a digital defect analyzer based on a THz camera would need to go through (using e.g., supervised learning) in an industrial or security setting.
  • a focal cross-entropy loss was used; see the Methods section.
  • This type of 2023-184-2 loss function can effectively reduce the penalization from samples that can be easily classified, such as those containing large hidden defects, thereby providing better detection sensitivity for more challenging samples, such as those with smaller-sized hidden defects.
  • a term was incorporated to impose constraints on the energy distribution of the output power spectrum (see the Methods section). This loss term aimed to maximize the output diffraction efficiency at ⁇ ⁇ and ⁇ ⁇ , while minimizing it at other neighboring wavelengths, which helped us enhance the single-pixel SNR at the desired operational wavelengths ( ⁇ ⁇ and ⁇ ⁇ ). This design reduced the single-pixel output at other wavelengths, increasing the designs' experimental robustness.
  • FIG.8C shows the resulting diffractive/substrate layers 20 after the training was complete.
  • This diffractive sensor design was numerically tested using the testing set containing 2,000 defective samples/objects 100 that were generated without any overlap with the training defective samples, as well as a test sample or object 100 without any defects.
  • the overall detection performance also shows a degradation trend the defect depth Dz is reduced.
  • the detection sensitivity reaches ⁇ 99.7% when Dz is 0.3 mm, but drops to ⁇ 81.2% as Dz reduces to 0.15 mm, which is much smaller than ⁇ ⁇ and ⁇ ⁇ .
  • the diffractive sensor design can achieve accurate detection of hidden defects that are Dx, Dy ⁇ ⁇ 1.25 mm ( ⁇ 1.32 ⁇ ⁇ ) and Dz ⁇ ⁇ 0.21 mm ( ⁇ 0.22 ⁇ ⁇ ) within a FOV of 2 ⁇ 2 cm ( ⁇ 21 ⁇ ⁇ ⁇ 21 ⁇ ⁇ ). It should be noted that the smallest defect used in this analysis has a size close to the diffraction limit of light in air ( ⁇ 0.5 ⁇ ⁇ ).
  • the medium between the sample or object 100 under test and the detector is air, which sets an upper limit of 1 on the effective numerical aperture (NA) of the detection system.
  • NA effective numerical aperture
  • the entire detection FOV of 2 ⁇ 2 cm was divided into a series of concentric circles of equal radius, forming ring-like regions denoted as R1 to R6 from the center to the edges.
  • test samples/objects 100 were prepared for this experimental testing, where the hidden structures of these samples at plane D are shown in FIG.11B.
  • No.1-9 contained a hidden defect that cannot be visibly seen as the defect is located at plane D (between the two wafers); these defects possess different sizes and shapes (defined by D x , D y and Dz).
  • the combinations of the parameters (Dx, Dy, Dz, xd, yd) for these test samples/objects 100 were never used in the training set, i.e., these 9 test samples containing defects were new to the trained diffractive model.
  • the defect samples No.8 and No.9 had unique characteristics in their defect structures: the defect in test sample No.8 is a rectangle of 1 ⁇ 5 mm, a shape never included in the training set; and the defect in test sample or object 100 No.9 is a 1 ⁇ 3 mm rectangle but rotated 45 ⁇ , where such a rotation was never seen by the diffractive model in the training stage.
  • the specific geometric parameters (Dx, Dy, D z , x d , y d ) of each test object are provided in FIG.11E.
  • the other test sample/object 100 i.e., sample No.10, contains no defects, i.e., represents the negative test sample.
  • each test sample/object 100 was measured 5 times, producing output power spectra shown in FIG.11E (solid lines), which are compared to their numerically generated 2023-184-2 counterparts using the trained forward model (dashed lines).
  • solid lines output power spectra shown in FIG.11E
  • the measured spectral curves shown in FIG.11E exhibit minor random fluctuations and some small shifts towards longer wavelengths.
  • An all-optical, end-to-end diffractive sensor 10 is disclosed for the rapid detection of hidden structures.
  • This diffractive THz sensor 10 features a distinctive architecture composed of a pair of encoder and decoder diffractive networks formed using layers/substrates 20, each taking the unique responsibilities of structured illumination and spatial-spectral encoding. Based on this unique framework, a proof-of-concept defect/target feature detection sensor 10 based on this framework was demonstrated.
  • the success of the experimental results and analyses confirmed the feasibility of the single-pixel diffractive terahertz detector 10 using pulsed illumination to identify various hidden defects/target features with unknown shapes and locations inside test sample volumes or objects 100, with minimal false positives and without any image formation, acquisition or digital processing steps.
  • the method signifies a unique paradigm, eliminating the image capture/reconstruction, storage, and transmission steps needed by GPU-based digital processing systems.
  • the volumetric defect or target feature detection rate can be elevated to align with the exceptional speed of single-pixel sensors 14, which can have a response time of ⁇ 1 ⁇ s. Therefore, with the presented framework the efficiency of terahertz defect/target feature inspection/detection can be dramatically enhanced, and concurrently the cost per inspection can be substantially reduced using the presented diffractive sensors.
  • the false positive rate, FPR was calculated as a function of the averaging factor Navg.
  • FIG.14 reports the impact of Navg on the FPR in the detection results.
  • FIG.15A Another aspect that needs to be addressed pertains to the variation of the diffractive defect sensor’s performance at different sub-regions within the detection FOV.
  • the terahertz wave field ⁇ ⁇ ⁇ ⁇ was simulated within the detection FOV at plane D, i.e., the plane where the incoming terahertz field interacts with the potential hidden defects.
  • ⁇ ⁇ ⁇ ⁇ ⁇ presents a fairly uniform distribution (excluding the edges); in contrast, the overall distribution of ⁇ ⁇ ⁇ ⁇ shows a significant negative correlation with ⁇ ⁇ , and they both exhibit substantial correspondence with the amplitude distribution of ⁇ ⁇ ⁇ ⁇ ⁇ as shown in FIG.15A.
  • This finding suggests that ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ and ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ essentially play roles akin to reference light and probe light, respectively.
  • the intensity structure of this “probe” light ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ notably contributes to the distribution of ⁇ ⁇ , which both exhibit features of higher values at the center and lower values at the periphery, correlating with the observations in FIG.9E.
  • Different approaches can be explored to further enhance the performance of the diffractive sensor 10. To achieve a larger detection FOV, one will need to enlarge the diffractive layers 20 so that the possible defect or target feature information can be effectively processed by the diffractive layers 20 using a larger NA. Additionally, the number of trainable diffractive layers 20 in both the front-end and back-end optical networks can be increased to improve the approximation power of the diffractive network by creating a deeper diffractive sensor 10.
  • a diffractive sensor 10 utilizes more wavelengths, encoding additional information regarding the hidden defects or target features such as e.g., the size and material type of the defect or target feature, which may lead to more comprehensive defect or target feature detection and classification capabilities.
  • additional information regarding the hidden defects or target features such as e.g., the size and material type of the defect or target feature, which may lead to more comprehensive defect or target feature detection and classification capabilities.
  • a larger number of trainable diffractive features per design would, in general, be required; this increase would be approximately proportional to the number of wavelengths used to encode independent channels of information.
  • the presented single-pixel terahertz sensor 10 enabled high-throughput detection of defects with feature sizes close to the diffraction limit of light
  • the maximum thickness of the test sample or object 100 that can be probed in transmission mode would be limited by the terahertz absorption or scattering inside the sample volume.
  • the proposed transmission system will present limitations to probe deeper into the test sample volume. However, this limitation is not specific to the diffractive sensor design, and is in fact commonly shared by all terahertz-based imaging and sensing systems.
  • the presented diffractive sensor designs can be modified to work in reflection mode so that a partial volume of the highly absorbing and/or reflecting test objects 100 can be rapidly probed and analyzed by the single-pixel diffractive sensor.
  • the whole deep learning- based training strategy outlined in this work will remain the same, except that between the test sample or object 100 and the encoder diffractive network there will be a beam splitter (e.g., a mylar film) that communicates with an orthogonally placed diffractive decoder that will be jointly trained with the front-end diffractive encoder, following the architecture reported earlier in the Results section.
  • a beam splitter e.g., a mylar film
  • diffractive decoder that will be jointly trained with the front-end diffractive encoder, following the architecture reported earlier in the Results section.
  • this reflective diffractive sensor 10 design would allow for all-optical, rapid defect or target feature detection within a large sample volume, eliminating the need for mechanical scanning or extensive data storage, transmission and digital processing.
  • One additional potential limitation of the framework is uncontrolled mechanical misalignments among the diffractive layers 20 that constitute the diffractive encoder and decoder networks, as well as possible lateral/axial misalignments that might be observed between the diffractive layers 20 and the test sample/object volume.
  • diffractive designs can be “vaccinated” to such variations by modeling these random variations and misalignments during the optimization phase of the diffractive sensor 10 to build misalignment-resilient physical systems. It has been shown in earlier works that the evolution of diffractive surfaces during the deep learning-based training of a diffractive network can be regularized and guided toward diffractive solutions that can maintain their inference accuracy despite mechanical misalignments.
  • This misalignment-tolerant diffractive network training strategy models the layer-to-layer misalignments, e.g., translations in x, y and z, over random variables and introduces these errors as part of the forward optical model, inducing “vaccination” against such inaccuracies and/or mechanical variations.
  • the same training scheme can also be extended to mitigate the effects of other potential error sources e.g., fabrication inaccuracies, refractive index measurement errors and detection noise, improving the robustness of single-pixel defect detector devices 10.
  • the diffraction sensor framework can potentially sense the presence of even smaller hidden defects or objects with subwavelength dimensions. While the diffractive defect sensor 10 is diffraction-limited, isolated subwavelength features/defects can still generate traveling waves (through scattering) to be sensed by the diffractive layers 20.
  • the diffractive sensor 10 can resolve two closely positioned subwavelength features/defects or morphologically distinguish them from larger defects since it can only process propagating waves from the defect volume, without access to the evanescent waves in the near-field of a defect that carry the super-resolution information.
  • the presented design and the underlying concept can also be applied to other frequency bands of the electromagnetic spectrum, such as the infra-red and X-ray, for all-optical detection of hidden objects or defects.
  • the single-pixel diffractive terahertz sensor 10 can be useful for a variety of applications, including industrial quality control, material inspection and security screening. [0088] Methods [0089] Numerical forward model of a single-pixel diffractive terahertz sensor.
  • the system consists of successive diffractive or substrate layers 20 that are modeled as thin dielectric optical modulation elements of different thicknesses, where the ⁇ th feature on the ⁇ th layer 20 at a spatial location ( ⁇ ⁇ , ⁇ ⁇ , ⁇ ⁇ ) represents a complex-valued transmission coefficient, ⁇ ⁇ , which depends on the illumination wavelength ( ⁇ ): [0090] ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ , ⁇ ⁇ , ⁇ ⁇ , ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ , ⁇ ⁇ , ⁇ exp ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ , ⁇ ⁇ , ⁇ ⁇ , ⁇ ⁇ 1 ⁇ , diffractive layers are connected to each other by free-space propagation, which is modeled through the Rayleigh-Sommerfeld diffraction equation, with an impulse response of ⁇ ⁇ ⁇ ⁇ ⁇ , ⁇ , ⁇ : ⁇ ⁇
  • each diffractive feature in a diffractive network can be modeled as the source of a secondary wave, as in Eq. (2).
  • the optical field modulated by the ⁇ th diffractive feature of the ⁇ th layer ( ⁇ ⁇ 1, treating the input object plane as the 0 th layer), ⁇ ⁇ ⁇ ⁇ ⁇ , ⁇ ⁇ , ⁇ , ⁇ , ⁇ , ⁇ , ⁇ , ⁇ , ⁇ , can be written as: [0094] ⁇ ⁇ ⁇ , ⁇ ⁇ , ⁇ ⁇ , ⁇ ⁇ ⁇ ⁇ ⁇ , ⁇ ⁇ ⁇ ⁇ ⁇ , ⁇ ⁇ ⁇ ⁇ ⁇ , ⁇ ⁇ , ⁇ ⁇ ⁇ 3 ⁇ , ⁇ diffractive layer and the ⁇ location of the ⁇ th layer.
  • the axial distances between the input 2023-184-2 aperture 44/output aperture 46, diffractive layers 20 and the object 100 under test can be found in FIG.8B.
  • the amplitude and phase components of the complex transmittance of the ⁇ th feature of diffractive layer ⁇ i.e., ⁇ ⁇ ⁇ ⁇ ⁇ , ⁇ ⁇ , ⁇ ⁇ , ⁇ and ⁇ ⁇ ⁇ ⁇ ⁇ , ⁇ ⁇ , ⁇ ⁇ , ⁇ in Eq.
  • (1) are defined as a function of the material thickness, h ⁇ ⁇ , as follows: ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ [0097] ⁇ ⁇ ⁇ ⁇ ⁇ , ⁇ ⁇ , ⁇ ⁇ , ⁇ ⁇ exp ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ 4 ⁇ , ⁇ ⁇ ⁇ ⁇ ⁇ are the refractive index and the extinction coefficient of the diffractive layer material corresponding to the real and imaginary parts of the complex-valued refractive index ⁇ ⁇ ⁇ ⁇ , i.e., ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ .
  • the additional base thickness, h base is a constant, which is chosen as 0.6 mm to serve as the substrate support for the diffractive layers.
  • an associated latent trainable variable h ⁇ was defined using: [00102] h trainable ⁇ ⁇ max ⁇ ⁇ ⁇ sin ⁇ h ⁇ ⁇ ⁇ 1 ⁇ ⁇ 7 ⁇ . [00103] Note that before the of all the diffractive features were initialized as 0.
  • the resulting complex field is calculated using Eq. (3), except that the ⁇ ⁇ is replaced by ⁇ ⁇ ⁇ ⁇ used for the object material.
  • the propagating optical field encounters a defect inside the object 100, one can model the defect as a tiny volume element with a certain size, which can also be considered as a thin optical modulation element.
  • a single-pixel spectroscopic detector 14 positioned at the output plane measures the power spectrum of the resulting optical field within the active area of the detector ⁇ , where the resulting spectral signal can be denoted as ⁇ ⁇ : [00111] ⁇ ⁇ ⁇ ⁇ ⁇ , ⁇
  • the spectral intensity of the diffractive network output was sampled at a pair of wavelengths ⁇ ⁇ and ⁇ ⁇ , resulting in spectral intensity values ⁇ ⁇ ⁇ ⁇ and ⁇ ⁇ ⁇ ⁇ .
  • the output detection ⁇ ⁇ of the diffractive sensor is given by: [00113] ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ 12 ⁇ , boils down to a differential detection scheme where ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ indicates the existence of hidden defects 2023-184-2 and ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ indicates a defect-free sample.
  • ⁇ ⁇ and ⁇ ⁇ were empirically selected as 0.8 and 1.1 mm, respectively.
  • the numbers of true positive, positive, true negative and false negative samples denoted as ⁇ ⁇ , ⁇ ⁇ , ⁇ ⁇ and ⁇ ⁇ , respectively were analyzed. Based on these, the sensitivity (i.e., true positive rate, TPR), the specificity, the false negative rate (FNR) and the false positive rate (FPR) are reported using: [00115] Sensitivity ⁇ TPR ⁇ ⁇ ⁇ ⁇ ⁇ 1 ⁇ FNR ⁇ 13 ⁇ , [00116] Specificity ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ 14 ⁇ , [00117] False positive rate ⁇ FPR ⁇ ⁇ 1 ⁇ Specificity ⁇ ⁇ ⁇ ⁇ ⁇ 15 ⁇ .
  • the optical field at the input aperture of the system has a flat spectral magnitude, i.e., the total power of the illumination beam at ⁇ ⁇ and ⁇ ⁇ is equal in the numerical simulations.
  • the pulsed terahertz source 36 in the experimental TDS set-up contained a different spectral profile within the band of operation.
  • a normalization step was performed for the experimentally measured output power spectra for all the test samples using a linear correction factor ⁇ ⁇ , which was obtained using: ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ spectral intensity values corresponding to the spectral peaks closest to ⁇ ⁇ and ⁇ ⁇ , respectively, using the defect-free sample as the test object 100 after averaging multiple spectral measurements.
  • FIG.11E illustrates the normalized experimental spectral ⁇ ⁇ ⁇ ⁇ ⁇ defined by Eq.
  • the loss function used for training the presented diffractive terahertz sensor is defined as: [00124] L ⁇ ⁇ L ⁇ ⁇ ⁇ ⁇ L ⁇ ⁇ ⁇ L ⁇ ⁇ 18 ⁇ . [00125]
  • the first loss term, L ⁇ stands for defect detection loss.
  • the focal loss was employed given by: [00126] L ⁇ ⁇ ⁇ ⁇ ⁇ 1 ⁇ ⁇ log ⁇ ⁇ ⁇ ⁇ , ⁇ ⁇ ⁇ 1 ⁇ ⁇ 1 ⁇ ⁇ ⁇ ⁇ log ⁇ 1 ⁇ ⁇ , ⁇ ⁇ 0 ⁇ 19 ⁇ , [00127] where ⁇ ⁇ denotes the ground-truth label of the given 3D object volume, indicating the existence of the hidden defect ( ⁇ ⁇ ⁇ 1) or not ( ⁇ ⁇ ⁇ 0).
  • denotes the coefficient to balance the loss magnitude for the positive and negative samples, and ⁇ is the focusing parameter used to down-weigh the importance of the easy-to-classify samples (e.g., 3D objects with relatively larger hidden defect(s)).
  • ⁇ and ⁇ were empirically chosen as 0.5 and 4 throughout the training process. [00128]
  • the single-pixel diffractive sensor 10 should also be photon efficient, achieving a decent SNR at its output. Therefore, in the training loss function, added a loss term, L ⁇ was added, to increase the diffraction power efficiency at the output single-pixel aperture.
  • L ⁇ is defined as: [00129] L ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ , if ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ 20 ⁇ , 14 and ⁇ th refers to the penalization threshold for ⁇ , which was empirically selected as 0.01 during the training process.
  • is defined as: 2023-184-2
  • ⁇ ⁇ ⁇ detector ⁇ input (21), the optical field calculated within the active area ⁇ of the output single-pixel detector aperture 46 across the two wavelengths ⁇ ⁇ and ⁇ ⁇ , i.e., ⁇ detector ⁇ ⁇ ⁇ , ⁇ ⁇ ⁇ , ⁇
  • a defect-free sample was also generated and mixed the replicas of that with each of these defect sample subsets at a 1:1 ratio during the training process, so that the number of samples with and without defects could be balanced.
  • the error in the thickness of the silicon wafer was also considered in the training forward model.
  • the silicon thickness value was modeled as a random variable that follows a uniform distribution of [0.52, 0.53] mm in the forward model. This ensures that the trained diffractive 2023-184-2 sensor can provide detection performance that is resilient to the variations in the thickness values of the silicon wafer test samples.
  • the single-pixel diffractive sensor model used herein was trained using TensorFlow (v2.5.0, Google LLC).
  • Adam optimizer was selected, and its parameters were taken as the default values.
  • the batch size was set as 32.
  • the learning rate was set as 0.001.
  • a workstation was used with a GeForce GTX 1080Ti GPU (Nvidia Inc.) and Core i78700 central processing unit (CPU, Intel Inc.) and 64 GB of RAM, running Windows 10 operating system (Microsoft Inc.).
  • the training of the diffractive model was performed with 200 epochs, which typically required ⁇ 10 hours.
  • the best model was selected based on the detection performance quantified using the validation data set. [00139] Fabrication of the test samples.
  • the defects on the silicon wafers i.e., objects 100 were fabricated through the following procedure.
  • a SiO 2 layer was deposited on the silicon wafers using low-pressure chemical vapor deposition (LPCVD). Defect patterns were defined by photolithography, and the SiO 2 layer was etched in the defect regions using reactive-ion etching (RIE). After removing the remaining photoresist, defects in silicon wafers were formed through deep reactive-ion etching (DRIE) with the SiO2 layer serving as the etch mask. Finally, the SiO2 layer was removed through wet etching using a buffered oxide etchant (BOE). The depth of the defect regions, Dz, was measured with a Dektak 6M profilometer, which was ⁇ 0.25 mm for all the prepared test samples.
  • RIE reactive-ion etching
  • BOE buffered oxide etchant
  • the diffractive layers 20 were fabricated using a 3D printer (Form 3, Formlab). To assemble the printed diffractive layers 20 and input objects, a 3D-printed holder 16 (Objet30 Pro, Stratasys) was employed that was designed to ensure the proper placement of these components according to the numerical design. [00141] Terahertz time-domain spectroscopy set-up. A Ti:Sapphire laser 32 was used to generate femtosecond optical pulses with a 78 MHz repetition rate at a center wavelength of 780 nm.
  • the laser beam was split into two parts: one part was used to pump the terahertz source, a plasmonic photoconductive nano-antenna array, and the other part was used to probe the terahertz detector, a plasmonic photoconductive nano-antenna array providing a high sensitivity and broad detection bandwidth.
  • the terahertz radiation generated by the source 36 was collimated and directed to the scanned sample using an off-axis parabolic mirror 40, as shown in FIG.10A.
  • the output optical signal from the terahertz detector 14 was amplified with a transimpedance amplifier (Femto DHPCA-100) and detected with a lock-in amplifier (Zurich Instruments MFLI).
  • the time-domain signal was obtained.
  • the corresponding spectrum was calculated by taking the Fourier transform of the time-domain signal, resulting in an SNR of 90 dB and an observable bandwidth of 5 THz for a time-domain signal span of 320 ps.
  • the geometrical, positional, and/or material information of the defect or target feature of the object 100 is determined based on the signal intensities of one or more wavelengths detected at the one or more output detectors 14.
  • the diffractive sensor 10 may also include polarization-sensitive elements that provide polarization-aware sensing capability and sense defects or target features of the object 100 using polarization characteristics based on the signal intensities of one or more wavelengths detected at the one or more output detectors 14.
  • polarization-sensitive elements that provide polarization-aware sensing capability and sense defects or target features of the object 100 using polarization characteristics based on the signal intensities of one or more wavelengths detected at the one or more output detectors 14.
  • the experimental setup used a diffractive sensor 10 to sense a defect the diffractive sensor 10 may also be used to sense certain target features. This may include, for example, a structure or object with sample (or lack of a structure like a void).
  • the target feature may also include a material type. The invention, therefore, should not be limited, except to the following claims, and their equivalents.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Biochemistry (AREA)
  • Analytical Chemistry (AREA)
  • Pathology (AREA)
  • Chemical & Material Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • Toxicology (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)

Abstract

Un capteur diffractif détecte rapidement des caractéristiques cibles ou des défauts cachés dans un échantillon à l'aide d'un détecteur spectroscopique à pixel unique sans balayage de l'échantillon ni formage/traitement de son image. Le capteur diffractif comprend des couches diffractives passives qui sont optimisées par apprentissage profond pour modifier le spectre de rayonnement térahertz en fonction de l'absence/présence de structures ou de défauts cachés. Le capteur diffractif sonde de manière entièrement optique les informations structurales du volume d'échantillon et délivre un spectre qui indique directement la présence ou l'absence de structures cachées. Un capteur térahertz diffractif a été entraîné pour détecter des défauts ou des caractéristiques cibles cachés dans des échantillons de test et ses performances ont été évaluées par analyse de la sensibilité de détection en fonction de la taille et de la position des défauts inconnus. La faisabilité a été validée par la réussite de la détection des défauts cachés par éclairage térahertz pulsé. Cette technique sera utile pour diverses applications, par exemple le contrôle de sécurité, la détection biomédicale, le contrôle de la qualité dans l'industrie, les mesures anti-contrefaçon et la protection de l'héritage culturel.
PCT/US2024/013612 2023-03-16 2024-01-30 Détection rapide de défauts et d'objets cachés à l'aide d'un capteur térahertz diffractif à pixel unique Pending WO2024191523A2 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202363490745P 2023-03-16 2023-03-16
US63/490,745 2023-03-16

Publications (2)

Publication Number Publication Date
WO2024191523A2 true WO2024191523A2 (fr) 2024-09-19
WO2024191523A3 WO2024191523A3 (fr) 2025-05-30

Family

ID=92756250

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2024/013612 Pending WO2024191523A2 (fr) 2023-03-16 2024-01-30 Détection rapide de défauts et d'objets cachés à l'aide d'un capteur térahertz diffractif à pixel unique

Country Status (1)

Country Link
WO (1) WO2024191523A2 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN119861050A (zh) * 2025-03-21 2025-04-22 中山大学 一种超分辨率太赫兹单像素成像方法、系统、设备及介质
CN120064349A (zh) * 2025-01-24 2025-05-30 北京科技大学 一种基于中子飞行时间技术的单晶单像素数据采集及解析方法
CN120655638A (zh) * 2025-08-12 2025-09-16 陕西宇光飞利金属材料有限公司 一种基于图像识别的精细金属粉末缺陷检测方法及系统

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014143235A1 (fr) * 2013-03-14 2014-09-18 Integrated Plasmonics Corporation Spectroscopie à assistance par lumière ambiante
WO2015195746A1 (fr) * 2014-06-18 2015-12-23 Innopix, Inc. Système d'imagerie spectrale pour une détection à distance et non invasive de substances cibles à l'aide de réseaux de filtres spectraux et de réseaux de capture d'image
US20230153600A1 (en) * 2020-05-09 2023-05-18 The Regents Of The University Of California Machine vision using diffractive spectral encoding

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN120064349A (zh) * 2025-01-24 2025-05-30 北京科技大学 一种基于中子飞行时间技术的单晶单像素数据采集及解析方法
CN119861050A (zh) * 2025-03-21 2025-04-22 中山大学 一种超分辨率太赫兹单像素成像方法、系统、设备及介质
CN120655638A (zh) * 2025-08-12 2025-09-16 陕西宇光飞利金属材料有限公司 一种基于图像识别的精细金属粉末缺陷检测方法及系统

Also Published As

Publication number Publication date
WO2024191523A3 (fr) 2025-05-30

Similar Documents

Publication Publication Date Title
Li et al. Rapid sensing of hidden objects and defects using a single-pixel diffractive terahertz sensor
WO2024191523A2 (fr) Détection rapide de défauts et d'objets cachés à l'aide d'un capteur térahertz diffractif à pixel unique
Gupta et al. Deep learning enabled laser speckle wavemeter with a high dynamic range
Bai et al. All-optical image classification through unknown random diffusers using a single-pixel diffractive network
JP4707653B2 (ja) 静的マルチモードマルチプレックス分光法のための方法及びシステム
CN1224829C (zh) 微分数值孔径方法及装置
US20180286038A1 (en) Deep learning in label-free cell classification and machine vision extraction of particles
US20230162016A1 (en) Misalignment-resilient diffractive optical neural networks
Zhang et al. Advanced all-optical classification using orbital-angular-momentum-encoded diffractive networks
Mahjoubfar et al. Artificial Intelligence in Label-free Microscopy
Zhao et al. Piston detection in segmented telescopes via multiple neural networks coordination of feature-enhanced images
Yao et al. Fast terahertz image classification with a single-pixel detector
US20240288701A1 (en) Diffractive optical network for seeing through diffusive or scattering media
US20240019378A1 (en) Systems and methods for detecting foodborne pathogens by analyzing spectral data
Wang et al. Miniaturized spectrometer based on MLP neural networks and a frosted glass encoder
CN113390507B (zh) 光谱信息获取方法及光谱探测装置
Hoover et al. Polarization components analysis for invariant discrimination
Gu et al. Disordered-guiding photonic chip enabled high-dimensional light field detection
US20230401436A1 (en) Scale-, shift-, and rotation-invariant diffractive optical networks
Divitt et al. Spatial-spectral correlations of broadband speckle in around-the-corner imaging conditions
US20250138296A1 (en) Diffractive all-optical computing for quantitative phase imaging
CN113324954A (zh) 一种基于光谱成像的棱镜耦合表面等离激元共振测试系统
Algazinov et al. Methods of measuring the spectral characteristics and identifying the components of grain mixtures in real-time separation systems
He et al. Full Poincare mapping for ultra-sensitive polarimetry
Wen et al. A computational spectrometer for the visible, near, and mid-infrared enabled by a single-spinning film encoder

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24771333

Country of ref document: EP

Kind code of ref document: A2

WWE Wipo information: entry into national phase

Ref document number: 2024771333

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24771333

Country of ref document: EP

Kind code of ref document: A2

ENP Entry into the national phase

Ref document number: 2024771333

Country of ref document: EP

Effective date: 20251016

ENP Entry into the national phase

Ref document number: 2024771333

Country of ref document: EP

Effective date: 20251016

ENP Entry into the national phase

Ref document number: 2024771333

Country of ref document: EP

Effective date: 20251016

ENP Entry into the national phase

Ref document number: 2024771333

Country of ref document: EP

Effective date: 20251016

ENP Entry into the national phase

Ref document number: 2024771333

Country of ref document: EP

Effective date: 20251016