[go: up one dir, main page]

WO2025116204A1 - Système optique capable d'inspecter facilement une grande zone et dispositif d'observation le comprenant - Google Patents

Système optique capable d'inspecter facilement une grande zone et dispositif d'observation le comprenant Download PDF

Info

Publication number
WO2025116204A1
WO2025116204A1 PCT/KR2024/011961 KR2024011961W WO2025116204A1 WO 2025116204 A1 WO2025116204 A1 WO 2025116204A1 KR 2024011961 W KR2024011961 W KR 2024011961W WO 2025116204 A1 WO2025116204 A1 WO 2025116204A1
Authority
WO
WIPO (PCT)
Prior art keywords
observation
light
image sensor
area
optical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/KR2024/011961
Other languages
English (en)
Korean (ko)
Inventor
류영화
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Psi System Inc
Original Assignee
Psi System Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020230172317A external-priority patent/KR102766468B1/ko
Application filed by Psi System Inc filed Critical Psi System Inc
Publication of WO2025116204A1 publication Critical patent/WO2025116204A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination

Definitions

  • the present invention relates to an optical system capable of easily inspecting a wide area of an observation target and an observation device including the same.
  • Observation equipment is equipment that inspects an observation target (surface, etc.) or an observation area for various purposes, such as whether an abnormality has occurred.
  • An observation equipment uses a magnification optical system that has a magnification greater than 1 and thus forms an image on an image sensor that is larger than the image of the observation target (or observation area), or a reduction optical system that has a magnification less than 1 and thus forms an image on an image sensor that is smaller than the image of the observation target (or observation area).
  • Fig. 17a is a drawing showing the configuration of a device for observing an observation object using a reduction optical system
  • Fig. 17b is a drawing showing the configuration of a device for observing an observation object using a magnification optical system.
  • the observation area (1710) passes through a reduction optical system (1720) and an image is formed by an image sensor (1730) having a smaller size than the image of the observation area (1710).
  • the observation device can inspect the observation area (1710) using the sensing value of the image sensor (1730).
  • the observation device wants to observe multiple observation areas (1710a, 1710b) adjacent to each other within one observation target, it can observe multiple observation areas (1710a, 1710b) without difficulty by including a reduction optical system.
  • the observation area passes through a magnifying optical system (1740) and an image is formed by an image sensor (1750) having a size larger than that of the observation area (1710). Accordingly, the observation device including the reduction optical system can collectively sense the observation area (1710) and then collectively observe the observation target through a post-processing (image processing) process.
  • the observation device is to observe multiple observation areas (1710a, 1710b) adjacent to each other within a single observation target, it has a problem as illustrated in Fig. 17b because it includes a magnifying optical system.
  • the size of the image sensor (1750) is larger than that of the observation area (1710), there arises a problem that the physical arrangement of the image sensors (1750a, 1750b) for observing the aforementioned observation areas (1710a, 1710b) becomes impossible.
  • conventional observation devices including magnifying optical systems have the inconvenience of having to move the observation area within the entire area of the observation object one by one to observe the entire observation object when the observation area is smaller than the area of the entire observation object. Accordingly, conventional observation devices including magnifying optical systems have had considerable inconvenience in observing the observation object.
  • One embodiment of the present invention aims to provide an optical system and an observation device including the same that can easily observe a wide area even if the observation area is relatively narrower than the observation target.
  • an observation device for observing an observation object includes one or more light sources for irradiating light for observation of an observation area, a plurality of imaging systems for transmitting the light to the observation area or the observation area and a reference mirror inside the observation area to generate object light or interference light including optical characteristics of the observation object, a plurality of image sensors for receiving the light generated in each of the imaging systems and sensing the optical characteristics of the detection target, and a control unit for controlling the operation of each light source, controlling the operation of each image sensor, and observing the observation object based on the sensing value of each image sensor, wherein each imaging system is characterized in that the size of an image formed on the image sensor is larger than an image of the observation area.
  • each imaging system is characterized by including a deflection element that converts an optical path.
  • the image sensors are characterized in that they are arranged so as not to physically overlap.
  • the biasing element is characterized in that it changes the path of light so that the image sensors are positioned so that they do not physically overlap, and each observation area is positioned a preset distance apart in the width direction and the length direction.
  • the deflection element is characterized in that it can be implemented as a mirror, a diffraction grating, a prism, or a metasurface.
  • control unit is characterized in that it controls the operation of each image sensor in synchronization with the speed or position at which the observation target moves.
  • control unit is characterized in that it can simultaneously observe an area greater than a preset standard value of the observation target.
  • an observation system including a stage for placing and moving an observation object, and an observation device for measuring optical characteristics of the observation object by observing an observation area in synchronization with the speed or position of the observation object on the stage, wherein the observation device includes one or more light sources for irradiating light for observation of the observation area, a plurality of imaging systems for propagating the light to the observation area or the observation area and a reference mirror inside the same, thereby generating object light or interference light including the optical characteristics of the observation object, a plurality of image sensors for receiving the light generated in each of the imaging systems and sensing the optical characteristics of the detection target, and a control unit for controlling the operation of each image sensor and observing the observation object based on the sensing value of each image sensor, wherein each imaging system is implemented so that the size of an image formed on the image sensor is larger than that of an image of the observation area.
  • the imaging system is characterized in that each observation area is arranged without any gaps in one direction, but is arranged at a preset interval in a direction perpendicular thereto.
  • an observation device which includes an optical mask that receives interference light that has passed through a focusing system and generates a plurality of phase-shifting interference patterns, wherein the optical mask includes an optical array that receives the interference light and induces phase shifts at different angles on its own, and a circularly polarizing beam splitter that receives the light that has passed through the optical array and reflects one circularly polarized component and transmits another circularly polarized component.
  • the optical array is implemented as a geometric phase optical element and is characterized by inducing a phase shift with respect to incident interference light.
  • the optical array is characterized in that it is implemented as a structure having a meta surface or a structure including a liquid crystal.
  • the optical array is characterized by separating incident light into a same polarization component and an opposite polarization component.
  • the circularly polarizing beam splitter is characterized by having a helical structure (Chirality).
  • the circularly polarizing beam splitter is characterized in that it reflects a circularly polarized light component having the same rotational direction as its own helical structure among light incident thereon, and transmits a circularly polarized light component rotating in the opposite direction.
  • a method for detecting an optical characteristic of an observation target by an observation device including a first detection process of detecting an interference pattern of object light and reference light transmitted through an optical mask, an acquisition process of obtaining a plurality of interference pattern images by grouping the interference patterns detected in the first detection process according to an optical axis rotation angle of a geometric phase optical pixel, and a second detection process of detecting an optical characteristic of the object light using the plurality of interference pattern images obtained in the acquisition process.
  • an observation device for observing an observation object comprises: at least one light source for irradiating light for observation of an observation area; a plurality of imaging systems for transmitting light to the observation area or the observation area and a reference mirror inside the observation area to generate object light or interference light including optical characteristics of the observation object; a plurality of image sensors for receiving the light generated in each imaging system and sensing the optical characteristics of the detection object; a plurality of optical masks for generating a plurality of phase-shifting interference patterns by receiving interference light passing through each imaging system from in front of each image sensor on an optical path; and a control unit for controlling the operation of each image sensor and observing the observation object based on the sensing values of each image sensor, wherein each optical mask comprises an optical array for receiving interference light and inducing a phase shift at a different angle on its own; and a circularly polarizing beam splitter for receiving light passing through the optical array and reflecting one circularly polarized component and transmitting another circularly polarized component.
  • an observation device for observing an observation object includes one or more light sources for irradiating light for observation of an observation area, a plurality of imaging systems for transmitting the light to the observation area or the observation area and a reference mirror inside the observation area to generate object light or interference light including optical characteristics of the observation object, a plurality of image sensors for receiving the light generated in each of the imaging systems and sensing the optical characteristics of the detection target, and a control unit for controlling the operation of each image sensor and observing the observation object based on the sensing value of each image sensor, wherein each imaging system is characterized in that each observation area is arranged without gaps in one direction, but is arranged at a preset interval in a direction perpendicular thereto.
  • the imaging system is characterized by including an objective lens.
  • the image sensor, the objective lens, and the observation area are characterized in that their centers are arranged on the same line or are arranged at positions within a preset error range from the same line.
  • the objective lens is characterized in that it has an area or diameter larger than that of the image sensor and the observation device.
  • the objective lens is characterized in that it is arranged parallel to an adjacent objective lens in one direction.
  • an observation system including a stage for placing and moving an observation object, and an observation device for measuring optical characteristics of the observation object by observing an observation area in synchronization with the speed or position of the observation object on the stage, wherein the observation device includes one or more light sources for irradiating light for observation of the observation area, a plurality of imaging systems for propagating the light to the observation area or the observation area and a reference mirror inside the observation area to generate object light or interference light including the optical characteristics of the observation object, a plurality of image sensors for receiving the light generated in each imaging system and sensing the optical characteristics of the detection target, and a control unit for controlling the operation of each image sensor and observing the observation object based on the sensing value of each image sensor, wherein each imaging system is characterized in that each observation area is arranged without a gap in one direction, but is arranged at a preset interval in a direction perpendicular thereto.
  • the observation area is characterized in that the stage is arranged without gaps in a vertical direction in which the stage moves the observation object, and the stage is arranged at a preset interval in the direction in which the stage moves the observation object.
  • control unit is characterized in that it controls the operation of each image sensor in synchronization with the speed of the stage or the position of the observation object.
  • FIG. 1 is a diagram illustrating an example of an observation system according to one embodiment of the present invention.
  • FIG. 2 is a drawing illustrating the configuration of an observation device according to the first embodiment of the present invention.
  • FIG. 3 is a drawing illustrating an example of an observation device according to the first embodiment of the present invention.
  • FIG. 4 is a drawing illustrating an observation area of an observation device according to the first embodiment of the present invention.
  • FIG. 5 is a drawing illustrating a process of observing an observation target by an observation device according to the first embodiment of the present invention.
  • Figure 6 is a drawing illustrating the configuration of an observation device according to the second embodiment of the present invention.
  • FIG. 7 is a drawing illustrating the configuration of an optical mask according to a second embodiment of the present invention.
  • FIG. 8 is a drawing explaining the optical characteristics of an optical array according to the second embodiment of the present invention.
  • FIG. 9 is a drawing illustrating the material properties of a material constituting an optical array according to a second embodiment of the present invention.
  • FIG. 10 is a drawing illustrating the structure of an optical array according to a second embodiment of the present invention.
  • FIG. 11 is a drawing illustrating the structure and chiral volume grating characteristics of a circular polarizing beam splitter according to the second embodiment of the present invention.
  • FIG. 12 is a drawing explaining the optical characteristics of a circular polarizing beam splitter according to the second embodiment of the present invention.
  • FIG. 13 is a drawing explaining the optical characteristics of an optical mask according to the second embodiment of the present invention.
  • FIG. 14 is a drawing illustrating an optical characteristic detection process of an observation device according to a second embodiment of the present invention.
  • FIG. 15 is a diagram illustrating a pixel structure of a phase-shifted interference pattern detected by an image sensor according to a second embodiment of the present invention.
  • FIG. 16 is a flowchart illustrating a method for detecting optical characteristics of an observation target by an observation device according to a second embodiment of the present invention.
  • Figure 17 is a drawing showing the configuration of a device for observing an object of observation using a reduction optical system or a magnification optical system.
  • FIG. 18 is a drawing illustrating an example of an observation device according to a third embodiment of the present invention.
  • FIG. 19 is a drawing illustrating an observation area of an observation device according to a third embodiment of the present invention.
  • Figure 20 is a drawing illustrating a process of observing an observation target by an observation device according to a third embodiment of the present invention.
  • first, second, A, B, etc. may be used to describe various components, but the components should not be limited by the terms. The terms are only used to distinguish one component from another.
  • the first component may be referred to as the second component, and similarly, the second component may also be referred to as the first component.
  • the term and/or includes any combination of a plurality of related described items or any item among a plurality of related described items.
  • each configuration, process, procedure or method included in each embodiment of the present invention may be shared within a scope that is not technically contradictory to each other.
  • FIG. 1 is a diagram illustrating an example of an observation system according to one embodiment of the present invention.
  • an observation system (100) includes an observation device (110) and a stage (120).
  • the observation system (100) observes an observation object moving to the stage (120) for various purposes, such as shape detection of the observation object or occurrence of an abnormality on the surface.
  • the observation system (100) observes the observation object by measuring the optical characteristics of the object light reflected from the observation object (e.g., semiconductor, display element, etc.).
  • the optical characteristics measured may include some or all of the phase, amplitude, and polarization components of the object light.
  • the observation device (110) measures the optical characteristics of the observation target to observe the observation area and the observation target.
  • the observation device (110) includes an imaging system that operates as a magnifying optical system, it can easily inspect an area wider than the observation area.
  • the observation device (110) observes the observation area in synchronization with the speed or position of the observation target on the stage (120), thereby easily inspecting an area wider than the observation area. Accordingly, the inconvenience of having to move the observation area within the entire area of the observation target to observe, as in the case of conventional observation devices, can be eliminated. A more specific configuration of the observation device (110) will be described later with reference to FIGS. 2 to 16.
  • the stage (120) places and moves an observation target.
  • the stage (120) moves the observation target placed on it toward the observation device (110) at a preset speed.
  • the observation device (110) can observe the entire part of the observation target even if it is observing only one point.
  • the stage (120) transmits the movement position with respect to a reference point to the observation device (110) in some cases, so that the observation device (110) can determine how much the observation target has moved from an initial position or a preset point, or where it is currently located.
  • the stage (120) may transmit the movement position at preset intervals, or may transmit its own position when the reference point or the observation target passes a preset point.
  • FIG. 2 is a drawing illustrating a configuration of an observation device according to a first embodiment of the present invention
  • FIG. 3a is a drawing illustrating an example of an implementation of an observation device according to the first embodiment of the present invention.
  • an observation device (110) includes one or more light sources (210a to 210n), a plurality of imaging systems (220a to 220n), a plurality of image sensors (230a to 230n), and a control unit (240).
  • Each imaging system (220) includes a beam splitter (310), a tube lens (320), a deflection element (330), and an objective lens (340).
  • the observation device (110) may further include a communication unit (250).
  • Each light source (210a to 210n) irradiates light for observation of the observation area (350).
  • Each imaging system (220a to 220n) transmits light to an observation area (350) or transmits light between the observation area (350) and a reference mirror (370) to generate object light or interference light containing optical characteristics of an observation target.
  • Each image sensor (230a to 230n) receives object light or interference light generated from each imaging system (220a to 220n) and senses optical characteristics of an observation target.
  • Each image sensor (230a to 230n) receives object light or interference light and senses optical characteristics of the object light, thereby enabling (a user of the device, etc.) to observe the observation target.
  • Each image sensor (230a to 230n) may perform sensing (On) or not perform sensing (Off). Since the image sensors (230a to 230n) perform sensing at a specific point in time under the control of the control unit (240), they may sense optical characteristics of an observation target in synchronization with the speed or position of the stage (120).
  • the control unit (240) controls the operation of each image sensor (230a to 230n), and observes the observation target based on the sensing value of each image sensor (230a to 230n).
  • the control unit (240) controls the operation of each image sensor (230a to 230n) in synchronization with the speed of the stage (120) or the position of the observation target, as will be described later with reference to FIG. 5.
  • the operation of each light source (210a to 210n) may be controlled, but the control unit (240) may control the operation of each image sensor (230a to 230n) for the convenience of control and operation.
  • control unit (240) controls the operation of each image sensor (230a to 230n) to perform sensing at each point in time when a preset time has elapsed, an area greater than a preset reference value of the observation target can be observed at one time (as will be described later with reference to FIG. 5).
  • the control unit (240) can control each image sensor (230a to 230n) to sense at that time.
  • control unit (240) can simultaneously observe an area of the observation target that is greater than the preset reference value by only performing a post-processing level of combining the sensing values sensed by each image sensor (230a to 230n) for a certain period of time.
  • the observation device (110) may further include a communication unit (250).
  • the communication unit (250) receives, more specifically, from an encoder (not shown) in the stage (120), the (movement) position of a reference point in the stage (120) or the position of an observation target placed on the stage (120).
  • the communication unit (250) transmits the received information to the control unit (240), thereby allowing the control unit (240) to determine whether the observation target is located at a point where an image is to be sensed and control the image sensor (230).
  • Each imaging system (220a to 220n) can be implemented as a magnifying optical system (an optical system in which the size of the image formed on the image sensor, etc. is larger than that of the target area because the magnification is greater than 1) to observe the phase among the optical characteristics of the observation target, as shown in Fig. 3a.
  • a magnifying optical system an optical system in which the size of the image formed on the image sensor, etc. is larger than that of the target area because the magnification is greater than 1
  • the beam splitter (310) reflects light irradiated from the light source (210) to the observation area (350), and transmits light reflected from the observation area (350) to the image sensor (230).
  • the tube lens (320) and/or the objective lens (340) allow light reflected from the beam splitter (310) and traveling to the observation area (350) to be focused onto the observation area (350). Only the objective lens (340) may be included to perform the above-described operation, or both the tube lens (320) and the objective lens (340) may be included to perform the above-described operation.
  • the deflection element (330) is arranged on the optical path along which light reflected from the beam splitter (310) proceeds to the observation area (350), thereby converting the optical path.
  • the deflection element (330) can be mainly arranged between the tube lens (320) and the objective lens (340), and any element capable of changing the optical path, such as a mirror, a diffraction grating, a prism, or a metasurface, may be implemented.
  • the deflection element (330) converts the path of light reflected from the beam splitter (310) (so that it proceeds to the observation area (350)) toward the observation area (350).
  • the imaging system (220) operates as a magnifying optical system.
  • the area of the image sensor (230) is implemented to be larger than the area of the observation area (350).
  • the problem of the conventional magnifying optical system (the image sensors (230) are physically overlapped and arranged) is the same as that of the conventional magnifying optical system, as shown in FIG. 17b.
  • the observation area (350) can be formed so that it is spaced apart only by a preset interval (r 2 ) in the width direction and the length direction as described later with reference to FIG. 4, and the adjacent image sensors (230a to 230n) can be arranged so that they do not physically overlap (r 1 > 0).
  • Each light source (210a to 210n) can operate simultaneously, and each image sensor (230a to 230n) can sense the optical characteristics of the object light reflected from the observation target by receiving the interference light interfered by each imaging system (220a to 220n). Accordingly, even if the imaging systems (220) implemented as magnifying optical systems are arranged in an array, the occurrence of problems similar to those in the prior art can be resolved.
  • each imaging system (220a to 220n) can be implemented as a magnifying optical system as shown in Fig. 3b to observe the amplitude or polarization component of the optical characteristics of the observation target.
  • FIG. 3b is a drawing illustrating another embodiment of an observation device according to the first embodiment of the present invention.
  • each imaging system (220) includes a tube lens (320), a deflection element (330), an objective lens (340), a polarizing beam splitter (360), a reference mirror (370), and a QWP (Quarter Wave Plate, 380a to 380c).
  • a tube lens 320
  • a deflection element 330
  • an objective lens 340
  • a polarizing beam splitter 360
  • a reference mirror 370
  • QWP Quadrater Wave Plate
  • the polarizing beam splitter (360) splits the light irradiated from the light source (210) into different directions according to the polarization direction, or causes the light reflected from the reference mirror (370) and the observation area (350) to interfere with each other and to advance to the image sensor (230).
  • the QWP (380a) is arranged on the path of light that is branched from the polarizing beam splitter (360) and proceeds to the observation area (350), and converts linearly polarized light incident on it into circularly polarized light and circularly polarized light into linearly polarized light.
  • the light branched from the polarizing beam splitter (360) passes through the QWP (380a) and is converted into either left-handed or right-handed circularly polarized light. Meanwhile, the light reflected from the observation area (350) passes through the QWP (380a) and is converted into linearly polarized light again.
  • QWP (380b) is placed on the path of light that is branched from the polarizing beam splitter (360) and proceeds to the reference mirror (370), and performs the same operation as QWP (380a). However, since the remaining light from the light branched from the polarizing beam splitter (360) is incident on QWP (380b), it has a polarization direction that is perpendicular to the polarization direction of the light that passes through QWP (380a).
  • the QWP (380c) is positioned in front of the image sensor (230) on the light path incident on the image sensor (230) and converts each of the linearly polarized lights into circularly polarized lights. As described above, the lights branched from the polarizing beam splitter (360) pass through each of the QWPs (380a, 380b) twice, and thus have a linearly polarized state with a different polarization direction. Accordingly, the QWP (380c) converts each of the lights into a circularly polarized state in front of the image sensor (230).
  • the tube lens (320) and/or the objective lens (340) focus the light that branches from the polarizing beam splitter (360) and travels to the observation area (350) onto the observation area (350).
  • the biasing element (330) performs the above-described operation.
  • the observation device (110) Since each of the imaging systems (220a to 220n) has the structure shown in FIG. 3a or FIG. 3b, the observation device (110) according to the first embodiment of the present invention conducts observation as shown in FIGS. 4 and 5.
  • FIG. 4 is a drawing illustrating an observation area of an observation device according to the first embodiment of the present invention
  • FIG. 5 is a drawing illustrating a process in which an observation object is observed by an observation device according to the first embodiment of the present invention.
  • FIGS. 4 and 5 are drawings illustrating a process in which light irradiated from each light source (210) passes through each imaging system (220) and observes an observation area (350) in a top view direction, and for convenience, four imaging systems are illustrated as forming four observation areas, but this is not necessarily limited thereto.
  • the observation areas (350) formed in each of the focusing systems (220) are implemented to be smaller than the size of the objective lens (340).
  • the respective deflection elements (330) are included in each of the focusing systems (220)
  • the observation areas (350) formed in each of the focusing systems (220) are formed to be spaced apart by a preset interval (r 21 ) or more in the width (w) direction and spaced apart by a preset interval (r 22 ) or more in the length (h) direction.
  • the preset interval (r 21 ) and the preset interval (r 22 ) may be formed as an integer multiple of the width (w) and the length (h) of the observation area (350), respectively.
  • the preset interval (r 21 ) and the preset interval (r 22 ) may be spaced apart by only the width (w) and length (h) of the observation area (350), respectively, or may be spaced apart by n times the width (w) and length (h) of the observation area (350), respectively (where n is 2 or more).
  • observation areas e.g., 350b in FIG. 5
  • one observation area e.g., 350a in FIG. 5
  • observation of the observation target can proceed as follows.
  • observation is performed at one point in time. Accordingly, the image sensor (230) performs observation of the observation target corresponding to each observation area (350).
  • the control unit (240) synchronizes with the speed or position of the stage (120) to conduct observation (controls whether to operate the image sensor).
  • the control unit (240) analyzes the speed or position information of the stage (120) to analyze whether the stage (120) has moved by the width (w) or length (h) of the observation area (350) according to the movement direction of the stage (120).
  • the control unit (240) operates each image sensor (230) to conduct additional observation.
  • FIG. 5 illustrates an example in which the stage (120) moves in the direction of the length (h) of the observation area (350).
  • the observation target can be observed in an expanded form (510) of the observation area (350) as illustrated in FIG. 5d.
  • the observation device (110) can observe a desired area at one time after an appropriate amount of time has passed.
  • the observation device (110) can observe a desired area at one time even if it includes an imaging system (220) implemented as a magnifying optical system.
  • the light sources (210a to 210n) in the observation device (110) are included as many as the number of imaging systems (220a to 220n) and irradiate light to each imaging system (220a to 220n), but it is not necessarily limited thereto.
  • the light irradiated from the light source (210) may be split into a plurality of parts by an optical configuration such as a beam splitter, and accordingly, the light sources (210) in the observation device (110) may be included less than the number of imaging systems (220a to 220n) or only one.
  • FIG. 6 is a drawing illustrating the configuration of an observation device according to the second embodiment of the present invention
  • FIG. 14 is a drawing illustrating an optical characteristic detection process of an observation device according to the second embodiment of the present invention.
  • an observation device (600) according to a second embodiment of the present invention further includes a plurality of optical masks (610a to 610n) in the configuration of the observation device (110) according to the first embodiment of the present invention.
  • Each optical mask (610a to 610n) receives interference light that has passed through each imaging system (220a to 220n) in front of each image sensor (230a to 230n) on the optical path, and generates a plurality of phase-shifting interference patterns.
  • each optical mask (610a to 610n) is arranged between the QWP (380c) and the image sensor (230) and can perform the above-described operation.
  • Each optical mask (610a to 610n) can generate a plurality of phase-shifting interference patterns even without including a plurality of shooting or image sensors as in the related art by using a geometrical phase optical element.
  • the specific structure and operation of each optical mask (610a to 610n) are illustrated in FIGS. 7 to 13.
  • FIG. 7 is a drawing illustrating the configuration of an optical mask according to a second embodiment of the present invention.
  • an optical mask (610) includes an optical array (710) and a circular polarizing beam splitter (720).
  • the optical array (710) receives interference light interfered by the interferometer (210) and induces a phase shift, but induces a phase shift at multiple different angles with respect to the interference light.
  • optical array (710) is implemented as a geometric phase optical element as shown in FIGS. 8 and 9, it induces a phase shift for the incident interference light.
  • FIG. 8 is a drawing illustrating optical characteristics of an optical array according to a second embodiment of the present invention
  • FIG. 9 is a drawing illustrating material characteristics of materials constituting an optical array according to a second embodiment of the present invention
  • FIG. 10 is a drawing illustrating a structure of an optical array according to a second embodiment of the present invention.
  • the optical array (710) is implemented as a geometric phase optical element.
  • the optical array (710) may be implemented as a structure having a meta surface as shown in FIG. 9a, or as a structure including a liquid crystal as shown in FIG. 9b.
  • the optical array (710) can be implemented as a structure having a meta surface as illustrated in FIG. 9A.
  • the anisotropic structure (910) having a meta surface is formed of a high refractive index material having a rectangular cross-section and a tall columnar shape, and can be arranged in a rotated form at each local location.
  • the rectangular cross-section of the high refractive index material having a nano-size smaller than a radio wave wavelength exhibits optical anisotropy, and its rotational arrangement induces rotation of the optical axis.
  • the anisotropic structures (610) having a meta surface can be manufactured by an E-beam lithography process or a semiconductor process for precisely manufacturing a nano-sized structure.
  • the optical array (710) may be implemented as a liquid crystal-based anisotropic structure (610) as illustrated in FIG. 9b. Since the liquid crystal-based anisotropic structure (910) itself exhibits the characteristics of an anisotropic material, the structure (910) may be arranged in a rotated form like the arrangement of ⁇ (x) values, which are liquid crystal alignment directions, at each local location to induce a geometrical phase effect. When the optical array (710) is implemented as a liquid crystal-based anisotropic structure, it may be produced relatively inexpensively.
  • the optical array (710) implemented as a geometric phase optical element has the optical characteristics illustrated in Fig. 8.
  • the optical array (710) generates an additional geometric phase shift phenomenon due to the difference in the optical axis direction of the anisotropic material. Accordingly, when the optical array (710) is implemented as an anisotropic material having ⁇ as phase retardance, when light having a circular polarization in one direction is incident on the optical array (710), the following light is output.
  • T represents the Jones Matrix
  • E in represents the incident light incident on the optical array (710)
  • E GP represents the emitted light output from the optical array (710)
  • Silver is a polarization, where is the left-hand circular polarization, ⁇ is the rotation angle of the optical axis of the anisotropic material, n e is the refractive index of the fast axis of the anisotropic material, n o is the refractive index of the slow axis of the anisotropic material, and t is the thickness of the anisotropic material in the direction of light propagation.
  • the optical array (710) having these characteristics is implemented with an anisotropic material that induces a geometrical phase effect, as illustrated in FIG. 10, and includes a plurality of optical pixels (1010) implemented in an array form.
  • the optical pixel (1010) does not receive interference light from different points (of the observation object) as in the past, but multiple (at least three or more) adjacent optical pixels (1010a to 1010d) receive interference light from the same point (of the observation object).
  • each optical pixel (1010) receiving interference light from the same point is implemented with an anisotropic material having the properties described above, but has different optical axis rotation angles. For example, as illustrated in FIG. 10, assuming that four optical pixels (1010a to 1010d) that are horizontally (x-axis direction) and vertically (y-axis direction) receive interference light from the same point, each optical pixel (1010a to 1010d) can have an optical axis rotation angle of 0°, 45°, 90°, or 135°.
  • the polarization component whose phase is shifted (delayed) experiences a phase shift of 0°, 90°, 180°, and 270°, which is twice the optical axis rotation angle.
  • the optical pixels (1010a to 1010d) within the optical array (710) can be arranged repeatedly in a period of n ⁇ .
  • the optical pixels (1010a to 1010d) are arranged horizontally and vertically adjacent to each other, they can be arranged repeatedly in a period of 2 ⁇ in the horizontal and vertical directions within the optical array (710). Even if the optical array (710) having such an arrangement of optical pixels (1010) receives interference light only once, it can obtain all phase-shifting interference patterns for analyzing the optical characteristics of the observation target.
  • the optical array (710) can fundamentally block the influence of external disturbances such as vibrations in the process of acquiring all interference patterns having different phase shift angles as in the past, and there is no need to have multiple image sensors. Since only a very thin optical mask (610) needs to be placed in front of one image sensor (230), it can have a simple structure, and the overall volume of the optical mask (610) or the observation device (600) including it can be significantly reduced.
  • each polarization component emitted from the optical array (710) is incident on a circular polarization beam splitter (720). Since the circular polarization beam splitter (720) has the characteristics illustrated in FIG. 11, it can operate as illustrated in FIG. 12.
  • FIG. 11 is a drawing illustrating the structure and chiral volume grating characteristics of a circular polarizing beam splitter according to a second embodiment of the present invention
  • FIG. 12 is a drawing explaining the optical characteristics of a circular polarizing beam splitter according to a second embodiment of the present invention
  • FIG. 13 is a drawing explaining the optical characteristics of an optical mask according to the second embodiment of the present invention.
  • the circular polarizing beam splitter (720) has a helical structure, i.e., chiral characteristics.
  • a device in which the molecular directions of anisotropic structures (1110) are sequentially rotated and aligned along the vertical axis (y-axis) direction is called a chiral device.
  • the chiral dopant can induce chiral alignment of the nematic liquid crystal.
  • the period T y of the chiral characteristics can be determined according to the concentration of the chiral dopant.
  • the chiral liquid crystal alignment layer can be fabricated as a volume grating.
  • the propagation of light in a volume grating satisfies the Bragg diffraction condition between the propagation constants of the incident and outgoing light and the grating vector according to Floquet's theorem.
  • the circularly polarizing beam splitter (720) operates as illustrated in FIG. 12.
  • the circularly polarizing beam splitter (720) reflects a circularly polarized component of light incident on it that has the same rotational direction as its own helical structure, and transmits a circularly polarized component rotating in the opposite direction. More specifically, since the volume grating characteristic is selectively exhibited in the circularly polarizing beam splitter (720) depending on the circularly polarizing direction of the incident light, the circularly polarizing beam splitter (720) can selectively perform the function of a mirror or a beam splitter depending on the polarization direction of the incident light. For example, when the circularly polarizing beam splitter (720) has a left helicity as illustrated in FIG. 12, it can reflect a left circularly polarized component of light incident on it, and transmit only a right circularly polarized component.
  • the optical mask (610) operates as illustrated in FIG. 13 as it includes an optical array (710) and a circular polarizing beam splitter (720) having the characteristics described above.
  • each light passes through the optical array (710) and is emitted with the same polarization component as the incident light without phase shift and the opposite polarization component with phase shift.
  • a circularly polarizing beam splitter (720) receives light emitted from an optical array (710), reflects only one polarization component (e.g., left-hand circularly polarized light) among the incident polarization components, and transmits the remaining polarization components.
  • One of the transmitted polarization components corresponds to light whose phase has not shifted, and the other corresponds to light whose phase has shifted by twice the optical axis rotation angle.
  • the components transmitting through the circularly polarizing beam splitter (720) interfere with each other to form an interference pattern, and then proceed to the image sensor (230).
  • the thickness ( ⁇ ) of the optical array (710) is adjusted to ⁇ /2, the following interference pattern is formed in the interference light passing through the optical mask (610).
  • Silver is the object light component of the incident light.
  • Silver is the reference light component among the incident light
  • the object light component transmitted through the optical array (710) is a reference light component transmitted through the optical array (710)
  • Silver represents the reference light component transmitted through the circular polarizing beam splitter (720)
  • I( ⁇ ) represents the interference pattern image for the optical axis rotation angle ⁇ .
  • the image sensor (230) receives a plurality of phase-shifting interference patterns transmitted through the optical mask (610) and measures the optical characteristics of the object light reflected from the observation target based on the received interference patterns. As illustrated in FIG. 15, the image sensor (230) can acquire (a plurality of) phase-shifting interference pattern images that are phase-shifted by the same angle depending on the optical axis rotation angle.
  • FIG. 15 is a diagram illustrating a pixel structure of a phase-shifted interference pattern detected by a detection unit according to a second embodiment of the present invention.
  • the image sensor (230) can group a plurality of interference patterns that are phase-shifted by the same angle according to the optical axis rotation angle, thereby obtaining a plurality of phase-shifted interference patterns.
  • the image sensor (230) can detect optical characteristics (phase, amplitude, polarization) of object light from the obtained plurality of phase-shifted interference patterns.
  • the image sensor (230) can receive at least three interference patterns shifted at different angles from the optical mask (610), the image sensor can detect the amplitude of the object light, the amplitude of the reference light, and the phase of the object light compared to the reference light, respectively, as follows.
  • AR represents the amplitude of the reference light
  • AO represents the amplitude of the object light
  • S represents the Stokes parameter
  • the image sensor (230) can detect the phase and amplitude of the object light.
  • the image sensor (230) can detect the polarization state of the object light by using a plurality of interference patterns received.
  • the image sensor (230) can detect the Stokes parameter (Stokes' parameter, ⁇ S 0 , S 1 , S 2 , S 3 ⁇ ) representing the polarization of light from the amplitude of the left-circular polarization, the phase of the left-circular polarization, the amplitude of the right-circular polarization, and the phase of the right- circular polarization.
  • the image sensor (230) can detect the Stokes parameter according to the following equation.
  • a r is the amplitude of the right-circular polarization
  • a l is the amplitude of left-circular polarization
  • Each represents the phase of the left-hand circular polarization.
  • each Stokes parameter is calculated as follows.
  • the image sensor (230) can detect the polarization state from the amplitude and phase of the object light as described above.
  • the image sensor (230) can detect the optical characteristics of the object light. Accordingly, the image sensor (230) can detect a three-dimensional shape (of the observation target) with a pixel size of ⁇ .
  • the light sources (210a to 210n) in the observation device (600) are included as many as the number of imaging systems (220a to 220n) and irradiate light to each imaging system (220a to 220n), but it is not necessarily limited thereto.
  • the light irradiated from the light source (210) may be split into a plurality of parts by an optical configuration such as a beam splitter, and accordingly, the light source (210) may be included in the observation device (600) less than the number of imaging systems (220a to 220n) or only one.
  • FIG. 16 is a flowchart illustrating a method for detecting optical characteristics of an observation target by an observation device according to a second embodiment of the present invention.
  • the image sensor (230) detects an interference pattern of the object light and reference light transmitted through the optical mask (610) (S1610).
  • the image sensor (230) detects an interference pattern formed by the object light, which is polarized in the same direction as the reference light, which is polarized in the same direction, transmitted through the optical mask (610).
  • the image sensor (230) groups the detected interference patterns according to the optical axis rotation angle of the geometric phase optical pixels to obtain multiple interference pattern images (S1620).
  • the image sensor (230) detects the optical characteristics of the object light using the acquired multiple interference pattern images (S1630).
  • FIG. 18 is a drawing illustrating an example of an observation device according to a third embodiment of the present invention.
  • an observation device is also implemented in the same manner as the observation device (110) according to the first embodiment of the present invention, but may have different types of imaging systems (220a to 220n).
  • Each imaging system (220) includes a beam splitter (310), a tube lens (320), and an objective lens (330).
  • the beam splitter (310) reflects light irradiated from the light source (210) to the observation area (1820), and transmits light reflected from the observation area (1820) to the image sensor (230).
  • the tube lens (320) and/or the objective lens (1810) focus the light that is reflected from the beam splitter (310) and travels to the observation area (1820) onto the observation area (1820). Only the objective lens (1810) may be included to perform the above-described operation, or both the tube lens (320) and the objective lens (1810) may be included to perform the above-described operation.
  • the objective lens (1810) has a relatively significantly large (single-sided) area or diameter compared to the area of the observation area (1820), and also has a large area or diameter compared to the image sensor (230). It has a (single-sided) area or diameter that is several to several tens of times larger than the area of the observation area (1820). Accordingly, the objective lens (1810) can focus light reflected from the beam splitter (310) or light passing through the tube lens (320) onto the observation area (1820), and directs light reflected from the observation area (1820) back toward the tube lens (320) or the beam splitter (310).
  • the center of the image sensor (230), the objective lens (1810, including the tube lens), and the observation area (1820) are arranged on the same line or at a position within a preset error range therefrom. Since the (cross-sectional) area or diameter of the objective lens (1810) is implemented to be larger than the size or diameter of the image sensor (230) or the observation area (1820), when each optical configuration (230, 1810, 1820) is arranged as described above, the interval (r2) between adjacent objective lenses (1810) is smaller than the interval (r1) between adjacent image sensors (230) or the interval between observation areas (1820). Therefore, even if the imaging systems (220) are implemented as magnifying optical systems, each optical configuration within the imaging system can be arranged and operated without difficulty.
  • observation area (1820) is arranged parallel to the direction perpendicular to the moving direction of the stage (120) (on the same plane) as will be described later with reference to FIG. 19.
  • Each light source (210a to 210n) can perform operation simultaneously, and each image sensor (230a to 230n) can sense the optical characteristics of the object light reflected from the observation target by receiving the interference light interfered by each imaging system (220a to 220n). Accordingly, even if the imaging systems (220) implemented as magnifying optical systems are arranged in an array, the occurrence of problems similar to those in the prior art can be resolved.
  • each imaging system (220a to 220n) can be implemented as a magnifying optical system (an optical system in which the size of the image formed on the image sensor, etc. is larger than that of the target area because the magnification is greater than 1) to observe the amplitude or polarization component among the optical characteristics of the observation target, as illustrated in Fig. 17b.
  • a magnifying optical system an optical system in which the size of the image formed on the image sensor, etc. is larger than that of the target area because the magnification is greater than 1
  • FIG. 18b is a drawing illustrating another embodiment of an observation device according to a third embodiment of the present invention.
  • each imaging system (220) includes a tube lens (320), an objective lens (1810), a polarizing beam splitter (1830), a reference mirror (370), and QWP (Quarter Wave Plate, 1840a to 1840c).
  • the polarizing beam splitter (1830) splits the light irradiated from the light source (210) into different directions according to the polarization direction, or causes the light reflected from the reference mirror (370) and the observation area (1820) to interfere with each other and to advance to the image sensor (230).
  • the QWP (1840a) is arranged on the path of light that is branched from the polarizing beam splitter (1830) and proceeds to the observation area (1820), and converts linearly polarized light incident on it into circularly polarized light and circularly polarized light into linearly polarized light.
  • the light branched from the polarizing beam splitter (1830) passes through the QWP (1840a) and is converted into either left-handed or right-handed circularly polarized light. Meanwhile, the light reflected from the observation area (1820) passes through the QWP (1840a) and is converted into linearly polarized light again.
  • QWP (1840b) is placed on the path of light that proceeds to the reference mirror (370) among the light branched from the polarizing beam splitter (1830), and performs the same operation. However, since the remaining light among the light branched from the polarizing beam splitter (1830) is incident on QWP (1840b), it has a polarization direction that is perpendicular to the polarization direction of the light that passes through QWP (1840a).
  • the QWP (1840c) is positioned in front of the image sensor (230) on the light path incident on the image sensor (230) and converts each of the linearly polarized lights into circularly polarized lights. As described above, the lights branched from the polarizing beam splitter (1830) pass through each of the QWPs (1840a, 1840b) twice, and thus have a linearly polarized state with a different polarization direction. Accordingly, the QWP (1840c) converts each of the lights into a circularly polarized state in front of the image sensor (230).
  • the tube lens (320) and/or the objective lens (1810) focus the light that is branched from the polarizing beam splitter (1830) and travels to the observation area (1820) onto the observation area (1820), or directs the light reflected from the observation area (1820) back toward the tube lens (320) or the polarizing beam splitter (1830).
  • the observation device (110) Since each of the imaging systems (220a to 220n) has the structure shown in FIG. 18a or FIG. 18b, the observation device (110) according to the first embodiment of the present invention conducts observation as shown in FIGS. 19 and 20.
  • FIG. 19 is a drawing illustrating an observation area of an observation device according to a third embodiment of the present invention
  • FIG. 20 is a drawing illustrating a process in which an observation object is observed by an observation device according to a third embodiment of the present invention
  • FIGS. 19 and 20 are drawings illustrating a process in which light irradiated from each light source (210) passes through each imaging system (220) and observes an observation area (1820) in a top view direction, and for convenience, three imaging systems are illustrated as forming three observation areas, but this is not necessarily limited thereto.
  • the centers of the image sensor (230), the objective lens (1810), and the observation area (1820) within one imaging system (220) are arranged to coincide or at similar positions, and the (cross-sectional) area or diameter of the image sensor (230) is implemented to be larger than the image sensor (230) or the observation area (1820).
  • the objective lens (1810) is arranged parallel to an adjacent objective lens, and the observation area (1820) within each imaging system is arranged without a gap (parallel) in a direction perpendicular to the moving direction of the stage (120) (on the same plane). As illustrated in FIG.
  • the objective lenses (1810a to 1810c) are arranged so that there is physically no gap (parallel), and the observation area (1820) is arranged without a gap in a direction perpendicular to the moving direction of the stage (120).
  • the observation device (110) can observe part or all of the observation area depending on the number of imaging systems (220) in the direction perpendicular to the movement direction of the stage (120), and are formed at a preset interval from each other in the movement direction of the stage (120).
  • the preset interval may be an integer multiple of the length of the observation area (1820) (in the movement direction of the stage (120)).
  • the structure has a structure in which n observation areas can be arranged in the movement direction of the stage (120) between adjacent observation areas (1820).
  • observation of the observation target can be performed as described above with reference to FIG. 5.
  • FIG. 16 describes each process as being executed sequentially, this is only an example of explaining the technical idea of one embodiment of the present invention.
  • a person having ordinary skill in the art to which one embodiment of the present invention belongs may change the order described in each drawing without departing from the essential characteristics of one embodiment of the present invention, or may modify and modify and apply various modifications and variations such as executing one or more of each process in parallel. Therefore, FIG. 16 is not limited to a chronological order.
  • the processes illustrated in FIG. 16 can be implemented as computer-readable codes on a computer-readable recording medium.
  • the computer-readable recording medium includes all types of recording devices that store data that can be read by a computer system. That is, the computer-readable recording medium includes storage media such as magnetic storage media (e.g., ROM, floppy disk, hard disk, etc.) and optical reading media (e.g., CD-ROM, DVD, etc.).
  • the computer-readable recording medium can be distributed to computer systems connected to a network, so that the computer-readable codes can be stored and executed in a distributed manner.

Landscapes

  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Microscoopes, Condenser (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

L'invention concerne un système optique capable d'inspecter facilement une grande surface et un dispositif d'observation le comprenant. Selon un aspect du présent mode de réalisation, le dispositif d'observation destiné à observer un objet à observer comprend : une pluralité de sources lumineuses destinées à émettre de la lumière afin d'observer une zone d'observation; une pluralité de systèmes d'imagerie destinés à propager la lumière vers la zone d'observation et un miroir de référence à l'intérieur des systèmes d'imagerie destiné à générer une lumière d'interférence comprenant des caractéristiques optiques de l'objet à observer; une pluralité de capteurs d'image destinés à recevoir une lumière d'interférence brouillée par chaque système d'imagerie et détecter les caractéristiques optiques de la lumière d'objet réfléchie par un objet à détecter; et une unité de commande destinée à commander le fonctionnement de chaque source lumineuse et observer l'objet à observer, sur la base de la valeur de détection par chaque capteur d'image, chaque système d'imagerie étant mis en œuvre de telle sorte que la taille d'une image formée sur le capteur d'image est plus grande qu'une image de la zone d'observation.
PCT/KR2024/011961 2023-12-01 2024-08-12 Système optique capable d'inspecter facilement une grande zone et dispositif d'observation le comprenant Pending WO2025116204A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2023-0172317 2023-12-01
KR1020230172317A KR102766468B1 (ko) 2023-12-01 2023-12-01 대면적을 용이하게 검사할 수 있는 광학계 및 그를 포함하는 관측장치
KR20240102305 2024-08-01
KR10-2024-0102305 2024-08-01

Publications (1)

Publication Number Publication Date
WO2025116204A1 true WO2025116204A1 (fr) 2025-06-05

Family

ID=95897214

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2024/011961 Pending WO2025116204A1 (fr) 2023-12-01 2024-08-12 Système optique capable d'inspecter facilement une grande zone et dispositif d'observation le comprenant

Country Status (2)

Country Link
TW (1) TW202524172A (fr)
WO (1) WO2025116204A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1175018A (ja) * 1997-06-27 1999-03-16 Acer Peripherals Inc 倍率切り換え可能な光学式スキャナー
JP2012127856A (ja) * 2010-12-16 2012-07-05 Nuflare Technology Inc パターン検査装置およびパターン検査方法
KR20140103607A (ko) * 2013-02-18 2014-08-27 삼성전기주식회사 렌즈 검사 장치
JP2018063148A (ja) * 2016-10-12 2018-04-19 株式会社ディスコ 計測装置
KR20220096035A (ko) * 2020-12-30 2022-07-07 한국광기술원 홀로그래픽 현미경

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1175018A (ja) * 1997-06-27 1999-03-16 Acer Peripherals Inc 倍率切り換え可能な光学式スキャナー
JP2012127856A (ja) * 2010-12-16 2012-07-05 Nuflare Technology Inc パターン検査装置およびパターン検査方法
KR20140103607A (ko) * 2013-02-18 2014-08-27 삼성전기주식회사 렌즈 검사 장치
JP2018063148A (ja) * 2016-10-12 2018-04-19 株式会社ディスコ 計測装置
KR20220096035A (ko) * 2020-12-30 2022-07-07 한국광기술원 홀로그래픽 현미경

Also Published As

Publication number Publication date
TW202524172A (zh) 2025-06-16

Similar Documents

Publication Publication Date Title
WO2015030343A1 (fr) Ellipsomètre à matrice de mueller de type à rotation d'élément optique et procédé de mesure de matrice de mueller d'échantillon l'utilisant
WO2017217590A1 (fr) Appareil et procédé pour mesurer l'épaisseur et la forme d'une structure de film multicouche à l'aide d'un spectromètre d'image
WO2016148422A1 (fr) Ellipsomètre de type à rotation d'élément optique sans aberration chromatique et procédé de mesure de la matrice de mueller d'un échantillon à l'aide de ce dernier
WO2020013517A1 (fr) Ellipsomètre à incidence normale et procédé de mesure des propriétés optiques d'un échantillon à l'aide de celui-ci
WO2021020604A1 (fr) Appareil de mesure et procédé de mesure pour l'épaisseur et l'indice de réfraction d'un film mince multicouche, à l'aide d'une image d'interférence spectrale à résolution angulaire selon la polarisation
WO2013172561A2 (fr) Procédé de mesure de position absolue, dispositif de mesure de position absolue et échelle
WO2013105830A1 (fr) Interféromètre utilisant la polarisation asymétrique et dispositif optique utilisant l'interféromètre
WO2022025385A1 (fr) Système de mesure d'épaisseur et de propriétés physiques d'un film mince à l'aide d'un modulateur spatial de lumière
WO2023075368A1 (fr) Procédé de restauration de forme d'objet
WO2019088370A1 (fr) Dispositif d'inspection d'un objet à grande vitesse sur une grande surface
WO2022065658A1 (fr) Guide d'ondes holographique, son procédé de production et dispositif d'affichage comprenant le guide d'ondes holographique
WO2020153639A1 (fr) Dispositif d'affichage à cristaux liquides
WO2018169165A1 (fr) Interféromètre de sagnac à fibre optique utilisant un séparateur de faisceau polarisant
WO2018084552A1 (fr) Interféromètre de sagnac à espace libre utilisant un séparateur de faisceau polarisant
WO2024071667A1 (fr) Système d'holographie à balayage double et à division de polarisation pour réflecteur
WO2025116204A1 (fr) Système optique capable d'inspecter facilement une grande zone et dispositif d'observation le comprenant
WO2024219919A1 (fr) Système d'holographie à double balayage à division de polarisation utilisant une inclinaison angulaire par rapport à un objet transmissif
WO2012008750A2 (fr) Lunettes polarisantes
WO2017171153A1 (fr) Procédé d'inspection de plaque polarisante et dispositif d'inspection de plaque polarisante
WO2022173240A2 (fr) Appareil de mesure optique comprenant une métasurface universelle, et procédé de mesure optique l'utilisant
WO2024219920A1 (fr) Système d'holographie à double balayage et à division de polarisation utilisant un angle d'inclinaison par rapport à un corps réfléchissant
WO2020032689A1 (fr) Appareil d'inspection et procédé d'inspection
WO2024063610A1 (fr) Film optique
WO2022098016A1 (fr) Module de dispositif d'affichage optique et dispositif d'affichage optique le comprenant
WO2022035068A1 (fr) Système d'holographie à phase géométrique à balayage en ligne pour objet transmissif

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24897805

Country of ref document: EP

Kind code of ref document: A1