[go: up one dir, main page]

WO2020026432A1 - Optical system evaluation device and optical system evaluation method - Google Patents

Optical system evaluation device and optical system evaluation method Download PDF

Info

Publication number
WO2020026432A1
WO2020026432A1 PCT/JP2018/029221 JP2018029221W WO2020026432A1 WO 2020026432 A1 WO2020026432 A1 WO 2020026432A1 JP 2018029221 W JP2018029221 W JP 2018029221W WO 2020026432 A1 WO2020026432 A1 WO 2020026432A1
Authority
WO
WIPO (PCT)
Prior art keywords
wavefront aberration
light
optical system
distribution function
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2018/029221
Other languages
French (fr)
Japanese (ja)
Other versions
WO2020026432A9 (en
Inventor
将敬 鈴木
柳澤 隆行
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mitsubishi Electric Corp
Original Assignee
Mitsubishi Electric Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitsubishi Electric Corp filed Critical Mitsubishi Electric Corp
Priority to PCT/JP2018/029221 priority Critical patent/WO2020026432A1/en
Publication of WO2020026432A1 publication Critical patent/WO2020026432A1/en
Anticipated expiration legal-status Critical
Publication of WO2020026432A9 publication Critical patent/WO2020026432A9/en
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M11/00Testing of optical apparatus; Testing structures by optical methods not otherwise provided for
    • G01M11/02Testing optical properties
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene

Definitions

  • the present invention relates to an optical system evaluation device and an optical system evaluation method for estimating a wavefront aberration of an imaging optical system.
  • an imaging device used for remote sensing The characteristics of an imaging device used for remote sensing are likely to change due to the influence of vibration or the surrounding environment. Therefore, conventionally, with respect to an imaging device used for remote sensing, a technology for performing a process of correcting the characteristics of the imaging device at any time so that a high-quality captured image can be continuously obtained during its operation is known. I have. In order to appropriately correct the characteristics of the imaging device, it is necessary to grasp the characteristics of the imaging device when capturing an image. Understanding the characteristics of the imaging device is performed by evaluating an image captured by the operating imaging device.
  • the wavefront aberration of the imaging optical system may be used as one of the indexes for evaluating the characteristics of the imaging device.
  • the wavefront aberration is measured by a device for evaluating an optical system (hereinafter, referred to as an “optical system evaluation device”). Based on the measured wavefront aberration, for example, the imaging optical system is deformed so that the wavefront aberration approaches 0, thereby compensating for the resolution degradation due to the wavefront aberration.
  • Non-Patent Document 1 a plurality of light sources for evaluation are arranged on the ground so that a plurality of light sources for evaluation are arranged at known intervals, and a plurality of light sources are imaged from the sky.
  • a technique for obtaining a high-resolution point spread function (PSF) from a plurality of included point images is disclosed.
  • the PSF is a function indicating the intensity distribution of a point image corresponding to each light emitted from a plurality of evaluation light sources.
  • Non-Patent Document 1 assumes that a plurality of light sources for evaluation are arranged on the ground and images of the plurality of light sources are taken from the sky. A plurality of light sources cannot be imaged from above. Therefore, in the method disclosed in Non-Patent Document 1, there is a problem in that the wavefront aberration cannot be promptly estimated because the execution of the estimation depends on the weather.
  • the present invention has been made to solve the above problems, and an optical system evaluation apparatus and an optical system capable of estimating the wavefront aberration of an imaging optical system without disposing a light source for evaluation on the ground.
  • the purpose is to obtain an evaluation method.
  • An optical system evaluation apparatus includes: a spatial light modulator that adds a wavefront aberration to each light from a plurality of point light sources; and an imager that forms each light wavefront aberration added by the spatial light modulator.
  • An optical system a plurality of imaging devices that convert each light imaged by the imaging optical system to generate an electric signal, and a point corresponding to each light from the electric signals generated by the plurality of imaging devices.
  • a distribution function calculation unit that calculates a point spread function indicating an intensity distribution of an image, and a pixel phase estimation unit that estimates a pixel phase of a point image corresponding to each light from an electric signal generated by a plurality of imaging elements.
  • Distribution A generating unit is obtained by the high-resolution point spread function generated by the distribution function generating unit, to and a wavefront aberration estimation unit that estimates a wavefront aberration of the imaging optical system.
  • the wavefront aberration of the imaging optical system can be estimated without arranging the evaluation light source on the ground.
  • FIG. 1 is a configuration diagram illustrating an optical system evaluation device according to a first embodiment.
  • FIG. 2 is a hardware configuration diagram illustrating hardware of an image processing unit 4.
  • FIG. 4 is a hardware configuration diagram of a computer when the image processing unit 4 is realized by software or firmware.
  • 9 is a flowchart illustrating a processing procedure when the image processing unit 4 is realized by software or firmware. It is explanatory drawing which shows the several point image which has the shape which spreads uniformly toward the circumference
  • FIG. 1 is a configuration diagram illustrating an optical system evaluation device according to a first embodiment.
  • FIG. 2 is a hardware configuration diagram illustrating hardware of an image processing unit 4.
  • FIG. 4 is a hardware configuration diagram of a computer when the image processing unit 4 is realized by software or firmware.
  • FIG. 4 is an explanatory diagram showing point images (1) to (3), point image intensity distributions, and pixel values which are electric signals generated by the imaging elements 3-1 to 3-J corresponding to each light.
  • FIG. 3 is an explanatory diagram illustrating a relationship between a point image pixel phase and a pixel value of an image sensor.
  • FIG. 4 is an explanatory diagram illustrating a process of generating a high-resolution PSF by a distribution function generation unit 7;
  • FIG. 9 is a configuration diagram illustrating an optical system evaluation device according to a second embodiment.
  • FIG. 2 is a hardware configuration diagram illustrating hardware of an image processing unit 31.
  • 9 is a flowchart illustrating a processing procedure when the image processing unit 31 is realized by software or firmware.
  • FIG. 1 is a configuration diagram showing an optical system evaluation device according to the first embodiment.
  • a spatial light modulator 1 adds a wavefront aberration to each light from a plurality of point light sources.
  • the point light source includes an object that emits light itself or an object that reflects incident light.
  • the spatial light modulator 1 is realized by, for example, a spatial light modulator.
  • the spatial light modulator is a device having an optical element capable of modulating the phase, the polarization direction, the amplitude, the intensity, or the like of the light incident on the spatial modulator.
  • the spatial light modulator 1 may be realized by a deformable mirror capable of changing the shape of a mirror surface.
  • the spatial light modulator 1 may be realized by a focus position adjusting mechanism capable of shifting the position of each of the plurality of image sensors. By deviating the position of the image sensor from the focal position, a point image corresponding to each light is defocused, and a point image different from the point image in a focused state is obtained.
  • the imaging optical system 2 is realized by a lens or a mirror.
  • the image forming optical system 2 forms each of the lights, to which the wavefront aberration has been added by the spatial light modulator 1, on the image forming surfaces of the plurality of image sensors 3-1 to 3-J.
  • J is an integer of 2 or more.
  • the imaging unit 3 has a plurality of imaging devices 3-1 to 3-J.
  • the imaging elements 3-1 to 3-J convert the light formed on the imaging planes of the imaging elements 3-1 to 3-J by the imaging optical system 2 to generate electric signals.
  • the imaging elements 3-1 to 3-J output the generated electric signals to a distribution function calculation unit 5 described later and a pixel phase estimation unit 6 described later.
  • the image processing unit 4 includes a distribution function calculation unit 5, a pixel phase estimation unit 6, a distribution function generation unit 7, a wavefront aberration estimation unit 8, and a wavefront aberration output unit 9.
  • the image processing unit 4 analyzes the electric signals generated by the imaging elements 3-1 to 3-J of the imaging unit 3 and performs a process of estimating the wavefront aberration of the imaging optical system 2.
  • FIG. 2 is a hardware configuration diagram illustrating hardware of the image processing unit 4.
  • the distribution function calculation unit 5 is realized by, for example, a distribution function calculation circuit 11 illustrated in FIG.
  • the distribution function calculation unit 5 calculates a point spread function (PSF) indicating an intensity distribution of a point image corresponding to each light from the electric signals generated by the imaging elements 3-1 to 3-J. I do.
  • PSF point spread function
  • the distribution function calculation unit 5 outputs the calculated plurality of PSFs to the distribution function generation unit 7.
  • the pixel phase estimating unit 6 is realized by, for example, the pixel phase estimating circuit 12 shown in FIG.
  • the pixel phase estimation unit 6 estimates a pixel phase of a point image corresponding to each light (hereinafter, referred to as a “point image pixel phase”) from the electric signals generated by the imaging elements 3-1 to 3-J.
  • a point image pixel phase the plurality of point image pixel phases estimated for each of the plurality of point images corresponding to each of the plurality of lights
  • the pixel phase estimating unit 6 outputs the estimated plurality of point image pixel phases to the distribution function generating unit 7.
  • the distribution function generation unit 7 is realized by, for example, a distribution function generation circuit 13 illustrated in FIG.
  • the distribution function generation unit 7 has a higher resolution than each of the plurality of PSFs from the plurality of PSFs calculated by the distribution function calculation unit 5 based on the plurality of point image pixel phases estimated by the pixel phase estimation unit 6.
  • a high PSF (hereinafter, referred to as “high resolution PSF”) is generated.
  • the distribution function generator 7 outputs the generated high-resolution PSF to the wavefront aberration estimator 8.
  • the wavefront aberration estimating unit 8 is realized by, for example, the wavefront aberration estimating circuit 14 illustrated in FIG.
  • the wavefront aberration estimator 8 estimates the wavefront aberration of the imaging optical system 2 from the high-resolution PSF generated by the distribution function generator 7.
  • the wavefront aberration estimating unit 8 outputs the estimated wavefront aberration of the imaging optical system 2 to the wavefront aberration output unit 9.
  • the wavefront aberration output unit 9 is realized by, for example, the wavefront aberration output circuit 15 illustrated in FIG.
  • the wavefront aberration output unit 9 stores the wavefront aberration estimated by the wavefront aberration estimation unit 8 in an external storage device or the like. Further, the wavefront aberration output unit 9 outputs an image indicating the wavefront aberration estimated by the wavefront aberration estimation unit 8 or a coefficient indicating the wavefront aberration to an output device such as a display (not shown).
  • each of the distribution function calculation unit 5, pixel phase estimation unit 6, distribution function generation unit 7, wavefront aberration estimation unit 8, and wavefront aberration output unit 9, which are components of the image processing unit 4, is shown in FIG. It is assumed that it is realized by such dedicated hardware. That is, it is assumed that the image processing unit 4 is realized by the distribution function calculation circuit 11, the pixel phase estimation circuit 12, the distribution function generation circuit 13, the wavefront aberration estimation circuit 14, and the wavefront aberration output circuit 15.
  • each of the distribution function calculation circuit 11, the pixel phase estimation circuit 12, the distribution function generation circuit 13, the wavefront aberration estimation circuit 14, and the wavefront aberration output circuit 15 is, for example, a single circuit, a composite circuit, a programmed processor, A processor programmed in parallel, an ASIC (Application Specific Integrated Circuit), an FPGA (Field-Programmable Gate Array), or a combination thereof is applicable.
  • ASIC Application Specific Integrated Circuit
  • FPGA Field-Programmable Gate Array
  • the components of the image processing unit 4 are not limited to those realized by dedicated hardware, and even if the image processing unit 4 is realized by software, firmware, or a combination of software and firmware.
  • Software or firmware is stored as a program in the memory of the computer.
  • the computer means hardware for executing a program, for example, a CPU (Central Processing Unit), a central processing unit, a processing unit, an arithmetic unit, a microprocessor, a microcomputer, a processor, or a DSP (Digital Signal Processor). I do.
  • FIG. 3 is a hardware configuration diagram of a computer when the image processing unit 4 is realized by software or firmware.
  • the processing procedure of the distribution function calculation unit 5, the pixel phase estimation unit 6, the distribution function generation unit 7, the wavefront aberration estimation unit 8, and the wavefront aberration output unit 9 is performed by a computer.
  • a program to be executed is stored in the memory 22.
  • the processor 21 of the computer executes the program stored in the memory 22.
  • FIG. 4 is a flowchart illustrating a processing procedure when the image processing unit 4 is realized by software or firmware.
  • FIG. 2 illustrates an example in which each of the components of the image processing unit 4 is realized by dedicated hardware
  • FIG. 3 illustrates an example in which the image processing unit 4 is realized by software or firmware.
  • this is only an example, and some components in the image processing unit 4 may be realized by dedicated hardware, and the remaining components may be realized by software or firmware.
  • the plurality of point sources may be a number of stars distant from the earth or a number of lights in an urban area at night.
  • the imaging unit 3 can capture a large number of stars without being affected by the weather. Since the expected angle of a star distant from the earth is sufficiently smaller than the instantaneous viewing angle of the imaging unit 3, the star distant from the earth can be regarded as a point light source.
  • the imaging unit 3 When the imaging unit 3 captures images of a large number of lights in an urban area at night, it may not be possible to capture the images of a large number of lights in the urban area due to the presence of clouds above the urban area. However, when the artificial satellite equipped with the optical system evaluation device orbits in orbit, the imaging unit 3 does not have a cloud other than the urban area where the cloud exists in the sky. Since an image of a lamp in an urban area or the like may be captured, the number of image capturing opportunities is increased as compared with a case where a plurality of evaluation light sources arranged on the ground are captured, and the influence of weather is reduced.
  • the spatial light modulator 1 When light from a plurality of point light sources enters, the spatial light modulator 1 adds a wavefront aberration to each of the incident lights.
  • the wavefront aberration added to each light by the spatial light modulator 1 is a wavefront aberration that causes a point image corresponding to each light to have a shape that spreads evenly toward the periphery as shown in FIG. That is, the spatial light modulating unit 1 controls the point images corresponding to the respective lights so that the change in the point image intensity with respect to the coordinates of the point image intensity distribution on the imaging planes of the imaging elements 3-1 to 3-J becomes gentle. Give defocus to.
  • the spatial light modulator 1 gives a third-order astigmatism having high central symmetry to a point image corresponding to each light.
  • FIG. 5 is an explanatory diagram showing a plurality of point images each having a shape spreading evenly toward the periphery and a PSF having a shape with high central symmetry.
  • FIG. 6 is an explanatory diagram showing a plurality of point images each having a shape that spreads unequally toward the periphery, and a PSF having a shape with low central symmetry. 5 and 6 will be described later.
  • FIG. 7 is an explanatory diagram showing point images (1) to (3), point image intensity distributions, and pixel values, which are electric signals generated by the imaging elements 3-1 to 3-J, corresponding to the respective lights. .
  • FIG. 7 illustrates a mode in which the imaging elements 3-1 to 3-J are arranged one-dimensionally for simplification of the drawing.
  • FIG. 7 shows an example in which the number of light is three, and FIG. 7 shows three point images (1) to (3).
  • the point image (1) is formed on the imaging planes of the imaging devices 3-2 to 3-4. Since the center position of the point image (1) substantially coincides with the center position of the image sensor 3-3, most of the point image (1) is formed on the image plane of the image sensor 3-3. . Therefore, the pixel value of the imaging element 3-3 is larger than the respective pixel values of the imaging element 3-2 and the imaging element 3-4. The pixel value of the image sensor 3-2 and the pixel value of the image sensor 3-4 are almost the same.
  • the point image (2) is formed on the image plane of the imaging elements 3-7 to 3-9. The center position of the point image (2) exists in the image plane of the image sensor 3-8, but is slightly shifted to the right side in the figure from the center position of the image sensor 3-8.
  • the point image (2) is also formed on the image plane of the image sensor 3-8. However, since the center position of the point image (2) is slightly shifted to the right from the center position of the image sensor 3-8, the center position of the point image (2) matches the center position of the image sensor 3-8. As compared with the case where a point image is formed, a larger part of the point image (2) is also formed on the image forming surface of the image sensor 3-9. Therefore, the pixel value of the image sensor 3-8 is larger than each of the pixel values of the image sensor 3-7 and the image sensor 3-9, and the pixel value of the image sensor 3-9 is larger than that of the image sensor 3-9. It is smaller than the pixel value of 3-8, but larger than the pixel value of the image sensor 3-7.
  • the point image (3) is formed on the image plane of the imaging elements 3-11 to 3-14. Although the center position of the point image (3) exists in the image plane of the image sensor 3-12, it is greatly shifted to the right side in the drawing from the center position of the image sensor 3-12. Since the center position of the point image (3) is greatly shifted to the right side of the center position of the image sensor 3-12, most of the point image (3) is formed in the image sensor 3-12 and the image sensor 3-13, respectively. Are formed on the image forming plane. Therefore, among the pixel values of the imaging elements 3-11 to 3-14, the pixel value of the imaging element 3-12 is the largest, and then the pixel value of the imaging element 3-13 is the largest.
  • each pixel value of the imaging element 3-11 and the imaging element 3-14 is smaller than each pixel value of the imaging element 3-12 and the imaging element 3-13.
  • the distribution function calculating unit 5 calculates a PSF indicating the intensity distribution of the point image corresponding to each light from the electric signals (step ST1 in FIG. 4). ).
  • the PSF indicating the intensity distribution of the point image (1) is a series of pixel values of the imaging elements 3-2 to 3-4.
  • the PSF indicating the intensity distribution of the point image (2) is represented by a series of pixel values of the image sensors 3-7 to 3-9.
  • the PSF indicating the intensity distribution of the point image (3) is expressed by the image sensors 3-11 to 3-11. It is represented by a series of 3-14 pixel values.
  • the distribution function calculation unit 5 outputs the PSF corresponding to each light to the distribution function generation unit 7.
  • the pixel phase estimating unit 6 Upon receiving the electric signals from the imaging elements 3-1 to 3-14, the pixel phase estimating unit 6 estimates a point image pixel phase for a point image corresponding to each light from the electric signals, and Is output to the distribution function generator 7 (step ST2 in FIG. 4).
  • the reason why the pixel phase estimating unit 6 estimates the point image pixel phase is that the distribution function generating unit 7 described later can generate a high-resolution PSF even if a plurality of point light sources are not arranged at equal intervals. To do that.
  • the point image pixel phase corresponds to the center position of the point image as shown in FIG.
  • FIG. 8 is an explanatory diagram showing the relationship between the point image pixel phase and the pixel value of the image sensor.
  • the pixel phase estimating unit 6 can estimate the point image pixel phase by calculating the barycentric position. Since estimating the pixel phase by calculating the position of the center of gravity is a known technique, detailed description thereof will be omitted. Further, the pixel phase estimating unit 6 can estimate the point image pixel phase with an accuracy of less than one pixel by using a fitting function having the same shape as the point image intensity distribution. Estimating the pixel phase using the fitting function is a well-known technique, and a detailed description thereof will be omitted.
  • the PSF of the imaging optical system 2 can be used as the fitting function.
  • the PSF of the imaging optical system 2 is unknown.
  • the pixel phase estimating unit 6 uses a symmetric function such as a Gaussian function or a sink function as the fitting function.
  • the PSF of the imaging optical system 2 is a PSF having a shape with low central symmetry as shown in FIG. 6, even if the pixel phase estimating unit 6 uses a symmetric function as a fitting function, the pixel phase estimating unit 6 calculates the point image pixel phase. It cannot be estimated with high accuracy. In the case of a PSF having a shape with low central symmetry, the pixel phase estimating unit 6 cannot estimate the pixel phase with high accuracy similarly when estimating the point image pixel phase by calculating the position of the center of gravity. .
  • the spatial light modulator 1 applies a wavefront to each of the incident light so that the point image pixel phase can be estimated with high accuracy even if a symmetric function is used as the fitting function.
  • Adds aberration That is, the spatial light modulator 1 adds a wavefront aberration to make the point image have a shape that spreads evenly toward the periphery, so that the PSF of the imaging optical system 2 has a central symmetry as shown in FIG. Is a high shape PSF.
  • the pixel phase estimating unit 6 uses the symmetric function as the fitting function to change the point image pixel phase. It can be estimated with high accuracy.
  • the pixel phase estimating unit 6 estimates the point image pixel phase by calculating the position of the center of gravity to similarly estimate the point image pixel phase with high accuracy. Can be.
  • the distribution function generator 7 receives the plurality of PSFs calculated from the distribution function calculator 5 and receives the plurality of point image pixel phases estimated from the pixel phase estimator 6. As illustrated in FIG. 9, the distribution function generation unit 7 calculates a plurality of PSFs calculated by the distribution function calculation unit 5 based on the plurality of point image pixel phases estimated by the pixel phase estimation unit 6, and A high-resolution PSF having a higher resolution than each of the PSFs is generated (step ST3 in FIG. 4). The distribution function generator 7 outputs the generated high-resolution PSF to the wavefront aberration estimator 8.
  • FIG. 9 is an explanatory diagram illustrating the generation processing of the high-resolution PSF by the distribution function generation unit 7.
  • the generation processing of the high-resolution PSF by the distribution function generation unit 7 will be specifically described with reference to FIG.
  • P n 0, 1,..., K, k + 1,..., N ⁇ 1
  • the point image pixel phase of the point image P 0 is 0 [pixel]
  • the point image pixel phase of the point image P 1 is ⁇ [pixel]
  • the point image pixel phase of the point image P k is k ⁇ ⁇ [pixel]
  • the point image pixel phase of the point image P N ⁇ 1 is (N ⁇ 1) ⁇ ⁇ [pixel].
  • the M pixel values of the PSF for each of the point images P 0 to P N ⁇ 1 are shown as being arranged one-dimensionally.
  • the PSF for each of the point images P 0 to P N ⁇ 1 has M ⁇ M pixel values, which are two-dimensionally arranged.
  • the M pixel values of the PSF for the point image P 0 are P 0 (0), P 0 (1),..., P 0 (M ⁇ 1).
  • the M pixel values of the PSF for the point image P 1 are P 1 (0), P 1 (1),..., P 1 (M ⁇ 1).
  • the M pixel values of the PSF for the point image P k are P k (0), P k (1),..., P k (M ⁇ 1).
  • the M pixel values of the PSF for the point image P N-1 are P N-1 (0), P N-1 (1),..., P N-1 (M-1). is there.
  • the distribution function generation unit 7 determines that the first pixel value P 0 (0), P 1 from the left in the figure. Focus on (0), ..., P k (0), ..., P N-1 (0). Distribution function generating unit 7, the focused pixel value P 0 (0), P 1 (0), ⁇ , P k (0), ⁇ , P N-1 a (0), from the left in the drawing Are arranged in ascending order of the point image pixel phase of the point image having each pixel value.
  • the distribution function generation unit 7 calculates the second pixel value P 0 (1), P 2 1 (1), ⁇ , P k (1), ⁇ , attention is paid to P N-1 (1).
  • the point image pixel phases are arranged in ascending order in the same manner as described above.
  • the distribution function generator 7 also arranges the third and subsequent M pixel values from the left in the drawing in the same manner as the second M pixel values. For example, M-1-th pixel value P 0 (M-1), P 1 (M-1), ⁇ , P k (M-1), ⁇ , P N-1 (M-1) Are arranged on the right side of the pixel values P N-1 (M-2) arranged in ascending order by the point image pixel phase in ascending order.
  • the distribution function generator 7 arranges the M pixel values of the PSF for each of the point images P 0 to P N ⁇ 1 as described above to obtain a PSF in which the pixel values are arranged as follows. P 0 (0), P 1 (0), ..., P N-1 (0), P 0 (1), P 1 (1), ..., P N-1 (1), ... ⁇ , P 0 (M-1), P 1 (M-1), ..., P N-1 (M-1)
  • the distribution function generation unit 7 normalizes the respective pixel values so that the sum of the pixel values is 1, for example, thereby obtaining the point images P 0 to P 0 . Generate a high resolution PSF with N times higher resolution than the respective PSF in P N-1 .
  • the wavefront aberration estimator 8 Upon receiving the high-resolution PSF from the distribution function generator 7, the wavefront aberration estimator 8 estimates the wavefront aberration of the imaging optical system 2 from the high-resolution PSF (step ST4 in FIG. 4). The process itself of estimating the wavefront aberration from the PSF is a known technique, and a detailed description thereof will be omitted.
  • the wavefront aberration estimating unit 8 outputs the wavefront aberration of the imaging optical system 2 to the wavefront aberration output unit 9.
  • the wavefront aberration output unit 9 When receiving the wavefront aberration of the imaging optical system 2 from the wavefront aberration estimating unit 8, the wavefront aberration output unit 9 stores the wavefront aberration in an external storage device or the like. The wavefront aberration output unit 9 outputs an image indicating the wavefront aberration or a coefficient indicating the wavefront aberration to an output device such as a display (not shown).
  • the spatial light modulator 1 that applies a wavefront aberration to each light from a plurality of point light sources, and the electric signals generated by the imaging devices 3-1 to 3-J are used to convert each light into each light.
  • a distribution function calculating unit 5 for calculating a point spread function indicating an intensity distribution of a corresponding point image, and a point image corresponding to each light from the electric signals generated by the imaging elements 3-1 to 3-J.
  • a pixel phase estimating unit 6 for estimating an image pixel phase, and a plurality of PSFs calculated by the distribution function calculating unit 5 based on the point image pixel phases estimated by the pixel phase estimating unit 6, the resolution of which is higher than the plurality of PSFs.
  • a distribution function generation unit 7 for generating a high-resolution PSF having a high resolution Configure optical system evaluation device It was. Therefore, the optical system evaluation device can estimate the wavefront aberration of the imaging optical system without disposing the light source for evaluation on the ground.
  • Embodiment 2 FIG. In the second embodiment, a description will be given of an optical system evaluation device in which the wavefront aberration estimation unit 35 estimates the wavefront aberration of the imaging optical system 2 from a plurality of PSFs generated by the distribution function generation unit 34.
  • FIG. 10 is a configuration diagram showing an optical system evaluation device according to the second embodiment. 10, the same reference numerals as those in FIG. 1 denote the same or corresponding parts, and a description thereof will not be repeated.
  • the spatial light modulator 30 is a spatial light modulator that adds wavefront aberration to each light when the light emitted or reflected by each of the plurality of point light sources enters. is there.
  • the spatial light modulator 30 is different from the spatial light modulator 1 shown in FIG. 1 in that the wavefront aberration is added to each light from a plurality of point light sources while changing the wavefront aberration. 2 is repeatedly output.
  • the image processing unit 31 includes a distribution function calculation unit 32, a pixel phase estimation unit 33, a distribution function generation unit 34, a wavefront aberration estimation unit 35, and a wavefront aberration output unit 9.
  • the image processing unit 31 analyzes the electric signals generated by the imaging elements 3-1 to 3-J of the imaging unit 3 and performs a process of estimating the wavefront aberration of the imaging optical system 2.
  • FIG. 11 is a hardware configuration diagram illustrating hardware of the image processing unit 31. In FIG. 11, the same reference numerals as those in FIG.
  • the distribution function calculation unit 32 is realized by, for example, a distribution function calculation circuit 41 illustrated in FIG.
  • the distribution function calculation unit 32 calculates the PSF corresponding to each light from the electric signals generated by the imaging elements 3-1 to 3-J, similarly to the distribution function calculation unit 5 illustrated in FIG.
  • the distribution function calculator 32 differs from the distribution function calculator 5 shown in FIG. 1 in that each time the light is converted by the image sensors 3-1 to 3-J to generate an electric signal, the image sensor 3-1 is changed. A PSF corresponding to each light is calculated from the electric signals generated by the steps 3-J.
  • the distribution function calculation unit 32 outputs the calculated plurality of PSFs to the distribution function generation unit 34.
  • the pixel phase estimating unit 33 is realized by, for example, a pixel phase estimating circuit 42 illustrated in FIG. Like the pixel phase estimating unit 6 shown in FIG. 1, the pixel phase estimating unit 33 calculates a point image pixel phase for a point image corresponding to each light from the electric signals generated by the imaging elements 3-1 to 3-J. Is estimated.
  • the pixel phase estimating unit 33 is different from the pixel phase estimating unit 6 shown in FIG. 1 in that each time each light is converted by the imaging elements 3-1 to 3-J to generate an electric signal, the imaging element 3-1 A point image pixel phase is estimated for a point image corresponding to each light from the electric signals generated by the steps 3-J.
  • the pixel phase estimating unit 33 outputs the plurality of estimated point image pixel phases to the distribution function generating unit 34.
  • the distribution function generation unit 34 is realized by, for example, a distribution function generation circuit 43 illustrated in FIG. Similar to the distribution function generator 7 shown in FIG. 1, the distribution function generator 34 calculates a plurality of point image pixel phases estimated by the pixel phase estimator 33 and calculates a plurality of point image pixel phases calculated by the distribution function calculator 32. From the PSF, a high-resolution PSF having a higher resolution than each of the plurality of PSFs is generated. Unlike the distribution function generator 7 shown in FIG. 1, the distribution function generator 34 calculates a plurality of PSFs by the distribution function calculator 32 and estimates a plurality of point image pixel phases by the pixel phase estimator 33. Each time, a high-resolution PSF having a higher resolution than each of the plurality of PSFs is generated. The distribution function generator 34 outputs the generated high-resolution PSF to the wavefront aberration estimator 35.
  • the wavefront aberration estimating unit 35 is realized by, for example, a wavefront aberration estimating circuit 44 shown in FIG.
  • the wavefront aberration estimator 35 estimates the wavefront aberration of the imaging optical system 2 from the plurality of high-resolution PSFs generated by the distribution function generator 34.
  • the wavefront aberration estimating unit 35 outputs the estimated wavefront aberration of the imaging optical system 2 to the wavefront aberration output unit 9.
  • the distribution function calculation unit 32, the pixel phase estimation unit 33, the distribution function generation unit 34, the wavefront aberration estimation unit 35, and the wavefront aberration output unit 9, which are the components of the image processing unit 31, are illustrated in FIG. It is assumed that it is realized by such dedicated hardware. That is, it is assumed that the image processing unit 31 is realized by the distribution function calculation circuit 41, the pixel phase estimation circuit 42, the distribution function generation circuit 43, the wavefront aberration estimation circuit 44, and the wavefront aberration output circuit 15.
  • each of the distribution function calculation circuit 41, the pixel phase estimation circuit 42, the distribution function generation circuit 43, the wavefront aberration estimation circuit 44, and the wavefront aberration output circuit 15 is, for example, a single circuit, a composite circuit, a programmed processor, A parallel programmed processor, ASIC, FPGA, or a combination thereof is applicable.
  • the components of the image processing unit 31 are not limited to those realized by dedicated hardware. Even if the image processing unit 31 is realized by software, firmware, or a combination of software and firmware, Good.
  • the processing procedure of the distribution function calculation unit 32, the pixel phase estimation unit 33, the distribution function generation unit 34, the wavefront aberration estimation unit 35, and the wavefront aberration output unit 9 is performed by a computer.
  • the program to be executed is stored in the memory 22 shown in FIG.
  • the processor 21 of the computer executes the program stored in the memory 22.
  • FIG. 12 is a flowchart illustrating a processing procedure when the image processing unit 31 is realized by software or firmware.
  • the spatial light modulating unit 30 repeatedly outputs each of the light to which the wavefront aberration has been added to the imaging optical system 2 while changing the wavefront aberration to be applied to each of the lights from the plurality of point light sources.
  • the spatial light modulator 30 adds different wavefront aberrations to incident light H times.
  • H is an integer of 2 or more.
  • the wavefront aberration added by the spatial light modulator 30 is a wavefront aberration that causes the point images corresponding to the respective lights to have a shape that spreads evenly toward the periphery, as shown in FIG.
  • the spatial light modulator 30 outputs each of the lights with the wavefront aberration added to the imaging optical system 2 each time the wavefront aberration is added to each of the lights.
  • the imaging optical system 2 When each light to which the wavefront aberration is added by the spatial light modulator 30 is incident, the imaging optical system 2 forms each light on the imaging surfaces of the imaging elements 3-1 to 3-J.
  • the imaging devices 3-1 to 3-J convert each light imaged by the imaging optical system 2 to generate an electric signal.
  • the imaging unit 3 outputs the electric signals generated by the imaging devices 3-1 to 3-J to each of the distribution function calculation unit 32 and the pixel phase estimation unit 33.
  • the distribution function calculation unit 32 calculates a point corresponding to each light from the received electric signal, similarly to the distribution function calculation unit 5 illustrated in FIG.
  • the PSF indicating the intensity distribution of the image is calculated (step ST11 in FIG. 12).
  • the distribution function calculation unit 32 outputs the calculated PSF to the distribution function generation unit 34.
  • the pixel phase estimating unit 33 calculates a point corresponding to each light from the received electric signal, similarly to the pixel phase estimating unit 6 shown in FIG.
  • the point image pixel phase is estimated for the image (step ST12 in FIG. 12).
  • the pixel phase estimating unit 33 outputs the estimated point image pixel phase to the distribution function generating unit 34.
  • the distribution function generator 34 receives the plurality of PSFs calculated from the distribution function calculator 32 and receives the plurality of point image pixel phases estimated from the pixel phase estimator 33.
  • the distribution function generator 7 like the distribution function generator 7 shown in FIG. 1, calculates a plurality of point image pixel phases estimated by the pixel phase estimator 33 and a plurality of point image pixel phases calculated by the distribution function calculator 32. From the PSF, a high-resolution PSF having a higher resolution than each of the plurality of PSFs is generated (step ST13 in FIG. 12).
  • the distribution function generator 34 outputs the generated high-resolution PSF to the wavefront aberration estimator 35.
  • step ST14: NO If the number of high-resolution PSFs generated by the distribution function generator 34 is less than H (step ST14: NO), the processing of steps ST11 to ST13 is repeatedly performed. If the number of high-resolution PSFs generated by the distribution function generator 34 is H (step ST14: YES), the wavefront aberration estimator 35 generates the H high-resolution PSFs generated by the distribution function generator 34.
  • the wavefront aberration of the imaging optical system 2 is estimated from the PSF (step ST15 in FIG. 12).
  • the wavefront aberration estimating unit 35 can estimate the wavefront aberration from the H high-resolution PSFs, for example, by using a known phase diversity method.
  • Non-Patent Document 2 J. J. Green, D. C. Redding, S. B. Shaklan, S. A. Basinger, “Extreme wave front sensing accuracy for the Eclipse coronagraphic space telescope,” Proc. SPIE 4860, 266, (2003).
  • Non-Patent Document 3 R. G. Paxman, T. J. Schulz, and J. R. Fienup, “Joint estimation of object and aberrations by using phase diversity,” J. Opt. Soc. Vol. 9, No. 7 (1992).
  • the spatial light modulator 30 changes the wavefront aberration applied to each light from the plurality of point light sources while changing the wavefront aberration-added light. Output to the system 2 repeatedly.
  • the distribution function calculation unit 32 calculates the intensity of the point image corresponding to each light from the generated electric signal. Calculate the PSF indicating the distribution.
  • the pixel phase estimating unit 33 calculates a point image corresponding to the light from the generated electric signal. Estimate the image pixel phase.
  • the distribution function generation unit 34 calculates a plurality of PSFs by the distribution function calculation unit 32 and every time the pixel phase estimation unit 33 estimates the point image pixel phase, the point image estimated by the pixel phase estimation unit 33 is calculated. Based on the pixel phase, a high-resolution PSF having a higher resolution than each of the plurality of PSFs is generated from the plurality of PSFs calculated by the distribution function calculation unit 32.
  • the wavefront aberration estimator 35 estimates the wavefront aberration of the imaging optical system 2 from the plurality of high-resolution PSFs generated by the distribution function generator 34. Therefore, the optical system evaluation device of the second embodiment can improve the estimation accuracy of the wavefront aberration more than the optical system evaluation device of the first embodiment.
  • any combination of the embodiments, a modification of an arbitrary component of each embodiment, or an omission of an arbitrary component in each embodiment is possible within the scope of the invention. .
  • the present invention is suitable for an optical system evaluation device and an optical system evaluation method for estimating the wavefront aberration of an imaging optical system.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Testing Of Optical Devices Or Fibers (AREA)
  • Exposure Of Semiconductors, Excluding Electron Or Ion Beam Exposure (AREA)
  • Studio Devices (AREA)

Abstract

This optical system evaluation device comprises a spatial light modulation unit (1) for adding wavefront aberration to the light from a plurality of point light sources, a spread function calculation unit (5) for using electrical signals generated by imaging elements (3-1 to 3-J) to calculate point image spread functions indicating the intensity distributions of point images corresponding to the light from the respective light sources, a pixel phase estimation unit (6) for using the electrical signals generated by the imaging elements (3-1 to 3-J) to estimate point image pixel phases for the point images corresponding to the light from the respective light sources, and a spread function generation unit (7) for using the point image pixel phases estimated by the pixel phase estimation unit (6) to generate a high-resolution PSF having a higher resolution than the plurality of PSFs calculated by the spread function calculation unit (5). A wavefront aberration estimation unit 8 estimates the wavefront aberration of an image formation optical system on the basis of the high-resolution PSF generated by the spread function generation unit (7).

Description

光学系評価装置及び光学系評価方法Optical system evaluation apparatus and optical system evaluation method

 この発明は、結像光学系の波面収差を推定する光学系評価装置及び光学系評価方法に関するものである。 The present invention relates to an optical system evaluation device and an optical system evaluation method for estimating a wavefront aberration of an imaging optical system.

 リモートセンシングに用いられる撮像装置の特性は、振動の影響又は周囲の環境の影響で、変化し易い。
 したがって、従来、リモートセンシングに用いられる撮像装置については、その運用中に継続的に高品質の撮像画像を得られるように、随時、撮像装置の特性を補正する処理を行う技術が、知られている。撮像装置の特性を適切に補正するためには、画像を撮影する際の撮像装置の特性を把握する必要がある。撮像装置の特性の把握は、運用中の撮像装置が撮像した画像を評価することによって行われる。
The characteristics of an imaging device used for remote sensing are likely to change due to the influence of vibration or the surrounding environment.
Therefore, conventionally, with respect to an imaging device used for remote sensing, a technology for performing a process of correcting the characteristics of the imaging device at any time so that a high-quality captured image can be continuously obtained during its operation is known. I have. In order to appropriately correct the characteristics of the imaging device, it is necessary to grasp the characteristics of the imaging device when capturing an image. Understanding the characteristics of the imaging device is performed by evaluating an image captured by the operating imaging device.

 結像光学系の波面収差は、撮像装置の特性を評価する指標の1つとして用いられることがある。
 撮像装置の特性を評価する指標として、波面収差が用いられる場合、光学系を評価するための装置(以下「光学系評価装置」という。)によって波面収差が測定される。測定された波面収差に基づき、例えば、波面収差が0に近づくように結像光学系が変形されることによって、波面収差による解像度劣化が補償される。
The wavefront aberration of the imaging optical system may be used as one of the indexes for evaluating the characteristics of the imaging device.
When the wavefront aberration is used as an index for evaluating the characteristics of the imaging device, the wavefront aberration is measured by a device for evaluating an optical system (hereinafter, referred to as an “optical system evaluation device”). Based on the measured wavefront aberration, for example, the imaging optical system is deformed so that the wavefront aberration approaches 0, thereby compensating for the resolution degradation due to the wavefront aberration.

 以下の非特許文献1には、複数の評価用の光源が既知の間隔で並ぶように、複数の評価用の光源を地上に配置して、上空から複数の光源を撮像し、撮像した画像に含まれている複数の点像から高分解能な点像分布関数(PSF:Point Spread Function)を得る手法が開示されている。
 PSFは、複数の評価用の光源から放射されたそれぞれの光に対応する点像の強度分布を示す関数である。
In Non-Patent Document 1 below, a plurality of light sources for evaluation are arranged on the ground so that a plurality of light sources for evaluation are arranged at known intervals, and a plurality of light sources are imaged from the sky. A technique for obtaining a high-resolution point spread function (PSF) from a plurality of included point images is disclosed.
The PSF is a function indicating the intensity distribution of a point image corresponding to each light emitted from a plurality of evaluation light sources.

“In-Flight Performance Assessment of Imaging Systems Using The Specular Array Radiometric Calibration (SPARC) Method,”11th Annual Joint Agency Commercial Imagery Evaluation (JACIE) Workshop (2012)“In-Flight Performance Assessment of Imaging Systems Using The Specular Array Radiometric Calibration (SPARC) Method,” 11th Annual Joint Joint Agency Commercial Imagery Evaluation (JACIE) Workshop (2012)

 非特許文献1に開示されている手法は、複数の評価用の光源を地上に配置して、上空から複数の光源を撮像することを前提とするが、光源の上空に雲が存在する場合、上空からは、複数の光源を撮像することができない。
 したがって、非特許文献1に開示されている手法では、推定の実行が天候に左右されるため、波面収差を即応的に推定できないことがあるという課題があった。
The method disclosed in Non-Patent Document 1 assumes that a plurality of light sources for evaluation are arranged on the ground and images of the plurality of light sources are taken from the sky. A plurality of light sources cannot be imaged from above.
Therefore, in the method disclosed in Non-Patent Document 1, there is a problem in that the wavefront aberration cannot be promptly estimated because the execution of the estimation depends on the weather.

 この発明は上記のような課題を解決するためになされたもので、評価用の光源を地上に配置することなく、結像光学系の波面収差を推定することができる光学系評価装置及び光学系評価方法を得ることを目的とする。 SUMMARY OF THE INVENTION The present invention has been made to solve the above problems, and an optical system evaluation apparatus and an optical system capable of estimating the wavefront aberration of an imaging optical system without disposing a light source for evaluation on the ground. The purpose is to obtain an evaluation method.

 この発明に係る光学系評価装置は、複数の点光源からのそれぞれの光に波面収差を加える空間光変調部と、空間光変調部により波面収差が加えられたそれぞれの光を結像する結像光学系と、結像光学系により結像されたそれぞれの光を変換して電気信号を生成する複数の撮像素子と、複数の撮像素子により生成された電気信号から、それぞれの光に対応する点像の強度分布を示す点像分布関数を算出する分布関数算出部と、複数の撮像素子により生成された電気信号から、それぞれの光に対応する点像の画素位相を推定する画素位相推定部と、画素位相推定部により推定された画素位相に基づいて、分布関数算出部により算出された複数の点像分布関数から、複数の点像分布関数よりも分解能が高い高分解能点像分布関数を生成する分布関数生成部と、分布関数生成部により生成された高分解能点像分布関数から、結像光学系の波面収差を推定する波面収差推定部とを備えるようにしたものである。 An optical system evaluation apparatus according to the present invention includes: a spatial light modulator that adds a wavefront aberration to each light from a plurality of point light sources; and an imager that forms each light wavefront aberration added by the spatial light modulator. An optical system, a plurality of imaging devices that convert each light imaged by the imaging optical system to generate an electric signal, and a point corresponding to each light from the electric signals generated by the plurality of imaging devices. A distribution function calculation unit that calculates a point spread function indicating an intensity distribution of an image, and a pixel phase estimation unit that estimates a pixel phase of a point image corresponding to each light from an electric signal generated by a plurality of imaging elements. Generating a high-resolution point spread function having a higher resolution than the plurality of point spread functions from the plurality of point spread functions calculated by the distribution function calculator based on the pixel phase estimated by the pixel phase estimator. Distribution A generating unit, is obtained by the high-resolution point spread function generated by the distribution function generating unit, to and a wavefront aberration estimation unit that estimates a wavefront aberration of the imaging optical system.

 この発明によれば、評価用の光源を地上に配置することなく、結像光学系の波面収差を推定することができる。 According to the present invention, the wavefront aberration of the imaging optical system can be estimated without arranging the evaluation light source on the ground.

実施の形態1による光学系評価装置を示す構成図である。1 is a configuration diagram illustrating an optical system evaluation device according to a first embodiment. 画像処理部4のハードウェアを示すハードウェア構成図である。FIG. 2 is a hardware configuration diagram illustrating hardware of an image processing unit 4. 画像処理部4がソフトウェア又はファームウェアなどで実現される場合のコンピュータのハードウェア構成図である。FIG. 4 is a hardware configuration diagram of a computer when the image processing unit 4 is realized by software or firmware. 画像処理部4がソフトウェア又はファームウェアなどで実現される場合の処理手順を示すフローチャートである。9 is a flowchart illustrating a processing procedure when the image processing unit 4 is realized by software or firmware. 周囲に向かって均等に広がる形状を有する複数の点像、及び、中心対称性が高い形状のPSFを示す説明図である。It is explanatory drawing which shows the several point image which has the shape which spreads uniformly toward the circumference | surroundings, and the PSF of the shape with high central symmetry. 周囲に向かって不均等に広がる形状を有する複数の点像、及び、中心対称性が低い形状のPSFを示す説明図である。It is explanatory drawing which shows the several point image which has the shape which spreads unequally toward the periphery, and the PSF of the shape with low center symmetry. それぞれの光に対応する点像(1)~(3)、点像強度分布及び撮像素子3-1~3-Jにより生成された電気信号である画素値を示す説明図である。FIG. 4 is an explanatory diagram showing point images (1) to (3), point image intensity distributions, and pixel values which are electric signals generated by the imaging elements 3-1 to 3-J corresponding to each light. 点像画素位相と撮像素子の画素値との関係を示す説明図である。FIG. 3 is an explanatory diagram illustrating a relationship between a point image pixel phase and a pixel value of an image sensor. 分布関数生成部7による高分解能PSFの生成処理を示す説明図である。FIG. 4 is an explanatory diagram illustrating a process of generating a high-resolution PSF by a distribution function generation unit 7; 実施の形態2による光学系評価装置を示す構成図である。FIG. 9 is a configuration diagram illustrating an optical system evaluation device according to a second embodiment. 画像処理部31のハードウェアを示すハードウェア構成図である。FIG. 2 is a hardware configuration diagram illustrating hardware of an image processing unit 31. 画像処理部31がソフトウェア又はファームウェアなどで実現される場合の処理手順を示すフローチャートである。9 is a flowchart illustrating a processing procedure when the image processing unit 31 is realized by software or firmware.

 以下、この発明をより詳細に説明するために、この発明を実施するための形態について、添付の図面に従って説明する。 Hereafter, in order to explain this invention in greater detail, the preferred embodiments of the present invention will be described with reference to the accompanying drawings.

実施の形態1.
 図1は、実施の形態1による光学系評価装置を示す構成図である。
 図1において、空間光変調部1は、複数の点光源からのそれぞれの光に波面収差を加えるものである。ここで、点光源には、それ自体が光を放射する物体、又は、入射してきた光を反射する物体が含まれる。
 空間光変調部1は、例えば、空間光変調器によって実現される。空間光変調器とは、当該空間変調器に入射した光について、当該光の位相、偏波方向、振幅又は強度などを変調させることが可能な光学素子を有する装置である。
 また、空間光変調部1は、鏡面の形状を変形可能な形状可変鏡によって実現されるものであってもよい。
 また、空間光変調部1は、複数の撮像素子について、それぞれの撮像素子の位置をずらすことが可能な焦点位置の調整機構によって実現されるものであってもよい。撮像素子の位置が焦点位置からずれることで、それぞれの光に対応する点像にデフォーカスが与えられ、焦点の合った状態での点像と異なる点像が得られる。
Embodiment 1 FIG.
FIG. 1 is a configuration diagram showing an optical system evaluation device according to the first embodiment.
In FIG. 1, a spatial light modulator 1 adds a wavefront aberration to each light from a plurality of point light sources. Here, the point light source includes an object that emits light itself or an object that reflects incident light.
The spatial light modulator 1 is realized by, for example, a spatial light modulator. The spatial light modulator is a device having an optical element capable of modulating the phase, the polarization direction, the amplitude, the intensity, or the like of the light incident on the spatial modulator.
Further, the spatial light modulator 1 may be realized by a deformable mirror capable of changing the shape of a mirror surface.
Further, the spatial light modulator 1 may be realized by a focus position adjusting mechanism capable of shifting the position of each of the plurality of image sensors. By deviating the position of the image sensor from the focal position, a point image corresponding to each light is defocused, and a point image different from the point image in a focused state is obtained.

 結像光学系2は、レンズ又はミラーなどによって実現される。
 結像光学系2は、空間光変調部1により波面収差が加えられたそれぞれの光を複数の撮像素子3-1~3-Jの結像面に結像する。Jは、2以上の整数である。
 撮像部3は、複数の撮像素子3-1~3-Jを有している。撮像素子3-1~3-Jは、結像光学系2により、当該撮像素子3-1~3-Jの結像面に結像された光を変換して電気信号を生成する。撮像素子3-1~3-Jは、生成した電気信号を後述する分布関数算出部5及び後述する画素位相推定部6のそれぞれに出力する。
The imaging optical system 2 is realized by a lens or a mirror.
The image forming optical system 2 forms each of the lights, to which the wavefront aberration has been added by the spatial light modulator 1, on the image forming surfaces of the plurality of image sensors 3-1 to 3-J. J is an integer of 2 or more.
The imaging unit 3 has a plurality of imaging devices 3-1 to 3-J. The imaging elements 3-1 to 3-J convert the light formed on the imaging planes of the imaging elements 3-1 to 3-J by the imaging optical system 2 to generate electric signals. The imaging elements 3-1 to 3-J output the generated electric signals to a distribution function calculation unit 5 described later and a pixel phase estimation unit 6 described later.

 画像処理部4は、分布関数算出部5、画素位相推定部6、分布関数生成部7、波面収差推定部8及び波面収差出力部9を備えている。
 画像処理部4は、撮像部3の撮像素子3-1~3-Jにより生成された電気信号を解析して、結像光学系2の波面収差を推定する処理を実施する。
 図2は、画像処理部4のハードウェアを示すハードウェア構成図である。
The image processing unit 4 includes a distribution function calculation unit 5, a pixel phase estimation unit 6, a distribution function generation unit 7, a wavefront aberration estimation unit 8, and a wavefront aberration output unit 9.
The image processing unit 4 analyzes the electric signals generated by the imaging elements 3-1 to 3-J of the imaging unit 3 and performs a process of estimating the wavefront aberration of the imaging optical system 2.
FIG. 2 is a hardware configuration diagram illustrating hardware of the image processing unit 4.

 分布関数算出部5は、例えば、図2に示す分布関数算出回路11によって実現される。
 分布関数算出部5は、撮像素子3-1~3-Jにより生成された電気信号から、それぞれの光に対応する点像の強度分布を示す点像分布関数(PSF:Point Spread Function)を算出する。以下、複数の光のそれぞれに対応する複数の点像のそれぞれについて算出された、複数の点像分布関数を、単に「複数のPSF」という。
 分布関数算出部5は、算出した複数のPSFを分布関数生成部7に出力する。
The distribution function calculation unit 5 is realized by, for example, a distribution function calculation circuit 11 illustrated in FIG.
The distribution function calculation unit 5 calculates a point spread function (PSF) indicating an intensity distribution of a point image corresponding to each light from the electric signals generated by the imaging elements 3-1 to 3-J. I do. Hereinafter, the plurality of point spread functions calculated for each of the plurality of point images corresponding to each of the plurality of lights are simply referred to as “a plurality of PSFs”.
The distribution function calculation unit 5 outputs the calculated plurality of PSFs to the distribution function generation unit 7.

 画素位相推定部6は、例えば、図2に示す画素位相推定回路12によって実現される。
 画素位相推定部6は、撮像素子3-1~3-Jにより生成された電気信号から、それぞれの光に対応する点像の画素位相(以下「点像画素位相」という。)を推定する。以下、複数の光のそれぞれに対応する複数の点像のそれぞれについて推定された、複数の点像画素位相を、単に「複数の点像画素位相」という。
 画素位相推定部6は、推定した複数の点像画素位相を分布関数生成部7に出力する。
The pixel phase estimating unit 6 is realized by, for example, the pixel phase estimating circuit 12 shown in FIG.
The pixel phase estimation unit 6 estimates a pixel phase of a point image corresponding to each light (hereinafter, referred to as a “point image pixel phase”) from the electric signals generated by the imaging elements 3-1 to 3-J. Hereinafter, the plurality of point image pixel phases estimated for each of the plurality of point images corresponding to each of the plurality of lights will be simply referred to as “a plurality of point image pixel phases”.
The pixel phase estimating unit 6 outputs the estimated plurality of point image pixel phases to the distribution function generating unit 7.

 分布関数生成部7は、例えば、図2に示す分布関数生成回路13によって実現される。
 分布関数生成部7は、画素位相推定部6により推定された複数の点像画素位相に基づいて、分布関数算出部5により算出された複数のPSFから、当該複数のPSFのそれぞれよりも分解能が高いPSF(以下「高分解能PSF」という。)を生成する。
 分布関数生成部7は、生成した高分解能PSFを波面収差推定部8に出力する。
The distribution function generation unit 7 is realized by, for example, a distribution function generation circuit 13 illustrated in FIG.
The distribution function generation unit 7 has a higher resolution than each of the plurality of PSFs from the plurality of PSFs calculated by the distribution function calculation unit 5 based on the plurality of point image pixel phases estimated by the pixel phase estimation unit 6. A high PSF (hereinafter, referred to as “high resolution PSF”) is generated.
The distribution function generator 7 outputs the generated high-resolution PSF to the wavefront aberration estimator 8.

 波面収差推定部8は、例えば、図2に示す波面収差推定回路14によって実現される。
 波面収差推定部8は、分布関数生成部7により生成された高分解能PSFから、結像光学系2の波面収差を推定する。
 波面収差推定部8は、推定した結像光学系2の波面収差を波面収差出力部9に出力する。
The wavefront aberration estimating unit 8 is realized by, for example, the wavefront aberration estimating circuit 14 illustrated in FIG.
The wavefront aberration estimator 8 estimates the wavefront aberration of the imaging optical system 2 from the high-resolution PSF generated by the distribution function generator 7.
The wavefront aberration estimating unit 8 outputs the estimated wavefront aberration of the imaging optical system 2 to the wavefront aberration output unit 9.

 波面収差出力部9は、例えば、図2に示す波面収差出力回路15によって実現される。
 波面収差出力部9は、波面収差推定部8により推定された波面収差を外部の記憶装置などに格納する。
 また、波面収差出力部9は、波面収差推定部8により推定された波面収差を示す画像又は波面収差を示す係数などを図示しないディスプレイ等の出力装置に出力する。
The wavefront aberration output unit 9 is realized by, for example, the wavefront aberration output circuit 15 illustrated in FIG.
The wavefront aberration output unit 9 stores the wavefront aberration estimated by the wavefront aberration estimation unit 8 in an external storage device or the like.
Further, the wavefront aberration output unit 9 outputs an image indicating the wavefront aberration estimated by the wavefront aberration estimation unit 8 or a coefficient indicating the wavefront aberration to an output device such as a display (not shown).

 図1では、画像処理部4の構成要素である分布関数算出部5、画素位相推定部6、分布関数生成部7、波面収差推定部8及び波面収差出力部9のそれぞれが、図2に示すような専用のハードウェアで実現されるものを想定している。即ち、画像処理部4が、分布関数算出回路11、画素位相推定回路12、分布関数生成回路13、波面収差推定回路14及び波面収差出力回路15で実現されるものを想定している。
 ここで、分布関数算出回路11、画素位相推定回路12、分布関数生成回路13、波面収差推定回路14及び波面収差出力回路15のそれぞれは、例えば、単一回路、複合回路、プログラム化したプロセッサ、並列プログラム化したプロセッサ、ASIC(Application Specific Integrated Circuit)、FPGA(Field-Programmable Gate Array)、又は、これらを組み合わせたものが該当する。
In FIG. 1, each of the distribution function calculation unit 5, pixel phase estimation unit 6, distribution function generation unit 7, wavefront aberration estimation unit 8, and wavefront aberration output unit 9, which are components of the image processing unit 4, is shown in FIG. It is assumed that it is realized by such dedicated hardware. That is, it is assumed that the image processing unit 4 is realized by the distribution function calculation circuit 11, the pixel phase estimation circuit 12, the distribution function generation circuit 13, the wavefront aberration estimation circuit 14, and the wavefront aberration output circuit 15.
Here, each of the distribution function calculation circuit 11, the pixel phase estimation circuit 12, the distribution function generation circuit 13, the wavefront aberration estimation circuit 14, and the wavefront aberration output circuit 15 is, for example, a single circuit, a composite circuit, a programmed processor, A processor programmed in parallel, an ASIC (Application Specific Integrated Circuit), an FPGA (Field-Programmable Gate Array), or a combination thereof is applicable.

 画像処理部4の構成要素は、専用のハードウェアで実現されるものに限るものではなく、画像処理部4がソフトウェア、ファームウェア、又は、ソフトウェアとファームウェアとの組み合わせで実現されるものであってもよい。
 ソフトウェア又はファームウェアは、プログラムとして、コンピュータのメモリに格納される。コンピュータは、プログラムを実行するハードウェアを意味し、例えば、CPU(Central Processing Unit)、中央処理装置、処理装置、演算装置、マイクロプロセッサ、マイクロコンピュータ、プロセッサ、あるいは、DSP(Digital Signal Processor)が該当する。
The components of the image processing unit 4 are not limited to those realized by dedicated hardware, and even if the image processing unit 4 is realized by software, firmware, or a combination of software and firmware. Good.
Software or firmware is stored as a program in the memory of the computer. The computer means hardware for executing a program, for example, a CPU (Central Processing Unit), a central processing unit, a processing unit, an arithmetic unit, a microprocessor, a microcomputer, a processor, or a DSP (Digital Signal Processor). I do.

 図3は、画像処理部4がソフトウェア又はファームウェアなどで実現される場合のコンピュータのハードウェア構成図である。
 画像処理部4がソフトウェア又はファームウェアなどで実現される場合、分布関数算出部5、画素位相推定部6、分布関数生成部7、波面収差推定部8及び波面収差出力部9の処理手順をコンピュータに実行させるためのプログラムがメモリ22に格納される。そして、コンピュータのプロセッサ21がメモリ22に格納されているプログラムを実行する。
 図4は、画像処理部4がソフトウェア又はファームウェアなどで実現される場合の処理手順を示すフローチャートである。
FIG. 3 is a hardware configuration diagram of a computer when the image processing unit 4 is realized by software or firmware.
When the image processing unit 4 is realized by software or firmware, the processing procedure of the distribution function calculation unit 5, the pixel phase estimation unit 6, the distribution function generation unit 7, the wavefront aberration estimation unit 8, and the wavefront aberration output unit 9 is performed by a computer. A program to be executed is stored in the memory 22. Then, the processor 21 of the computer executes the program stored in the memory 22.
FIG. 4 is a flowchart illustrating a processing procedure when the image processing unit 4 is realized by software or firmware.

 また、図2では、画像処理部4の構成要素のそれぞれが専用のハードウェアで実現される例を示し、図3では、画像処理部4がソフトウェア又はファームウェアなどで実現される例を示している。しかし、これは一例に過ぎず、画像処理部4における一部の構成要素が専用のハードウェアで実現され、残りの構成要素がソフトウェア又はファームウェアなどで実現されるものであってもよい。 FIG. 2 illustrates an example in which each of the components of the image processing unit 4 is realized by dedicated hardware, and FIG. 3 illustrates an example in which the image processing unit 4 is realized by software or firmware. . However, this is only an example, and some components in the image processing unit 4 may be realized by dedicated hardware, and the remaining components may be realized by software or firmware.

 次に、図1に示す光学系評価装置の動作について説明する。
 空間光変調部1に入射する光は、複数の点光源のそれぞれによって放射又は反射された光である。
 複数の点光源は、地球から遠方に存在する多数の星又は夜間の都市部における多数の灯などでよい。
 図1に示す光学系評価装置が例えば人工衛星に搭載されている場合、撮像部3は、天候に左右されずに、多数の星を撮像することが可能である。
 地球から遠方に存在する星の見込み角は、撮像部3の瞬時視野角よりも十分小さいため、地球から遠方に存在する星は、点光源とみなすことが可能である。
 撮像部3が、夜間の都市部における多数の灯などを撮像する場合、ある都市部の上空に雲が存在しているために、その都市部における多数の灯などを撮像できないことがある。しかし、光学系評価装置を搭載している人工衛星が、軌道上を周回しているときには、撮像部3が、上空に雲が存在している都市部とは別の、雲が存在していない都市部の灯などを撮像してもよいため、地上に配置されている複数の評価用の光源を撮像する場合よりも、撮像機会が増えて、天候の影響が小さくなる。
Next, the operation of the optical system evaluation apparatus shown in FIG. 1 will be described.
Light incident on the spatial light modulator 1 is light emitted or reflected by each of the plurality of point light sources.
The plurality of point sources may be a number of stars distant from the earth or a number of lights in an urban area at night.
When the optical system evaluation device illustrated in FIG. 1 is mounted on, for example, an artificial satellite, the imaging unit 3 can capture a large number of stars without being affected by the weather.
Since the expected angle of a star distant from the earth is sufficiently smaller than the instantaneous viewing angle of the imaging unit 3, the star distant from the earth can be regarded as a point light source.
When the imaging unit 3 captures images of a large number of lights in an urban area at night, it may not be possible to capture the images of a large number of lights in the urban area due to the presence of clouds above the urban area. However, when the artificial satellite equipped with the optical system evaluation device orbits in orbit, the imaging unit 3 does not have a cloud other than the urban area where the cloud exists in the sky. Since an image of a lamp in an urban area or the like may be captured, the number of image capturing opportunities is increased as compared with a case where a plurality of evaluation light sources arranged on the ground are captured, and the influence of weather is reduced.

 空間光変調部1は、複数の点光源からの光が入射すると、入射したそれぞれの光に波面収差を加える。
 空間光変調部1によりそれぞれの光に加えられる波面収差は、図5に示すように、それぞれの光に対応する点像を周囲に向かって均等に広がる形状とするような波面収差である。
 即ち、空間光変調部1は、撮像素子3-1~3-Jの結像面における点像強度分布の座標に対する点像強度の変化がなだらかになるように、それぞれの光に対応する点像にデフォーカスを与える。あるいは、空間光変調部1は、中心対称性が高い3次の非点収差をそれぞれの光に対応する点像に与える。
 図5は、いずれも周囲に向かって均等に広がる形状を有する複数の点像、及び、中心対称性が高い形状のPSFを示す説明図である。
 図6は、いずれも周囲に向かって不均等に広がる形状を有する複数の点像、及び、中心対称性が低い形状のPSFを示す説明図である。
 図5及び図6の内容については後述する。
When light from a plurality of point light sources enters, the spatial light modulator 1 adds a wavefront aberration to each of the incident lights.
The wavefront aberration added to each light by the spatial light modulator 1 is a wavefront aberration that causes a point image corresponding to each light to have a shape that spreads evenly toward the periphery as shown in FIG.
That is, the spatial light modulating unit 1 controls the point images corresponding to the respective lights so that the change in the point image intensity with respect to the coordinates of the point image intensity distribution on the imaging planes of the imaging elements 3-1 to 3-J becomes gentle. Give defocus to. Alternatively, the spatial light modulator 1 gives a third-order astigmatism having high central symmetry to a point image corresponding to each light.
FIG. 5 is an explanatory diagram showing a plurality of point images each having a shape spreading evenly toward the periphery and a PSF having a shape with high central symmetry.
FIG. 6 is an explanatory diagram showing a plurality of point images each having a shape that spreads unequally toward the periphery, and a PSF having a shape with low central symmetry.
5 and 6 will be described later.

 結像光学系2は、空間光変調部1により波面収差が加えられたそれぞれの光が入射すると、それぞれの光を撮像素子3-1~3-Jの結像面に結像させる。
 撮像素子3-1~3-Jは、結像光学系2により結像されたそれぞれの光を変換して電気信号を生成する。
 撮像部3は、撮像素子3-1~3-Jにより生成された電気信号を分布関数算出部5及び画素位相推定部6のそれぞれに出力する。
 図7は、それぞれの光に対応する点像(1)~(3)、点像強度分布及び撮像素子3-1~3-Jにより生成された電気信号である画素値を示す説明図である。
 図7は、図面の簡単化のため、撮像素子3-1~3-Jが1次元に配置されているような態様を図示している。しかし、実際には、撮像素子3-1~3-Jは、2次元に配置されている。
 図7は、J=14である例を示しており、同図7には、14個の撮像素子3-1~3-14が表記されている。
 また、図7は、光の数が3個である例を示しており、同図7には、3個の点像(1)~(3)が表記されている。
When each light to which the wavefront aberration is added by the spatial light modulator 1 is incident, the imaging optical system 2 forms each light on an imaging plane of the imaging elements 3-1 to 3-J.
The imaging devices 3-1 to 3-J convert each light imaged by the imaging optical system 2 to generate an electric signal.
The imaging unit 3 outputs the electric signals generated by the imaging elements 3-1 to 3-J to each of the distribution function calculation unit 5 and the pixel phase estimation unit 6.
FIG. 7 is an explanatory diagram showing point images (1) to (3), point image intensity distributions, and pixel values, which are electric signals generated by the imaging elements 3-1 to 3-J, corresponding to the respective lights. .
FIG. 7 illustrates a mode in which the imaging elements 3-1 to 3-J are arranged one-dimensionally for simplification of the drawing. However, actually, the imaging elements 3-1 to 3-J are two-dimensionally arranged.
FIG. 7 shows an example in which J = 14, and FIG. 7 shows fourteen image sensors 3-1 to 3-14.
FIG. 7 shows an example in which the number of light is three, and FIG. 7 shows three point images (1) to (3).

 点像(1)は、撮像素子3-2~3-4の結像面に結像されている。点像(1)の中心位置が、概ね撮像素子3-3の中心位置と一致しているため、点像(1)の大部分が撮像素子3-3の結像面に結像されている。
 したがって、撮像素子3-3の画素値が、撮像素子3-2及び撮像素子3-4におけるそれぞれの画素値と比べて大きい。撮像素子3-2の画素値及び撮像素子3-4の画素値は、ほぼ同じである。
 点像(2)は、撮像素子3-7~3-9の結像面に結像されている。点像(2)の中心位置は、撮像素子3-8の結像面内に存在しているが、撮像素子3-8の中心位置よりも図中右側に少しだけずれている。点像(2)についても、その大部分は、撮像素子3-8の結像面に結像されている。しかし、点像(2)の中心位置が、撮像素子3-8の中心位置よりも右側に少しだけずれているため、点像(2)の中心位置が撮像素子3-8の中心位置と一致する場合と比べて、点像(2)のより多くの部分が、撮像素子3-9の結像面にも結像される。
 したがって、撮像素子3-8の画素値は、撮像素子3-7及び撮像素子3-9におけるそれぞれの画素値のいずれと比べても大きく、かつ、撮像素子3-9の画素値は、撮像素子3-8の画素値よりは小さいが、撮像素子3-7の画素値よりは大きい。
The point image (1) is formed on the imaging planes of the imaging devices 3-2 to 3-4. Since the center position of the point image (1) substantially coincides with the center position of the image sensor 3-3, most of the point image (1) is formed on the image plane of the image sensor 3-3. .
Therefore, the pixel value of the imaging element 3-3 is larger than the respective pixel values of the imaging element 3-2 and the imaging element 3-4. The pixel value of the image sensor 3-2 and the pixel value of the image sensor 3-4 are almost the same.
The point image (2) is formed on the image plane of the imaging elements 3-7 to 3-9. The center position of the point image (2) exists in the image plane of the image sensor 3-8, but is slightly shifted to the right side in the figure from the center position of the image sensor 3-8. Most of the point image (2) is also formed on the image plane of the image sensor 3-8. However, since the center position of the point image (2) is slightly shifted to the right from the center position of the image sensor 3-8, the center position of the point image (2) matches the center position of the image sensor 3-8. As compared with the case where a point image is formed, a larger part of the point image (2) is also formed on the image forming surface of the image sensor 3-9.
Therefore, the pixel value of the image sensor 3-8 is larger than each of the pixel values of the image sensor 3-7 and the image sensor 3-9, and the pixel value of the image sensor 3-9 is larger than that of the image sensor 3-9. It is smaller than the pixel value of 3-8, but larger than the pixel value of the image sensor 3-7.

 点像(3)は、撮像素子3-11~3-14の結像面に結像されている。点像(3)の中心位置は、撮像素子3-12の結像面内に存在しているが、撮像素子3-12の中心位置よりも図中右側に大きくずれている。点像(3)の中心位置が、撮像素子3-12の中心位置よりも右側に大きくずれているため、点像(3)の大部分が撮像素子3-12及び撮像素子3-13におけるそれぞれの結像面に結像されている。
 したがって、撮像素子3-11~3-14の画素値の中で、撮像素子3-12の画素値が最も大きく、次に、撮像素子3-13の画素値が大きい。ただし、撮像素子3-12の画素値と撮像素子3-13の画素値との差は少ない。
 撮像素子3-11及び撮像素子3-14におけるそれぞれの画素値は、撮像素子3-12及び撮像素子3-13のいずれの画素値よりも小さい。
The point image (3) is formed on the image plane of the imaging elements 3-11 to 3-14. Although the center position of the point image (3) exists in the image plane of the image sensor 3-12, it is greatly shifted to the right side in the drawing from the center position of the image sensor 3-12. Since the center position of the point image (3) is greatly shifted to the right side of the center position of the image sensor 3-12, most of the point image (3) is formed in the image sensor 3-12 and the image sensor 3-13, respectively. Are formed on the image forming plane.
Therefore, among the pixel values of the imaging elements 3-11 to 3-14, the pixel value of the imaging element 3-12 is the largest, and then the pixel value of the imaging element 3-13 is the largest. However, the difference between the pixel value of the image sensor 3-12 and the pixel value of the image sensor 3-13 is small.
Each pixel value of the imaging element 3-11 and the imaging element 3-14 is smaller than each pixel value of the imaging element 3-12 and the imaging element 3-13.

 分布関数算出部5は、撮像素子3-1~3-14から電気信号を受けると、電気信号から、それぞれの光に対応する点像の強度分布を示すPSFを算出する(図4のステップST1)。
 図7に示すように、3個の点像(1)~(3)がある場合、点像(1)の強度分布を示すPSFは、撮像素子3-2~3-4の画素値の系列によって表される。
 点像(2)の強度分布を示すPSFは、撮像素子3-7~3-9の画素値の系列によって表され、点像(3)の強度分布を示すPSFは、撮像素子3-11~3-14の画素値の系列によって表される。
 分布関数算出部5は、それぞれの光に対応するPSFを分布関数生成部7に出力する。
When receiving the electric signals from the imaging devices 3-1 to 3-14, the distribution function calculating unit 5 calculates a PSF indicating the intensity distribution of the point image corresponding to each light from the electric signals (step ST1 in FIG. 4). ).
As shown in FIG. 7, when there are three point images (1) to (3), the PSF indicating the intensity distribution of the point image (1) is a series of pixel values of the imaging elements 3-2 to 3-4. Represented by
The PSF indicating the intensity distribution of the point image (2) is represented by a series of pixel values of the image sensors 3-7 to 3-9. The PSF indicating the intensity distribution of the point image (3) is expressed by the image sensors 3-11 to 3-11. It is represented by a series of 3-14 pixel values.
The distribution function calculation unit 5 outputs the PSF corresponding to each light to the distribution function generation unit 7.

 画素位相推定部6は、撮像素子3-1~3-14から電気信号を受けると、電気信号から、それぞれの光に対応する点像について点像画素位相を推定し、それぞれの点像画素位相を分布関数生成部7に出力する(図4のステップST2)。
 画素位相推定部6が点像画素位相を推定している理由は、複数の点光源が等間隔に並んでいなくても、後述する分布関数生成部7が、高分解能PSFを生成できるようにするためである。
 点像画素位相は、図8に示すように、点像の中心位置に相当する。
 図8は、点像画素位相と撮像素子の画素値との関係を示す説明図である。
 1つの点像強度分布に対するR(Rは、2以上の整数)個の撮像素子におけるそれぞれの画素値が、G~Gであるとすると、点像画素位相に相当する点像の中心位置は、画素値G~G全体の重心位置に対応する。
 したがって、画素位相推定部6は、重心位置を計算することで、点像画素位相を推定することができる。重心位置を計算することで、画素位相を推定すること自体は、公知の技術であるため詳細な説明を省略する。
 また、画素位相推定部6は、点像強度分布と同様の形状を有するフィッティング関数を用いることで、1画素未満の精度で点像画素位相を推定することができる。フィッティング関数を用いて、画素位相を推定すること自体は、公知の技術であるため詳細な説明を省略する。
Upon receiving the electric signals from the imaging elements 3-1 to 3-14, the pixel phase estimating unit 6 estimates a point image pixel phase for a point image corresponding to each light from the electric signals, and Is output to the distribution function generator 7 (step ST2 in FIG. 4).
The reason why the pixel phase estimating unit 6 estimates the point image pixel phase is that the distribution function generating unit 7 described later can generate a high-resolution PSF even if a plurality of point light sources are not arranged at equal intervals. To do that.
The point image pixel phase corresponds to the center position of the point image as shown in FIG.
FIG. 8 is an explanatory diagram showing the relationship between the point image pixel phase and the pixel value of the image sensor.
R for one point spread (R is an integer of 2 or more) each pixel value in the number of image pickup device, when a G 0 ~ G R, the center position of a point image corresponding to the point image pixel phase corresponds to the center of gravity of the entire pixel value G 0 ~ G R.
Therefore, the pixel phase estimating unit 6 can estimate the point image pixel phase by calculating the barycentric position. Since estimating the pixel phase by calculating the position of the center of gravity is a known technique, detailed description thereof will be omitted.
Further, the pixel phase estimating unit 6 can estimate the point image pixel phase with an accuracy of less than one pixel by using a fitting function having the same shape as the point image intensity distribution. Estimating the pixel phase using the fitting function is a well-known technique, and a detailed description thereof will be omitted.

 画素位相推定部6が、フィッティング関数を用いて、点像画素位相を推定する場合、フィッティング関数としては、結像光学系2のPSFを用いることが可能である。しかし、図1に示す光学系評価装置では、結像光学系2のPSFが未知である。
 画素位相推定部6は、結像光学系2のPSFが未知である場合、フィッティング関数として、ガウス関数又はシンク関数などの対称関数を用いる。
 結像光学系2のPSFが、図6に示すように、中心対称性が低い形状のPSFである場合、画素位相推定部6が、フィッティング関数として対称関数を用いても、点像画素位相を高精度に推定することができない。中心対称性が低い形状のPSFである場合、画素位相推定部6が、重心位置を計算することで、点像画素位相を推定する場合も同様に、画素位相を高精度に推定することができない。
When the pixel phase estimating unit 6 estimates a point image pixel phase using a fitting function, the PSF of the imaging optical system 2 can be used as the fitting function. However, in the optical system evaluation device shown in FIG. 1, the PSF of the imaging optical system 2 is unknown.
When the PSF of the imaging optical system 2 is unknown, the pixel phase estimating unit 6 uses a symmetric function such as a Gaussian function or a sink function as the fitting function.
When the PSF of the imaging optical system 2 is a PSF having a shape with low central symmetry as shown in FIG. 6, even if the pixel phase estimating unit 6 uses a symmetric function as a fitting function, the pixel phase estimating unit 6 calculates the point image pixel phase. It cannot be estimated with high accuracy. In the case of a PSF having a shape with low central symmetry, the pixel phase estimating unit 6 cannot estimate the pixel phase with high accuracy similarly when estimating the point image pixel phase by calculating the position of the center of gravity. .

 図1に示す光学系評価装置では、フィッティング関数として対称関数を用いても、点像画素位相を高精度に推定することができるように、空間光変調部1が、入射したそれぞれの光に波面収差を加えている。
 即ち、空間光変調部1が、点像を周囲に向かって均等に広がる形状とするような波面収差を加えることで、結像光学系2のPSFを、図5に示すように、中心対称性が高い形状のPSFとしている。
 結像光学系2のPSFが、図5に示すように、中心対称性が高い形状のPSFである場合、画素位相推定部6が、フィッティング関数として対称関数を用いることで、点像画素位相を高精度に推定することができる。中心対称性が高い形状のPSFである場合、画素位相推定部6が、重心位置を計算することで、点像画素位相を推定する場合も同様に、点像画素位相を高精度に推定することができる。
In the optical system evaluation apparatus shown in FIG. 1, the spatial light modulator 1 applies a wavefront to each of the incident light so that the point image pixel phase can be estimated with high accuracy even if a symmetric function is used as the fitting function. Adds aberration.
That is, the spatial light modulator 1 adds a wavefront aberration to make the point image have a shape that spreads evenly toward the periphery, so that the PSF of the imaging optical system 2 has a central symmetry as shown in FIG. Is a high shape PSF.
When the PSF of the imaging optical system 2 is a PSF having a high central symmetry as shown in FIG. 5, the pixel phase estimating unit 6 uses the symmetric function as the fitting function to change the point image pixel phase. It can be estimated with high accuracy. In the case of a PSF having a shape with high central symmetry, the pixel phase estimating unit 6 estimates the point image pixel phase by calculating the position of the center of gravity to similarly estimate the point image pixel phase with high accuracy. Can be.

 分布関数生成部7は、分布関数算出部5から算出された複数のPSFを受け、画素位相推定部6から推定された複数の点像画素位相を受ける。
 分布関数生成部7は、図9に示すように、画素位相推定部6により推定された複数の点像画素位相に基づいて、分布関数算出部5により算出された複数のPSFから、当該複数のPSFのそれぞれよりも分解能が高い高分解能PSFを生成する(図4のステップST3)。
 分布関数生成部7は、生成した高分解能PSFを波面収差推定部8に出力する。
 図9は、分布関数生成部7による高分解能PSFの生成処理を示す説明図である。
 以下、図9を参照しながら、分布関数生成部7による高分解能PSFの生成処理を具体的に説明する。
The distribution function generator 7 receives the plurality of PSFs calculated from the distribution function calculator 5 and receives the plurality of point image pixel phases estimated from the pixel phase estimator 6.
As illustrated in FIG. 9, the distribution function generation unit 7 calculates a plurality of PSFs calculated by the distribution function calculation unit 5 based on the plurality of point image pixel phases estimated by the pixel phase estimation unit 6, and A high-resolution PSF having a higher resolution than each of the PSFs is generated (step ST3 in FIG. 4).
The distribution function generator 7 outputs the generated high-resolution PSF to the wavefront aberration estimator 8.
FIG. 9 is an explanatory diagram illustrating the generation processing of the high-resolution PSF by the distribution function generation unit 7.
Hereinafter, the generation processing of the high-resolution PSF by the distribution function generation unit 7 will be specifically described with reference to FIG.

 図9では、分布関数生成部7が、N個の点像P(n=0,1,・・・,k,k+1,・・・,N-1)それぞれについてのPSFである複数のPSFから、当該複数のPSFよりも分解能がN倍高い高分解能PSFを生成している例を示している。
 点像P~PN-1は、例えば、互いに点像画素位相がδ=1/N[pixel]だけずれている点像群であるとする。
 例えば、点像Pの点像画素位相は、0[pixel]、点像Pの点像画素位相は、δ[pixel]、点像Pの点像画素位相は、k×δ[pixel]、点像PN-1の点像画素位相は、(N-1)×δ[pixel]である。
 図9では、説明の簡単のため、点像P~PN-1それぞれについてのPSFが有するM個の画素値が1次元に並んでいるように表記している。実際には、点像P~PN-1それぞれについてのPSFが有する画素値はM×M個であり、2次元に並んでいる。
 例えば、点像PについてのPSFが有するM個の画素値は、P(0),P(1),・・・,P(M-1)である。
 例えば、点像PについてのPSFが有するM個の画素値は、P(0),P(1),・・・,P(M-1)である。
 例えば、点像PについてのPSFが有するM個の画素値は、P(0),P(1),・・・,P(M-1)である。
 例えば、点像PN-1についてのPSFが有するM個の画素値は、PN-1(0),PN-1(1),・・・,PN-1(M-1)である。
In FIG. 9, the distribution function generation unit 7 generates a plurality of PSFs that are PSFs for each of the N point images P n (n = 0, 1,..., K, k + 1,..., N−1). Thus, an example is shown in which a high-resolution PSF having a resolution N times higher than the plurality of PSFs is generated.
It is assumed that the point images P 0 to P N−1 are, for example, a point image group in which the point image pixel phases are shifted from each other by δ = 1 / N [pixel].
For example, the point image pixel phase of the point image P 0 is 0 [pixel], the point image pixel phase of the point image P 1 is δ [pixel], and the point image pixel phase of the point image P k is k × δ [pixel] ], And the point image pixel phase of the point image P N−1 is (N−1) × δ [pixel].
In FIG. 9, for simplicity of description, the M pixel values of the PSF for each of the point images P 0 to P N−1 are shown as being arranged one-dimensionally. Actually, the PSF for each of the point images P 0 to P N−1 has M × M pixel values, which are two-dimensionally arranged.
For example, the M pixel values of the PSF for the point image P 0 are P 0 (0), P 0 (1),..., P 0 (M−1).
For example, the M pixel values of the PSF for the point image P 1 are P 1 (0), P 1 (1),..., P 1 (M−1).
For example, the M pixel values of the PSF for the point image P k are P k (0), P k (1),..., P k (M−1).
For example, the M pixel values of the PSF for the point image P N-1 are P N-1 (0), P N-1 (1),..., P N-1 (M-1). is there.

 まず、分布関数生成部7は、点像P~PN-1それぞれについてのPSFが有するM個の画素値のうち、図中、左から1番目の画素値P(0),P(0),・・・,P(0),・・・,PN-1(0)に着目する。
 分布関数生成部7は、着目した画素値P(0),P(0),・・・,P(0),・・・,PN-1(0)を、図中左側から、それぞれの画素値を有する点像の点像画素位相が小さい順に並べる。
 次に、分布関数生成部7は、点像P~PN-1それぞれについてのPSFが有するM個の画素値のうち、図中、左から2番目の画素値P(1),P(1),・・・,P(1),・・・,PN-1(1)に着目する。
 分布関数生成部7は、着目した画素値P(1),P(1),・・・,P(1),・・・,PN-1(1)を、先に並べた画素値PN-1(0)の右側に、上記と同様に点像画素位相が小さい順に並べる。
First, among the M pixel values of the PSF for each of the point images P 0 to P N−1 , the distribution function generation unit 7 determines that the first pixel value P 0 (0), P 1 from the left in the figure. Focus on (0), ..., P k (0), ..., P N-1 (0).
Distribution function generating unit 7, the focused pixel value P 0 (0), P 1 (0), ···, P k (0), ···, P N-1 a (0), from the left in the drawing Are arranged in ascending order of the point image pixel phase of the point image having each pixel value.
Next, among the M pixel values of the PSF for each of the point images P 0 to P N−1 , the distribution function generation unit 7 calculates the second pixel value P 0 (1), P 2 1 (1), ···, P k (1), ···, attention is paid to P N-1 (1).
Distribution function generating unit 7, the focused pixel value P 0 (1), P 1 (1), ···, P k (1), ···, P N-1 (1) was arranged above On the right side of the pixel value P N-1 (0), the point image pixel phases are arranged in ascending order in the same manner as described above.

 分布関数生成部7は、図中、左から3番目以降のM個の画素値についても、2番目のM個の画素値と同様に並べる。
 例えば、M-1番目の画素値P(M-1),P(M-1),・・・,P(M-1),・・・,PN-1(M-1)については、分布関数生成部7によって、先に並べた画素値PN-1(M-2)の右側に、点像画素位相が小さい順に並べられる。
The distribution function generator 7 also arranges the third and subsequent M pixel values from the left in the drawing in the same manner as the second M pixel values.
For example, M-1-th pixel value P 0 (M-1), P 1 (M-1), ···, P k (M-1), ···, P N-1 (M-1) Are arranged on the right side of the pixel values P N-1 (M-2) arranged in ascending order by the point image pixel phase in ascending order.

 分布関数生成部7は、点像P~PN-1それぞれについてのPSFが有するM個の画素値を上記のように並べることで、以下のように画素値が並んでいるPSFを得る。
 P(0),P(0),・・・,PN-1(0),P(1),P(1),・・・,PN-1(1),・・・,P(M-1),P(M-1),・・・,PN-1(M-1)
 分布関数生成部7は、上記のように画素値が並んでいるPSFを得ると、例えば、画素値の合計が1となるようにそれぞれの画素値を規格化することで、点像P~PN-1におけるそれぞれのPSFよりも分解能がN倍高い高分解能PSFを生成する。
The distribution function generator 7 arranges the M pixel values of the PSF for each of the point images P 0 to P N−1 as described above to obtain a PSF in which the pixel values are arranged as follows.
P 0 (0), P 1 (0), ..., P N-1 (0), P 0 (1), P 1 (1), ..., P N-1 (1), ... ·, P 0 (M-1), P 1 (M-1), ..., P N-1 (M-1)
When obtaining the PSF in which the pixel values are arranged as described above, the distribution function generation unit 7 normalizes the respective pixel values so that the sum of the pixel values is 1, for example, thereby obtaining the point images P 0 to P 0 . Generate a high resolution PSF with N times higher resolution than the respective PSF in P N-1 .

 波面収差推定部8は、分布関数生成部7から高分解能PSFを受けると、この高分解能PSFから、結像光学系2の波面収差を推定する(図4のステップST4)。
 PSFから波面収差を推定する処理自体は、公知の技術であるため、詳細な説明を省略する。
 波面収差推定部8は、結像光学系2の波面収差を波面収差出力部9に出力する。
Upon receiving the high-resolution PSF from the distribution function generator 7, the wavefront aberration estimator 8 estimates the wavefront aberration of the imaging optical system 2 from the high-resolution PSF (step ST4 in FIG. 4).
The process itself of estimating the wavefront aberration from the PSF is a known technique, and a detailed description thereof will be omitted.
The wavefront aberration estimating unit 8 outputs the wavefront aberration of the imaging optical system 2 to the wavefront aberration output unit 9.

 波面収差出力部9は、波面収差推定部8から結像光学系2の波面収差を受けると、波面収差を外部の記憶装置などに格納する。
 また、波面収差出力部9は、波面収差を示す画像又は波面収差を示す係数などを図示しないディスプレイ等の出力装置に出力する。
When receiving the wavefront aberration of the imaging optical system 2 from the wavefront aberration estimating unit 8, the wavefront aberration output unit 9 stores the wavefront aberration in an external storage device or the like.
The wavefront aberration output unit 9 outputs an image indicating the wavefront aberration or a coefficient indicating the wavefront aberration to an output device such as a display (not shown).

 以上の実施の形態1は、複数の点光源からのそれぞれの光に波面収差を加える空間光変調部1と、撮像素子3-1~3-Jにより生成された電気信号から、それぞれの光に対応する点像の強度分布を示す点像分布関数を算出する分布関数算出部5と、撮像素子3-1~3-Jにより生成された電気信号から、それぞれの光に対応する点像について点像画素位相を推定する画素位相推定部6と、画素位相推定部6により推定された点像画素位相に基づいて、分布関数算出部5により算出された複数のPSFから、複数のPSFよりも分解能が高い高分解能PSFを生成する分布関数生成部7と備え、波面収差推定部8が、分布関数生成部7により生成された高分解能PSFから、結像光学系の波面収差を推定するように、光学系評価装置を構成した。したがって、光学系評価装置は、評価用の光源を地上に配置することなく、結像光学系の波面収差を推定することができる。 In the first embodiment, the spatial light modulator 1 that applies a wavefront aberration to each light from a plurality of point light sources, and the electric signals generated by the imaging devices 3-1 to 3-J are used to convert each light into each light. A distribution function calculating unit 5 for calculating a point spread function indicating an intensity distribution of a corresponding point image, and a point image corresponding to each light from the electric signals generated by the imaging elements 3-1 to 3-J. A pixel phase estimating unit 6 for estimating an image pixel phase, and a plurality of PSFs calculated by the distribution function calculating unit 5 based on the point image pixel phases estimated by the pixel phase estimating unit 6, the resolution of which is higher than the plurality of PSFs. And a distribution function generation unit 7 for generating a high-resolution PSF having a high resolution. Configure optical system evaluation device It was. Therefore, the optical system evaluation device can estimate the wavefront aberration of the imaging optical system without disposing the light source for evaluation on the ground.

実施の形態2.
 実施の形態2では、波面収差推定部35が、分布関数生成部34により生成された複数のPSFから、結像光学系2の波面収差を推定する光学系評価装置について説明する。
Embodiment 2 FIG.
In the second embodiment, a description will be given of an optical system evaluation device in which the wavefront aberration estimation unit 35 estimates the wavefront aberration of the imaging optical system 2 from a plurality of PSFs generated by the distribution function generation unit 34.

 図10は、実施の形態2による光学系評価装置を示す構成図である。
 図10において、図1と同一符号は同一又は相当部分を示すので説明を省略する。
 空間光変調部30は、図1に示す空間光変調部1と同様に、複数の点光源のそれぞれによって放射又は反射された光が入射すると、それぞれの光に波面収差を加える空間光変調器である。
 空間光変調部30は、図1に示す空間光変調部1と異なり、複数の点光源からのそれぞれの光に加える波面収差を変化させながら、波面収差を加えたそれぞれの光を結像光学系2に繰り返し出力する。
FIG. 10 is a configuration diagram showing an optical system evaluation device according to the second embodiment.
10, the same reference numerals as those in FIG. 1 denote the same or corresponding parts, and a description thereof will not be repeated.
Similar to the spatial light modulator 1 shown in FIG. 1, the spatial light modulator 30 is a spatial light modulator that adds wavefront aberration to each light when the light emitted or reflected by each of the plurality of point light sources enters. is there.
The spatial light modulator 30 is different from the spatial light modulator 1 shown in FIG. 1 in that the wavefront aberration is added to each light from a plurality of point light sources while changing the wavefront aberration. 2 is repeatedly output.

 画像処理部31は、分布関数算出部32、画素位相推定部33、分布関数生成部34、波面収差推定部35及び波面収差出力部9を備えている。
 画像処理部31は、撮像部3の撮像素子3-1~3-Jにより生成された電気信号を解析して、結像光学系2の波面収差を推定する処理を実施する。
 図11は、画像処理部31のハードウェアを示すハードウェア構成図である。
 図11において、図2と同一符号は同一又は相当部分を示すので説明を省略する。
The image processing unit 31 includes a distribution function calculation unit 32, a pixel phase estimation unit 33, a distribution function generation unit 34, a wavefront aberration estimation unit 35, and a wavefront aberration output unit 9.
The image processing unit 31 analyzes the electric signals generated by the imaging elements 3-1 to 3-J of the imaging unit 3 and performs a process of estimating the wavefront aberration of the imaging optical system 2.
FIG. 11 is a hardware configuration diagram illustrating hardware of the image processing unit 31.
In FIG. 11, the same reference numerals as those in FIG.

 分布関数算出部32は、例えば、図11に示す分布関数算出回路41によって実現される。
 分布関数算出部32は、図1に示す分布関数算出部5と同様に、撮像素子3-1~3-Jにより生成された電気信号から、それぞれの光に対応するPSFを算出する。
 分布関数算出部32は、図1に示す分布関数算出部5と異なり、撮像素子3-1~3-Jによりそれぞれの光が変換されて電気信号が生成される毎に、撮像素子3-1~3-Jにより生成された電気信号から、それぞれの光に対応するPSFを算出する。
 分布関数算出部32は、算出した複数のPSFを分布関数生成部34に出力する。
The distribution function calculation unit 32 is realized by, for example, a distribution function calculation circuit 41 illustrated in FIG.
The distribution function calculation unit 32 calculates the PSF corresponding to each light from the electric signals generated by the imaging elements 3-1 to 3-J, similarly to the distribution function calculation unit 5 illustrated in FIG.
The distribution function calculator 32 differs from the distribution function calculator 5 shown in FIG. 1 in that each time the light is converted by the image sensors 3-1 to 3-J to generate an electric signal, the image sensor 3-1 is changed. A PSF corresponding to each light is calculated from the electric signals generated by the steps 3-J.
The distribution function calculation unit 32 outputs the calculated plurality of PSFs to the distribution function generation unit 34.

 画素位相推定部33は、例えば、図11に示す画素位相推定回路42によって実現される。
 画素位相推定部33は、図1に示す画素位相推定部6と同様に、撮像素子3-1~3-Jにより生成された電気信号から、それぞれの光に対応する点像について点像画素位相を推定する。
 画素位相推定部33は、図1に示す画素位相推定部6と異なり、撮像素子3-1~3-Jによりそれぞれの光が変換されて電気信号が生成される毎に、撮像素子3-1~3-Jにより生成された電気信号から、それぞれの光に対応する点像について点像画素位相を推定する。
 画素位相推定部33は、推定した複数の点像画素位相を分布関数生成部34に出力する。
The pixel phase estimating unit 33 is realized by, for example, a pixel phase estimating circuit 42 illustrated in FIG.
Like the pixel phase estimating unit 6 shown in FIG. 1, the pixel phase estimating unit 33 calculates a point image pixel phase for a point image corresponding to each light from the electric signals generated by the imaging elements 3-1 to 3-J. Is estimated.
The pixel phase estimating unit 33 is different from the pixel phase estimating unit 6 shown in FIG. 1 in that each time each light is converted by the imaging elements 3-1 to 3-J to generate an electric signal, the imaging element 3-1 A point image pixel phase is estimated for a point image corresponding to each light from the electric signals generated by the steps 3-J.
The pixel phase estimating unit 33 outputs the plurality of estimated point image pixel phases to the distribution function generating unit 34.

 分布関数生成部34は、例えば、図11に示す分布関数生成回路43によって実現される。
 分布関数生成部34は、図1に示す分布関数生成部7と同様に、画素位相推定部33により推定された複数の点像画素位相に基づいて、分布関数算出部32により算出された複数のPSFから、当該複数のPSFのそれぞれよりも分解能が高い高分解能PSFを生成する。
 分布関数生成部34は、図1に示す分布関数生成部7と異なり、分布関数算出部32により複数のPSFが算出され、かつ、画素位相推定部33により複数の点像画素位相が推定される毎に、当該複数のPSFのそれぞれよりも分解能が高い高分解能PSFを生成する。
 分布関数生成部34は、生成した高分解能PSFを波面収差推定部35に出力する。
The distribution function generation unit 34 is realized by, for example, a distribution function generation circuit 43 illustrated in FIG.
Similar to the distribution function generator 7 shown in FIG. 1, the distribution function generator 34 calculates a plurality of point image pixel phases estimated by the pixel phase estimator 33 and calculates a plurality of point image pixel phases calculated by the distribution function calculator 32. From the PSF, a high-resolution PSF having a higher resolution than each of the plurality of PSFs is generated.
Unlike the distribution function generator 7 shown in FIG. 1, the distribution function generator 34 calculates a plurality of PSFs by the distribution function calculator 32 and estimates a plurality of point image pixel phases by the pixel phase estimator 33. Each time, a high-resolution PSF having a higher resolution than each of the plurality of PSFs is generated.
The distribution function generator 34 outputs the generated high-resolution PSF to the wavefront aberration estimator 35.

 波面収差推定部35は、例えば、図11に示す波面収差推定回路44によって実現される。
 波面収差推定部35は、分布関数生成部34により生成された複数の高分解能PSFから、結像光学系2の波面収差を推定する。
 波面収差推定部35は、推定した結像光学系2の波面収差を波面収差出力部9に出力する。
The wavefront aberration estimating unit 35 is realized by, for example, a wavefront aberration estimating circuit 44 shown in FIG.
The wavefront aberration estimator 35 estimates the wavefront aberration of the imaging optical system 2 from the plurality of high-resolution PSFs generated by the distribution function generator 34.
The wavefront aberration estimating unit 35 outputs the estimated wavefront aberration of the imaging optical system 2 to the wavefront aberration output unit 9.

 図10では、画像処理部31の構成要素である分布関数算出部32、画素位相推定部33、分布関数生成部34、波面収差推定部35及び波面収差出力部9のそれぞれが、図11に示すような専用のハードウェアで実現されるものを想定している。即ち、画像処理部31が、分布関数算出回路41、画素位相推定回路42、分布関数生成回路43、波面収差推定回路44及び波面収差出力回路15で実現されるものを想定している。
 ここで、分布関数算出回路41、画素位相推定回路42、分布関数生成回路43、波面収差推定回路44及び波面収差出力回路15のそれぞれは、例えば、単一回路、複合回路、プログラム化したプロセッサ、並列プログラム化したプロセッサ、ASIC、FPGA、又は、これらを組み合わせたものが該当する。
In FIG. 10, the distribution function calculation unit 32, the pixel phase estimation unit 33, the distribution function generation unit 34, the wavefront aberration estimation unit 35, and the wavefront aberration output unit 9, which are the components of the image processing unit 31, are illustrated in FIG. It is assumed that it is realized by such dedicated hardware. That is, it is assumed that the image processing unit 31 is realized by the distribution function calculation circuit 41, the pixel phase estimation circuit 42, the distribution function generation circuit 43, the wavefront aberration estimation circuit 44, and the wavefront aberration output circuit 15.
Here, each of the distribution function calculation circuit 41, the pixel phase estimation circuit 42, the distribution function generation circuit 43, the wavefront aberration estimation circuit 44, and the wavefront aberration output circuit 15 is, for example, a single circuit, a composite circuit, a programmed processor, A parallel programmed processor, ASIC, FPGA, or a combination thereof is applicable.

 画像処理部31の構成要素は、専用のハードウェアで実現されるものに限るものではなく、画像処理部31がソフトウェア、ファームウェア、又は、ソフトウェアとファームウェアとの組み合わせで実現されるものであってもよい。
 画像処理部31がソフトウェア又はファームウェアなどで実現される場合、分布関数算出部32、画素位相推定部33、分布関数生成部34、波面収差推定部35及び波面収差出力部9の処理手順をコンピュータに実行させるためのプログラムが図3に示すメモリ22に格納される。そして、コンピュータのプロセッサ21がメモリ22に格納されているプログラムを実行する。
 図12は、画像処理部31がソフトウェア又はファームウェアなどで実現される場合の処理手順を示すフローチャートである。
The components of the image processing unit 31 are not limited to those realized by dedicated hardware. Even if the image processing unit 31 is realized by software, firmware, or a combination of software and firmware, Good.
When the image processing unit 31 is realized by software or firmware, the processing procedure of the distribution function calculation unit 32, the pixel phase estimation unit 33, the distribution function generation unit 34, the wavefront aberration estimation unit 35, and the wavefront aberration output unit 9 is performed by a computer. The program to be executed is stored in the memory 22 shown in FIG. Then, the processor 21 of the computer executes the program stored in the memory 22.
FIG. 12 is a flowchart illustrating a processing procedure when the image processing unit 31 is realized by software or firmware.

 次に、図10に示す光学系評価装置の動作について説明する。
 空間光変調部30は、複数の点光源からのそれぞれの光に加える波面収差を変化させながら、波面収差を加えたそれぞれの光を結像光学系2に繰り返し出力する。
 図10に示す光学系評価装置では、空間光変調部30が、入射した光に、異なる波面収差をH回加えるものとする。Hは、2以上の整数である。
 空間光変調部30により加えられる波面収差は、図5に示すように、それぞれの光に対応する点像を周囲に向かって均等に広がる形状とするような波面収差である。
 空間光変調部30は、それぞれの光に波面収差を加える毎に、波面収差を加えたそれぞれの光を結像光学系2に出力する。
Next, the operation of the optical system evaluation device shown in FIG. 10 will be described.
The spatial light modulating unit 30 repeatedly outputs each of the light to which the wavefront aberration has been added to the imaging optical system 2 while changing the wavefront aberration to be applied to each of the lights from the plurality of point light sources.
In the optical system evaluation device shown in FIG. 10, the spatial light modulator 30 adds different wavefront aberrations to incident light H times. H is an integer of 2 or more.
The wavefront aberration added by the spatial light modulator 30 is a wavefront aberration that causes the point images corresponding to the respective lights to have a shape that spreads evenly toward the periphery, as shown in FIG.
The spatial light modulator 30 outputs each of the lights with the wavefront aberration added to the imaging optical system 2 each time the wavefront aberration is added to each of the lights.

 結像光学系2は、空間光変調部30により波面収差が加えられたそれぞれの光が入射すると、それぞれの光を撮像素子3-1~3-Jの結像面に結像させる。
 撮像素子3-1~3-Jは、結像光学系2により結像されたそれぞれの光を変換して電気信号を生成する。
 撮像部3は、撮像素子3-1~3-Jにより生成された電気信号を分布関数算出部32及び画素位相推定部33のそれぞれに出力する。
When each light to which the wavefront aberration is added by the spatial light modulator 30 is incident, the imaging optical system 2 forms each light on the imaging surfaces of the imaging elements 3-1 to 3-J.
The imaging devices 3-1 to 3-J convert each light imaged by the imaging optical system 2 to generate an electric signal.
The imaging unit 3 outputs the electric signals generated by the imaging devices 3-1 to 3-J to each of the distribution function calculation unit 32 and the pixel phase estimation unit 33.

 分布関数算出部32は、撮像素子3-1~3-Jから電気信号を受ける毎に、図1に示す分布関数算出部5と同様に、受けた電気信号から、それぞれの光に対応する点像の強度分布を示すPSFを算出する(図12のステップST11)。
 分布関数算出部32は、算出したPSFを分布関数生成部34に出力する。
Each time the distribution function calculation unit 32 receives an electric signal from the image sensor 3-1 to 3-J, the distribution function calculation unit 32 calculates a point corresponding to each light from the received electric signal, similarly to the distribution function calculation unit 5 illustrated in FIG. The PSF indicating the intensity distribution of the image is calculated (step ST11 in FIG. 12).
The distribution function calculation unit 32 outputs the calculated PSF to the distribution function generation unit 34.

 画素位相推定部33は、撮像素子3-1~3-Jから電気信号を受ける毎に、図1に示す画素位相推定部6と同様に、受けた電気信号から、それぞれの光に対応する点像について点像画素位相を推定する(図12のステップST12)。
 画素位相推定部33は、推定した点像画素位相を分布関数生成部34に出力する。   
Each time the pixel phase estimating unit 33 receives an electric signal from the imaging elements 3-1 to 3-J, the pixel phase estimating unit 33 calculates a point corresponding to each light from the received electric signal, similarly to the pixel phase estimating unit 6 shown in FIG. The point image pixel phase is estimated for the image (step ST12 in FIG. 12).
The pixel phase estimating unit 33 outputs the estimated point image pixel phase to the distribution function generating unit 34.

 分布関数生成部34は、分布関数算出部32から算出された複数のPSFを受け、画素位相推定部33から推定された複数の点像画素位相を受ける。
 分布関数生成部7は、図1に示す分布関数生成部7と同様に、画素位相推定部33により推定された複数の点像画素位相に基づいて、分布関数算出部32により算出された複数のPSFから、当該複数のPSFのそれぞれよりも分解能が高い高分解能PSFを生成する(図12のステップST13)。
 分布関数生成部34は、生成した高分解能PSFを波面収差推定部35に出力する。
The distribution function generator 34 receives the plurality of PSFs calculated from the distribution function calculator 32 and receives the plurality of point image pixel phases estimated from the pixel phase estimator 33.
The distribution function generator 7, like the distribution function generator 7 shown in FIG. 1, calculates a plurality of point image pixel phases estimated by the pixel phase estimator 33 and a plurality of point image pixel phases calculated by the distribution function calculator 32. From the PSF, a high-resolution PSF having a higher resolution than each of the plurality of PSFs is generated (step ST13 in FIG. 12).
The distribution function generator 34 outputs the generated high-resolution PSF to the wavefront aberration estimator 35.

 分布関数生成部34により生成された高分解能PSFの個数がH個未満であれば(ステップST14:NOの場合)、ステップST11~ST13の処理が繰り返し実施される。
 波面収差推定部35は、分布関数生成部34により生成された高分解能PSFの個数がH個であれば(ステップST14:YESの場合)、分布関数生成部34により生成されたH個の高分解能PSFから、結像光学系2の波面収差を推定する(図12のステップST15)。
 波面収差推定部35は、例えば、公知のフェーズダイバーシティ法を用いることで、H個の高分解能PSFから波面収差を推定することができる。
 H個の高分解能PSFから推定された波面収差は、図1に示す波面収差推定部8により推定された波面収差よりも精度が向上する。
 以下の非特許文献2及び非特許文献3には、フェーズダイバーシティ法が開示されている。
[非特許文献2]
J. J. Green, D. C. Redding, S. B. Shaklan, S. A. Basinger, “Extreme wave front sensing accuracy for the Eclipse coronagraphic space telescope,”Proc. SPIE 4860, 266, (2003).
[非特許文献3]
R. G. Paxman, T. J. Schulz, and J. R. Fienup, “Joint estimation of object and aberrations by using phase diversity,”J. Opt. Soc. Vol. 9, No. 7 (1992).
If the number of high-resolution PSFs generated by the distribution function generator 34 is less than H (step ST14: NO), the processing of steps ST11 to ST13 is repeatedly performed.
If the number of high-resolution PSFs generated by the distribution function generator 34 is H (step ST14: YES), the wavefront aberration estimator 35 generates the H high-resolution PSFs generated by the distribution function generator 34. The wavefront aberration of the imaging optical system 2 is estimated from the PSF (step ST15 in FIG. 12).
The wavefront aberration estimating unit 35 can estimate the wavefront aberration from the H high-resolution PSFs, for example, by using a known phase diversity method.
The wavefront aberration estimated from the H high-resolution PSFs has higher accuracy than the wavefront aberration estimated by the wavefront aberration estimating unit 8 shown in FIG.
The following non-patent documents 2 and 3 disclose a phase diversity method.
[Non-Patent Document 2]
J. J. Green, D. C. Redding, S. B. Shaklan, S. A. Basinger, “Extreme wave front sensing accuracy for the Eclipse coronagraphic space telescope,” Proc. SPIE 4860, 266, (2003).
[Non-Patent Document 3]
R. G. Paxman, T. J. Schulz, and J. R. Fienup, “Joint estimation of object and aberrations by using phase diversity,” J. Opt. Soc. Vol. 9, No. 7 (1992).

 以上の実施の形態2の光学系評価装置は、空間光変調部30が、複数の点光源からのそれぞれの光に加える波面収差を変化させながら、波面収差を加えたそれぞれの光を結像光学系2に繰り返し出力する。分布関数算出部32が、撮像素子3-1~3-Jによりそれぞれの光が変換されて電気信号が生成される毎に、生成された電気信号から、それぞれの光に対応する点像の強度分布を示すPSFを算出する。画素位相推定部33が、撮像素子3-1~3-Jによりそれぞれの光が変換されて電気信号が生成される毎に、生成された電気信号から、それぞれの光に対応する点像について点像画素位相を推定する。分布関数生成部34が、分布関数算出部32により複数のPSFが算出され、かつ、画素位相推定部33により点像画素位相が推定される毎に、画素位相推定部33により推定された点像画素位相に基づいて、分布関数算出部32により算出された複数のPSFから、当該複数のPSFのそれぞれよりも分解能が高い高分解能PSFを生成する。波面収差推定部35が、分布関数生成部34により生成された複数の高分解能PSFから、結像光学系2の波面収差を推定する。したがって、実施の形態2の光学系評価装置は、実施の形態1の光学系評価装置よりも、波面収差の推定精度を高めることができる。 In the optical system evaluation apparatus according to the second embodiment, the spatial light modulator 30 changes the wavefront aberration applied to each light from the plurality of point light sources while changing the wavefront aberration-added light. Output to the system 2 repeatedly. Each time the light is converted by the imaging elements 3-1 to 3-J to generate an electric signal, the distribution function calculation unit 32 calculates the intensity of the point image corresponding to each light from the generated electric signal. Calculate the PSF indicating the distribution. Each time the light is converted by the imaging elements 3-1 to 3-J to generate an electric signal, the pixel phase estimating unit 33 calculates a point image corresponding to the light from the generated electric signal. Estimate the image pixel phase. The distribution function generation unit 34 calculates a plurality of PSFs by the distribution function calculation unit 32 and every time the pixel phase estimation unit 33 estimates the point image pixel phase, the point image estimated by the pixel phase estimation unit 33 is calculated. Based on the pixel phase, a high-resolution PSF having a higher resolution than each of the plurality of PSFs is generated from the plurality of PSFs calculated by the distribution function calculation unit 32. The wavefront aberration estimator 35 estimates the wavefront aberration of the imaging optical system 2 from the plurality of high-resolution PSFs generated by the distribution function generator 34. Therefore, the optical system evaluation device of the second embodiment can improve the estimation accuracy of the wavefront aberration more than the optical system evaluation device of the first embodiment.

 なお、本願発明はその発明の範囲内において、各実施の形態の自由な組み合わせ、あるいは各実施の形態の任意の構成要素の変形、もしくは各実施の形態において任意の構成要素の省略が可能である。 In the present invention, any combination of the embodiments, a modification of an arbitrary component of each embodiment, or an omission of an arbitrary component in each embodiment is possible within the scope of the invention. .

 この発明は、結像光学系の波面収差を推定する光学系評価装置及び光学系評価方法に適している。 The present invention is suitable for an optical system evaluation device and an optical system evaluation method for estimating the wavefront aberration of an imaging optical system.

 1 空間光変調部、2 結像光学系、3 撮像部、3-1~3-J 撮像素子、4 画像処理部、5 分布関数算出部、6 画素位相推定部、7 分布関数生成部、8 波面収差推定部、9 波面収差出力部、11 分布関数算出回路、12 画素位相推定回路、13 分布関数生成回路、14 波面収差推定回路、15 波面収差出力回路、21 プロセッサ、22 メモリ、30 空間光変調部、31 画像処理部、32 分布関数算出部、33 画素位相推定部、34 分布関数生成部、35 波面収差推定部、41 分布関数算出回路、42 画素位相推定回路、43 分布関数生成回路、44 波面収差推定回路。 1 spatial light modulator, 2 imaging optical system, 3 imaging unit, 3-1 to 3-J imaging device, 4 image processing unit, 5 distribution function calculation unit, 6 pixel phase estimation unit, 7 distribution function generation unit, 8 Wavefront aberration estimating section, 9 ° wavefront aberration output section, 11 ° distribution function calculating circuit, 12 ° pixel phase estimating circuit, 13 ° distribution function generating circuit, 14 ° wavefront aberration estimating circuit, 15 ° wavefront aberration output circuit, 21 ° processor, 22 ° memory, 30 ° spatial light Modulation unit, 31 image processing unit, 32 distribution function calculating unit, 33 pixel phase estimating unit, 34 distribution function generating unit, 35 wavefront aberration estimating unit, 41 distribution function calculating circuit, 42 pixel phase estimating circuit, 43 distribution function generating circuit, 44 ° wavefront aberration estimation circuit.

Claims (4)

 複数の点光源からのそれぞれの光に波面収差を加える空間光変調部と、
 前記空間光変調部により波面収差が加えられたそれぞれの光を結像する結像光学系と、
 前記結像光学系により結像されたそれぞれの光を変換して電気信号を生成する複数の撮像素子と、
 前記複数の撮像素子により生成された電気信号から、それぞれの光に対応する点像の強度分布を示す点像分布関数を算出する分布関数算出部と、
 前記複数の撮像素子により生成された電気信号から、それぞれの光に対応する点像の画素位相を推定する画素位相推定部と、
 前記画素位相推定部により推定された画素位相に基づいて、前記分布関数算出部により算出された複数の点像分布関数から、前記複数の点像分布関数よりも分解能が高い高分解能点像分布関数を生成する分布関数生成部と、
 前記分布関数生成部により生成された高分解能点像分布関数から、前記結像光学系の波面収差を推定する波面収差推定部と
 を備えた光学系評価装置。
A spatial light modulator that adds a wavefront aberration to each light from the plurality of point light sources,
An imaging optical system that forms an image of each light wavefront aberration added by the spatial light modulator,
A plurality of imaging devices that convert each light imaged by the imaging optical system to generate an electric signal,
From the electric signals generated by the plurality of imaging elements, a distribution function calculation unit that calculates a point spread function indicating an intensity distribution of a point image corresponding to each light,
From the electrical signals generated by the plurality of imaging elements, a pixel phase estimating unit that estimates a pixel phase of a point image corresponding to each light,
From the plurality of point spread functions calculated by the distribution function calculation section based on the pixel phase estimated by the pixel phase estimation section, a high resolution point spread function having a higher resolution than the plurality of point spread functions A distribution function generator that generates
A wavefront aberration estimating unit for estimating a wavefront aberration of the imaging optical system from a high-resolution point spread function generated by the distribution function generating unit.
 前記空間光変調部は、前記複数の点光源からのそれぞれの光に加える波面収差を変化させながら、波面収差を加えたそれぞれの光を前記結像光学系に繰り返し出力し、
 前記分布関数算出部は、前記複数の撮像素子によりそれぞれの光が変換されて電気信号が生成される毎に、前記複数の撮像素子により生成された電気信号から、それぞれの光に対応する点像の強度分布を示す点像分布関数を算出し、
 前記画素位相推定部は、前記複数の撮像素子によりそれぞれの光が変換されて電気信号が生成される毎に、前記複数の撮像素子により生成された電気信号から、それぞれの光に対応する点像の画素位相を推定し、
 前記分布関数生成部は、前記分布関数算出部により複数の点像分布関数が算出され、かつ、前記画素位相推定部により画素位相が推定される毎に、前記画素位相推定部により推定された画素位相に基づいて、前記分布関数算出部により算出された複数の点像分布関数から、前記高分解能点像分布関数を生成し、
 前記波面収差推定部は、前記分布関数生成部により生成された複数の高分解能点像分布関数から、前記結像光学系の波面収差を推定することを特徴とする請求項1記載の光学系評価装置。
The spatial light modulator, while changing the wavefront aberration applied to each light from the plurality of point light sources, repeatedly outputs each light added wavefront aberration to the imaging optical system,
The distribution function calculation unit is configured such that each time each of the light is converted by the plurality of imaging elements to generate an electric signal, a point image corresponding to each light is generated from the electric signal generated by the plurality of imaging elements. Calculate a point spread function indicating the intensity distribution of
The pixel phase estimating unit is configured such that each time each of the light is converted by the plurality of imaging elements to generate an electric signal, a point image corresponding to each light is obtained from the electric signal generated by the plurality of imaging elements. Of the pixel phase of
The distribution function generation unit calculates a plurality of point spread functions by the distribution function calculation unit, and each time the pixel phase is estimated by the pixel phase estimation unit, the pixel estimated by the pixel phase estimation unit Based on the phase, from the plurality of point spread functions calculated by the distribution function calculation unit, to generate the high-resolution point spread function,
The optical system evaluation according to claim 1, wherein the wavefront aberration estimating unit estimates the wavefront aberration of the imaging optical system from a plurality of high-resolution point spread functions generated by the distribution function generating unit. apparatus.
 前記空間光変調部は、前記点像が周囲に向かって均等に広がる波面収差を光に加えることを特徴とする請求項1記載の光学系評価装置。 2. The optical system evaluation device according to claim 1, wherein the spatial light modulator applies wavefront aberration to the light in which the point image spreads evenly toward the periphery.  空間光変調部が、複数の点光源からのそれぞれの光に波面収差を加え、
 結像光学系が、前記空間光変調部により波面収差が加えられたそれぞれの光を結像し、
 複数の撮像素子が、前記結像光学系により結像されたそれぞれの光を変換して電気信号を生成し、
 分布関数算出部が、前記複数の撮像素子により生成された電気信号から、それぞれの光に対応する点像の強度分布を示す点像分布関数を算出し、
 画素位相推定部が、前記複数の撮像素子により生成された電気信号から、それぞれの光に対応する点像の画素位相を推定し、
 分布関数生成部が、前記画素位相推定部により推定された画素位相に基づいて、前記分布関数算出部により算出された複数の点像分布関数から、前記複数の点像分布関数よりも分解能が高い高分解能点像分布関数を生成し、
 波面収差推定部が、前記分布関数生成部により生成された高分解能点像分布関数から、前記結像光学系の波面収差を推定する
 光学系評価方法。
A spatial light modulator adds wavefront aberration to each light from the plurality of point light sources,
An imaging optical system forms an image of each light wavefront aberration added by the spatial light modulator,
A plurality of image sensors convert each light imaged by the imaging optical system to generate an electric signal,
A distribution function calculation unit calculates a point spread function indicating an intensity distribution of a point image corresponding to each light from the electric signals generated by the plurality of image sensors,
A pixel phase estimating unit estimates a pixel phase of a point image corresponding to each light from the electric signals generated by the plurality of image sensors,
The distribution function generation unit has a higher resolution than the plurality of point spread functions from the plurality of point spread functions calculated by the distribution function calculation unit based on the pixel phase estimated by the pixel phase estimation unit. Generate a high-resolution point spread function,
An optical system evaluation method, wherein a wavefront aberration estimating unit estimates a wavefront aberration of the imaging optical system from a high-resolution point spread function generated by the distribution function generating unit.
PCT/JP2018/029221 2018-08-03 2018-08-03 Optical system evaluation device and optical system evaluation method Ceased WO2020026432A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/029221 WO2020026432A1 (en) 2018-08-03 2018-08-03 Optical system evaluation device and optical system evaluation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/029221 WO2020026432A1 (en) 2018-08-03 2018-08-03 Optical system evaluation device and optical system evaluation method

Publications (2)

Publication Number Publication Date
WO2020026432A1 true WO2020026432A1 (en) 2020-02-06
WO2020026432A9 WO2020026432A9 (en) 2021-02-11

Family

ID=69232421

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/029221 Ceased WO2020026432A1 (en) 2018-08-03 2018-08-03 Optical system evaluation device and optical system evaluation method

Country Status (1)

Country Link
WO (1) WO2020026432A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014164004A (en) * 2013-02-22 2014-09-08 Hitachi High-Technologies Corp Fluorescence microscope
JP2014531160A (en) * 2011-10-11 2014-11-20 レイセオン カンパニー Blur calibration system for electro-optic sensor and method using moving multi-focus multi-target constellation
JP2014224766A (en) * 2013-05-16 2014-12-04 三菱電機株式会社 Imaging performance evaluation device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014531160A (en) * 2011-10-11 2014-11-20 レイセオン カンパニー Blur calibration system for electro-optic sensor and method using moving multi-focus multi-target constellation
JP2014164004A (en) * 2013-02-22 2014-09-08 Hitachi High-Technologies Corp Fluorescence microscope
JP2014224766A (en) * 2013-05-16 2014-12-04 三菱電機株式会社 Imaging performance evaluation device

Also Published As

Publication number Publication date
WO2020026432A9 (en) 2021-02-11

Similar Documents

Publication Publication Date Title
US11758078B2 (en) Methods and apparatuses for compensating light reflections from a cover of a time-of-flight camera
Minowa et al. Performance of Subaru adaptive optics system AO188
CN103968945B (en) Based on hypersensitive light spectrum image-forming astronomical telescope and the method for second order squeeze perception
US10247933B2 (en) Image capturing device and method for image capturing
CN104019899B (en) A kind of hypersensitive astronomical telescope and astronomical image acquiring method thereof
JP5440615B2 (en) Stereo camera device
CN104019898B (en) Ultrasensitive spectral imaging astronomical telescope and astronomical spectral imaging method
EP3730917A1 (en) Wavefront measurement device and wavefront measurement system
US5350911A (en) Wavefront error estimation derived from observation of arbitrary unknown extended scenes
US20210235060A1 (en) Solid-state imaging device, information processing device, information processing method, and calibration method
US20180067017A1 (en) Method and device for characterising optical aberrations of an optical system
CN111207910B (en) Co-phase error correction method for spliced mirrors based on the analysis of dispersion fringe slope
US6396588B1 (en) Hybrid curvature-tilt wave front sensor
CN114095718A (en) Single-pixel imaging system and method
Cota et al. PICASSO: an end-to-end image simulation tool for space and airborne imaging systems
JP5477464B2 (en) Imaging device
CN119533327B (en) Defocus holographic measurement method of radio telescope based on planetary zoom
CN113176079B (en) Ultrahigh-precision wavefront detection and calibration method for high-contrast imaging coronagraph
WO2020026432A1 (en) Optical system evaluation device and optical system evaluation method
EP4390499A1 (en) Adaptive optical system and method for adaptively correcting an optical radiation
US10339665B2 (en) Positional shift amount calculation apparatus and imaging apparatus
CN112946602B (en) Multipath error compensation method and multipath error compensation indirect flight time distance calculation device
CN102621688A (en) Self-adapting optical system based on differential sensor
JP5009017B2 (en) Phase distribution controller
KR20130106525A (en) Device for processing image and method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18928225

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18928225

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP