[go: up one dir, main page]

WO2011121763A1 - Image processing apparatus and image capturing apparatus using same - Google Patents

Image processing apparatus and image capturing apparatus using same Download PDF

Info

Publication number
WO2011121763A1
WO2011121763A1 PCT/JP2010/055865 JP2010055865W WO2011121763A1 WO 2011121763 A1 WO2011121763 A1 WO 2011121763A1 JP 2010055865 W JP2010055865 W JP 2010055865W WO 2011121763 A1 WO2011121763 A1 WO 2011121763A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
filter
transfer function
imaging
difference
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2010/055865
Other languages
French (fr)
Japanese (ja)
Inventor
弘至 畠山
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Priority to PCT/JP2010/055865 priority Critical patent/WO2011121763A1/en
Priority to CN201180017173.9A priority patent/CN102822863B/en
Priority to PCT/JP2011/055610 priority patent/WO2011122284A1/en
Priority to JP2012508186A priority patent/JP5188651B2/en
Priority to US13/204,453 priority patent/US8514304B2/en
Publication of WO2011121763A1 publication Critical patent/WO2011121763A1/en
Anticipated expiration legal-status Critical
Priority to US13/849,781 priority patent/US8692909B2/en
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise
    • H04N25/61Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4"
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise
    • H04N25/61Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4"
    • H04N25/611Correction of chromatic aberration

Definitions

  • the present invention relates to an image processing apparatus that performs image processing, and more particularly to an image processing apparatus that performs image restoration (restoration).
  • An image obtained by an imaging device such as a digital camera is deteriorated due to blur.
  • the image blur is caused by spherical aberration, coma aberration, field curvature, astigmatism, and the like of the imaging optical system.
  • PSF Point Spread Function
  • An optical transfer function (OTF, Optic Transfer Function) obtained by Fourier transforming a point spread function (hereinafter referred to as PSF) is information in the frequency space of aberration, and is represented by a complex number.
  • OTF optical transfer function
  • MTF ModulationFTransfer Function
  • PTF Phase component
  • the OTF of the imaging optical system affects (deteriorates) the amplitude component and phase component of the image. For this reason, an image deteriorated by the influence of the OTF (hereinafter referred to as a deteriorated image) is an image in which each point of the subject is asymmetrically blurred like coma.
  • FIGS. 13A, 13B, and 13C are schematic diagrams showing the spread of the point spread function (PSF) on the plane perpendicular to the principal ray (the ray passing through the center of the pupil of the optical system). is there.
  • lines perpendicular to each other passing through the optical axis are defined as axes x1 and x2, and an angle ⁇ formed by an arbitrary straight line passing through the optical axis and the axis x1 in the plane shown in FIG.
  • the azimuth angle ⁇ is the azimuth direction.
  • the azimuth direction is a general term for all directions including the sagittal direction and the meridional direction, and also including the other angle ⁇ directions.
  • FIG. 13A is a diagram schematically showing a PSF in which coma aberration occurs.
  • the PSF of the angle of view other than on the optical axis in the case of an optical system having a rotationally symmetric lens without the eccentricity of the optical axis is symmetric with respect to a straight line passing through the optical axis and the principal ray on the image plane. It has a symmetrical shape.
  • the PSF is line symmetric with respect to the axis x2.
  • FIG. 13B shows a PSF in a state where there is no phase shift. Although it has a symmetric shape in each azimuth direction, there is a difference in amplitude (MTF), so that the PSF spread is different in the axis x1 direction and the axis x2 direction.
  • the PSF on the optical axis has a rotationally symmetric shape as shown in FIG. 13C because there is no phase shift and no azimuth dependency of amplitude deterioration if manufacturing errors are ignored. That is, as shown in FIGS. 13A and 13B, the PSF has an asymmetric shape due to the difference in the phase (PTF) in each azimuth direction and the difference in amplitude (MTF) between the azimuth directions. This is a factor that hinders the generation of high-definition images.
  • Patent Document 1 provides a parameter ⁇ for designing an image restoration filter as shown in Equation 1.
  • the degree of image recovery can be adjusted with one parameter in the range from the original photographed image to the image recovered to the maximum.
  • F (u, v) and G (u, v) are Fourier transforms of the restored image and the degraded image, respectively.
  • Equation 2 shows the frequency characteristic M (u, v) of the Wiener filter.
  • H (u, v) is an optical transfer function (OTF).
  • is the absolute value (MTF) of the OTF.
  • SNR is the intensity ratio of the noise signal.
  • an image restoration process a process of performing image restoration using an image restoration filter based on an optical transfer function (OTF) as described in the Wiener filter or Patent Document 1 is referred to as an image restoration process.
  • OTF optical transfer function
  • the above-described Wiener filter and the image restoration filter of Patent Document 1 can correct the deterioration of the amplitude component and the phase component due to the imaging optical system, but cannot correct the difference in the amplitude component between the azimuth directions.
  • the Wiener filter if there is a difference between the azimuth directions of the MTF before the recovery, the difference in the azimuth direction of the MTF after the recovery is expanded. This will be described with reference to FIG.
  • FIG. 14 is a diagram showing the MTF before the image restoration process and the MTF after the image restoration process using the Wiener filter.
  • the broken line (a) and the solid line (b) are the MTFs in the first azimuth direction and the second azimuth direction before recovery, respectively.
  • the broken line (c) and the solid line (d) are the MTFs in the first azimuth direction and the second azimuth direction after recovery, respectively.
  • the first and second azimuth directions are, for example, the sagittal direction and the meridional direction.
  • the Wiener filter is an image restoration filter that lowers the recovery gain (recovery degree) when the MTF is high and increases the recovery gain (recovery degree) when the MTF is low.
  • the broken line (a) which is the azimuth direction with a low MTF, has a lower recovery gain than the azimuth direction (b) with a high MTF. Therefore, the MTF (c) in the first azimuth direction after the recovery and the MTF (d) in the second azimuth direction after the difference between the MTF (a) in the first azimuth direction before the recovery and the MTF (b) in the second azimuth direction. ) Is enlarged. That is, asymmetric aberrations appear in the image despite the image restoration process. The same applies to the image restoration filter disclosed in Patent Document 1.
  • the present invention has been made in view of the above problems, and an object of the present invention is to provide an image processing apparatus capable of reducing asymmetric aberrations that can be caused by image restoration processing and obtaining a higher definition image.
  • the present invention Image acquisition means for acquiring an input image; Image recovery means for recovering the input image using an image recovery filter generated or selected based on a transfer function of an imaging system used to form a subject image as the input image and generating a recovered image; And
  • the image restoration filter has a difference in absolute value of a transfer function between two azimuth directions in obtaining a restored image from a subject smaller than a difference in absolute value of a transfer function between the two azimuth directions of the imaging system. It is characterized by doing.
  • the effect of the present invention is that a high-definition restored image with reduced asymmetric aberration can be generated.
  • Explanatory drawing of the image processing method which is an Example of this invention Explanatory drawing of the image restoration filter used with the image processing method of an Example 1st explanatory drawing about the correction amount of MTF between each azimuth direction 2nd explanatory drawing regarding the correction amount of MTF between each azimuth direction
  • Block diagram showing the basic configuration of the imaging device
  • Explanatory drawing regarding selection and correction of an image restoration filter in the first embodiment Flowchart showing the image processing procedure of the first embodiment. The figure which shows the MTF change before and after the image processing of Example 1. Illustration of edge enhancement filter Edge cross section with edge enhancement filter applied Explanatory drawing of the image processing system of Example 2 of this invention.
  • FIG. 1 shows processing steps from image input to output.
  • step S11 an image generated by the imaging system is acquired.
  • the image acquired in the image acquisition process is referred to as an input image.
  • step S12 an image restoration filter corresponding to the shooting condition of the input image acquired in step S11 is generated.
  • This step S12 may be a step of selecting an appropriate filter from a plurality of image restoration filters prepared in advance, or may be a step of appropriately correcting the filter after selecting the filter.
  • image restoration processing (correction processing) is performed using the image restoration filter generated or selected based on the transfer function of the imaging system (optical transfer function of the imaging optical system) in step S12. Execute. More specifically, the phase component (PTF) of the input image is corrected to zero as the target value, and the amplitude component (MTF) is corrected so that the difference between the two azimuth directions decreases.
  • step S14 the corrected image corrected in step S13 is output as an output image.
  • image processing steps may be inserted before, after, or during the process of FIG.
  • the other image processing is processing such as electronic aberration correction such as distortion aberration correction and peripheral light amount correction, demosaicing, gamma conversion, and image compression.
  • electronic aberration correction such as distortion aberration correction and peripheral light amount correction
  • demosaicing demosaicing
  • gamma conversion demosaicing
  • image compression image compression
  • MTF is the amplitude component (absolute value) of the transfer function of the image pickup system (optical transfer function of the image pickup optical system), but when the subject (object in the object field) is a white point light source, the MTF is the spectrum of the image. Can be considered.
  • An image acquired by the image acquisition process (hereinafter referred to as an input image) is a digital image obtained by capturing an image with an image sensor via an imaging optical system.
  • the digital image obtained here is compared with an object in the field by an optical transfer function (OTF) based on aberrations of the imaging optical system including the imaging optical system (lens) and various optical filters. It has deteriorated.
  • the optical transfer function (OTF) is preferably a transfer function based on the aberration of the optical element of the imaging optical system as described above and other characteristics of the imaging device.
  • the imaging optical system can also use a mirror (reflection surface) having a curvature in addition to the lens.
  • the input image is represented by a color space.
  • the color space is represented by RGB, for example.
  • RGB hue, saturation expressed by LCH
  • luminance luminance
  • color difference signal expressed by YCbCr
  • Other color spaces include XYZ, Lab, Yuv, JCh, and color temperature.
  • the present invention can be applied to values represented by these commonly used color spaces as color components.
  • the input image may be a mosaic image having a signal value of one color component for each pixel, or a color interpolation process (demosaicing process) is performed on this mosaic image, and each pixel has a signal value of a plurality of color components.
  • a demosaic image may be used.
  • the mosaic image is also called a RAW image as an image before image processing such as color interpolation processing (demosaicing processing), gamma conversion, image compression such as JPEG, or the like.
  • a color filter with a different spectral transmittance is arranged in each pixel to obtain a mosaic image having a signal value of one color component in each pixel. To do.
  • an image having a plurality of color component signal values at each pixel can be acquired.
  • a color filter having a different spectral transmittance is arranged for each image sensor, and a demosaic image having image signal values of different color components for each image sensor is obtained. Will get.
  • each image sensor has a signal value of each color component for the corresponding pixel, each pixel has a signal value of a plurality of color components without performing color interpolation processing. Images can be acquired.
  • the correction information includes information about the imaging state (imaging state information) such as the focal length (zoom position), aperture value, shooting distance (focus distance), exposure time, and ISO sensitivity of the lens.
  • imaging state information such as the focal length (zoom position), aperture value, shooting distance (focus distance), exposure time, and ISO sensitivity of the lens.
  • the input image is described as a digital image obtained by taking an image with an imaging device via an imaging optical system
  • the input image may be a digital image obtained by an imaging system that does not include the imaging optical system.
  • a scanner (reading device) or an X-ray imaging device that performs imaging with an imaging element in close contact with the subject surface may be an image obtained by an imaging device that does not have an imaging optical system such as a lens. Although these do not have an imaging optical system, an image generated by image sampling by the imaging device is not a little deteriorated.
  • the “optical transfer function” referred to in the embodiments of the present invention is an optical transfer function in a broad sense including the system transfer function of such an imaging system that does not include such an imaging optical system.
  • FIG. 2A is a schematic diagram of an image restoration filter in which convolution processing is performed on pixels of an input image in real space.
  • the number of taps (cells) of the image restoration filter can be determined according to the aberration characteristics of the imaging system and the required restoration accuracy.
  • the image is two-dimensional, generally the number of taps corresponding to each pixel of the image Is a two-dimensional image restoration filter.
  • FIG. 2A shows an 11 ⁇ 11 tap two-dimensional image restoration filter as an example.
  • the number of taps is set according to the required image quality, image processing capability, aberration characteristics, and the like.
  • FIG. 2 (A) values in each tap are omitted, but one section of this image restoration filter is shown in FIG. 2 (B).
  • the distribution of the values (coefficient values) of each tap of the image restoration filter plays a role of ideally returning the signal value spatially spread by the aberration to the original one point during the convolution process.
  • an optical transfer function (OTF) of the imaging optical system is calculated or measured.
  • OTF optical transfer function
  • the image restoration filter used in the present invention has a function of correcting the difference between the azimuth directions of the MTF.
  • a conventional winner filter will be described with reference to FIG.
  • FIG. 15A shows a meridional section of a point spread function (PSF) of a color component at a certain position on the image
  • FIG. 16 shows the frequency characteristics thereof.
  • FIG. 16M shows the MTF that is the amplitude component
  • FIG. 16P shows the PTF that is the phase component.
  • the frequency characteristics corresponding to the PSFs in FIGS. 15A, 15B, and 15C are a broken line (a), a two-dot chain line (b), and a one-dot chain line (c) in FIG.
  • the PSF before correction shown in FIG. 15A has an asymmetric shape due to coma and the like, and has an MTF characteristic with a lower amplitude response at higher frequencies as indicated by a broken line (A) in FIG.
  • a phase shift occurs as indicated by a broken line (a) in FIG.
  • OTF optical transfer function
  • the MTF corresponding to FIG. 15B is 1 over the entire frequency as shown by the two-dot chain line (b) in FIG. 16M, and the PTF is as shown by the two-dot chain line (b) in FIG. Is zero over the entire frequency.
  • the PSF recovered from the PSF in FIG. 15A by the Wiener filter shown in Equation 1 becomes a symmetrical shape by correcting the phase as shown in FIG. 15C, and the PSF is improved by improving the amplitude.
  • the spread is small and sharp.
  • the MTF corresponding to FIG. 15C has a reduced recovery gain as indicated by the one-dot chain line (c) in FIG. 16M, and the PTF is indicated by the one-dot chain line (c) in FIG. Is zero over the entire frequency. The reason why the PTF is corrected to 0 in spite of the suppression of the recovery gain will be described using Equation 3.
  • the frequency characteristic of the subject has no phase shift, and the amplitude characteristic is 1 over the entire frequency.
  • the frequency characteristic of the image obtained through this imaging optical system is the optical transfer function (OTF) itself, and the image has a pixel value distribution in the form of PSF. That is, if the frequency characteristic of the input image is OTF and is multiplied by the frequency characteristic of the image restoration filter, the frequency characteristic of the restored image can be known. If this is expressed by an equation, as shown in Equation 3, H (u, v), which is an OTF, is canceled out, and the frequency characteristic of the recovered image becomes as shown on the right side.
  • the PSF is corrected symmetrically for each azimuth direction by correcting the phase degradation component for each azimuth direction, the amplitude component is different for each azimuth direction, so that it is not corrected for rotational symmetry.
  • the coma aberration which is the line object in FIG. 13A, is corrected to be point-symmetric so as to be corrected to a PSF like the astigmatism in FIG. That is, for each azimuth direction, a PSF whose phase is corrected and whose amplitude is different is corrected only to a point-symmetric state.
  • the difference in MTF between the azimuth directions is not corrected but rather enlarged as described above with reference to FIG. Therefore, the conventional image restoration method cannot sufficiently correct the asymmetric aberration.
  • Equation 4 the rOTF portion of Equation 4 is the frequency characteristic after recovery of an image taken with a white point light source.
  • rOTF is an arbitrary function. Since it is desirable that the phase degradation component of the restored image is zero, it is sufficient that rOTF does not have a phase component. Since rOTF has only a real part, it is substantially equal to rMTF. Although rOTF preferably has only a real part, it goes without saying that even if an imaginary part has a value within an allowable range, it is within the scope of the present invention. In other words, by using the image restoration filter expressed by Equation 4, any subject regardless of the point light source, as if the image was captured by an imaging optical system having an optical transfer function (OTF) characteristic of rOTF. Is that you can.
  • OTF optical transfer function
  • Equation 5 using OTF (rH (u, v)) common between the azimuth directions, an image captured with an imaging optical system having no MTF difference between the azimuth directions can be obtained.
  • the image restoration filter used in this embodiment is a filter that reduces the MTF difference between the two azimuth directions of the imaging system.
  • FIG. 3 is a diagram showing changes in MTF in two azimuth directions before recovery and after recovery when processing is performed using the image recovery filter of the present invention when the subject is a point light source.
  • the broken line (a) and the solid line (b) are the MTFs before recovery in the first and second azimuth directions, respectively, and the broken line (c) and the solid line (d) are the MTFs after the recovery in the first and second azimuth directions, respectively.
  • the MTF before recovery differs for each azimuth direction as shown in FIGS. 3A and 3B, but the MTF after recovery is aligned between the azimuth directions as shown in FIGS. 3C and 3D.
  • 3A and 3B correspond to MTFs in the meridional direction and the sagittal direction, for example.
  • the image restoration filter can be restored while correcting the difference in MTF between the azimuth directions.
  • Equation 5 a common OTF (rH (u, v)) is used between the azimuth directions, but the difference in OTF for each azimuth direction of rH (u, v) is smaller than the difference in OTF before recovery.
  • the rotational symmetry can be controlled by correcting as described above. An example is shown in FIG. Even if the recovered MTF between the two azimuth directions does not match as shown in FIGS. 4C and 4D, the MTF difference between the azimuth directions is reduced with respect to FIGS. 14C and 14D. PSF asymmetry will be reduced. In order to obtain an asymmetry correction effect, it is desirable to use a filter that recovers at least so as to be smaller than the difference in MTF between the azimuth directions before recovery.
  • the difference in MTF (absolute value of transfer function) between two azimuth directions when obtaining a restored image from a subject is larger than the difference in MTF between the two azimuth directions of the imaging system. Configure to be smaller.
  • the image restoration filter of the present embodiment has a difference in spectrum between the two azimuth directions in the restored image that is smaller than the difference in spectrum between the two azimuth directions in the input image. To recover.
  • the image restoration filter of the present embodiment is a corrected transfer function that is corrected so that the difference between the transfer function having different frequency characteristics in the two azimuth directions and the absolute value of the transfer function between the two azimuth directions is reduced. Is generated based on
  • the H (u, v) portion of Equation 5 is different for each azimuth direction, so that rH (u, v) may be common between the azimuth directions, but may be asymmetric.
  • Has a coefficient array That is, the cross-sectional view of FIG. 2 is different for each azimuth direction.
  • the optical transfer function can include not only the imaging optical system but also a factor that degrades the optical transfer function (OTF) during the imaging process.
  • OTF optical transfer function
  • an optical low-pass filter having birefringence suppresses a high-frequency component with respect to a frequency characteristic of an optical transfer function (OTF).
  • OTF optical transfer function
  • the shape and aperture ratio of the pixel aperture of the image sensor also affect the frequency characteristics.
  • a corrected image (recovered image) can be obtained by convolving an image recovery filter with the deteriorated image.
  • convolution convolution integration, product sum
  • Convolution makes the pixel coincide with the center of the image restoration filter in order to improve the signal value of that pixel. This is a process of taking the product of the image signal value and the coefficient value of the image restoration filter for each corresponding pixel of the image and the image restoration filter and replacing the sum as the signal value of the central pixel.
  • the advantage of applying an image restoration filter to the input image or performing convolution processing is that the image can be restored without performing Fourier transform or inverse Fourier transform of the image in the image restoration processing.
  • the load for convolution processing is smaller than the load for performing Fourier transform. Therefore, it is possible to reduce the processing burden when performing the image restoration process.
  • the number of vertical and horizontal taps of the image restoration filter has already been described, the number of vertical and horizontal taps does not necessarily have to be the same, and can be arbitrarily changed as long as they are taken into consideration when performing convolution processing. .
  • the image restoration process of the present invention can process the reverse process for restoring the original image before the degradation with high accuracy when the image degradation process is linear, the input image is subjected to various adaptive nonlinear processes. Is preferably not performed. That is, it is more preferable to carry out with respect to the mosaic image (RAW image).
  • the image restoration processing of the present invention can be applied regardless of whether the input image is a mosaic image or a demosaic image. The reason is that if the degradation process by the color interpolation process is linear, the image restoration process can be performed by considering this degradation function in the generation of the image restoration filter.
  • the required accuracy of recovery is low or only images that have been subjected to various image processing can be obtained, the effect of reducing blur asymmetry even if image recovery processing is performed on the demosaic image is obtained. Can do.
  • the corrected image (recovered image) acquired by the above processing is output to a desired device. If it is an imaging device, it is output to a display unit or a recording medium. If other image processing or the like is performed on the image that has been subjected to the image restoration processing, the image may be output to a device that executes a later process.
  • the image processing of the present invention has been described in order for each process. However, if several processes can be processed at the same time, they can be processed together. It is also possible to add necessary processing steps as appropriate before and after each step. Furthermore, the formulas and equal signs used in the description do not limit the specific algorithm of the image processing of the present invention to this, and can be modified as necessary within a range where the object can be achieved.
  • FIG. 5 is a schematic diagram of the configuration of the imaging apparatus according to the first embodiment.
  • a subject image (not shown) is formed on the image sensor 102 by the imaging optical system 101.
  • the imaging element 102 converts the imaged light into an electric signal (photoelectric conversion), and the A / D converter 103 converts the electric signal into a digital signal.
  • the image processing unit 104 performs image processing on the digital signal (input image) together with predetermined processing.
  • the predetermined processing is processing such as electronic aberration correction such as magnification chromatic aberration correction, distortion aberration correction, and peripheral light amount correction, demosaicing, gamma conversion, and image compression.
  • the imaging state information of the imaging device is obtained from the state detection unit 107.
  • the state detection unit 107 may obtain imaging state information directly from the system controller 110.
  • imaging state information regarding the imaging optical system 101 may be obtained from the imaging system control unit 106.
  • an image restoration filter corresponding to the imaging state is selected from the storage unit 108, and image restoration processing is performed on the image input to the image processing unit 104.
  • the image restoration filter selected from the storage unit 108 according to the imaging state may be used as it is, or an image restoration filter prepared in advance and corrected to an image restoration filter more suitable for the imaging state is used. You can also.
  • the output image processed by the image processing unit 104 is stored in the image recording medium 109 in a predetermined format.
  • This output image is an image in which the asymmetry of the aberration is corrected and the sharpness is improved.
  • the display unit 105 may display an image that has undergone predetermined processing for display on the image after the image restoration processing, or no correction processing is performed for high-speed display, or simple correction. You may display the image which processed.
  • the above-described series of control is performed by the system controller 110, and mechanical driving of the imaging system is performed by the imaging system control unit 106 according to an instruction from the system controller 110.
  • the aperture of the aperture 101a is controlled as an F number shooting state setting.
  • the focus lens 101b is controlled in position by an unillustrated autofocus (AF) mechanism or manual manual focus mechanism in order to adjust the focus according to the shooting distance.
  • AF autofocus
  • This imaging system may include an optical element such as a low-pass filter or an infrared cut filter.
  • an optical element such as a low-pass filter or an infrared cut filter.
  • OTF optical transfer function
  • the recovery process can be performed with higher accuracy if the influence of this element is taken into consideration when the image recovery filter is created. It is.
  • the infrared cut filter it affects each PSF of the RGB channel, particularly the PSF of the R channel, which is an integral value of the point spread function (PSF) of the spectral wavelength. .
  • the imaging optical system 101 is configured as a part of the imaging apparatus, but may be an interchangeable type as in a single-lens reflex camera. Functions such as aperture diameter control and manual focus may not be used depending on the purpose of the imaging apparatus.
  • the image restoration processing of the present invention can be performed by changing it according to the image height. desirable.
  • the image processing unit 104 includes at least a calculation unit and a temporary storage unit (buffer).
  • the image is temporarily written (stored) and read out from the storage unit as necessary for each step of the image processing.
  • the storage unit for temporarily storing is not limited to the temporary storage unit (buffer), but may be the storage unit 108, which is suitable for the data capacity and communication speed of the storage unit having a storage function. It can be appropriately selected and used.
  • the storage unit 108 stores data such as an image restoration filter and correction information.
  • FIG. 6 schematically illustrates imaging state information and a plurality of image restoration filters (black circles) stored in the storage unit 108 based on the imaging state information.
  • the image restoration filter stored in the storage unit 108 is in an imaging state space centered on three imaging states of a focus position (state A), an aperture value (state B), and a subject distance (focusing distance) (state C). Are arranged discretely.
  • the coordinates of each point (black circle) in the imaging state space indicate the image restoration filter stored in the storage unit 108.
  • the image restoration filter is arranged at a grid point on a line orthogonal to each imaging state, but the image restoration filter may be arranged away from the grid point.
  • the types of imaging states are not limited to the focal length, the aperture value, and the subject distance, and the number thereof may not be three, and a four-dimensional or more imaging state space with four or more imaging states is configured,
  • the image restoration filter may be discretely arranged therein.
  • an imaging state indicated by a large white circle is an actual imaging state detected by the state detection unit 107.
  • the image restoration filter can be selected and used for image restoration processing.
  • One method for selecting an image restoration filter in the vicinity of a position corresponding to the actual imaging state is a distance (in the imaging state space between the actual imaging state and a plurality of imaging states in which the image restoration filter is stored ( The difference between the imaging states is calculated. This is a method of selecting the image restoration filter at the shortest distance. By this method, the image restoration filter at the position indicated by a small white circle in FIG. 6 is selected.
  • this is a method of selecting an image restoration filter having the highest value of the evaluation function using the product of the distance in the imaging state space and the weighted direction as the evaluation function.
  • the distance (state difference amount) in the imaging state space between the actual imaging state and the imaging state in which the image restoration filter is stored is calculated, and the shortest distance (state difference amount) Select the image restoration filter at the position with the smallest).
  • the correction amount of the image restoration filter can be reduced, and an image restoration filter close to the original image restoration filter in the imaging state can be generated.
  • the image restoration filter at the position indicated by a small white circle is selected.
  • State difference amounts ⁇ A, ⁇ B, and ⁇ C between the imaging state corresponding to the selected image restoration filter and the actual imaging state are calculated.
  • a state correction coefficient is calculated based on the state difference amount, and the selected image restoration filter is corrected using the state correction coefficient. Thereby, an image restoration filter corresponding to an actual imaging state can be generated.
  • an image restoration filter suitable for the imaging state can be generated.
  • the coefficient values of the corresponding taps between the two-dimensional image restoration filters may be interpolated using linear interpolation, polynomial interpolation, spline interpolation, or the like.
  • optical transfer function (OTF) used for generating the image restoration filter can be obtained by calculation using an optical design tool or an optical analysis tool. Furthermore, the optical transfer function (OTF) in the actual state of the imaging optical system alone or the imaging apparatus can be measured and obtained.
  • FIG. 7 shows a specific flowchart of the image restoration processing of this embodiment executed by the image processing unit 104.
  • the mark ⁇ in the figure represents the step of storing pixel data such as an image at least temporarily.
  • the image processing unit 104 acquires an input image in an image acquisition process. Next, imaging state information is obtained from the state detection unit 107 (step S72). Then, an image restoration filter corresponding to the imaging state is selected from the storage unit 108 (step S73), and the restoration process is performed on the input image using the image restoration filter in the image restoration processing step (correction step) (step S74). ).
  • step S76 other processing necessary for image formation is performed and the recovered image is output (step S76).
  • Other processing includes color interpolation processing (demosaicing processing), shading correction (peripheral light amount correction), distortion aberration correction, and the like if the correction image is a mosaic image. Further, various image processes including the other processes described here can be inserted before, after, or in the middle of the above flow as necessary.
  • FIG. 8 shows changes in the MTF before and after the recovery process.
  • the broken line (a) and the solid line (b) are the MTFs in the first azimuth direction and the second azimuth direction before the image restoration process, respectively, and the broken line (c) and the solid line (d) are the first azimuth direction and the solid line (d) after the restoration process. It is MTF of 1 azimuth direction and 2nd azimuth direction.
  • the image restoration processing is performed with a low degree of restoration for the two MTFs (a) and (b) in the azimuth direction before the restoration.
  • the MTF is low and the difference in the azimuth direction is corrected.
  • the phase component of the aberration and the asymmetry of the aberration are corrected, but the sharpness is low.
  • the recovery degree of the recovered image when the image recovery filter is an inverse filter is the maximum recovery degree
  • the recovered image has a recovery degree lower than the maximum recovery degree.
  • the average frequency of the MTF after recovery in the two azimuth directions is preferably 1.5 times or less of the maximum MTF before recovery. More preferably, it is preferably 1.2 times or less.
  • the first azimuth direction having a higher MTF recovers only the phase and does not substantially change the MTF.
  • the second azimuth direction having a lower MTF recovers the phase, and the MTF in the second azimuth direction is preferably aligned with the MTF in the first azimuth direction.
  • Edge enhancement processing is performed on the recovered image that has undergone such recovery processing.
  • FIG. 9 An example of the edge enhancement filter is shown in FIG.
  • a filter for performing edge enhancement can be generated by a difference between a filter that directly outputs an input image and a differential filter.
  • a differential filter a Sobel filter that performs primary differentiation, a Laplacian filter that performs secondary differentiation, and the like are well known.
  • the differential filter in FIG. 9 is a Laplacian filter. Since the edge enhancement filter performs processing based on the relationship between the pixel values of adjacent pixels, a filter having about 3 ⁇ 3 taps as shown in the figure is often used.
  • FIG. 10 shows the edge enhancement effect when the edge enhancement filter shown in FIG. 9 is used.
  • 10A, 10B, and 10C are diagrams when the luminance of the edge portion in the image is viewed in a certain cross section.
  • the horizontal axis represents coordinates, and the vertical axis represents amplitude.
  • (A) of FIG. 10 is a brightness
  • (B) what extracted the edge part with the differential filter and reversed the code
  • the edge inclination can be sharply enhanced as shown in (C).
  • Edge enhancement works only on sharp edges of edges, and sharpens, so the entire image is less affected by noise amplification, and the number of filter taps is relatively small, allowing high-speed processing. There is an advantage. Therefore, it is more preferable to perform the edge enhancement process after performing the image restoration process with a low degree of restoration. When combined with edge enhancement processing in this way, the edge enhancement processing may be included in the other necessary processing in FIG. Other processing that can enhance the edge portion of the image includes sharpness processing.
  • FIG. 11A shows a configuration diagram of an image processing system that is Embodiment 2 of the present invention.
  • the image processing apparatus 111 includes an information processing apparatus, and is loaded with image processing software (image processing program) 112 for causing the information processing apparatus to execute the image processing method described in the first embodiment.
  • image processing software image processing program
  • the imaging device 113 includes a camera, a microscope, an endoscope, a scanner, and the like.
  • the storage medium 114 stores an image (captured image data) generated by imaging such as a semiconductor memory, a hard disk, or a server on a network.
  • the image processing device 111 acquires image data from the imaging device 113 or the storage medium 114 and outputs output image (corrected image) data obtained by performing predetermined image processing to at least one of the output device 116, the imaging device 113, and the storage medium 114. Output to one. Further, the output destination can be a storage unit built in the image processing apparatus 111, and the output image data can be stored in the storage unit. An example of the output device 116 is a printer. A display device 115 as a monitor is connected to the image processing apparatus 111, and the user can perform an image processing operation through the display device 115 and can evaluate a recovery adjustment image (output image).
  • the image processing software 112 has a development function and other image processing functions as needed in addition to the image recovery processing function and the recovery degree adjustment function.
  • FIG. 11B shows the configuration of another image processing system.
  • the recovery adjustment image can be output directly from the imaging device 118 to the output device 119.
  • the output device 119 sets an adjustment coefficient according to the feature amount of the image, and adjusts the degree of recovery. Is also possible. Furthermore, by adjusting the degree of recovery according to the degradation characteristics of the output image of the output device 119, a higher quality image can be provided.
  • FIG. 12 shows an example of the correction information, and the plurality of correction information is referred to as a correction information set. Each correction information will be described below.
  • the correction control information includes setting information indicating which of the imaging device 113, the image processing device 111, and the output device 116 performs the recovery process and the recovery degree adjustment process, and data to be transmitted to other devices according to the setting information Is selection information for selecting. For example, when only the restoration processing is performed by the imaging device 113 and the restoration degree is adjusted by the image processing device 111, it is not necessary to transmit the image restoration filter to the image processing device 111, but at least the photographed image and the restoration image or the restoration component information. (Difference information) needs to be transmitted.
  • Imaging device information is identification information of the imaging device 113 corresponding to the product name. If the lens and the camera body are interchangeable, the identification information includes the combination.
  • Imaging status information is information relating to the state of the imaging device 113 at the time of shooting. For example, the focal length (zoom position), aperture value, subject distance (focusing distance), ISO sensitivity, white balance setting, etc.
  • Imaging device individual information is identification information of each imaging device with respect to the above imaging device information. Since the optical transfer function (OTF) of the imaging apparatus has individual variations due to variations in manufacturing errors, the individual imaging apparatus information is effective information for setting an optimum recovery degree adjustment parameter individually.
  • the restoration degree adjustment parameter is a restoration strength adjustment coefficient ⁇ and a color composition ratio adjustment coefficient ⁇ .
  • Image recovery filters The image restoration filter group is a set of image restoration filters used in image restoration processing. When a device that performs image restoration processing does not have an image restoration filter, it is necessary to transmit the image restoration filter from another device (apparatus).
  • the user setting information is an adjustment parameter for adjusting the recovery degree according to the user's preference or a correction function of the adjustment parameter.
  • the user can variably set the adjustment parameter, but if user setting information is used, a desired output image can always be obtained as an initial value.
  • the user setting information is updated by the learning function with the sharpness most preferred from the history of the user determining the adjustment parameter.
  • the imaging device provider can also provide preset values according to some sharpness patterns via a network.
  • the above correction information set is preferably attached to individual image data.
  • an image restoration process can be performed by any device equipped with the image processing apparatus of the second embodiment.
  • the contents of the correction information set can be selected automatically and manually as necessary.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

This invention is directed to provision of an image processing apparatus capable of obtaining high definition of images, while correcting the asymmetry of aberration. An image processing apparatus comprises: an image acquiring means for acquiring an input image; and an image recovering means for recovering the input image by use of an image recovery filter, which is generated or selected based on the transfer function of an image capturing system used for forming, as the input image, a subject image, thereby generating a recovered image. The image recovery filter causes the difference in absolute value of transfer function between two azimuthal directions used when obtaining the recovered image from the subject to be smaller than the difference in absolute value of transfer function between the two azimuthal directions of the image capturing system.

Description

画像処理装置、およびそれを用いた撮像装置Image processing apparatus and imaging apparatus using the same

 本発明は画像処理を行う画像処理装置に関する発明であり、特に画像回復(復元)を行う画像処理装置に関する。 The present invention relates to an image processing apparatus that performs image processing, and more particularly to an image processing apparatus that performs image restoration (restoration).

 デジタルカメラ等の撮像装置により得られた画像は、ぼけによって画像が劣化している。画像のボケは、撮像光学系の球面収差、コマ収差、像面湾曲、非点収差などが原因である。これら収差は、点像分布関数(PSF、Point Spread Function)により表すことができる。点像分布関数(以下、PSF)をフーリエ変換して得られる光学伝達関数(OTF、Optic Transfer Function)は、収差の周波数空間での情報であり、複素数で表される。光学伝達関数(以下、OTF)の絶対値、即ち振幅成分はMTF(Modulation Transfer Function)、位相成分はPTF(Phase Transfer Function)と呼ばれる。撮像光学系のOTFは画像の振幅成分と位相成分に影響(劣化)を与える。このため、OTFの影響により劣化した画像(以下、劣化画像)は、被写体の各点がコマ収差のように非対称にボケた画像になる。 An image obtained by an imaging device such as a digital camera is deteriorated due to blur. The image blur is caused by spherical aberration, coma aberration, field curvature, astigmatism, and the like of the imaging optical system. These aberrations can be expressed by a point spread function (PSF, Point Spread Function). An optical transfer function (OTF, Optic Transfer Function) obtained by Fourier transforming a point spread function (hereinafter referred to as PSF) is information in the frequency space of aberration, and is represented by a complex number. The absolute value of the optical transfer function (hereinafter OTF), that is, the amplitude component is called MTF (ModulationFTransfer Function), and the phase component is called PTF (PhasePTTransfer Function). The OTF of the imaging optical system affects (deteriorates) the amplitude component and phase component of the image. For this reason, an image deteriorated by the influence of the OTF (hereinafter referred to as a deteriorated image) is an image in which each point of the subject is asymmetrically blurred like coma.

 これについて図13を用いて説明する。図13(A)、(B)、(C)は、主光線(光学系の瞳の中心を通る光線)と垂直に交わる面における、点像分布関数(PSF)の拡がりを表した模式図である。図13に示した面内において、光軸を通り互いに垂直な線を軸x1、軸x2とし、図13に示した面内において光軸を通る任意の直線と軸x1が成す角θをアジムス角とする。また、図13の座標軸の原点を主光線の結像位置としたときに、アジムス角θが表す方向をアジムス方向とする。アジムス方向とは、サジタル方向およびメリジオナル方向を含み、それ以外の角度θ方向も含む全方向の総称である。 This will be described with reference to FIG. FIGS. 13A, 13B, and 13C are schematic diagrams showing the spread of the point spread function (PSF) on the plane perpendicular to the principal ray (the ray passing through the center of the pupil of the optical system). is there. In the plane shown in FIG. 13, lines perpendicular to each other passing through the optical axis are defined as axes x1 and x2, and an angle θ formed by an arbitrary straight line passing through the optical axis and the axis x1 in the plane shown in FIG. And When the origin of the coordinate axis in FIG. 13 is the principal ray imaging position, the azimuth angle θ is the azimuth direction. The azimuth direction is a general term for all directions including the sagittal direction and the meridional direction, and also including the other angle θ directions.

 既に述べたように、収差の位相成分(PTF)の劣化はPSFに非対称性を発生させる。また、振幅成分(MTF)の劣化はアジムス方向ごとのPSFの広がりの大きさに影響する。図13(A)は、コマ収差の発生しているPSFを模式的に表した図である。光軸の偏芯が無く、回転対称形状のレンズからなる光学系の場合の光軸上以外の画角のPSFは、像面上で光軸と主光線を通る直線に関して対称であるので、線対称形状をしている。図13(A)では、PSFが軸x2に対して線対称となっている。 As already mentioned, the deterioration of the phase component (PTF) of aberration causes asymmetry in the PSF. In addition, the deterioration of the amplitude component (MTF) affects the size of the PSF spread for each azimuth direction. FIG. 13A is a diagram schematically showing a PSF in which coma aberration occurs. The PSF of the angle of view other than on the optical axis in the case of an optical system having a rotationally symmetric lens without the eccentricity of the optical axis is symmetric with respect to a straight line passing through the optical axis and the principal ray on the image plane. It has a symmetrical shape. In FIG. 13A, the PSF is line symmetric with respect to the axis x2.

 図13(B)は、位相ずれの無い状態のPSFを示している。各アジムス方向において対称な形状をしているが、振幅(MTF)の差異があるため、軸x1方向と軸x2方向ではPSFの広がりの異なった非対称なPSFとなっている。ちなみに、光軸上のPSFは、製造誤差を無視すると位相ずれが無く、振幅劣化のアジムス依存性も無いので図13(C)のように回転対称な形状になる。つまり図13(A)、(B)に示したように、PSFはアジムス方向ごとの位相(PTF)のずれと、アジムス方向間の振幅(MTF)の差異により非対称な形状となり、画像のボケとして、高精細な画像生成を阻害する要因となっている。 FIG. 13B shows a PSF in a state where there is no phase shift. Although it has a symmetric shape in each azimuth direction, there is a difference in amplitude (MTF), so that the PSF spread is different in the axis x1 direction and the axis x2 direction. By the way, the PSF on the optical axis has a rotationally symmetric shape as shown in FIG. 13C because there is no phase shift and no azimuth dependency of amplitude deterioration if manufacturing errors are ignored. That is, as shown in FIGS. 13A and 13B, the PSF has an asymmetric shape due to the difference in the phase (PTF) in each azimuth direction and the difference in amplitude (MTF) between the azimuth directions. This is a factor that hinders the generation of high-definition images.

 このような画像のボケを補正する技術として、特許文献1は、画像回復フィルタを設計する際のパラメータαを式1のように設けている。調整パラメータαを調整することにより、画像回復フィルタは何も作用しないフィルタ(α=0)から逆フィルタ(α=1)に変化する。これにより、もとの撮影画像から最大に回復した画像までの範囲で画像の回復度合いをパラメータ1つで調整することができる。 As a technique for correcting such image blurring, Patent Document 1 provides a parameter α for designing an image restoration filter as shown in Equation 1. By adjusting the adjustment parameter α, the image restoration filter changes from a filter (α = 0) that does nothing to an inverse filter (α = 1). As a result, the degree of image recovery can be adjusted with one parameter in the range from the original photographed image to the image recovered to the maximum.

Figure JPOXMLDOC01-appb-M000001

ここで、F(u,v)、G(u,v)はそれぞれ、回復画像および劣化画像のフーリエ変換である。
Figure JPOXMLDOC01-appb-M000001

Here, F (u, v) and G (u, v) are Fourier transforms of the restored image and the degraded image, respectively.

 その他に、画像のボケを補正し、鮮鋭度を向上させるフィルタとしてウィナーフィルタが知られている。式2にウィナーフィルタの周波数特性M(u,v)を示す。 Other than that, a Wiener filter is known as a filter that corrects image blur and improves sharpness. Equation 2 shows the frequency characteristic M (u, v) of the Wiener filter.

Figure JPOXMLDOC01-appb-M000002

ここで、H(u、v)は光学伝達関数(OTF)である。|H(u,v)|はOTFの絶対値(MTF)である。SNRはノイズ信号の強度比である。
Figure JPOXMLDOC01-appb-M000002

Here, H (u, v) is an optical transfer function (OTF). | H (u, v) | is the absolute value (MTF) of the OTF. SNR is the intensity ratio of the noise signal.

 以下、ウィナーフィルタや特許文献1に記載されているような、光学伝達関数(OTF)に基づいた画像回復フィルタを用いて画像の回復を行う処理を画像回復処理と称する。 Hereinafter, a process of performing image restoration using an image restoration filter based on an optical transfer function (OTF) as described in the Wiener filter or Patent Document 1 is referred to as an image restoration process.

特開2007-183842号公報JP 2007-183842 A

 しかしながら、上記のウィナーフィルタや特許文献1の画像回復フィルタは、撮像光学系による振幅成分および位相成分の劣化を補正することはできるが、アジムス方向間の振幅成分の相違を補正することはできない。ウィナーフィルタは、回復前のMTFがアジムス方向間で差異があると、回復後のMTFのアジムス方向の差異は拡大してしまう。これを図14を用いて説明する。 However, the above-described Wiener filter and the image restoration filter of Patent Document 1 can correct the deterioration of the amplitude component and the phase component due to the imaging optical system, but cannot correct the difference in the amplitude component between the azimuth directions. In the Wiener filter, if there is a difference between the azimuth directions of the MTF before the recovery, the difference in the azimuth direction of the MTF after the recovery is expanded. This will be described with reference to FIG.

 図14は画像回復処理を行う前のMTFとウィナーフィルタを用いて画像回復処理を行った後のMTFを表した図である。破線(a)、実線(b)はそれぞれ回復前の第1アジムス方向、第2アジムス方向のMTFである。破線(c)、実線(d)はそれぞれ回復後の前記第1アジムス方向、前記第2アジムス方向のMTFである。第1、第2アジムス方向とは、例えば、サジタル方向およびメリジオナル方向である。ウィナーフィルタはMTFが高ければ回復ゲイン(回復度合い)を低くし、MTFが低ければ回復ゲイン(回復度合い)を高くする画像回復フィルタである。従って、MTFの低いアジムス方向である破線(a)は、MTFの高いアジムス方向(b)よりも回復ゲインが低くなる。そのため、回復前の第1アジムス方向のMTF(a)、第2アジムス方向のMTF(b)の差よりも、回復後の第1アジムス方向のMTF(c)、第2アジムス方向のMTF(d)の差が拡大されてしまう。つまり、画像回復処理を行ったにも関わらず、非対称な収差が画像に現れてしまう。これは、特許文献1に開示されている画像回復フィルタに関しても同様である。 FIG. 14 is a diagram showing the MTF before the image restoration process and the MTF after the image restoration process using the Wiener filter. The broken line (a) and the solid line (b) are the MTFs in the first azimuth direction and the second azimuth direction before recovery, respectively. The broken line (c) and the solid line (d) are the MTFs in the first azimuth direction and the second azimuth direction after recovery, respectively. The first and second azimuth directions are, for example, the sagittal direction and the meridional direction. The Wiener filter is an image restoration filter that lowers the recovery gain (recovery degree) when the MTF is high and increases the recovery gain (recovery degree) when the MTF is low. Therefore, the broken line (a), which is the azimuth direction with a low MTF, has a lower recovery gain than the azimuth direction (b) with a high MTF. Therefore, the MTF (c) in the first azimuth direction after the recovery and the MTF (d) in the second azimuth direction after the difference between the MTF (a) in the first azimuth direction before the recovery and the MTF (b) in the second azimuth direction. ) Is enlarged. That is, asymmetric aberrations appear in the image despite the image restoration process. The same applies to the image restoration filter disclosed in Patent Document 1.

 本発明は以上の課題を鑑みてなされたものであり、画像回復処理により生じ得る非対称な収差を低減させ、より高精細な画像を得ることができる画像処理装置を提供することを課題とする。 The present invention has been made in view of the above problems, and an object of the present invention is to provide an image processing apparatus capable of reducing asymmetric aberrations that can be caused by image restoration processing and obtaining a higher definition image.

 上記課題を解決するために本発明は、
 入力画像を取得する画像取得手段と、
 被写体像を前記入力画像として形成するために用いた撮像系の伝達関数に基づいて生成または選択された画像回復フィルタを用いて前記入力画像を回復し、回復画像を生成する画像回復手段とを有し、
 前記画像回復フィルタは、被写体から回復画像を得る際の2つのアジムス方向間の伝達関数の絶対値の差が、前記撮像系の該2つのアジムス方向間の伝達関数の絶対値の差よりも小さくすることを特徴とする。
In order to solve the above problems, the present invention
Image acquisition means for acquiring an input image;
Image recovery means for recovering the input image using an image recovery filter generated or selected based on a transfer function of an imaging system used to form a subject image as the input image and generating a recovered image; And
The image restoration filter has a difference in absolute value of a transfer function between two azimuth directions in obtaining a restored image from a subject smaller than a difference in absolute value of a transfer function between the two azimuth directions of the imaging system. It is characterized by doing.

 本発明の効果は、非対称な収差が低減された高精細な回復画像を生成することが可能な点である。 The effect of the present invention is that a high-definition restored image with reduced asymmetric aberration can be generated.

本発明の実施例である画像処理方法の説明図Explanatory drawing of the image processing method which is an Example of this invention 実施例の画像処理方法にて用いられる画像回復フィルタの説明図Explanatory drawing of the image restoration filter used with the image processing method of an Example 各アジムス方向間のMTFの補正量に関する第1説明図1st explanatory drawing about the correction amount of MTF between each azimuth direction 各アジムス方向間のMTFの補正量に関する第2説明図2nd explanatory drawing regarding the correction amount of MTF between each azimuth direction 撮像装置の基本構成を示すブロック図Block diagram showing the basic configuration of the imaging device 実施例1における画像回復フィルタの選択、補正に関する説明図Explanatory drawing regarding selection and correction of an image restoration filter in the first embodiment 実施例1の画像処理手順を示すフローチャートFlowchart showing the image processing procedure of the first embodiment. 実施例1の画像処理前後のMTF変化を示す図The figure which shows the MTF change before and after the image processing of Example 1. エッジ強調フィルタの説明図Illustration of edge enhancement filter エッジ強調フィルタを適用した場合のエッジ断面図Edge cross section with edge enhancement filter applied 本発明の実施例2の画像処理システムの説明図Explanatory drawing of the image processing system of Example 2 of this invention. 本発明の実施例2の補正情報の説明図Explanatory drawing of the correction information of Example 2 of this invention アジムス方向の説明図Illustration of azimuth direction 従来の画像回復処理前後のMTF変化を示す図The figure which shows the MTF change before and behind the conventional image restoration process 点像分布関数(PSF)の説明図Explanatory diagram of point spread function (PSF) 画像の振幅成分(MTF)と位相成分(PTF)の説明図Explanatory drawing of amplitude component (MTF) and phase component (PTF) of image

 はじめに、本発明の画像処理の流れを図1を用いて説明する。図1に画像の入力から出力までの処理工程を示す。まずステップS11の画像取得工程において、撮像系により生成された画像を取得する。以下、画像取得工程において取得された画像を入力画像と記す。 First, the flow of image processing according to the present invention will be described with reference to FIG. FIG. 1 shows processing steps from image input to output. First, in the image acquisition process of step S11, an image generated by the imaging system is acquired. Hereinafter, the image acquired in the image acquisition process is referred to as an input image.

 次にステップS12において、ステップS11で取得された入力画像の撮影された条件に対応する画像回復フィルタを生成する。尚、このステップS12は、予め用意された複数の画像回復フィルタの中から適当なフィルタを選択する工程であってもよいし、フィルタを選択後、適宜そのフィルタを補正する工程であってもよい。ステップS13の画像回復処理工程では、ステップS12において、撮像系の伝達関数(撮像光学系の光学伝達関数)に基づいて生成、あるいは選択された画像回復フィルタを用いて画像回復処理(補正処理)を実行する。より詳しく説明すれば、入力画像の位相成分(PTF)については零を目標値に補正し、振幅成分(MTF)については2つのアジムス方向間でその差が減少するように補正する。そしてステップS14において、ステップS13で補正された補正画像を出力画像として出力する。 Next, in step S12, an image restoration filter corresponding to the shooting condition of the input image acquired in step S11 is generated. This step S12 may be a step of selecting an appropriate filter from a plurality of image restoration filters prepared in advance, or may be a step of appropriately correcting the filter after selecting the filter. . In the image restoration processing step of step S13, image restoration processing (correction processing) is performed using the image restoration filter generated or selected based on the transfer function of the imaging system (optical transfer function of the imaging optical system) in step S12. Execute. More specifically, the phase component (PTF) of the input image is corrected to zero as the target value, and the amplitude component (MTF) is corrected so that the difference between the two azimuth directions decreases. In step S14, the corrected image corrected in step S13 is output as an output image.

 尚、図1の工程の前後、あるいは途中に他の画像処理に関する工程を挿入しても構わない。他の画像処理とは、例えば歪曲収差補正、周辺光量補正などの電子収差補正やデモザイキング、ガンマ変換、画像圧縮などの処理である。次に、図1に示した各工程についてより詳細に説明する。 It should be noted that other image processing steps may be inserted before, after, or during the process of FIG. The other image processing is processing such as electronic aberration correction such as distortion aberration correction and peripheral light amount correction, demosaicing, gamma conversion, and image compression. Next, each step shown in FIG. 1 will be described in more detail.

 尚、MTFは撮像系の伝達関数(撮像光学系の光学伝達関数)の振幅成分(絶対値)であるが、被写体(被写界内の物体)が白色点光源の場合、MTFを画像のスペクトルと捉えることができる。 MTF is the amplitude component (absolute value) of the transfer function of the image pickup system (optical transfer function of the image pickup optical system), but when the subject (object in the object field) is a white point light source, the MTF is the spectrum of the image. Can be considered.

 (画像取得工程)
 画像取得工程により取得される画像(以下、入力画像)は、撮像光学系を介して撮像素子で撮像することで得られたデジタル画像である。ここで得られたデジタル画像は、撮像光学系(レンズ)と各種の光学フィルタ類を含む撮像光学系の収差等に基づく光学伝達関数(OTF)により、被写界内での物体と比較して劣化している。この光学伝達関数(OTF)は、上記のような撮像光学系の光学素子の収差やその他の撮像装置の特性に基づいた伝達関数であることが望ましい。撮像光学系はレンズの他にも曲率を有するミラー(反射面)を用いることもできる。
(Image acquisition process)
An image acquired by the image acquisition process (hereinafter referred to as an input image) is a digital image obtained by capturing an image with an image sensor via an imaging optical system. The digital image obtained here is compared with an object in the field by an optical transfer function (OTF) based on aberrations of the imaging optical system including the imaging optical system (lens) and various optical filters. It has deteriorated. The optical transfer function (OTF) is preferably a transfer function based on the aberration of the optical element of the imaging optical system as described above and other characteristics of the imaging device. The imaging optical system can also use a mirror (reflection surface) having a curvature in addition to the lens.

 また、入力画像は色空間により表される。色空間の表し方には、例えばRGBがあるが、RGB以外にもLCHで表現される明度、色相、彩度や、YCbCrで表現される輝度、色差信号などがある。その他の色空間として、XYZ、Lab、Yuv、JChや色温度がある。これら一般に用いられている色空間により表される値を、色成分として本発明を適用することができる。 Also, the input image is represented by a color space. The color space is represented by RGB, for example. In addition to RGB, there are brightness, hue, saturation expressed by LCH, luminance, color difference signal expressed by YCbCr, and the like. Other color spaces include XYZ, Lab, Yuv, JCh, and color temperature. The present invention can be applied to values represented by these commonly used color spaces as color components.

 また、入力画像は各画素に一つの色成分の信号値を有するモザイク画像でも良いし、このモザイク画像を色補間処理(デモザイキング処理)して各画素に複数の色成分の信号値を有したデモザイク画像でも良い。モザイク画像は色補間処理(デモザイキング処理)やガンマ変換やJPEG等の画像圧縮などの画像処理を行う前の画像として、RAW画像とも呼ぶ。例えば単板の撮像素子で複数の色成分の情報を得る場合には、各画素に分光透過率の異なるカラーフィルタを配置して、各画素に一つの色成分の信号値を有するモザイク画像を取得する。このモザイク画像に色補間処理を行うことで各画素に複数の色成分の信号値を有した画像を取得することができる。また、多板、例えば3板の撮像素子を用いる場合には各撮像素子ごとに分光透過率の異なるカラーフィルタを配置して、撮像素子ごとに異なる色成分の画像信号値を有したデモザイク画像を取得することになる。この場合、各撮像素子間で、対応する画素に対してそれぞれの色成分の信号値を有しているので、特に色補間処理を行わなくとも各画素に複数の色成分の信号値を有した画像を取得することができる。 Further, the input image may be a mosaic image having a signal value of one color component for each pixel, or a color interpolation process (demosaicing process) is performed on this mosaic image, and each pixel has a signal value of a plurality of color components. A demosaic image may be used. The mosaic image is also called a RAW image as an image before image processing such as color interpolation processing (demosaicing processing), gamma conversion, image compression such as JPEG, or the like. For example, when obtaining information on multiple color components with a single-chip image sensor, a color filter with a different spectral transmittance is arranged in each pixel to obtain a mosaic image having a signal value of one color component in each pixel. To do. By performing color interpolation processing on this mosaic image, an image having a plurality of color component signal values at each pixel can be acquired. In addition, when using a multi-plate, for example, three-plate image sensor, a color filter having a different spectral transmittance is arranged for each image sensor, and a demosaic image having image signal values of different color components for each image sensor is obtained. Will get. In this case, since each image sensor has a signal value of each color component for the corresponding pixel, each pixel has a signal value of a plurality of color components without performing color interpolation processing. Images can be acquired.

 また入力画像には、入力画像を補正するための各種の補正情報を付帯することができる。補正情報には、レンズの焦点距離(ズーム位置)、絞り値、撮影距離(合焦距離)、露光時間、ISO感度などの撮像状態に関する情報(撮像状態情報)が含まれる。撮像から画像の出力までの一連の処理を一つの撮像装置で行う場合には、入力画像に撮像状態情報や補正情報を付帯しなくとも装置内で取得することもできる。しかし、撮像装置からRAW画像を取得し、撮像装置とは別体の画像処理装置で画像回復処理や現像処理等を行う場合には、上記のように画像に撮像状態情報や補正情報を付帯することが好ましい。ただしこれに限られず、画像処理装置に予め補正情報を記憶させ、入力画像に付帯された撮像状態情報から補正情報を選択可能なシステムを構成すれば、必ずしも画像に補正情報を付帯する必要はない。 Also, various correction information for correcting the input image can be attached to the input image. The correction information includes information about the imaging state (imaging state information) such as the focal length (zoom position), aperture value, shooting distance (focus distance), exposure time, and ISO sensitivity of the lens. When a series of processing from imaging to image output is performed by a single imaging apparatus, it is also possible to obtain the input image within the apparatus without adding imaging state information and correction information to the input image. However, when a RAW image is acquired from an imaging device and image restoration processing or development processing is performed by an image processing device separate from the imaging device, imaging state information and correction information are attached to the image as described above. It is preferable. However, the present invention is not limited to this, and if the correction information is stored in the image processing apparatus in advance and the correction information can be selected from the imaging state information attached to the input image, the correction information is not necessarily attached to the image. .

 尚、入力画像は撮像光学系を介して撮像素子で撮像することで得られたデジタル画像と記述したが、入力画像は撮像光学系を含まない撮像系により得られたデジタル画像でも良い。例えば、被写体面に撮像素子を密着させて撮像を行うスキャナ(読み取り装置)やX線撮像装置はレンズのような撮像光学系を持たない撮像装置により得られた画像であってもよい。これら、撮像光学系を持たないが、撮像素子による画像サンプリングによって生成された画像は少なからず劣化する。この場合の劣化特性は、撮像光学系の光学伝達関数(狭義の光学伝達関数)によるものではないが、撮像系のシステム伝達関数によるものであり、このシステム伝達関数は光学伝達関数に相当するものと言える。このため、本発明の実施例にいう「光学伝達関数」は、このような撮像光学系を含まない撮像系のシステム伝達関数を含む広義の光学伝達関数である。 Although the input image is described as a digital image obtained by taking an image with an imaging device via an imaging optical system, the input image may be a digital image obtained by an imaging system that does not include the imaging optical system. For example, a scanner (reading device) or an X-ray imaging device that performs imaging with an imaging element in close contact with the subject surface may be an image obtained by an imaging device that does not have an imaging optical system such as a lens. Although these do not have an imaging optical system, an image generated by image sampling by the imaging device is not a little deteriorated. The deterioration characteristics in this case are not due to the optical transfer function of the imaging optical system (the optical transfer function in a narrow sense), but are due to the system transfer function of the imaging system, and this system transfer function corresponds to the optical transfer function. It can be said. Therefore, the “optical transfer function” referred to in the embodiments of the present invention is an optical transfer function in a broad sense including the system transfer function of such an imaging system that does not include such an imaging optical system.

 (画像回復フィルタ生成工程)
 次に、画像回復フィルタの生成について図2(A)、(B)を参照しながら説明する。図2(A)は実空間において入力画像の画素に対してコンボリューション処理が行われる画像回復フィルタの模式図である。画像回復フィルタのタップ(セル)数は、撮像系の収差特性や要求される回復精度に応じて決めることができ、画像が2次元のときは、一般的に画像の各画素に対応したタップ数を有する2次元の画像回復フィルタとなる。図2(A)には、例として11×11タップの2次元画像回復フィルタを示した。また、画像回復フィルタのタップ数に関しては、一般的に多いほど回復精度が向上するので、タップ数は要求画質、画像処理能力、収差の特性等に応じて設定される。
(Image restoration filter generation process)
Next, generation of an image restoration filter will be described with reference to FIGS. FIG. 2A is a schematic diagram of an image restoration filter in which convolution processing is performed on pixels of an input image in real space. The number of taps (cells) of the image restoration filter can be determined according to the aberration characteristics of the imaging system and the required restoration accuracy. When the image is two-dimensional, generally the number of taps corresponding to each pixel of the image Is a two-dimensional image restoration filter. FIG. 2A shows an 11 × 11 tap two-dimensional image restoration filter as an example. In general, as the number of taps of the image restoration filter increases, the restoration accuracy improves as the number of taps increases. Therefore, the number of taps is set according to the required image quality, image processing capability, aberration characteristics, and the like.

 図2(A)では各タップ内の値を省略しているが、この画像回復フィルタの1断面を図2(B)に示す。この画像回復フィルタの各タップのもつ値(係数値)の分布が、コンボリューション処理の際に、収差によって空間的に広がった信号値を理想的には元の1点に戻す役割を果たしている。 In FIG. 2 (A), values in each tap are omitted, but one section of this image restoration filter is shown in FIG. 2 (B). The distribution of the values (coefficient values) of each tap of the image restoration filter plays a role of ideally returning the signal value spatially spread by the aberration to the original one point during the convolution process.

 この画像回復フィルタを生成するためには、まず、撮像光学系の光学伝達関数(OTF)を計算若しくは計測する。元画像(劣化画像)が撮像光学系を持たないシステムにより得られた画像である場合には、その劣化特性は撮像システム伝達関数で表すことができるので、撮像システムの伝達関数を光学伝達関数(OTF)として画像回復フィルタを生成すればよい。以下で使用する光学伝達関数(OTF)という表現は、この撮像光学系を有しない撮像システムの伝達関数も含む。 In order to generate this image restoration filter, first, an optical transfer function (OTF) of the imaging optical system is calculated or measured. When the original image (degraded image) is an image obtained by a system that does not have an imaging optical system, the degradation characteristic can be expressed by an imaging system transfer function. An image restoration filter may be generated as OTF). The expression “optical transfer function (OTF)” used below includes the transfer function of an imaging system that does not have this imaging optical system.

 本発明で用いる画像回復フィルタは、従来の画像回復フィルタとは異なり、MTFのアジムス方向間の差異を補正する機能を有している。本発明の画像回復フィルタの作成工程について説明する前に、従来のウィナーフィルタについて図15を用いて説明する。 Unlike the conventional image restoration filter, the image restoration filter used in the present invention has a function of correcting the difference between the azimuth directions of the MTF. Prior to describing the process of creating the image restoration filter of the present invention, a conventional winner filter will be described with reference to FIG.

 画像上のある位置でのある色成分の点像分布関数(PSF)のメリジオナル方向の断面を図15(A)に、その周波数特性を図16に示す。図16(M)は振幅成分であるMTFを、図16(P)は位相成分であるPTFを示している。また、図15(A)、(B)、(C)のPSFに対応した周波数特性が、図16の破線(a)、2点鎖線(b)、1点鎖線(c)である。 FIG. 15A shows a meridional section of a point spread function (PSF) of a color component at a certain position on the image, and FIG. 16 shows the frequency characteristics thereof. FIG. 16M shows the MTF that is the amplitude component, and FIG. 16P shows the PTF that is the phase component. Further, the frequency characteristics corresponding to the PSFs in FIGS. 15A, 15B, and 15C are a broken line (a), a two-dot chain line (b), and a one-dot chain line (c) in FIG.

 図15(A)に示す補正前のPSFは、コマ収差等により非対称な形状をしており、図16(M)の破線(A)のように高周波ほど振幅応答の低いMTF特性を持ち、図16(P)の破線(a)のように位相ずれが発生している。これを光学伝達関数(OTF)の逆数(1/OTF(u,v))を逆フーリエ変換して作成した画像回復フィルタを用いて補正すると、理想的には図15(B)のようにPSFが広がりをもたないデルタ関数のように補正される。 The PSF before correction shown in FIG. 15A has an asymmetric shape due to coma and the like, and has an MTF characteristic with a lower amplitude response at higher frequencies as indicated by a broken line (A) in FIG. A phase shift occurs as indicated by a broken line (a) in FIG. When this is corrected using an image restoration filter created by inverse Fourier transform of the reciprocal of the optical transfer function (OTF) (1 / OTF (u, v)), the PSF is ideally as shown in FIG. Is corrected like a delta function with no spread.

 尚、OTFの逆数は逆フィルタと呼ばれ、該逆フィルタ(画像回復フィルタ)による回復度合いを本明細書では最大回復度合いと定義する。 Note that the reciprocal of OTF is called an inverse filter, and the degree of recovery by the inverse filter (image restoration filter) is defined as the maximum degree of restoration in this specification.

 図15(B)に対応するMTFは図16(M)の2点鎖線(b)のように全周波数に渡って1になり、PTFは図16(P)の2点鎖線(b)のように全周波数に渡って0になっている。 The MTF corresponding to FIG. 15B is 1 over the entire frequency as shown by the two-dot chain line (b) in FIG. 16M, and the PTF is as shown by the two-dot chain line (b) in FIG. Is zero over the entire frequency.

 しかしながら、前記したように、画像回復フィルタの作成にあたっては、ノイズ増幅の影響を制御する必要がある。図15(A)のPSFを、式1に示したウィナーフィルタで回復したPSFは図15(C)のように、位相が補正されることで対称形状になり、振幅が向上されることでPSFの広がりが小さく先鋭な形状になる。図15(C)に対応するMTFは図16(M)の1点鎖線(c)のように回復ゲインを抑制したものになり、PTFは図16(P)の1点鎖線(c)のように全周波数に渡って0になっている。回復ゲインを抑制したにも関わらず、PTFが0に補正される理由を式3を用いて説明する。 However, as described above, in creating an image restoration filter, it is necessary to control the influence of noise amplification. The PSF recovered from the PSF in FIG. 15A by the Wiener filter shown in Equation 1 becomes a symmetrical shape by correcting the phase as shown in FIG. 15C, and the PSF is improved by improving the amplitude. The spread is small and sharp. The MTF corresponding to FIG. 15C has a reduced recovery gain as indicated by the one-dot chain line (c) in FIG. 16M, and the PTF is indicated by the one-dot chain line (c) in FIG. Is zero over the entire frequency. The reason why the PTF is corrected to 0 in spite of the suppression of the recovery gain will be described using Equation 3.

 被写体として白色点光源を仮定すると、被写体の周波数特性は位相ずれが無く、振幅特性が全周波数に渡って1となる。これを撮像光学系を通して得られる画像の周波数特性は光学伝達関数(OTF)そのものとなり、画像としてはPSF形状の画素値分布になる。即ち、入力画像の周波数特性をOTFとして、これに画像回復フィルタの周波数特性を掛け合わせれば回復画像の周波数特性を知ることができる。これを式で表現すると、式3のように、OTFであるH(u,v)は相殺され、回復画像の周波数特性は右辺のようになる。 Assuming a white point light source as the subject, the frequency characteristic of the subject has no phase shift, and the amplitude characteristic is 1 over the entire frequency. The frequency characteristic of the image obtained through this imaging optical system is the optical transfer function (OTF) itself, and the image has a pixel value distribution in the form of PSF. That is, if the frequency characteristic of the input image is OTF and is multiplied by the frequency characteristic of the image restoration filter, the frequency characteristic of the restored image can be known. If this is expressed by an equation, as shown in Equation 3, H (u, v), which is an OTF, is canceled out, and the frequency characteristic of the recovered image becomes as shown on the right side.

Figure JPOXMLDOC01-appb-M000003
右辺の|H(u,v)|はOTFの絶対値(MTF)であるため、回復度合いを決定するパラメータSNRの値に関わらず位相成分は消滅することになる。従って、位相劣化成分が補正されることで、PSFは対称な形状に補正される。
Figure JPOXMLDOC01-appb-M000003
Since | H (u, v) | on the right side is the absolute value (MTF) of the OTF, the phase component disappears regardless of the value of the parameter SNR that determines the degree of recovery. Therefore, the PSF is corrected to a symmetric shape by correcting the phase degradation component.

 しかしながら、各アジムス方向ごと位相劣化成分が補正されることでPSFはアジムス方向ごとには対称に補正されるものの、各アジムス方向ごとに振幅成分が異なるため回転対称には補正されない。例えば、図13(A)の線対象であるコマ収差が点対称に補正されることで図13(B)の非点収差のようなPSFに補正される。即ち、各アジムス方向ごとに、位相が補正され振幅が異なっているPSFは点対称な状態にまでしか補正されない。従来の画像回復方法において、アジムス方向間のMTFの差異が補正されず、むしろ拡大されてしまうことは図14を用いて上述したとおりである。したがって、従来の画像回復方法では、非対称収差を十分に補正することができない。 However, although the PSF is corrected symmetrically for each azimuth direction by correcting the phase degradation component for each azimuth direction, the amplitude component is different for each azimuth direction, so that it is not corrected for rotational symmetry. For example, the coma aberration, which is the line object in FIG. 13A, is corrected to be point-symmetric so as to be corrected to a PSF like the astigmatism in FIG. That is, for each azimuth direction, a PSF whose phase is corrected and whose amplitude is different is corrected only to a point-symmetric state. In the conventional image restoration method, the difference in MTF between the azimuth directions is not corrected but rather enlarged as described above with reference to FIG. Therefore, the conventional image restoration method cannot sufficiently correct the asymmetric aberration.

 次に、本発明の非対称な収差を補正する機能を有した本発明の画像回復フィルタについて説明する。式1と式3から分かるように、式4のrOTFの部分が白色点光源を撮影した画像の回復後の周波数特性になる。 Next, the image restoration filter of the present invention having the function of correcting the asymmetric aberration of the present invention will be described. As can be seen from Equations 1 and 3, the rOTF portion of Equation 4 is the frequency characteristic after recovery of an image taken with a white point light source.

Figure JPOXMLDOC01-appb-M000004
ここで、rOTFは、任意の関数である。回復画像の位相劣化成分は零であることが望ましいので、rOTFは位相成分を持たないようにすれば良く、rOTFは実数部のみを有するため実質的にはrMTFと等しい。rOTFは実部のみを有することが好ましいが、許容可能な範囲で虚数部に値を持たせたとしても本発明の変形の範囲内であることはいうまでも無い。つまり式4に表した画像回復フィルタとすることで、点光源に関わらずどのような被写体でも、あたかも光学伝達関数(OTF)がrOTFの特性を持った撮像光学系で撮影された画像のようにすることができるということである。
Figure JPOXMLDOC01-appb-M000004
Here, rOTF is an arbitrary function. Since it is desirable that the phase degradation component of the restored image is zero, it is sufficient that rOTF does not have a phase component. Since rOTF has only a real part, it is substantially equal to rMTF. Although rOTF preferably has only a real part, it goes without saying that even if an imaginary part has a value within an allowable range, it is within the scope of the present invention. In other words, by using the image restoration filter expressed by Equation 4, any subject regardless of the point light source, as if the image was captured by an imaging optical system having an optical transfer function (OTF) characteristic of rOTF. Is that you can.

 従って、アジムス方向間で共通なOTF(rH(u,v))を用いた式5のようにすることで、あたかもアジムス方向間にMTF差の無い撮像光学系で撮影した画像を得ることができる。つまり、本実施例に用いる画像回復フィルタは、撮像系の2つのアジムス方向間のMTFの差を減少させるフィルタである。 Therefore, by using Equation 5 using OTF (rH (u, v)) common between the azimuth directions, an image captured with an imaging optical system having no MTF difference between the azimuth directions can be obtained. . That is, the image restoration filter used in this embodiment is a filter that reduces the MTF difference between the two azimuth directions of the imaging system.

Figure JPOXMLDOC01-appb-M000005
Figure JPOXMLDOC01-appb-M000005

 これを図3を用いて説明する。図3は被写体が点光源の場合に本発明の画像回復フィルタを用いて処理した場合の、回復前と回復後の2つのアジムス方向におけるMTFの変化を示した図である。破線(a)、実線(b)はそれぞれ第1、第2のアジムス方向における回復前のMTFを、破線(c)、実線(d)はそれぞれ第1、第2のアジムス方向における回復後のMTFを表している。回復前のMTFは図3(a)、(b)のようにアジムス方向ごとに異なっているが、回復後のMTFは(c)、(d)のようにアジムス方向間で揃っている。図3の(a)、(b)は、例えばメリジオナル方向、サジタル方向のMTFに相当する。このように、上記画像回復フィルタによりアジムス方向間のMTFの差異を補正しながら回復することが可能となる。 This will be described with reference to FIG. FIG. 3 is a diagram showing changes in MTF in two azimuth directions before recovery and after recovery when processing is performed using the image recovery filter of the present invention when the subject is a point light source. The broken line (a) and the solid line (b) are the MTFs before recovery in the first and second azimuth directions, respectively, and the broken line (c) and the solid line (d) are the MTFs after the recovery in the first and second azimuth directions, respectively. Represents. The MTF before recovery differs for each azimuth direction as shown in FIGS. 3A and 3B, but the MTF after recovery is aligned between the azimuth directions as shown in FIGS. 3C and 3D. 3A and 3B correspond to MTFs in the meridional direction and the sagittal direction, for example. As described above, the image restoration filter can be restored while correcting the difference in MTF between the azimuth directions.

 また、式5ではアジムス方向間で共通なOTF(rH(u,v))を用いたが、rH(u,v)をアジムス方向ごとのOTFの差が回復前のOTFの差よりも低減するように補正することで回転対称性を制御することができる。一例を図4に示す。図4(c)、(d)のように2つのアジムス方向間の回復後のMTFが一致しなくとも、図14(c)、(d)に対してアジムス方向間のMTF差が低減し、PSFの非対称性が低減されることになる。非対称性の補正効果を得るためには、少なくとも回復前のアジムス方向間のMTFの差よりも小さくなるように回復するフィルタにすることが望ましい。 Further, in Equation 5, a common OTF (rH (u, v)) is used between the azimuth directions, but the difference in OTF for each azimuth direction of rH (u, v) is smaller than the difference in OTF before recovery. The rotational symmetry can be controlled by correcting as described above. An example is shown in FIG. Even if the recovered MTF between the two azimuth directions does not match as shown in FIGS. 4C and 4D, the MTF difference between the azimuth directions is reduced with respect to FIGS. 14C and 14D. PSF asymmetry will be reduced. In order to obtain an asymmetry correction effect, it is desirable to use a filter that recovers at least so as to be smaller than the difference in MTF between the azimuth directions before recovery.

 本実施例の画像回復フィルタは、被写体から回復画像を得る際の2つのアジムス方向間のMTF(伝達関数の絶対値)の差が、撮像系の該2つのアジムス方向間のMTFの差よりも小さくなるように構成する。 In the image restoration filter of this embodiment, the difference in MTF (absolute value of transfer function) between two azimuth directions when obtaining a restored image from a subject is larger than the difference in MTF between the two azimuth directions of the imaging system. Configure to be smaller.

 言い換えれば、本実施例の画像回復フィルタは、被写体が白色点光源である場合、回復画像における2つアジムス方向のスペクトルの差が、入力画像における該2つのアジムス方向のスペクトルの差よりも減少するように回復する。 In other words, when the subject is a white point light source, the image restoration filter of the present embodiment has a difference in spectrum between the two azimuth directions in the restored image that is smaller than the difference in spectrum between the two azimuth directions in the input image. To recover.

 言い換えれば、本実施例の画像回復フィルタは、2つのアジムス方向で異なる周波数特性を有する伝達関数と、該2つのアジムス方向間の伝達関数の絶対値の差が減少するように補正した補正伝達関数に基づいて生成される。 In other words, the image restoration filter of the present embodiment is a corrected transfer function that is corrected so that the difference between the transfer function having different frequency characteristics in the two azimuth directions and the absolute value of the transfer function between the two azimuth directions is reduced. Is generated based on

 以上説明したとおり、本発明の画像回復フィルタを用いた画像回復処理を実行することにより、収差のアジムス方向ごとの位相成分とアジムス方向間の振幅成分の差異を補正することができ、収差の非対称性を低減させより高精細な画像を得ることが可能となる。 As described above, by executing the image restoration process using the image restoration filter of the present invention, it is possible to correct the difference between the phase component of each aberration in the azimuth direction and the amplitude component between the azimuth directions, and the asymmetry of the aberration. Therefore, it becomes possible to obtain a higher definition image.

 尚、画像回復フィルタは、式5のH(u,v)の部分がアジムス方向ごとに異なるため、rH(u,v)がアジムス方向間で共通であろうが、異なっていようが、非対称な係数配列を有する。即ち、図2の断面図がアジムス方向ごとに異なる。 In the image restoration filter, the H (u, v) portion of Equation 5 is different for each azimuth direction, so that rH (u, v) may be common between the azimuth directions, but may be asymmetric. Has a coefficient array. That is, the cross-sectional view of FIG. 2 is different for each azimuth direction.

 尚、光学伝達関数(OTF)は撮像光学系のみならず、撮像の過程で光学伝達関数(OTF)を劣化させる要因を含めることができる。例えば、複屈折を有する光学ローパスフィルタは光学伝達関数(OTF)の周波数特性に対して高周波成分を抑制するものである。また、撮像素子の画素開口の形状や開口率も周波数特性に影響している。他にも光源の分光特性や各種波長フィルタの分光特性が挙げられる。これらを含めた広義の光学伝達関数(OTF)に基づいて、画像回復フィルタを作成することが望ましい。 The optical transfer function (OTF) can include not only the imaging optical system but also a factor that degrades the optical transfer function (OTF) during the imaging process. For example, an optical low-pass filter having birefringence suppresses a high-frequency component with respect to a frequency characteristic of an optical transfer function (OTF). Further, the shape and aperture ratio of the pixel aperture of the image sensor also affect the frequency characteristics. In addition, there are spectral characteristics of a light source and spectral characteristics of various wavelength filters. It is desirable to create an image restoration filter based on an optical transfer function (OTF) in a broad sense including these.

 (画像回復処理工程(補正工程))
 次に、生成した画像回復フィルタを用いて、補正画像を得る方法について説明する。
(Image restoration process (correction process))
Next, a method for obtaining a corrected image using the generated image restoration filter will be described.

 既に説明しているが、補正工程においては劣化画像に画像回復フィルタをコンボリューションすることにより、補正画像(回復画像)を得ることができる。ここでは、画像回復フィルタのタップ内に含まれる各画素に対してコンボリューション(畳み込み積分、積和)処理が行われる。コンボリューションは、ある画素の信号値を改善するために、その画素を画像回復フィルタの中心と一致させる。そして、画像と画像回復フィルタの対応画素ごとに画像の信号値と画像回復フィルタの係数値の積をとり、その総和を中心画素の信号値として置き換える処理である。 As already described, in the correction step, a corrected image (recovered image) can be obtained by convolving an image recovery filter with the deteriorated image. Here, convolution (convolution integration, product sum) processing is performed on each pixel included in the tap of the image restoration filter. Convolution makes the pixel coincide with the center of the image restoration filter in order to improve the signal value of that pixel. This is a process of taking the product of the image signal value and the coefficient value of the image restoration filter for each corresponding pixel of the image and the image restoration filter and replacing the sum as the signal value of the central pixel.

 入力画像に対して画像回復フィルタを作用させたり、コンボリューション処理を行ったりすることの利点は、画像回復処理で画像のフーリエ変換や逆フーリエ変換を行うことなく画像を回復することができる。一般的にコンボリューション処理の負荷は、フーリエ変換を行う負荷に比べると小さい。したがって、画像回復処理を行うにあたって、処理負担を軽減することができる。 The advantage of applying an image restoration filter to the input image or performing convolution processing is that the image can be restored without performing Fourier transform or inverse Fourier transform of the image in the image restoration processing. In general, the load for convolution processing is smaller than the load for performing Fourier transform. Therefore, it is possible to reduce the processing burden when performing the image restoration process.

 尚、画像回復フィルタの縦横のタップ数について既に述べたが、縦横のタップ数が必ずしも同じである必要はなく、コンボリューションの処理を行う際に考慮するようにすれば任意に変更することができる。 Although the number of vertical and horizontal taps of the image restoration filter has already been described, the number of vertical and horizontal taps does not necessarily have to be the same, and can be arbitrarily changed as long as they are taken into consideration when performing convolution processing. .

 その他、本発明の画像回復処理は、画像の劣化過程が線形である方が劣化前の元画像に回復するための逆過程を高精度に処理できるため、入力画像は諸々の適応的な非線形処理が行われていないことが好ましい。つまり、モザイク画像(RAW画像)に対して行うことがより好ましい。しかし、入力画像がモザイク画像であってもデモザイク画像であっても本発明の画像回復処理を適用することができる。理由は、色補間処理による劣化過程が線形であれば、画像回復フィルタの生成において、この劣化関数を考慮することで画像回復処理を行うことができるからである。また、回復の要求精度が低い場合や諸々の画像処理が行われた画像しか入手できない場合には、デモザイク画像に対して画像回復処理を行ってもボケの非対称性を低減するという効果を得ることができる。 In addition, since the image restoration process of the present invention can process the reverse process for restoring the original image before the degradation with high accuracy when the image degradation process is linear, the input image is subjected to various adaptive nonlinear processes. Is preferably not performed. That is, it is more preferable to carry out with respect to the mosaic image (RAW image). However, the image restoration processing of the present invention can be applied regardless of whether the input image is a mosaic image or a demosaic image. The reason is that if the degradation process by the color interpolation process is linear, the image restoration process can be performed by considering this degradation function in the generation of the image restoration filter. In addition, when the required accuracy of recovery is low or only images that have been subjected to various image processing can be obtained, the effect of reducing blur asymmetry even if image recovery processing is performed on the demosaic image is obtained. Can do.

 (画像出力工程)
 以上の処理により取得した補正画像(回復画像)を、所望のデバイスに出力する。撮像装置であれば表示部あるいは記録媒体等に出力する。画像回復処理を行った画像に対して、その他の画像処理などを行うのであれば、後の工程を実行するデバイスに画像を出力すればよい。
(Image output process)
The corrected image (recovered image) acquired by the above processing is output to a desired device. If it is an imaging device, it is output to a display unit or a recording medium. If other image processing or the like is performed on the image that has been subjected to the image restoration processing, the image may be output to a device that executes a later process.

 以上、本発明の画像処理について各工程に分けて順に説明してきたが、各工程のうち、いくつかの工程を同時に処理できる場合はまとめて処理することができる。また、各工程の前後に適宜必要な処理工程を追加することも可能である。さらに、説明に用いた式や等号記号は本発明の画像処理の具体的なアルゴリズムをこれに限定するものではなく、目的を達成しうる範囲で必要に応じた変形が可能である。 As described above, the image processing of the present invention has been described in order for each process. However, if several processes can be processed at the same time, they can be processed together. It is also possible to add necessary processing steps as appropriate before and after each step. Furthermore, the formulas and equal signs used in the description do not limit the specific algorithm of the image processing of the present invention to this, and can be modified as necessary within a range where the object can be achieved.

 以下、上述した画像処理を適用した実施例について図面を用いて説明する。 Hereinafter, an embodiment to which the above-described image processing is applied will be described with reference to the drawings.

 図5は実施例1における撮像装置の構成概略図である。不図示の被写体像を撮像光学系101で撮像素子102に結像する。撮像素子102は、結像した光を電気信号に変換(光電変換)し、該電気信号をA/Dコンバータ103がデジタル信号に変換する。そして画像処理部104は、該デジタル信号(入力画像)に対して所定の処理と併せて画像処理を行う。ここでの所定の処理とは、例えば、倍率色収差補正、歪曲収差補正、周辺光量補正などの電子収差補正やデモザイキング、ガンマ変換、画像圧縮などの処理である。 FIG. 5 is a schematic diagram of the configuration of the imaging apparatus according to the first embodiment. A subject image (not shown) is formed on the image sensor 102 by the imaging optical system 101. The imaging element 102 converts the imaged light into an electric signal (photoelectric conversion), and the A / D converter 103 converts the electric signal into a digital signal. The image processing unit 104 performs image processing on the digital signal (input image) together with predetermined processing. The predetermined processing here is processing such as electronic aberration correction such as magnification chromatic aberration correction, distortion aberration correction, and peripheral light amount correction, demosaicing, gamma conversion, and image compression.

 まず、状態検知部107から撮像装置の撮像状態情報を得る。状態検知部107はシステムコントローラ110から直接、撮像状態の情報を得ても良いし、例えば撮像光学系101に関する撮像状態情報は撮像系制御部106から得ることもできる。次に撮像状態に応じた画像回復フィルタを記憶部108から選択し、画像処理部104に入力された画像に対して画像回復処理を行う。画像回復フィルタは撮像状態に応じて記憶部108から選択したものをそのまま用いても良いし、予め用意した画像回復フィルタを補正して、より撮像状態に適した画像回復フィルタに補正したものを用いることもできる。 First, the imaging state information of the imaging device is obtained from the state detection unit 107. The state detection unit 107 may obtain imaging state information directly from the system controller 110. For example, imaging state information regarding the imaging optical system 101 may be obtained from the imaging system control unit 106. Next, an image restoration filter corresponding to the imaging state is selected from the storage unit 108, and image restoration processing is performed on the image input to the image processing unit 104. The image restoration filter selected from the storage unit 108 according to the imaging state may be used as it is, or an image restoration filter prepared in advance and corrected to an image restoration filter more suitable for the imaging state is used. You can also.

 そして、画像処理部104で処理した出力画像を画像記録媒体109に所定のフォーマットで保存する。この出力画像は収差の非対称性が補正され鮮鋭度が向上した画像である。また、表示部105には、画像回復処理後の画像に表示のための所定の処理を行った画像を表示しても良いし、高速表示のために補正処理を行わない、又は簡易的な補正処理を行った画像を表示しても良い。 Then, the output image processed by the image processing unit 104 is stored in the image recording medium 109 in a predetermined format. This output image is an image in which the asymmetry of the aberration is corrected and the sharpness is improved. Further, the display unit 105 may display an image that has undergone predetermined processing for display on the image after the image restoration processing, or no correction processing is performed for high-speed display, or simple correction. You may display the image which processed.

 上述した一連の制御はシステムコントローラ110で行われ、撮像系の機械的な駆動はシステムコントローラ110の指示により撮像系制御部106で行う。絞り101aは、Fナンバーの撮影状態設定として開口径が制御される。フォーカスレンズ101bは、撮影距離に応じてピント調整を行うために不図示のオートフォーカス(AF)機構や手動のマニュアルフォーカス機構によりレンズの位置が制御される。 The above-described series of control is performed by the system controller 110, and mechanical driving of the imaging system is performed by the imaging system control unit 106 according to an instruction from the system controller 110. The aperture of the aperture 101a is controlled as an F number shooting state setting. The focus lens 101b is controlled in position by an unillustrated autofocus (AF) mechanism or manual manual focus mechanism in order to adjust the focus according to the shooting distance.

 この撮像系にはローパスフィルタや赤外線カットフィルタ等の光学素子を入れても構わない。ローパスフィルタ等の光学伝達関数(OTF)の特性に影響を与える素子を用いる場合には画像回復フィルタを作成する時点でこの素子の影響を考慮すれば、より高精度に回復処理を行うことが可能である。赤外カットフィルタにおいても、分光波長の点像分布関数(PSF)の積分値であるRGBチャンネルの各PSF、特にRチャンネルのPSFに影響するため、画像回復フィルタを作成する時点で考慮するとより好ましい。 This imaging system may include an optical element such as a low-pass filter or an infrared cut filter. When using an element that affects the characteristics of an optical transfer function (OTF) such as a low-pass filter, the recovery process can be performed with higher accuracy if the influence of this element is taken into consideration when the image recovery filter is created. It is. Even in the infrared cut filter, it affects each PSF of the RGB channel, particularly the PSF of the R channel, which is an integral value of the point spread function (PSF) of the spectral wavelength. .

 また、撮像光学系101は撮像装置の一部として構成されているが、一眼レフカメラにあるような交換式のものであっても良い。絞りの開口径制御やマニュアルフォーカスなどの機能は撮像装置の目的に応じて用いなくても良い。 Further, the imaging optical system 101 is configured as a part of the imaging apparatus, but may be an interchangeable type as in a single-lens reflex camera. Functions such as aperture diameter control and manual focus may not be used depending on the purpose of the imaging apparatus.

 また、光学伝達関数(OTF)は1つの撮影状態においても撮像系の像高(画像の位置)に応じて変化するので、本発明の画像回復処理を像高に応じて変更して行うことが望ましい。 In addition, since the optical transfer function (OTF) changes in accordance with the image height (image position) of the imaging system even in one shooting state, the image restoration processing of the present invention can be performed by changing it according to the image height. desirable.

 また、画像処理部104は少なくとも演算部と一時的記憶部(バッファー)を有する。上記の画像処理の各工程ごとに必要に応じて一時的に記憶部に対して画像の書き込み(記憶)および読み出しを行う。また、一時的に記憶するための記憶部は前記一時的記憶部(バッファー)に限定せず、記憶部108でも良く、記憶機能を有する記憶部のデータ容量や通信速度に応じて好適なものを適宜選択して用いることができる。その他、記憶部108には画像回復フィルタ、補正情報などのデータが記憶されている。 The image processing unit 104 includes at least a calculation unit and a temporary storage unit (buffer). The image is temporarily written (stored) and read out from the storage unit as necessary for each step of the image processing. Further, the storage unit for temporarily storing is not limited to the temporary storage unit (buffer), but may be the storage unit 108, which is suitable for the data capacity and communication speed of the storage unit having a storage function. It can be appropriately selected and used. In addition, the storage unit 108 stores data such as an image restoration filter and correction information.

 図6を用いて、画像回復フィルタの選択と補正について説明する。図6には、撮像状態情報と該撮像状態情報に基づいて記憶部108に格納された複数の画像回復フィルタ(黒丸)を模式的に示す。記憶部108に格納された画像回復フィルタは、焦点位置(状態A)、絞り値(状態B)および被写体距離(合焦距離)(状態C)の3つの撮像状態を軸とした撮像状態空間中に離散的に配置されている。撮像状態空間中の各点(黒丸)の座標が、記憶部108に記憶されている画像回復フィルタを示す。尚、図6では、画像回復フィルタを各各撮像状態に対して直交した線上の格子点に配置しているが、画像回復フィルタを格子点から外して配置しても構わない。また、撮像状態の種類は、焦点距離、絞り値および被写体距離に限らず、その数も3つでなくてもよく、4つ以上の撮像状態による4次元以上の撮像状態空間を構成して、その中に画像回復フィルタを離散的に配置してもよい。 The selection and correction of the image restoration filter will be described with reference to FIG. FIG. 6 schematically illustrates imaging state information and a plurality of image restoration filters (black circles) stored in the storage unit 108 based on the imaging state information. The image restoration filter stored in the storage unit 108 is in an imaging state space centered on three imaging states of a focus position (state A), an aperture value (state B), and a subject distance (focusing distance) (state C). Are arranged discretely. The coordinates of each point (black circle) in the imaging state space indicate the image restoration filter stored in the storage unit 108. In FIG. 6, the image restoration filter is arranged at a grid point on a line orthogonal to each imaging state, but the image restoration filter may be arranged away from the grid point. The types of imaging states are not limited to the focal length, the aperture value, and the subject distance, and the number thereof may not be three, and a four-dimensional or more imaging state space with four or more imaging states is configured, The image restoration filter may be discretely arranged therein.

 図6において、大きな白丸で示した撮像状態が、状態検知部107により検知された実際の撮像状態であるとする。実際の撮像状態位置に対応する位置、またはその近傍に予め格納された画像回復フィルタが存在する場合には、その画像回復フィルタを選択して画像回復処理に用いることができる。実際の撮像状態に対応する位置の近傍の画像回復フィルタを選択する際の1つの方法は、実際の撮像状態と画像回復フィルタが格納された複数の撮像状態との間の撮像状態空間で距離(撮像状態の相違量)を算出する。そして、最も距離の短い位置の画像回復フィルタを選択する方法である。この方法により、図6に小さな白丸で示した位置の画像回復フィルタが選択される。 6, it is assumed that an imaging state indicated by a large white circle is an actual imaging state detected by the state detection unit 107. When there is an image restoration filter stored in advance at or near the position corresponding to the actual imaging state position, the image restoration filter can be selected and used for image restoration processing. One method for selecting an image restoration filter in the vicinity of a position corresponding to the actual imaging state is a distance (in the imaging state space between the actual imaging state and a plurality of imaging states in which the image restoration filter is stored ( The difference between the imaging states is calculated. This is a method of selecting the image restoration filter at the shortest distance. By this method, the image restoration filter at the position indicated by a small white circle in FIG. 6 is selected.

 また、他の方法として、画像回復フィルタ選択に撮像状態空間中の方向による重み付けをする方法がある。すなわち、撮像状態空間中の距離と重み付けした方向の積を評価関数として、該評価関数の値が最も高い画像回復フィルタを選択する方法である。 Also, as another method, there is a method of weighting the image restoration filter selection by the direction in the imaging state space. In other words, this is a method of selecting an image restoration filter having the highest value of the evaluation function using the product of the distance in the imaging state space and the weighted direction as the evaluation function.

 次に、選択された画像回復フィルタを補正することで、新たな画像回復フィルタを生成する方法について説明する。画像回復フィルタを補正するにあたり、まず実際の撮像状態と画像回復フィルタが格納された撮像状態との間の撮像状態空間での距離(状態相違量)を算出し、最も距離の短い(状態相違量が最も小さい)位置の画像回復フィルタを選択する。状態相違量が最も小さい画像回復フィルタを選択することで、画像回復フィルタの補正量も少なくすることができ、撮像状態での本来の画像回復フィルタに近い画像回復フィルタを生成することができる。 Next, a method for generating a new image restoration filter by correcting the selected image restoration filter will be described. In correcting the image restoration filter, first, the distance (state difference amount) in the imaging state space between the actual imaging state and the imaging state in which the image restoration filter is stored is calculated, and the shortest distance (state difference amount) Select the image restoration filter at the position with the smallest). By selecting the image restoration filter having the smallest state difference amount, the correction amount of the image restoration filter can be reduced, and an image restoration filter close to the original image restoration filter in the imaging state can be generated.

 図6では、小さな白丸で示した位置の画像回復フィルタが選択される。この選択された画像回復フィルタに対応する撮像状態と、実際の撮像状態との状態相違量ΔA、ΔB、ΔCを算出する。この状態相違量に基づいて状態補正係数を算出し、該状態補正係数を用いて選択された画像回復フィルタを補正する。これにより、実際の撮像状態に対応した画像回復フィルタを生成することができる。 In FIG. 6, the image restoration filter at the position indicated by a small white circle is selected. State difference amounts ΔA, ΔB, and ΔC between the imaging state corresponding to the selected image restoration filter and the actual imaging state are calculated. A state correction coefficient is calculated based on the state difference amount, and the selected image restoration filter is corrected using the state correction coefficient. Thereby, an image restoration filter corresponding to an actual imaging state can be generated.

 また、他の方法として、実際の撮像状態の近傍に位置する複数の画像回復フィルタを選択し、該複数の画像回復フィルタと実際の撮像状態との状態相違量に応じて補間処理することで、撮像状態に適した画像回復フィルタを生成することができる。ここでの補間処理は、2次元の画像回復フィルタ同士の対応タップの係数値を線形補間、多項式補間およびスプライン補間等を用いて補間すれば良い。 Further, as another method, by selecting a plurality of image restoration filters located in the vicinity of the actual imaging state and performing interpolation processing according to the amount of state difference between the plurality of image restoration filters and the actual imaging state, An image restoration filter suitable for the imaging state can be generated. In this interpolation process, the coefficient values of the corresponding taps between the two-dimensional image restoration filters may be interpolated using linear interpolation, polynomial interpolation, spline interpolation, or the like.

 また、画像回復フィルタの生成に用いる光学伝達関数(OTF)は、光学設計ツールや光学解析ツールを用いて計算により求めることができる。さらに、撮像光学系単体や撮像装置の実際の状態における光学伝達関数(OTF)を、計測して求めることもできる。 Further, the optical transfer function (OTF) used for generating the image restoration filter can be obtained by calculation using an optical design tool or an optical analysis tool. Furthermore, the optical transfer function (OTF) in the actual state of the imaging optical system alone or the imaging apparatus can be measured and obtained.

 図7に画像処理部104で実行される本実施例の画像回復処理の具体的なフローチャートを示す。図中の●印は画像等の画素データを少なくとも一時的に記憶するステップを表す。 FIG. 7 shows a specific flowchart of the image restoration processing of this embodiment executed by the image processing unit 104. The mark ● in the figure represents the step of storing pixel data such as an image at least temporarily.

 画像処理部104は画像取得工程で入力画像を取得する。次に状態検知部107から撮像状態情報を得る(ステップS72)。そして、記憶部108から撮像状態に応じた画像回復フィルタを選択し(ステップS73)、画像回復処理工程(補正工程)でこの画像回復フィルタを用いて入力画像に対して回復処理を行う(ステップS74)。 The image processing unit 104 acquires an input image in an image acquisition process. Next, imaging state information is obtained from the state detection unit 107 (step S72). Then, an image restoration filter corresponding to the imaging state is selected from the storage unit 108 (step S73), and the restoration process is performed on the input image using the image restoration filter in the image restoration processing step (correction step) (step S74). ).

 次に画像形成に必要なその他の処理を行い回復された画像を出力する(ステップS76)。ここでのその他の処理としては、前記の補正画像がモザイク画像の状態であれば、色補間処理(デモザイキング処理)、シェーディング補正(周辺光量補正)、歪曲収差補正などがある。また、ここで説明したその他の処理を含めた諸々の画像処理は、上記フローの前後や中間に必要に応じて挿入することもできる。 Next, other processing necessary for image formation is performed and the recovered image is output (step S76). Other processing here includes color interpolation processing (demosaicing processing), shading correction (peripheral light amount correction), distortion aberration correction, and the like if the correction image is a mosaic image. Further, various image processes including the other processes described here can be inserted before, after, or in the middle of the above flow as necessary.

 ここで、画像回復処理の流れとしてより好ましい例について図8を用いて説明する。図8は、回復処理を行う前と後のMTFの変化を表す。破線(a)、実線(b)はそれぞれ画像回復処理を行う前の第1アジムス方向、第2アジムス方向のMTFであり、破線(c)、実線(d)は回復処理を行った後の第1アジムス方向、第2アジムス方向のMTFである。図8に示すように、回復前の2つのアジムス方向のMTF(a)、(b)に対して、画像回復処理を低い回復度合いで行う。これにより、(c)、(d)のように、MTFは低い状態で、アジムス方向の差異は補正された状態になる。この状態は、収差の位相成分および収差の非対称性は補正されているが、鮮鋭度は低い状態である。言い換えれば、画像回復フィルタが逆フィルタであるときの回復画像の回復度合いを最大回復度合いとすると、最大回復度合いよりも低い回復度合いの回復画像である。好ましくは、ナイキスト周波数内において、2つのアジムス方向の回復後のMTFの周波数平均が、回復前の最大のMTFの1.5倍以下とすることが好ましい。さらに好ましくは1.2倍以下とすることが好ましい。さらに好ましくは、2つのアジムス方向のうち、MTFのより高い第1のアジムス方向は位相のみを回復し、MTFは実質的に変化させない。そして、MTFのより低い第2のアジムス方向は位相を回復し、該第2のアジムス方向のMTFは第1のアジムス方向のMTFに揃えることが好ましい。このような回復処理を行った回復画像に対してエッジ強調処理を行う。 Here, a more preferable example of the flow of image restoration processing will be described with reference to FIG. FIG. 8 shows changes in the MTF before and after the recovery process. The broken line (a) and the solid line (b) are the MTFs in the first azimuth direction and the second azimuth direction before the image restoration process, respectively, and the broken line (c) and the solid line (d) are the first azimuth direction and the solid line (d) after the restoration process. It is MTF of 1 azimuth direction and 2nd azimuth direction. As shown in FIG. 8, the image restoration processing is performed with a low degree of restoration for the two MTFs (a) and (b) in the azimuth direction before the restoration. As a result, as in (c) and (d), the MTF is low and the difference in the azimuth direction is corrected. In this state, the phase component of the aberration and the asymmetry of the aberration are corrected, but the sharpness is low. In other words, if the recovery degree of the recovered image when the image recovery filter is an inverse filter is the maximum recovery degree, the recovered image has a recovery degree lower than the maximum recovery degree. Preferably, in the Nyquist frequency, the average frequency of the MTF after recovery in the two azimuth directions is preferably 1.5 times or less of the maximum MTF before recovery. More preferably, it is preferably 1.2 times or less. More preferably, of the two azimuth directions, the first azimuth direction having a higher MTF recovers only the phase and does not substantially change the MTF. The second azimuth direction having a lower MTF recovers the phase, and the MTF in the second azimuth direction is preferably aligned with the MTF in the first azimuth direction. Edge enhancement processing is performed on the recovered image that has undergone such recovery processing.

 これにより、エッジ部のみの鮮鋭度を向上することができるので、画像全体に対して回復処理を行うよりも、ノイズ増幅を抑制することができる。 Thereby, since the sharpness of only the edge portion can be improved, it is possible to suppress noise amplification rather than performing the recovery process on the entire image.

 エッジ強調処理については図9を用いて説明する。エッジ強調フィルタの一例を図9に示す。エッジ強調を行うためのフィルタは、図9に示すように、入力画像をそのまま出力するフィルタと微分フィルタの差分によって生成することができる。微分フィルタは1次微分を行うソーベルフィルタや2次微分を行うラプラシアンフィルタなどがよく知られている。図9の微分フィルタはラプラシアンフィルタである。エッジ強調フィルタは隣接画素との画素値の関係により処理を行うため、図のようにタップ数が3×3程度のフィルタがよく用いられている。 Edge enhancement processing will be described with reference to FIG. An example of the edge enhancement filter is shown in FIG. As shown in FIG. 9, a filter for performing edge enhancement can be generated by a difference between a filter that directly outputs an input image and a differential filter. As the differential filter, a Sobel filter that performs primary differentiation, a Laplacian filter that performs secondary differentiation, and the like are well known. The differential filter in FIG. 9 is a Laplacian filter. Since the edge enhancement filter performs processing based on the relationship between the pixel values of adjacent pixels, a filter having about 3 × 3 taps as shown in the figure is often used.

 図10に図9に示したエッジ強調フィルタを用いた場合のエッジ部分の強調効果を示す。図10(A)、(B)、(C)は画像中のエッジ部の輝度をある断面で見たとき図である。横軸は座標、縦軸は振幅を表している。図10の(A)は、画像中のエッジ部分の輝度断面であり、これに微分フィルタでエッジ部を抽出して符号反転したものが(B)である。元の画像(A)に(B)を足し合わせることで(C)のように、エッジの傾きを急峻に強調することができる。エッジ強調処理は特にエッジの急峻な部分にのみ作用し、先鋭化するので画像全体に対してはノイズ増幅の影響が少ないという利点やフィルタのタップ数が比較的小さいため高速な処理が可能であるという利点がある。よって、低い回復度合いの画像回復処理を行った後に、エッジ強調処理をするのがより好ましい。このようにエッジ強調処理と組み合わせる場合には、図7のその他必要な処理にエッジ強調処理を含めれば良い。画像のエッジ部が強調処理が可能なその他の処理としては、シャープネス処理等が挙げられる。 FIG. 10 shows the edge enhancement effect when the edge enhancement filter shown in FIG. 9 is used. 10A, 10B, and 10C are diagrams when the luminance of the edge portion in the image is viewed in a certain cross section. The horizontal axis represents coordinates, and the vertical axis represents amplitude. (A) of FIG. 10 is a brightness | luminance cross section of the edge part in an image, (B) what extracted the edge part with the differential filter and reversed the code | symbol to this. By adding (B) to the original image (A), the edge inclination can be sharply enhanced as shown in (C). Edge enhancement works only on sharp edges of edges, and sharpens, so the entire image is less affected by noise amplification, and the number of filter taps is relatively small, allowing high-speed processing. There is an advantage. Therefore, it is more preferable to perform the edge enhancement process after performing the image restoration process with a low degree of restoration. When combined with edge enhancement processing in this way, the edge enhancement processing may be included in the other necessary processing in FIG. Other processing that can enhance the edge portion of the image includes sharpness processing.

 以上、各処理工程の好ましい前後関係や考慮すべき処理について説明したが、処理工程の順序に対して別の観点での制約がある場合にはこれに限るものではなく、処理上の制約条件や要求画質に応じて変更しても構わない。また、撮像装置に関する実施例を示したが、その要旨の範囲内で種々の変形及び変更が可能である。 As described above, the preferred context of each processing step and the processing to be considered have been described. However, when there is a restriction from another viewpoint on the order of the processing steps, the present invention is not limited to this. It may be changed according to the required image quality. Moreover, although the example regarding an imaging device was shown, various deformation | transformation and a change are possible within the range of the summary.

 図11(A)には、本発明の実施例2である画像処理システムの構成図を示した。画像処理装置111は、情報処理装置により構成され、実施例1にて説明した画像処理方法を該情報処理装置に実行させるための画像処理ソフトウェア(画像処理プログラム)112を搭載している。 FIG. 11A shows a configuration diagram of an image processing system that is Embodiment 2 of the present invention. The image processing apparatus 111 includes an information processing apparatus, and is loaded with image processing software (image processing program) 112 for causing the information processing apparatus to execute the image processing method described in the first embodiment.

 撮像装置113は、カメラ、顕微鏡、内視鏡、スキャナ等を含む。記憶媒体114は、半導体メモリ、ハードディスク、ネットワーク上のサーバー等、撮像により生成された画像(撮影画像データ)を記憶する。 The imaging device 113 includes a camera, a microscope, an endoscope, a scanner, and the like. The storage medium 114 stores an image (captured image data) generated by imaging such as a semiconductor memory, a hard disk, or a server on a network.

 画像処理装置111は、撮像装置113または記憶媒体114から画像データを取得して、所定の画像処理を行った出力画像(補正画像)データを出力機器116、撮像装置113および記憶媒体114の少なくとも1つに出力する。また、出力先を画像処理装置111に内蔵された記憶部にし、該記憶部に出力画像データを保存しておくこともできる。出力機器116としては、プリンタ等が挙げられる。画像処理装置111には、モニタである表示機器115が接続されており、ユーザはこの表示機器115を通して画像処理作業を行うとともに、回復調整画像(出力画像)を評価することができる。画像処理ソフトウェア112は、画像回復処理機能および回復度合い調整機能の他に、必要に応じて現像機能やその他の画像処理機能を有している。 The image processing device 111 acquires image data from the imaging device 113 or the storage medium 114 and outputs output image (corrected image) data obtained by performing predetermined image processing to at least one of the output device 116, the imaging device 113, and the storage medium 114. Output to one. Further, the output destination can be a storage unit built in the image processing apparatus 111, and the output image data can be stored in the storage unit. An example of the output device 116 is a printer. A display device 115 as a monitor is connected to the image processing apparatus 111, and the user can perform an image processing operation through the display device 115 and can evaluate a recovery adjustment image (output image). The image processing software 112 has a development function and other image processing functions as needed in addition to the image recovery processing function and the recovery degree adjustment function.

 また、図11(B)には、別の画像処理システムの構成を示している。実施例1のように、撮像装置118単体で実施例1の画像処理を行う場合は、撮像装置118から直接、出力機器119に回復調整画像を出力することができる。 FIG. 11B shows the configuration of another image processing system. As in the first embodiment, when the image processing of the first embodiment is performed by the imaging device 118 alone, the recovery adjustment image can be output directly from the imaging device 118 to the output device 119.

 また、出力機器119に、実施例1の画像処理方法を実行する画像処理装置を搭載することで、出力機器119で画像の特徴量に応じて調整係数を設定し、回復度合いの調整を行うことも可能である。さらに、出力機器119の出力画像の劣化特性に応じて回復度合いを調整することで、より高画質な画像を提供することができる。 In addition, by installing the image processing apparatus that executes the image processing method of the first embodiment in the output device 119, the output device 119 sets an adjustment coefficient according to the feature amount of the image, and adjusts the degree of recovery. Is also possible. Furthermore, by adjusting the degree of recovery according to the degradation characteristics of the output image of the output device 119, a higher quality image can be provided.

 ここで、画像回復処理および回復度合いの調整を含む画像処理を行うための補正情報の内容と、その受渡しについて説明する。図12に補正情報の一例を示し、この複数の補正情報を補正情報セットと記す。各補正情報について以下に説明する。 Here, the contents of correction information for performing image processing including image restoration processing and adjustment of the degree of restoration will be described. FIG. 12 shows an example of the correction information, and the plurality of correction information is referred to as a correction information set. Each correction information will be described below.

 「補正制御情報」
 補正制御情報は、撮像装置113、画像処理装置111および出力機器116のどれでで回復処理および回復度合い調整処理を行うかを示す設定情報と、該設定情報に応じて他の機器に伝送するデータを選択するための選択情報である。例えば、撮像装置113で回復処理のみ行い、画像処理装置111で回復度合いの調整を行う場合、画像回復フィルタを画像処理装置111に伝送する必要は無いが、少なくとも撮影画像と回復画像あるいは回復成分情報(差分情報)を伝送する必要がある。
"Correction control information"
The correction control information includes setting information indicating which of the imaging device 113, the image processing device 111, and the output device 116 performs the recovery process and the recovery degree adjustment process, and data to be transmitted to other devices according to the setting information Is selection information for selecting. For example, when only the restoration processing is performed by the imaging device 113 and the restoration degree is adjusted by the image processing device 111, it is not necessary to transmit the image restoration filter to the image processing device 111, but at least the photographed image and the restoration image or the restoration component information. (Difference information) needs to be transmitted.

 「撮像装置情報」
 撮像装置情報は、製品名称に相当する撮像装置113の識別情報である。レンズとカメラ本体が交換可能な場合はその組み合わせを含む識別情報である。
"Imaging device information"
The imaging device information is identification information of the imaging device 113 corresponding to the product name. If the lens and the camera body are interchangeable, the identification information includes the combination.

 「撮像状態情報」
 撮像状態情報は、撮影時の撮像装置113の状態に関する情報である。例えば、焦点距離(ズーム位置)、絞り値、被写体距離(合焦距離)、ISO感度、ホワイトバランス設定等等である。
"Imaging status information"
The imaging state information is information relating to the state of the imaging device 113 at the time of shooting. For example, the focal length (zoom position), aperture value, subject distance (focusing distance), ISO sensitivity, white balance setting, etc.

 「撮像装置個別情報」
 撮像装置個別情報は、上記の撮像装置情報に対して、個々の撮像装置の識別情報である。製造誤差のばらつきにより撮像装置の光学伝達関数(OTF)は個体ばらつきがあるため、撮像装置個別情報は個々に最適な回復度合い調整パラメータを設定するために有効な情報である。回復度合い調整パラメータとは、回復強度調整係数μや色合成比調整係数ωである。
"Imaging device individual information"
The imaging device individual information is identification information of each imaging device with respect to the above imaging device information. Since the optical transfer function (OTF) of the imaging apparatus has individual variations due to variations in manufacturing errors, the individual imaging apparatus information is effective information for setting an optimum recovery degree adjustment parameter individually. The restoration degree adjustment parameter is a restoration strength adjustment coefficient μ and a color composition ratio adjustment coefficient ω.

 「画像回復フィルタ群」
画像回復フィルタ群は、画像回復処理で用いる画像回復フィルタのセットである。画像回復処理を行う装置が画像回復フィルタを有していない場合、別の装置(機器)から画像回復フィルタを伝送する必要がある。
"Image recovery filters"
The image restoration filter group is a set of image restoration filters used in image restoration processing. When a device that performs image restoration processing does not have an image restoration filter, it is necessary to transmit the image restoration filter from another device (apparatus).

 「ユーザ設定情報」
 ユーザ設定情報は、ユーザの好みに応じた回復度合いに調整するための調整パラメータまたは調整パラメータの補正関数である。ユーザは調整パラメータを可変に設定可能であるが、ユーザ設定情報を用いれば、常に初期値として好みの出力画像を得ることができる。また、ユーザ設定情報は、ユーザが調整パラメータを決定した履歴から、最も好む鮮鋭度を学習機能により更新することが好ましい。
"User Setting Information"
The user setting information is an adjustment parameter for adjusting the recovery degree according to the user's preference or a correction function of the adjustment parameter. The user can variably set the adjustment parameter, but if user setting information is used, a desired output image can always be obtained as an initial value. Moreover, it is preferable that the user setting information is updated by the learning function with the sharpness most preferred from the history of the user determining the adjustment parameter.

 さらに、撮像装置の提供者(メーカー)がいくつかの鮮鋭度パターンに応じたプリセット値をネットワークを介して提供することもできる。 Furthermore, the imaging device provider (manufacturer) can also provide preset values according to some sharpness patterns via a network.

 上記の補正情報セットは、個々の画像データに付帯させることが好ましい。必要な補正情報を画像データに付帯させることで、実施例2の画像処理装置を搭載した機器であれば画像回復処理を行うことができる。また、補正情報セットの内容は必要に応じて、自動および手動で取捨選択可能である。 The above correction information set is preferably attached to individual image data. By attaching necessary correction information to the image data, an image restoration process can be performed by any device equipped with the image processing apparatus of the second embodiment. The contents of the correction information set can be selected automatically and manually as necessary.

 本発明は上記実施の形態に制限されるものではなく、本発明の精神及び範囲から離脱することなく、様々な変更及び変形が可能である。従って、本発明の範囲を公にするために以下の請求項を添付する。 The present invention is not limited to the above embodiment, and various changes and modifications can be made without departing from the spirit and scope of the present invention. Therefore, in order to make the scope of the present invention public, the following claims are attached.

 101 撮像光学系
 102 撮像素子
 104 画像処理部
 106 撮像系制御部
 108 記憶部
 110 システムコントローラ
DESCRIPTION OF SYMBOLS 101 Imaging optical system 102 Imaging element 104 Image processing part 106 Imaging system control part 108 Storage part 110 System controller

Claims (6)

 入力画像を取得する画像取得手段と、
 被写体像を前記入力画像として形成するために用いた撮像系の伝達関数に基づいて生成または選択された画像回復フィルタを用いて前記入力画像を回復し、回復画像を生成する画像回復手段とを有し、
 前記画像回復フィルタは、被写体から回復画像を得る際の2つのアジムス方向間の伝達関数の絶対値の差が、前記撮像系の該2つのアジムス方向間の伝達関数の絶対値の差よりも小さくすることを特徴とする画像処理装置。
Image acquisition means for acquiring an input image;
Image recovery means for recovering the input image using an image recovery filter generated or selected based on a transfer function of an imaging system used to form a subject image as the input image and generating a recovered image; And
The image restoration filter has a difference in absolute value of a transfer function between two azimuth directions in obtaining a restored image from a subject smaller than a difference in absolute value of a transfer function between the two azimuth directions of the imaging system. An image processing apparatus.
 前記画像回復手段は、取得された画像の画素に対して前記画像回復フィルタを畳み込み積分することにより画像回復を行うことを特徴とする請求項1に記載の画像処理装置。 The image processing apparatus according to claim 1, wherein the image restoration means performs image restoration by convolving and integrating the image restoration filter with respect to pixels of the acquired image.  前記画像回復フィルタは、2つのアジムス方向で異なる周波数特性を有する伝達関数と、該2つのアジムス方向間の伝達関数の絶対値の差が減少するように補正した補正伝達関数に基づいて生成されることを特徴とする請求項1または2に記載の画像処理装置。 The image restoration filter is generated based on a transfer function having different frequency characteristics in two azimuth directions and a corrected transfer function corrected so that a difference between the absolute values of the transfer functions between the two azimuth directions is reduced. The image processing apparatus according to claim 1, wherein the image processing apparatus is an image processing apparatus.  撮像系により形成された被写体像を光電変換して撮影画像を生成する撮像素子と、
 前記撮影画像を前記入力画像として処理する請求項1乃至3いずれか1項に記載の画像処理装置とを有することを特徴とする撮像装置。
An image sensor that photoelectrically converts a subject image formed by an imaging system to generate a captured image; and
An imaging apparatus comprising: the image processing apparatus according to claim 1, wherein the captured image is processed as the input image.
 撮像系により生成された画像を入力画像として取得する工程と、
 前記撮像系の伝達関数に基づいて生成または選択された画像回復フィルタを用いて前記入力画像を回復し、回復画像を生成する工程とを有し、
 前記画像回復フィルタは、被写体から回復画像を得る際の2つのアジムス方向間の伝達関数の絶対値の差が、前記撮像系の該2つのアジムス方向間の伝達関数の差よりも小さくすることを特徴とする画像処理方法。
Acquiring an image generated by the imaging system as an input image;
Recovering the input image using an image recovery filter generated or selected based on a transfer function of the imaging system, and generating a recovered image,
The image restoration filter is configured to make a difference between absolute values of transfer functions between two azimuth directions when obtaining a restored image from a subject smaller than a difference between transfer functions between the two azimuth directions of the imaging system. A featured image processing method.
 撮像系により生成された画像を入力画像として取得する画像取得工程と、
 前記撮像系の伝達関数に基づいて生成または選択された画像回復フィルタを用いて前記入力画像を回復し、回復画像を生成する画像回復工程とを情報処理装置に実行させる画像処理プログラムであって、
 前記画像回復フィルタは、被写体から回復画像を得る際の2つのアジムス方向間の伝達関数の絶対値の差が、前記撮像系の該2つのアジムス方向間の伝達関数の絶対値の差よりも小さくすることを特徴とする画像処理プログラム。
An image acquisition step of acquiring an image generated by the imaging system as an input image;
An image processing program for causing an information processing apparatus to execute an image recovery step of recovering the input image using an image recovery filter generated or selected based on a transfer function of the imaging system and generating a recovered image,
The image restoration filter has a difference in absolute value of a transfer function between two azimuth directions in obtaining a restored image from a subject smaller than a difference in absolute value of a transfer function between the two azimuth directions of the imaging system. An image processing program.
PCT/JP2010/055865 2010-03-31 2010-03-31 Image processing apparatus and image capturing apparatus using same Ceased WO2011121763A1 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
PCT/JP2010/055865 WO2011121763A1 (en) 2010-03-31 2010-03-31 Image processing apparatus and image capturing apparatus using same
CN201180017173.9A CN102822863B (en) 2010-03-31 2011-03-10 The image pick up equipment of image processing equipment and this image processing equipment of use
PCT/JP2011/055610 WO2011122284A1 (en) 2010-03-31 2011-03-10 Image processing device and image capturing apparatus using same
JP2012508186A JP5188651B2 (en) 2010-03-31 2011-03-10 Image processing apparatus and imaging apparatus using the same
US13/204,453 US8514304B2 (en) 2010-03-31 2011-08-05 Image processing device and image pickup device using the same
US13/849,781 US8692909B2 (en) 2010-03-31 2013-03-25 Image processing device and image pickup device using the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2010/055865 WO2011121763A1 (en) 2010-03-31 2010-03-31 Image processing apparatus and image capturing apparatus using same

Publications (1)

Publication Number Publication Date
WO2011121763A1 true WO2011121763A1 (en) 2011-10-06

Family

ID=44711549

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2010/055865 Ceased WO2011121763A1 (en) 2010-03-31 2010-03-31 Image processing apparatus and image capturing apparatus using same

Country Status (1)

Country Link
WO (1) WO2011121763A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012073691A (en) * 2010-09-28 2012-04-12 Canon Inc Image processing system, imaging apparatus, image processing method, and program
JP2013038563A (en) * 2011-08-08 2013-02-21 Canon Inc Image processing method, image processing apparatus, image pickup apparatus, and image processing program
JP2014003550A (en) * 2012-06-20 2014-01-09 Fujitsu Ltd Image processing apparatus and program
JP2014006667A (en) * 2012-06-22 2014-01-16 Fujitsu Ltd Image processing apparatus, information processing method, and program
WO2014125659A1 (en) * 2013-02-15 2014-08-21 富士フイルム株式会社 Image processing device, image capture device, filter generating device, image restoration method, and program
WO2015064264A1 (en) * 2013-10-31 2015-05-07 富士フイルム株式会社 Image processing device, image capture device, parameter generating method, image processing method, and program
JP2015103902A (en) * 2013-11-22 2015-06-04 キヤノン株式会社 Imaging device
WO2017056787A1 (en) * 2015-09-29 2017-04-06 富士フイルム株式会社 Image processing device, image processing method and program

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003078812A (en) * 2001-05-29 2003-03-14 Hewlett Packard Co <Hp> Method for reducing motion blur in digital images
JP2007036799A (en) * 2005-07-28 2007-02-08 Kyocera Corp Image processing device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003078812A (en) * 2001-05-29 2003-03-14 Hewlett Packard Co <Hp> Method for reducing motion blur in digital images
JP2007036799A (en) * 2005-07-28 2007-02-08 Kyocera Corp Image processing device

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012073691A (en) * 2010-09-28 2012-04-12 Canon Inc Image processing system, imaging apparatus, image processing method, and program
JP2013038563A (en) * 2011-08-08 2013-02-21 Canon Inc Image processing method, image processing apparatus, image pickup apparatus, and image processing program
JP2014003550A (en) * 2012-06-20 2014-01-09 Fujitsu Ltd Image processing apparatus and program
JP2014006667A (en) * 2012-06-22 2014-01-16 Fujitsu Ltd Image processing apparatus, information processing method, and program
USRE47947E1 (en) 2012-06-22 2020-04-14 Fujitsu Limited Image processing apparatus and information processing method for improving resolution of direction exhibiting degraded resolution
US9633417B2 (en) 2013-02-15 2017-04-25 Fujifilm Corporation Image processing device and image capture device performing restoration processing using a restoration filter based on a point spread function
WO2014125659A1 (en) * 2013-02-15 2014-08-21 富士フイルム株式会社 Image processing device, image capture device, filter generating device, image restoration method, and program
JPWO2014125659A1 (en) * 2013-02-15 2017-02-02 富士フイルム株式会社 Image processing apparatus, imaging apparatus, filter generation apparatus, image restoration method, and program
WO2015064264A1 (en) * 2013-10-31 2015-05-07 富士フイルム株式会社 Image processing device, image capture device, parameter generating method, image processing method, and program
JP5992633B2 (en) * 2013-10-31 2016-09-14 富士フイルム株式会社 Image processing apparatus and imaging apparatus
JP2015103902A (en) * 2013-11-22 2015-06-04 キヤノン株式会社 Imaging device
US10559068B2 (en) 2015-09-29 2020-02-11 Fujifilm Corporation Image processing device, image processing method, and program processing image which is developed as a panorama
WO2017056787A1 (en) * 2015-09-29 2017-04-06 富士フイルム株式会社 Image processing device, image processing method and program

Similar Documents

Publication Publication Date Title
JP5188651B2 (en) Image processing apparatus and imaging apparatus using the same
JP5284537B2 (en) Image processing apparatus, image processing method, image processing program, and imaging apparatus using the same
JP5546229B2 (en) Image processing method, image processing apparatus, imaging apparatus, and image processing program
JP5441652B2 (en) Image processing method, image processing apparatus, imaging apparatus, and image processing program
JP5409589B2 (en) Image processing method, image processing program, image processing apparatus, and imaging apparatus
US9167216B2 (en) Image processing apparatus, image capture apparatus and image processing method
JP2011123589A5 (en)
JP2011124692A5 (en)
JP2013051524A (en) Image processing device and method
CN106162132A (en) Image processing equipment and control method thereof
CN104519329B (en) Image processing apparatus, image pickup apparatus, image pickup system and image processing method
WO2011121763A1 (en) Image processing apparatus and image capturing apparatus using same
JP5344648B2 (en) Image processing method, image processing apparatus, imaging apparatus, and image processing program
JP5479187B2 (en) Image processing apparatus and imaging apparatus using the same
JP5730036B2 (en) Image processing apparatus, imaging apparatus, image processing method, and program.
JP5425135B2 (en) Image processing method, image processing apparatus, imaging apparatus, and image processing program
JP6415108B2 (en) Image processing method, image processing apparatus, imaging apparatus, image processing program, and storage medium
JP5425136B2 (en) Image processing method, image processing apparatus, imaging apparatus, and image processing program
WO2011121761A1 (en) Image processing apparatus and image capturing apparatus using same
JP2016201600A (en) Image processing apparatus, imaging apparatus, image processing method, image processing program, and storage medium
JP6238673B2 (en) Image processing apparatus, imaging apparatus, imaging system, image processing method, image processing program, and storage medium
JP2012156714A (en) Program, image processing device, image processing method, and imaging device
JP2022161527A (en) Image processing method, image processing apparatus, imaging device, and program
JP2016201601A (en) Image processing apparatus, imaging apparatus, image processing method, image processing program, and storage medium
JP2012156710A (en) Image processing device, imaging device, image processing method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10848941

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10848941

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP