US20130021484A1 - Dynamic computation of lens shading - Google Patents
Dynamic computation of lens shading Download PDFInfo
- Publication number
- US20130021484A1 US20130021484A1 US13/330,047 US201113330047A US2013021484A1 US 20130021484 A1 US20130021484 A1 US 20130021484A1 US 201113330047 A US201113330047 A US 201113330047A US 2013021484 A1 US2013021484 A1 US 2013021484A1
- Authority
- US
- United States
- Prior art keywords
- lens shading
- image
- captured
- captured image
- image data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/73—Deblurring; Sharpening
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/134—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
- H04N19/136—Incoming video signal characteristics or properties
- H04N19/137—Motion inside a coding unit, e.g. average field, frame or block difference
- H04N19/139—Analysis of motion vectors, e.g. their magnitude, direction, variance or reliability
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/189—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the adaptation method, adaptation tool or adaptation type used for the adaptive coding
- H04N19/192—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the adaptation method, adaptation tool or adaptation type used for the adaptive coding the adaptation method, adaptation tool or adaptation type being iterative or recursive
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/42—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation
- H04N19/436—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation using parallelised computational arrangements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/50—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
- H04N19/503—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
- H04N19/51—Motion estimation or motion compensation
- H04N19/537—Motion estimation other than block-based
- H04N19/54—Motion estimation other than block-based using feature points or meshes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/50—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
- H04N19/503—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
- H04N19/51—Motion estimation or motion compensation
- H04N19/56—Motion estimation with initialisation of the vector search, e.g. estimating a good candidate to initiate a search
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/681—Motion detection
- H04N23/6811—Motion detection based on the image signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/682—Vibration or motion blur correction
- H04N23/683—Vibration or motion blur correction performed by a processor, e.g. controlling the readout of an image memory
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20172—Image enhancement details
- G06T2207/20201—Motion blur correction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/633—Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
- H04N23/634—Warning indications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/64—Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
Definitions
- the present disclosure is generally related to lens shading correction for imaging devices.
- An increasing number of devices are being produced that are enabled to capture and display images.
- mobile devices such as cell phones, are increasingly being equipped with digital cameras to capture images, including still snapshots and motion video images.
- lens shading is the difference in light transition through the opto-electrical system of the camera, in a way that a same light source that is imaged by the camera at different angles, or places on the image is read by the camera in different values rather than having the same value.
- lens shading can cause pixel cells in a pixel array of an image sensor located farther away from the center of the pixel array to have a lower pixel signal value when compared to pixel cells located closer to the center of the pixel array even when all pixel cells are exposed to the same illuminant condition.
- pixels with different spectral characteristics have different responses to the lens shading, which may cause appearance of color patches even if the scene is monochromatic.
- long and expensive calibration process is done per camera or mobile product. In many cases, calibration errors are the main source for the reduced image quality in these devices.
- FIG. 1 is a block diagram of an exemplary mobile device with image capture and processing capability in accordance with embodiments of the present disclosure.
- FIG. 2 is a diagram representation of an intensity profile under uniform illumination in accordance with embodiments of the present disclosure.
- FIG. 3 is a diagram representation of intensity profiles of multiple images under non-uniform illumination in accordance with embodiments of the present disclosure.
- FIGS. 4-6 are flow chart diagrams depicting exemplary processes of estimating lens shading in accordance with the disclosed embodiments.
- FIG. 7 is a diagram representation of a surface profile that may be created depending on the scene being photographed and the particular illumination characteristics in accordance with the disclosed embodiments.
- FIGS. 8-11 are block diagrams illustrating examples of a mobile device employing the image processing circuitry of FIG. 1 .
- Embodiments of the present disclosure utilize captured image information to determine a lens shading surface being experienced by an imaging device or camera under current conditions (e.g., optical conditions, lighting conditions, etc.). The lens shading surface is then used to apply a correction to the pixels of captured images to compensate for effects of lens shading.
- current conditions e.g., optical conditions, lighting conditions, etc.
- Embodiments of the present disclosure relate to image processing performed in devices.
- mobile devices where image processing must be performed with limited resources.
- Types of such mobile devices include mobile phones (e.g., cell phones), handheld computing devices (e.g., personal digital assistants (PDAs), BLACKBERRY devices, PALM devices, etc.), handheld music players (e.g., APPLE IPODs, MP3 players, etc.), and further types of mobile devices.
- Such mobile devices may include a camera or image sensor used to capture images, such as still images and video images. The captured images are processed internal to the mobile device.
- FIG. 1 shows a block diagram of an exemplary mobile device 100 with image capture and processing capability.
- Mobile device 100 may be a mobile phone, a handheld computing device, a music player, etc.
- the implementation of mobile device 100 shown in FIG. 1 is provided for purposes of illustration, and is not intended to be limiting. Embodiments of the present disclosure are intended to cover mobile devices having additional and/or alternative features to those shown for mobile device 100 in FIG. 1 .
- mobile device 100 includes, but is not limited to including, an image sensor device 102 , an analog-to-digital (A/D) 104 , an image processor 106 , a speaker 108 , a microphone 110 , an audio codec 112 , a central processing unit (CPU) 114 , a radio frequency (RF) transceiver 116 , an antenna 118 , a display 120 , a battery 122 , a storage 124 , and a keypad 126 .
- CPU central processing unit
- RF radio frequency
- Battery 122 provides power to the components of mobile device 100 that require power.
- Battery 122 may be any type of battery, including one or more rechargeable and/or non-rechargeable batteries.
- Keypad 126 is a user interface device that includes a plurality of keys enabling a user of mobile device 100 to enter data, commands, and/or to otherwise interact with mobile device 100 .
- Mobile device 100 may include additional and/or alternative user interface devices to keypad 126 , such as a touch pad, a roller ball, a stick, a click wheel, and/or voice recognition technology.
- Image sensor device 102 is an image capturing device.
- image sensor device 102 may include an array of photoelectric light sensors, such as a charge coupled device (CCD) or a CMOS (complementary metal-oxide-semiconductor) sensor device.
- Image sensor device 102 typically includes a two-dimensional array of sensor elements or pixel sensors organized into rows and columns. Each pixel sensor may be identified using pixel sensor coordinates, where “x” is a row number, and “y” is a column number, for any pixel sensor in the array of sensor elements.
- each pixel sensor of image sensor device 102 is configured to be sensitive to a specific color, or color range.
- Image sensor device 102 receives light (from optical system 101 ) corresponding to an image, and generates an analog image signal corresponding to the captured image.
- Analog image signal includes analog values for each of the pixel sensors.
- Optical system 101 can be a single lens, as shown, but may also be a set of lenses.
- An image of a scene is formed in visible optical radiation through a shutter onto a two-dimensional surface of the image sensor 102 .
- An electrical output of the sensor carries an analog signal resulting from scanning individual photo-detectors of the surface of the sensor 102 onto which the image is projected.
- Signals proportional to the intensity of light striking the individual photo-detectors or pixel sensors are obtained in the output in time sequence, typically by scanning them in a raster pattern, where the rows of photo-detectors are scanned one at a time from left to right, beginning at the top row, to generate a frame of video data from which the image may be reconstructed.
- A/D 104 receives analog image signal, converts analog image signal to digital form, and outputs a digital image signal.
- Digital image signal includes digital representations of each of the analog values generated by the pixel sensors or photo-detectors, and thus includes a digital representation of the captured image.
- Image processor 106 performs image processing of the digital pixel sensor data received in digital image signal. For example, image processor 106 may be used to generate pixels of all three colors at all pixel positions when a Bayer pattern image is output by image sensor device 102 .
- two or more of image sensor device 102 , A/D 104 , and image processor 106 may be included together in a single IC chip, such as a CMOS chip, particularly when image sensor device 102 is a CMOS sensor, or may be in two or more separate chips.
- CPU 114 is shown in FIG. 1 as coupled to each of image processor 106 , audio codec 112 , RF transceiver 116 , display 120 , storage 124 , and keypad 126 .
- CPU 114 may be individually connected to these components, or one or more of these components may be connected to CPU 114 in a common bus structure.
- Microphone 110 and audio CODEC 112 may be present in some applications of mobile device 100 , such as mobile phone applications and video applications (e.g., where audio corresponding to the video images is recorded). Microphone 110 captures audio, including any sounds such as voice, etc. Microphone 110 may be any type of microphone. Microphone 110 generates an audio signal that is received by audio codec 112 . The audio signal may include a stream of digital data, or analog information that is converted to digital form by an analog-to-digital (ND) converter of audio codec 112 . Audio codec 112 encodes (e.g., compresses) the received audio of the received audio signal. Audio codec 112 generates an encoded audio data stream that is received by CPU 114 .
- ND analog-to-digital
- CPU 114 receives image processor output signal from image processor 106 and receives the audio data stream from audio codec 112 .
- CPU 114 may include an additional image processor.
- the additional image processor performs image processing (e.g., image filtering) functions for CPU 114 .
- CPU 114 includes a digital signal processor (DSP), which may be included in the additional image processor. When present, the DSP may apply special effects to the received audio data (e.g., an equalization function) and/or to the video data.
- DSP digital signal processor
- CPU 114 may store and/or buffer video and/or audio data in storage 124 .
- Storage 124 may include any suitable type of storage, including one or more hard disc drives, optical disc drives, FLASH memory devices, etc.
- CPU 114 may stream the video and/or audio data to RF transceiver 116 , to be transmitted from mobile device 100 .
- RF transceiver 116 is configured to enable wireless communications for mobile device 116 .
- RF transceiver 116 may enable telephone calls, such as telephone calls according to a cellular protocol.
- RF transceiver 116 may include a frequency up-converter (transmitter) and down-converter (receiver).
- RF transceiver 116 may transmit RF signals to antenna 118 containing audio information corresponding to voice of a user of mobile device 100 .
- RF transceiver 116 may receive RF signals from antenna 118 corresponding to audio information received from another device in communication with mobile device 100 .
- RF transceiver 116 provides the received audio information to CPU 114 .
- RF transceiver 116 may be configured to receive television signals for mobile device 100 , to be displayed by display 120 .
- RF transceiver 116 may transmit images captured by image sensor device 102 , including still and/or video images, from mobile device 100 .
- RF transceiver 116 may enable a wireless local area network (WLAN) link (including an IEEE 802.11 WLAN standard link), and/or other type of wireless communication link.
- WLAN wireless local area network
- CPU 114 provides audio data received by RF transceiver 116 to audio codec 112 .
- Audio codec 112 performs bit stream decoding of the received audio data (if needed) and converts the decoded data to an analog signal.
- Speaker 108 receives the analog signal, and outputs corresponding sound.
- Image processor 106 audio codec 112 , and CPU 114 may be implemented in hardware, software, firmware, and/or any combination thereof.
- CPU 114 may be implemented as a proprietary or commercially available processor that executes code to perform its functions.
- Audio codec 112 may be configured to process proprietary and/or industry standard audio protocols.
- Image processor 106 may be a proprietary or commercially available image signal processing chip, for example.
- Display 120 receives image data from CPU 114 , such as image data generated by image processor 106 .
- display 120 may be used to display images captured by image sensor device 102 .
- Display 120 may include any type of display mechanism, including an LCD (liquid crystal display) panel or other display mechanism.
- the display may show a preview of images currently being received by the sensor 102 , whereby a user may select a control (e.g., shutter button) to begin saving captured image(s) to storage 124 .
- a control e.g., shutter button
- image processor 106 formats the image data output in image processor output signal according to a proprietary or known video data format.
- Display 120 is configured to receive the formatted data, and to display a corresponding captured image.
- image processor 106 may output a plurality of data words, where each data word corresponds to an image pixel.
- a data word may include multiple data portions that correspond to the various color channels for an image pixel. Any number of bits may be used for each color channel, and the data word may have any length.
- display 120 has a display screen that is not capable of viewing the full resolution of the images captured by image sensor device 102 .
- Image sensor devices 102 may have various sizes, including numbers of pixels in the hundreds of thousand, or millions, such as 1 megapixel (Mpel), 2 Mpels, 4 Mpels, 8 Mpels, etc.).
- Display 120 may be capable of displaying relatively smaller image sizes.
- CPU 114 may down-size a captured image received from image processor 106 before providing the image to display 120 , in some embodiments.
- image downsizing may be performed by a subsampling process.
- subsampling is a process used to reduce an image size.
- Subsampling is a type of image scaling, and may alter the appearance of an image or reduce the quantity of information required to store an image.
- Two types of subsampling are replacement and interpolation.
- the replacement technique selects a single pixel from a group and uses it to represent the entire group.
- the interpolation technique uses a statistical sample of the group (such as a mean) to create a new representation of the entire group.
- image processor 106 performs processing of digital image signal and generates an image processor output signal.
- Image processing may include lens shading correction processes performed by a lens shading sub-module 107 of an image processor 106 , in one embodiment.
- lens shading gain adjustment surface A function that maps pixel position into a desired correction amount is referred to herein as a lens shading gain adjustment surface.
- the surface may consist of an interleaving of several smooth surfaces, each for every pixel type or color. Such a surface may be generated in a CMOS circuit, in one embodiment, and then used for correction of spatial non-uniformity sensitivity of the lens shading correction across pixel positions in the sensor array.
- shading correction factors for an optical photo system e.g., lens, image sensor, and/or housing
- an optical photo system e.g., lens, image sensor, and/or housing
- Data of the resulting circular, hyperbolic or other variation across the image sensor device are derived by prior measurement of image sensor photo detector signals and a compensating mathematical function or functions are calculated and stored under optimal lab conditions, where imaging a scene of uniform intensity is possible.
- an actual response may be measured from the image.
- a response may be measured in each of the color planes—red, green, blue. The response will show that at its center, the response is strongest and weaker at its edges. Accordingly, pixels corresponding to the edges of the pixel sensor may be multiplied by a relative corrective factor so that the corrected response is flat, after correction. These correction factors may be used for captured image(s) acquired under similar illumination conditions.
- illumination conditions change as the environment of the mobile device changes.
- different lens positions within the optical system 101 produce different lens shading effects.
- different correction factors may need to be adjusted to compensate for different light sources within a scene being photographed and/or lens positioning or qualities (e.g., changes in zoom or focus, particular manufacturing accuracies, mounting of lens, filter consistencies, etc.). Therefore, a tuning process to premeasure all the different combinations of potential different positions and light sources will be complicated and not accurate, since for each combination, the actual response measured from a captured image is different.
- shading correction factors for an optical photo system are performed by capturing multiple images of a scene in succession.
- the lens shading effect on the camera may be better understood and represented. Therefore, in one embodiment, capturing two images of a same scene with slight camera shift between image captures provides the reference from which the lens shading gain adjustment can be estimated, as represented in FIG. 3 .
- a lens shading curve or surface may be determined that caused the differences between the captured images.
- preview images captured for display on a viewfinder of a mobile device 100 may be used to determine the lens shading effect, where upon capturing of an image (e.g., after selecting a shutter button or control), the captured image may be corrected to compensate for the current lens shading effect. For example, a series of low resolution images may be used to preview the image to the photographer before actually taking a high resolution image. Then, data of the resulting circular, hyperbolic or other variation across the image sensor 102 are derived by dynamic measurement of image sensor photo detector signals and a compensating mathematical function or functions are calculated.
- lens shading phenomenon may be estimated dynamically and on the fly. Accordingly, during manufacturing and assembly of a camera or mobile device equipped with a camera, resources used for corrective lens shading calibration may be eliminated or significantly reduced.
- an image may be captured where an object in a scene is moving (as opposed to the camera moving). Accordingly, a subsequent capturing of the scene is going to be quite dissimilar, since the scene is not static.
- the lens shading sub-module 107 and/or the image processor 106 may detect that illumination types for the captured images are not the same. Accordingly, the lens shading sub-module 107 and/or image processor 106 may attempt to detect that lighting conditions are stable during capturing of the images that are used to derive the lens shading surface.
- lens shading correction factors may be preloaded or stored in the reference database 125 at a manufacturing facility so that the camera is equipped with preliminary lens shading correction factors that can be used, as needed.
- the correction factors may be initially generated responsive to capturing a scene in a flat field (e.g., a white wall with desired illumination) within a closed environment.
- a flat field e.g., a white wall with desired illumination
- the scene does not necessarily need to be a perfectly flat field, in one embodiment. Therefore, the motion based calculation may be performed in the manufacturing stage with relatively flat surfaces which make measurements faster to obtain and less dependent on measurement conditions. In other words, a wider range of manufacturing conditions are available to be used with systems and processes of the present disclosure.
- the lens shading sub-module 107 and/or the image processor 106 may detect conditions that do not allow for sufficient measuring of lens shading. Accordingly, in a case where good conditions are not present to measure lens shading, the mobile device 100 takes advantage of a lens shading correction factors stored in the reference database 125 . Alternatively, in a case where good conditions are present, the lens shading sub-module 107 and/or the image processor 106 compares the differences between recently captured images and defines a lens shading surface (on the fly) by considering the differences between the captured images. Then, a lens surface gain adjustment surface may be chosen that matches the intensity variation or variations reflected by the lens shading surface across the captured images that are to be corrected. The mobile device 100 may also store the lens surface gain adjustment surface in the reference database 125 for later use, as needs arise.
- one embodiment of dynamic lens shading calculation utilizes gradients or differences between captured image areas to define the lens shading surface. From the lens shading surface, corrections may be prepared to compensate for the effects of lens shading. To determine the lens shading surface, inter image consideration and/or intra image consideration are evaluated. For inter image consideration, in one embodiment, two images are captured and a ratio is calculated between a pixel value in an object in one image and the pixel value of the same place on the same object in a second image (that was taken after camera motion). The calculated ratio represents a local gradient of the lens shading at the direction of the camera or mobile device movement.
- a second ratio is calculated, between a pixel value of an object in an image and the pixel value of another object that has similar luminosity in the image.
- the calculated second ratio represents the gradient of the lens shading between these two points.
- the images are geometrically matched with one another.
- global motion is detected from the two images, where motion parameters may include translation and transformation (e.g., affine or perspective). Then, areas having local motion that is different from the global motion are determined. These areas may have had an object moving in the camera field or scene being photographed. In one embodiment, areas having local motion are not analyzed for gradients. In such a situation, a correction factor may be determined by extrapolation from other areas (not subject to the local motion) in single image data.
- a captured image may be compensated using stored lens shading correction factors in the reference database 125 instead of determining correction factors dynamically, in one embodiment.
- the foregoing processes may be iterative in some embodiments, where after an estimation of lens shading, processes may be repeated to determine a new estimation of lens shading.
- the motion detection is performed on full resolution image(s). After the motion detection is estimated, then a first image is transformed to match the second image geometrically and matching pixel values are attempted to be found.
- Gradients and pixel ratios may be affected with noise and inaccuracies.
- possible sources of noise include pixel noise (i.e., electronic and photonic noise in the process of converting luminance to digital luminance count), errors in estimation of motion, changing light condition between the two images, an object has different reflection into the camera in different positions, an incorrect assumption on similar luminance, where in reality two objects being compared have different luminance, etc.
- measures may be taken to calculate the gradients in ‘flat’ areas where there are no rapid changes in luminance (e.g., edges). For example, areas near edges in the captured images may be masked out.
- the lens shading surface may be calculated from the local gradients, in some embodiments.
- a model of a lens shading surface may be computed or estimated that matches the measured gradients in the captured images. Accordingly, parameters of the model may be adjusted until an optimal result is determined from all the tested results or trials.
- one possible technique determines an optimized analytical parametric surface by selecting a surface model equation (e.g., polynomial, Gaussian, etc.) and calculating the parameters of the lens shading surface model that yield minimal difference between the surface gradient (according to the model) and measured gradients.
- a surface model equation e.g., polynomial, Gaussian, etc.
- Another possible technique determines an optimized singular value decomposition (SVD) surface composition by select largest surface eigenvectors and calculating the coefficients for the surface composition that yield minimal difference between the surface gradient (according to the model) and measured gradients.
- SVD singular value decomposition
- a Gaussian model may be used to model the lens shading being experienced by the mobile device 100 and values of parameters for the model may be adjusted until an optimal match is found between the model and the measured values.
- other models may be considered, such as a polynomial surface model, in some embodiments.
- an SVD process may be used.
- color layers may be estimated using a variety of techniques, including direct layer estimation, independent color layer estimation, and normalized color domain estimation. For instance, measurements may be made in normalized color domain(s) where possible, since, typically, luminosity changes more rapidly than normalized color in images. Additional measurements include calculating small number of surface model parameters with large number of measurements; limiting parameter space to a predefined space according to measurements of sample units in different light (spectra and intensity) conditions; averaging measurements before calculating gradients (e.g., by down sampling the image); calculating global gradients rather than using only local gradients; and segmenting the image to a small number of color segments, and estimating global gradients on each one. Also, in some embodiments, possible effects of light flickering during capturing of the images may be addressed and removed from the images.
- one technique of matching pixels involves direct layer estimation, where local gradients are calculated.
- differences between the images represents the local gradients.
- intra image consideration color segments are derived and differences between like color segments are representative of local gradients.
- An optimized lens shading surface is modeled which matches with local gradients at measured points. Accordingly, a model surface may be computed that fits the local gradients of each of the color segments.
- information may be obtained on the gradients at each corresponding sensor point of the image, where the gradients are representative of the lens shading phenomenon.
- lens shading correction for color image sensors may be defined for each of a plurality of color channels in order to correct for lens shading variations across color channels.
- different techniques or models may be used by the lens shading sub-module 107 and/or the image processor 106 for the different color channels.
- a green channel may determine a best fit of color plan parameters for an SVD model and a red/blue channel may utilize direct layer optimization.
- lens shading can be corrected using standard correction methods.
- a white balance may be selected that is appropriate to generate the estimated lens shading curve, where different illuminants have different optical wavelength responses and hence may result in different lens shading surfaces.
- the image processor 106 or an auto-white-balance (AWB) sub-module of the image processor 106 may then determine the type of illuminant used to generate the lens shading curve that has been estimated and subsequently use this information to correct white balance levels in captured image(s).
- more robust motion estimation may also be implemented responsive to the lens shading estimation by the lens shading sub-module 107 and/or the image processor 106 .
- global motion can be estimated by calculating a mean difference between image areas in the two images captured in a sequence, where the difference corresponds to the same object moving across one image to a different place in the second image. Since the second image has different lens shading characteristics as compared to the first, it also has a different mean brightness as compared to the first image. Accordingly, instead of examining correlations between the images in order to determine a motion vector that can be used to estimate camera motion, statistics used to determine the lens shading can also be used to estimate the camera motion. Therefore, differences in the statistics between the images may be used to calculate the camera or global motion.
- the first image may feature a white ball at a left corner of the frame.
- the second image may feature the ball at a position to the right of the left corner, where the ball has a brighter intensity than in the first frame.
- the lens shading for the mobile device 100 has been determined, where the lens shading is found to traverse along one side of the image sensor 102 to the other side. Accordingly, at a pixel sensor corresponding to the left corner of the image, the average intensity value is going to be lower than an average intensity value at a pixel sensor to the right. Therefore, based on the lens shading statistics, it is expected that the intensity values of pixels corresponding to the ball will change based on the lens shading as the ball moves to the right in subsequent images.
- an object having a different intensity value than a prior value in a prior frame may be determined to be the same object in motion due to the lens shading phenomenon (that has been previously computed). As a result, motion can be analyzed and determined.
- FIG. 4 illustrates a flow chart depicting a process of estimating lens shading in accordance with the disclosed embodiments.
- Lens shading estimation in accordance with FIG. 4 , is performed by a pixel processing pipeline of image processor 106 ( FIG. 1 ) (e.g., lens shading module 107 ) dynamically and, if necessary, stored references surface(s) acquired during a calibration operation.
- the image processor 106 has access to the stored gain adjustment surface(s) and scene adjustment surface(s) in, for example, reference database 125 ( FIG. 1 ) or other memory storage.
- an image is generally captured by a digital camera
- the image is not captured in a known illumination type or a reference is not available for the current illumination type.
- the captured image is a natural image, where the lens shading sub-module 107 of an image processor 106 does not have any preset knowledge of the illumination type and may not therefore have a reference correction surface prestored according to the current illumination type.
- lens shading correction factors may be solely derived from capturing a scene of a flat field to create an image that contains even color and intensity values except for effects from lens shading, natural images taken by the camera normally have no such flat areas in the image. Accordingly, embodiments of the present disclosure analyze the differences in light transition from natural images captured by the mobile device 100 .
- embodiments of the present disclosure take advantage of capturing multiple images in succession and determining a lens shading correction or gain adjustment surface for the present illumination conditions.
- the images are captured by the same image sensor 102 of the mobile device 100 , the images are captured using the same optics. Accordingly, intensity values of pixels for the multiple images should ideally be the same, and illumination levels for the captured images should also be the same, since the images are captured within parts of a second from one another, in some embodiments.
- the mobile device 100 may move or shift during the capturing of one image to the next. Also, due to lens shading, the intensity values of the pixels may not be exactly the same.
- the lens shading effect on the mobile device 100 may be better understood and represented. Therefore, in one embodiment, capturing two images of a same scene with slight camera shift between image captures provides the reference from which the corrective lens shading surface can be estimated. In particular, a lens shading curve or surface may be determined that caused the differences between the captured images. In some embodiments, preview images captured for display on a viewfinder of a camera may be used to determine the lens shading effect, where upon capturing of an image (e.g., after selecting a shutter button or control), the captured image may be corrected to compensate for the current lens shading effect.
- Lens shading estimation begins with capturing a sequence of images at step 402 .
- local gradients of the captured images are determined.
- the local gradients may be determined in a number of different ways.
- techniques estimate the local gradients from inter image consideration and/or intra image consideration.
- multiple images may be captured and inter image analysis may be performed on the captured images.
- intra image analysis may be performed on each captured image.
- the intra image analysis may be performed in concert with the inter image analysis or apart from the inter image analysis, in some embodiments, based on recognition of a particular condition. For instance, a sequence of images may have been subjected to a level of local motion in the scene being photographed that does not allow for adequate statistics to be obtained. Alternatively, adequate statistics for global motion may not be able to be obtained which prohibits one or both approaches from being used or causes prestored statistics or factors in the reference database 125 to be used instead. As an example, a single image may not contain multiple areas with similar colors or intensity.
- a model of a lens shading surface is compared to the measured gradients from the captured images and the deviation between the two is saved for later comparison.
- the process proceeds to step 408 where the model is adjusted and compared again with the measured gradients and new deviation(s) are computed and compared with the saved values.
- the model having the set of smallest deviation values is maintained as the optimum model for the trials previously computed. The process then repeats until an optimum model is determined.
- a lens shading gain adjustment surface is calculated from the lens shading surface.
- the lens shading gain adjustment surface may also be determined for each color channel.
- lens shading correction for color image sensors may be defined for each of a plurality of color channels in order to correct for lens shading variations across color channels.
- the lens shading gain adjustment surface is applied to the pixels of the corresponding color channel during post-image capture processing to correct for variations in pixel value due to the spatial location of the pixels in the pixel array.
- monochrome image sensors on the other hand, apply a single gain adjustment surface to all pixels of a pixel array.
- color image sensors may use a single lens shading gain adjustment surface across all color channels, in some embodiments.
- a pixel value located at x, y pixel coordinates may be multiplied by the lens surface gain adjustment values at the x, y pixel coordinates on the lens surface gain adjustment surface. Accordingly, at step 412 , lens shading correction is performed on the pixel values of the captured image using the lens surface gain adjustment surface(s).
- a lens shading module 107 is provided to estimate the effects of lens shading and to possibly correct the gain of individual pixels in captured images.
- the lens shading module 107 may, for example, be implemented as software or firmware.
- the lens shading module 107 may be implemented in image processor 106 as software designed to implement lens shading correction, in one embodiment. Alternatively, lens shading module 107 may be implemented in image sensor 102 , in one embodiment.
- the lens shading module 107 utilizes lens shading correction surfaces to determine gain correction for individual pixels to account for lens shading.
- An individual correction or gain adjustment surface may, for example, comprise parameters to calculate gain correction although it will also be understood that in some cases a correction table may be stored.
- Positional gain adjustments across the pixel array can be provided as digital gain values, one corresponding to each of the pixels. It may happen that the further away a pixel is from the center of the pixel array, the more gain is needed to be applied to the pixel value. The set of digital gain values for the entire pixel array forms a lens shading gain adjustment surface.
- only a relatively few gain values are preferably stored, in order to minimize the amount of memory required to store correction data, and a determination of values between the stored values is obtained, during the image modification process, by a form of interpolation.
- these few data values are preferably fit to a smooth curve or curves that are chosen to match the intensity variation or variations across the image that are to be corrected.
- the digital gain values are computed from an expression that approximates the desired lens shading gain adjustment surface, since the number of parameters needed to generate an approximate surface is generally significantly lower than the numbers of parameters needed to store the digital gain values for every pixel location.
- FIG. 5 is a flowchart representation of a method in accordance with one embodiment of the present disclosure. In particular, a method is presented for use in conjunction with one or more of the functions and features described in conjunction with FIGS. 1-3 .
- a lens shading surface is continually calculated for preview images being displayed by a mobile device 100 .
- the lens shading surface is stored in a reference database 125 , in step 504 .
- a newly calculated lens shading surface is used to compensate for lens shading effects in the captured image, in step 506 if the newly calculated lens shading surface was determined to be satisfactory. Otherwise, a lens shading surface prestored in the reference database 125 is used to compensate for lens shading effects in the captured image, in step 508 .
- FIG. 6 is a flowchart representation of a method in accordance with one embodiment of the present disclosure.
- a method is presented for use in conjunction with one or more of the functions and features described in conjunction with FIGS. 1-3 .
- step 602 two or more images of the same scene, containing objects, are captured, where the images have some displacement between themselves.
- the relative displacement between the images is analyzed based on tracking of image areas with details or discernable objects.
- step 606 for each point and for each color plane in the image, a ratio between an intensity level at the first image and the level of the same object point at the second image is calculated. If there were no lens shading, the values would be the same.
- step 608 the differences to the values are normalized, and in step 610 , from the calculated ratios, a difference surface profile is created by possibly filtering the results and interpolating data points or values, as needed, to produce a smooth lens shading surface profile.
- FIG. 7 is a representative lens shading surface profile that may be created depending on the scene being photographed and the particular illumination characteristics.
- step 612 after the lens shading surface is extracted or generated from a current scene, the lens shading surface is used to indicate the light source illuminating the scene and supply an unbiased measurement for the white balance.
- Mobile device 100 may comprise a variety of platforms in various embodiments.
- a smart phone electronic device 100 a is represented in FIG. 8 , where the smart phone 100 a includes an optical system 101 , at least one imaging device or sensor 102 , at least one image processor 106 with lens shading sub-module 107 , a power source 122 , among other components (e.g., display 120 , processor 114 , etc.).
- a tablet electronic device 100 b is represented in FIG.
- the tablet 100 b includes an optical system 101 , at least one imaging device or sensor 102 , at least one image processor 106 with lens shading sub-module 107 , a power source 122 , among other components (e.g., display 120 , processor 114 , etc.).
- a laptop computer 100 c is represented in FIG. 10 , where the laptop computer 100 c includes an optical system 101 , at least one imaging device or sensor 102 , at least one image processor 106 with lens shading sub-module 107 , a power source 122 , among other components (e.g., display 120 , processor 114 , etc.).
- a digital camera electronic device 100 d is represented in FIG.
- the digital camera 100 d includes an optical system 101 , at least one imaging device or sensor 102 , at least one image processor 106 with lens shading sub-module 107 , a power source 122 , among other components (e.g., display 120 , processor 114 , etc.). Therefore, a variety of platforms of electronic mobile devices may be integrated with the image processor 106 and/or lens shading sub-module 107 of the various embodiments.
- Embodiments of the present disclosure can be implemented in hardware, software, firmware, or a combination thereof.
- the lens shading sub-module 107 is implemented in software or firmware that is stored in a memory and that is executed by a suitable instruction execution system.
- the lens shading sub-module 107 comprises an ordered listing of executable instructions for implementing logical functions and can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions.
- a “computer-readable medium” can be any means that can contain, store, communicate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
- the computer readable medium can be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium.
- the computer-readable medium would include the following: an electrical connection (electronic) having one or more wires, a portable computer diskette (magnetic), a random access memory (RAM) (electronic), a read-only memory (ROM) (electronic), an erasable programmable read-only memory (EPROM or Flash memory) (electronic), an optical fiber (optical), and a portable compact disc read-only memory (CDROM) (optical).
- an electrical connection having one or more wires
- a portable computer diskette magnetic
- RAM random access memory
- ROM read-only memory
- EPROM or Flash memory erasable programmable read-only memory
- CDROM portable compact disc read-only memory
- the lens shading sub-module 107 can be implemented with any or a combination of the following technologies, which are all well known in the art: a discrete logic circuit(s) having logic gates for implementing logic functions upon data signals, an application specific integrated circuit (ASIC) having appropriate combinational logic gates, a programmable gate array(s) (PGA), a field programmable gate array (FPGA), etc.
- ASIC application specific integrated circuit
- PGA programmable gate array
- FPGA field programmable gate array
- each block represents a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
- the functions noted in the blocks may occur out of the order noted in FIGS. 4-6 .
- two blocks shown in succession in FIGS. 4-6 may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved, as will be further clarified hereinbelow.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computing Systems (AREA)
- Studio Devices (AREA)
- Image Processing (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
Abstract
Description
- This application claims priority to copending U.S. provisional application entitled, “Image Capture Device Systems and Methods,” having Ser. No. 61/509,747, filed Jul. 20, 2011, which is entirely incorporated herein by reference.
- The present disclosure is generally related to lens shading correction for imaging devices.
- An increasing number of devices are being produced that are enabled to capture and display images. For example, mobile devices, such as cell phones, are increasingly being equipped with digital cameras to capture images, including still snapshots and motion video images.
- One of the critical problems in small form factor cameras like the ones in cellular phones is the lens shading of the camera, where lens shading is the difference in light transition through the opto-electrical system of the camera, in a way that a same light source that is imaged by the camera at different angles, or places on the image is read by the camera in different values rather than having the same value.
- As a result, lens shading can cause pixel cells in a pixel array of an image sensor located farther away from the center of the pixel array to have a lower pixel signal value when compared to pixel cells located closer to the center of the pixel array even when all pixel cells are exposed to the same illuminant condition. Moreover, pixels with different spectral characteristics have different responses to the lens shading, which may cause appearance of color patches even if the scene is monochromatic. In order to correct the lens shading, long and expensive calibration process is done per camera or mobile product. In many cases, calibration errors are the main source for the reduced image quality in these devices.
- Many aspects of the present disclosure can be better understood with reference to the following drawings. The components in the drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the present disclosure. Moreover, in the drawings, like reference numerals designate corresponding parts throughout the several views.
-
FIG. 1 is a block diagram of an exemplary mobile device with image capture and processing capability in accordance with embodiments of the present disclosure. -
FIG. 2 is a diagram representation of an intensity profile under uniform illumination in accordance with embodiments of the present disclosure. -
FIG. 3 is a diagram representation of intensity profiles of multiple images under non-uniform illumination in accordance with embodiments of the present disclosure. -
FIGS. 4-6 are flow chart diagrams depicting exemplary processes of estimating lens shading in accordance with the disclosed embodiments. -
FIG. 7 is a diagram representation of a surface profile that may be created depending on the scene being photographed and the particular illumination characteristics in accordance with the disclosed embodiments. -
FIGS. 8-11 are block diagrams illustrating examples of a mobile device employing the image processing circuitry ofFIG. 1 . - Embodiments of the present disclosure utilize captured image information to determine a lens shading surface being experienced by an imaging device or camera under current conditions (e.g., optical conditions, lighting conditions, etc.). The lens shading surface is then used to apply a correction to the pixels of captured images to compensate for effects of lens shading.
- Embodiments of the present disclosure relate to image processing performed in devices. For example, embodiments include mobile devices where image processing must be performed with limited resources. Types of such mobile devices include mobile phones (e.g., cell phones), handheld computing devices (e.g., personal digital assistants (PDAs), BLACKBERRY devices, PALM devices, etc.), handheld music players (e.g., APPLE IPODs, MP3 players, etc.), and further types of mobile devices. Such mobile devices may include a camera or image sensor used to capture images, such as still images and video images. The captured images are processed internal to the mobile device.
-
FIG. 1 shows a block diagram of an exemplarymobile device 100 with image capture and processing capability.Mobile device 100 may be a mobile phone, a handheld computing device, a music player, etc. The implementation ofmobile device 100 shown inFIG. 1 is provided for purposes of illustration, and is not intended to be limiting. Embodiments of the present disclosure are intended to cover mobile devices having additional and/or alternative features to those shown formobile device 100 inFIG. 1 . - As shown in
FIG. 1 ,mobile device 100 includes, but is not limited to including, animage sensor device 102, an analog-to-digital (A/D) 104, animage processor 106, aspeaker 108, amicrophone 110, anaudio codec 112, a central processing unit (CPU) 114, a radio frequency (RF)transceiver 116, anantenna 118, adisplay 120, abattery 122, astorage 124, and akeypad 126. These components are typically mounted to or contained in a housing. The housing may further contain a circuit board mounting integrated circuit chips and/or other electrical devices corresponding to these components. Each of these components ofmobile device 100 is described as follows. -
Battery 122 provides power to the components ofmobile device 100 that require power.Battery 122 may be any type of battery, including one or more rechargeable and/or non-rechargeable batteries. - Keypad 126 is a user interface device that includes a plurality of keys enabling a user of
mobile device 100 to enter data, commands, and/or to otherwise interact withmobile device 100.Mobile device 100 may include additional and/or alternative user interface devices tokeypad 126, such as a touch pad, a roller ball, a stick, a click wheel, and/or voice recognition technology. -
Image sensor device 102 is an image capturing device. For example,image sensor device 102 may include an array of photoelectric light sensors, such as a charge coupled device (CCD) or a CMOS (complementary metal-oxide-semiconductor) sensor device.Image sensor device 102 typically includes a two-dimensional array of sensor elements or pixel sensors organized into rows and columns. Each pixel sensor may be identified using pixel sensor coordinates, where “x” is a row number, and “y” is a column number, for any pixel sensor in the array of sensor elements. In embodiments, each pixel sensor ofimage sensor device 102 is configured to be sensitive to a specific color, or color range. In one example, three types of pixel sensors are present, including a first set of pixel sensors that are sensitive to the color red, a second set of pixel sensors or photo-detectors that are sensitive to green, and a third set of pixel sensors that are sensitive to blue.Image sensor device 102 receives light (from optical system 101) corresponding to an image, and generates an analog image signal corresponding to the captured image. Analog image signal includes analog values for each of the pixel sensors. -
Optical system 101 can be a single lens, as shown, but may also be a set of lenses. An image of a scene is formed in visible optical radiation through a shutter onto a two-dimensional surface of theimage sensor 102. An electrical output of the sensor carries an analog signal resulting from scanning individual photo-detectors of the surface of thesensor 102 onto which the image is projected. Signals proportional to the intensity of light striking the individual photo-detectors or pixel sensors are obtained in the output in time sequence, typically by scanning them in a raster pattern, where the rows of photo-detectors are scanned one at a time from left to right, beginning at the top row, to generate a frame of video data from which the image may be reconstructed. - A/
D 104 receives analog image signal, converts analog image signal to digital form, and outputs a digital image signal. Digital image signal includes digital representations of each of the analog values generated by the pixel sensors or photo-detectors, and thus includes a digital representation of the captured image. -
Image processor 106 performs image processing of the digital pixel sensor data received in digital image signal. For example,image processor 106 may be used to generate pixels of all three colors at all pixel positions when a Bayer pattern image is output byimage sensor device 102. - Note that in an embodiment, two or more of
image sensor device 102, A/D 104, andimage processor 106 may be included together in a single IC chip, such as a CMOS chip, particularly whenimage sensor device 102 is a CMOS sensor, or may be in two or more separate chips. -
CPU 114 is shown inFIG. 1 as coupled to each ofimage processor 106,audio codec 112,RF transceiver 116,display 120,storage 124, andkeypad 126.CPU 114 may be individually connected to these components, or one or more of these components may be connected toCPU 114 in a common bus structure. - Microphone 110 and
audio CODEC 112 may be present in some applications ofmobile device 100, such as mobile phone applications and video applications (e.g., where audio corresponding to the video images is recorded). Microphone 110 captures audio, including any sounds such as voice, etc. Microphone 110 may be any type of microphone.Microphone 110 generates an audio signal that is received byaudio codec 112. The audio signal may include a stream of digital data, or analog information that is converted to digital form by an analog-to-digital (ND) converter ofaudio codec 112.Audio codec 112 encodes (e.g., compresses) the received audio of the received audio signal.Audio codec 112 generates an encoded audio data stream that is received byCPU 114. -
CPU 114 receives image processor output signal fromimage processor 106 and receives the audio data stream fromaudio codec 112. In some embodiments,CPU 114 may include an additional image processor. In one embodiment, the additional image processor performs image processing (e.g., image filtering) functions forCPU 114. In an embodiment,CPU 114 includes a digital signal processor (DSP), which may be included in the additional image processor. When present, the DSP may apply special effects to the received audio data (e.g., an equalization function) and/or to the video data.CPU 114 may store and/or buffer video and/or audio data instorage 124.Storage 124 may include any suitable type of storage, including one or more hard disc drives, optical disc drives, FLASH memory devices, etc. In an embodiment,CPU 114 may stream the video and/or audio data toRF transceiver 116, to be transmitted frommobile device 100. - When present,
RF transceiver 116 is configured to enable wireless communications formobile device 116. For example,RF transceiver 116 may enable telephone calls, such as telephone calls according to a cellular protocol.RF transceiver 116 may include a frequency up-converter (transmitter) and down-converter (receiver). For example,RF transceiver 116 may transmit RF signals toantenna 118 containing audio information corresponding to voice of a user ofmobile device 100.RF transceiver 116 may receive RF signals fromantenna 118 corresponding to audio information received from another device in communication withmobile device 100.RF transceiver 116 provides the received audio information toCPU 114. In another example,RF transceiver 116 may be configured to receive television signals formobile device 100, to be displayed bydisplay 120. In another example,RF transceiver 116 may transmit images captured byimage sensor device 102, including still and/or video images, frommobile device 100. In another example,RF transceiver 116 may enable a wireless local area network (WLAN) link (including an IEEE 802.11 WLAN standard link), and/or other type of wireless communication link. -
CPU 114 provides audio data received byRF transceiver 116 toaudio codec 112.Audio codec 112 performs bit stream decoding of the received audio data (if needed) and converts the decoded data to an analog signal.Speaker 108 receives the analog signal, and outputs corresponding sound. -
Image processor 106,audio codec 112, andCPU 114 may be implemented in hardware, software, firmware, and/or any combination thereof. For example,CPU 114 may be implemented as a proprietary or commercially available processor that executes code to perform its functions.Audio codec 112 may be configured to process proprietary and/or industry standard audio protocols.Image processor 106 may be a proprietary or commercially available image signal processing chip, for example. -
Display 120 receives image data fromCPU 114, such as image data generated byimage processor 106. For example,display 120 may be used to display images captured byimage sensor device 102.Display 120 may include any type of display mechanism, including an LCD (liquid crystal display) panel or other display mechanism. In some embodiments, the display may show a preview of images currently being received by thesensor 102, whereby a user may select a control (e.g., shutter button) to begin saving captured image(s) tostorage 124. - Depending on the particular implementation,
image processor 106 formats the image data output in image processor output signal according to a proprietary or known video data format.Display 120 is configured to receive the formatted data, and to display a corresponding captured image. In one example,image processor 106 may output a plurality of data words, where each data word corresponds to an image pixel. A data word may include multiple data portions that correspond to the various color channels for an image pixel. Any number of bits may be used for each color channel, and the data word may have any length. - In some implementations,
display 120 has a display screen that is not capable of viewing the full resolution of the images captured byimage sensor device 102.Image sensor devices 102 may have various sizes, including numbers of pixels in the hundreds of thousand, or millions, such as 1 megapixel (Mpel), 2 Mpels, 4 Mpels, 8 Mpels, etc.).Display 120 may be capable of displaying relatively smaller image sizes. - To accommodate such differences between a size of
display 120 and a size of captured images,CPU 114 may down-size a captured image received fromimage processor 106 before providing the image to display 120, in some embodiments. Such image downsizing may be performed by a subsampling process. In computer graphics, subsampling is a process used to reduce an image size. Subsampling is a type of image scaling, and may alter the appearance of an image or reduce the quantity of information required to store an image. Two types of subsampling are replacement and interpolation. The replacement technique selects a single pixel from a group and uses it to represent the entire group. The interpolation technique uses a statistical sample of the group (such as a mean) to create a new representation of the entire group. - As stated above,
image processor 106 performs processing of digital image signal and generates an image processor output signal. Image processing may include lens shading correction processes performed by alens shading sub-module 107 of animage processor 106, in one embodiment. - Due to lens shading, along a given radius, the farther away from the center of the image sensor, the more attenuated the signal from a given pixel circuit becomes. Moreover, pixels with different spectral characteristics have different responses to the lens shading, which may cause appearance of color patches even if the scene is monochromatic. As such, correction is applied in order to reduce the spatial variation. By applying a gain to attenuated signals according to position, embodiments of the present disclosure perform positional gain adjustment. A function that maps pixel position into a desired correction amount is referred to herein as a lens shading gain adjustment surface. In one embodiment, the surface may consist of an interleaving of several smooth surfaces, each for every pixel type or color. Such a surface may be generated in a CMOS circuit, in one embodiment, and then used for correction of spatial non-uniformity sensitivity of the lens shading correction across pixel positions in the sensor array.
- In conventional processes, shading correction factors for an optical photo system (e.g., lens, image sensor, and/or housing) of a
mobile device 100 are performed by imaging a scene of uniform intensity onto theimage sensor 102 employed by the device being calibrated. Data of the resulting circular, hyperbolic or other variation across the image sensor device (seeFIG. 2 ) are derived by prior measurement of image sensor photo detector signals and a compensating mathematical function or functions are calculated and stored under optimal lab conditions, where imaging a scene of uniform intensity is possible. - Accordingly, by capturing an image of a scene that is known to be a flat illumination field, an actual response may be measured from the image. In some embodiments, a response may be measured in each of the color planes—red, green, blue. The response will show that at its center, the response is strongest and weaker at its edges. Accordingly, pixels corresponding to the edges of the pixel sensor may be multiplied by a relative corrective factor so that the corrected response is flat, after correction. These correction factors may be used for captured image(s) acquired under similar illumination conditions.
- However, illumination conditions change as the environment of the mobile device changes. Further, different lens positions within the
optical system 101 produce different lens shading effects. Accordingly, different correction factors may need to be adjusted to compensate for different light sources within a scene being photographed and/or lens positioning or qualities (e.g., changes in zoom or focus, particular manufacturing accuracies, mounting of lens, filter consistencies, etc.). Therefore, a tuning process to premeasure all the different combinations of potential different positions and light sources will be complicated and not accurate, since for each combination, the actual response measured from a captured image is different. - In contrast, with embodiments of the present disclosure, shading correction factors for an optical photo system, (e.g., the lens, image sensor, and/or housing) of a digital camera or other imaging device, are performed by capturing multiple images of a scene in succession. By analyzing the differences between intensity values of the captured images and the shift in detected intensity values with respect to the pixels, the lens shading effect on the camera may be better understood and represented. Therefore, in one embodiment, capturing two images of a same scene with slight camera shift between image captures provides the reference from which the lens shading gain adjustment can be estimated, as represented in
FIG. 3 . In particular, a lens shading curve or surface may be determined that caused the differences between the captured images. In some embodiments, preview images captured for display on a viewfinder of amobile device 100 may be used to determine the lens shading effect, where upon capturing of an image (e.g., after selecting a shutter button or control), the captured image may be corrected to compensate for the current lens shading effect. For example, a series of low resolution images may be used to preview the image to the photographer before actually taking a high resolution image. Then, data of the resulting circular, hyperbolic or other variation across theimage sensor 102 are derived by dynamic measurement of image sensor photo detector signals and a compensating mathematical function or functions are calculated. - By implementing such a process, lens shading phenomenon may be estimated dynamically and on the fly. Accordingly, during manufacturing and assembly of a camera or mobile device equipped with a camera, resources used for corrective lens shading calibration may be eliminated or significantly reduced.
- While conditions may often exist that allow for capturing of images that can be used to estimate lens shading, in some situations, conditions may not be present to capture an image that allows for sufficient estimation of lens shading. As an example, an image may be captured where an object in a scene is moving (as opposed to the camera moving). Accordingly, a subsequent capturing of the scene is going to be quite dissimilar, since the scene is not static. Further, the
lens shading sub-module 107 and/or theimage processor 106 may detect that illumination types for the captured images are not the same. Accordingly, thelens shading sub-module 107 and/orimage processor 106 may attempt to detect that lighting conditions are stable during capturing of the images that are used to derive the lens shading surface. For example, in a subsequent image, maybe someone turned out the lights in a room where a scene is being captured. In such a situation, themobile device 100 may rely on prestored lens shading correction factors that are suited for a similar illumination type. Thelens shading sub-module 107 and/or theimage processor 106 may add lens shading correction factors to areference database 125 ofstorage 124 periodically as factors are dynamically generated that are determined to be suitable to be used in future uses, where conditions do not allow for suitable correction factors to be newly generated. Also, in some embodiments, lens shading correction factors may be preloaded or stored in thereference database 125 at a manufacturing facility so that the camera is equipped with preliminary lens shading correction factors that can be used, as needed. - In one embodiment, the correction factors may be initially generated responsive to capturing a scene in a flat field (e.g., a white wall with desired illumination) within a closed environment. Also, since the correction factors captured at the manufacturing facility is used as a secondary measure and is not intended to be used as a primary tool for estimating the lens shading, the scene does not necessarily need to be a perfectly flat field, in one embodiment. Therefore, the motion based calculation may be performed in the manufacturing stage with relatively flat surfaces which make measurements faster to obtain and less dependent on measurement conditions. In other words, a wider range of manufacturing conditions are available to be used with systems and processes of the present disclosure.
- As stated above, the
lens shading sub-module 107 and/or theimage processor 106 may detect conditions that do not allow for sufficient measuring of lens shading. Accordingly, in a case where good conditions are not present to measure lens shading, themobile device 100 takes advantage of a lens shading correction factors stored in thereference database 125. Alternatively, in a case where good conditions are present, thelens shading sub-module 107 and/or theimage processor 106 compares the differences between recently captured images and defines a lens shading surface (on the fly) by considering the differences between the captured images. Then, a lens surface gain adjustment surface may be chosen that matches the intensity variation or variations reflected by the lens shading surface across the captured images that are to be corrected. Themobile device 100 may also store the lens surface gain adjustment surface in thereference database 125 for later use, as needs arise. - Additionally, one embodiment of dynamic lens shading calculation utilizes gradients or differences between captured image areas to define the lens shading surface. From the lens shading surface, corrections may be prepared to compensate for the effects of lens shading. To determine the lens shading surface, inter image consideration and/or intra image consideration are evaluated. For inter image consideration, in one embodiment, two images are captured and a ratio is calculated between a pixel value in an object in one image and the pixel value of the same place on the same object in a second image (that was taken after camera motion). The calculated ratio represents a local gradient of the lens shading at the direction of the camera or mobile device movement.
- For intra image consideration, areas with similar colors in the image and/or similar intensities may be used to estimate a portion of lens shading surface from differences in the areas of the same image. Accordingly, in one embodiment, via the
lens shading sub-module 107 and/or theimage processor 106, a second ratio is calculated, between a pixel value of an object in an image and the pixel value of another object that has similar luminosity in the image. The calculated second ratio represents the gradient of the lens shading between these two points. - For inter image consideration, in order to find matching pixel values in the two images, the images are geometrically matched with one another. In one embodiment, global motion is detected from the two images, where motion parameters may include translation and transformation (e.g., affine or perspective). Then, areas having local motion that is different from the global motion are determined. These areas may have had an object moving in the camera field or scene being photographed. In one embodiment, areas having local motion are not analyzed for gradients. In such a situation, a correction factor may be determined by extrapolation from other areas (not subject to the local motion) in single image data. Also, if intra image analysis is not available on the single image (e.g., the size of the image exceeds a threshold), a captured image may be compensated using stored lens shading correction factors in the
reference database 125 instead of determining correction factors dynamically, in one embodiment. - The foregoing processes may be iterative in some embodiments, where after an estimation of lens shading, processes may be repeated to determine a new estimation of lens shading. In some embodiments, the motion detection is performed on full resolution image(s). After the motion detection is estimated, then a first image is transformed to match the second image geometrically and matching pixel values are attempted to be found.
- Gradients and pixel ratios may be affected with noise and inaccuracies. For example, possible sources of noise include pixel noise (i.e., electronic and photonic noise in the process of converting luminance to digital luminance count), errors in estimation of motion, changing light condition between the two images, an object has different reflection into the camera in different positions, an incorrect assumption on similar luminance, where in reality two objects being compared have different luminance, etc. To avoid or reduce such inaccuracies, measures may be taken to calculate the gradients in ‘flat’ areas where there are no rapid changes in luminance (e.g., edges). For example, areas near edges in the captured images may be masked out.
- Then, the lens shading surface may be calculated from the local gradients, in some embodiments. In one embodiment, a model of a lens shading surface may be computed or estimated that matches the measured gradients in the captured images. Accordingly, parameters of the model may be adjusted until an optimal result is determined from all the tested results or trials.
- For example, one possible technique determines an optimized analytical parametric surface by selecting a surface model equation (e.g., polynomial, Gaussian, etc.) and calculating the parameters of the lens shading surface model that yield minimal difference between the surface gradient (according to the model) and measured gradients. Another possible technique, among others, determines an optimized singular value decomposition (SVD) surface composition by select largest surface eigenvectors and calculating the coefficients for the surface composition that yield minimal difference between the surface gradient (according to the model) and measured gradients.
- To illustrate, a Gaussian model may be used to model the lens shading being experienced by the
mobile device 100 and values of parameters for the model may be adjusted until an optimal match is found between the model and the measured values. Instead of a Gaussian model, other models may be considered, such as a polynomial surface model, in some embodiments. Alternatively, an SVD process may be used. - To match the pixel values, color layers may be estimated using a variety of techniques, including direct layer estimation, independent color layer estimation, and normalized color domain estimation. For instance, measurements may be made in normalized color domain(s) where possible, since, typically, luminosity changes more rapidly than normalized color in images. Additional measurements include calculating small number of surface model parameters with large number of measurements; limiting parameter space to a predefined space according to measurements of sample units in different light (spectra and intensity) conditions; averaging measurements before calculating gradients (e.g., by down sampling the image); calculating global gradients rather than using only local gradients; and segmenting the image to a small number of color segments, and estimating global gradients on each one. Also, in some embodiments, possible effects of light flickering during capturing of the images may be addressed and removed from the images.
- Accordingly, in one embodiment, one technique of matching pixels involves direct layer estimation, where local gradients are calculated. In particular, in the inter image consideration, differences between the images represents the local gradients. In the intra image consideration, color segments are derived and differences between like color segments are representative of local gradients. An optimized lens shading surface is modeled which matches with local gradients at measured points. Accordingly, a model surface may be computed that fits the local gradients of each of the color segments. From inter and/or intra image considerations, information may be obtained on the gradients at each corresponding sensor point of the image, where the gradients are representative of the lens shading phenomenon. By taking gradients from inter image and/or intra image calculations and optimizing according to the two respective sets, a lens shading surface can be estimated and applied to a captured image.
- In some embodiments, lens shading correction for color image sensors may be defined for each of a plurality of color channels in order to correct for lens shading variations across color channels. Further, different techniques or models may be used by the
lens shading sub-module 107 and/or theimage processor 106 for the different color channels. As an example, a green channel may determine a best fit of color plan parameters for an SVD model and a red/blue channel may utilize direct layer optimization. In general, once a lens shading surface has been determined, then lens shading can be corrected using standard correction methods. - Further, with estimation of the lens shading surface or curve, other image quality processes may be benefited. For example, by knowing the lens shading surface, accurate white balancing may be computed. As discussed above, different light sources create different lens shading. Therefore, by determining the lens shading correctly, an unbiased measurement for the white balance can be provided by the
mobile device 100. - To illustrate, a white balance may be selected that is appropriate to generate the estimated lens shading curve, where different illuminants have different optical wavelength responses and hence may result in different lens shading surfaces. The
image processor 106 or an auto-white-balance (AWB) sub-module of theimage processor 106 may then determine the type of illuminant used to generate the lens shading curve that has been estimated and subsequently use this information to correct white balance levels in captured image(s). - In addition to performing accurate white balancing, more robust motion estimation may also be implemented responsive to the lens shading estimation by the
lens shading sub-module 107 and/or theimage processor 106. For example, from analysis performed in determining the lens shading phenomenon, global motion can be estimated by calculating a mean difference between image areas in the two images captured in a sequence, where the difference corresponds to the same object moving across one image to a different place in the second image. Since the second image has different lens shading characteristics as compared to the first, it also has a different mean brightness as compared to the first image. Accordingly, instead of examining correlations between the images in order to determine a motion vector that can be used to estimate camera motion, statistics used to determine the lens shading can also be used to estimate the camera motion. Therefore, differences in the statistics between the images may be used to calculate the camera or global motion. - As an example, the first image may feature a white ball at a left corner of the frame. The second image may feature the ball at a position to the right of the left corner, where the ball has a brighter intensity than in the first frame. The lens shading for the
mobile device 100 has been determined, where the lens shading is found to traverse along one side of theimage sensor 102 to the other side. Accordingly, at a pixel sensor corresponding to the left corner of the image, the average intensity value is going to be lower than an average intensity value at a pixel sensor to the right. Therefore, based on the lens shading statistics, it is expected that the intensity values of pixels corresponding to the ball will change based on the lens shading as the ball moves to the right in subsequent images. Therefore, by considering the global and local statistics compiled on the captured images, an object having a different intensity value than a prior value in a prior frame may be determined to be the same object in motion due to the lens shading phenomenon (that has been previously computed). As a result, motion can be analyzed and determined. -
FIG. 4 illustrates a flow chart depicting a process of estimating lens shading in accordance with the disclosed embodiments. Lens shading estimation, in accordance withFIG. 4 , is performed by a pixel processing pipeline of image processor 106 (FIG. 1 ) (e.g., lens shading module 107) dynamically and, if necessary, stored references surface(s) acquired during a calibration operation. Theimage processor 106 has access to the stored gain adjustment surface(s) and scene adjustment surface(s) in, for example, reference database 125 (FIG. 1 ) or other memory storage. - When an image is generally captured by a digital camera, the image is not captured in a known illumination type or a reference is not available for the current illumination type. The captured image is a natural image, where the
lens shading sub-module 107 of animage processor 106 does not have any preset knowledge of the illumination type and may not therefore have a reference correction surface prestored according to the current illumination type. While in conventional processes, lens shading correction factors may be solely derived from capturing a scene of a flat field to create an image that contains even color and intensity values except for effects from lens shading, natural images taken by the camera normally have no such flat areas in the image. Accordingly, embodiments of the present disclosure analyze the differences in light transition from natural images captured by themobile device 100. - As such, embodiments of the present disclosure take advantage of capturing multiple images in succession and determining a lens shading correction or gain adjustment surface for the present illumination conditions. In particular, since the images are captured by the
same image sensor 102 of themobile device 100, the images are captured using the same optics. Accordingly, intensity values of pixels for the multiple images should ideally be the same, and illumination levels for the captured images should also be the same, since the images are captured within parts of a second from one another, in some embodiments. In practice, themobile device 100 may move or shift during the capturing of one image to the next. Also, due to lens shading, the intensity values of the pixels may not be exactly the same. - Accordingly, by analyzing the differences between intensity values of the captured images and the shift in detected intensity values with respect to the pixels of the captured images, the lens shading effect on the
mobile device 100 may be better understood and represented. Therefore, in one embodiment, capturing two images of a same scene with slight camera shift between image captures provides the reference from which the corrective lens shading surface can be estimated. In particular, a lens shading curve or surface may be determined that caused the differences between the captured images. In some embodiments, preview images captured for display on a viewfinder of a camera may be used to determine the lens shading effect, where upon capturing of an image (e.g., after selecting a shutter button or control), the captured image may be corrected to compensate for the current lens shading effect. - Lens shading estimation begins with capturing a sequence of images at
step 402. Atstep 404, local gradients of the captured images are determined. As noted above, the local gradients may be determined in a number of different ways. In some embodiments, techniques estimate the local gradients from inter image consideration and/or intra image consideration. - For example, multiple images may be captured and inter image analysis may be performed on the captured images. In addition, intra image analysis may be performed on each captured image. The intra image analysis may be performed in concert with the inter image analysis or apart from the inter image analysis, in some embodiments, based on recognition of a particular condition. For instance, a sequence of images may have been subjected to a level of local motion in the scene being photographed that does not allow for adequate statistics to be obtained. Alternatively, adequate statistics for global motion may not be able to be obtained which prohibits one or both approaches from being used or causes prestored statistics or factors in the
reference database 125 to be used instead. As an example, a single image may not contain multiple areas with similar colors or intensity. - Referring back to
FIG. 4 , atstep 406, a model of a lens shading surface is compared to the measured gradients from the captured images and the deviation between the two is saved for later comparison. The process proceeds to step 408 where the model is adjusted and compared again with the measured gradients and new deviation(s) are computed and compared with the saved values. The model having the set of smallest deviation values is maintained as the optimum model for the trials previously computed. The process then repeats until an optimum model is determined. - At
step 410, a lens shading gain adjustment surface is calculated from the lens shading surface. For embodiments that derived a lens shading surface for each color channel of the image sensor, the lens shading gain adjustment surface may also be determined for each color channel. In other words, lens shading correction for color image sensors may be defined for each of a plurality of color channels in order to correct for lens shading variations across color channels. For these color image sensors, the lens shading gain adjustment surface is applied to the pixels of the corresponding color channel during post-image capture processing to correct for variations in pixel value due to the spatial location of the pixels in the pixel array. In some embodiments, monochrome image sensors, on the other hand, apply a single gain adjustment surface to all pixels of a pixel array. Likewise, color image sensors may use a single lens shading gain adjustment surface across all color channels, in some embodiments. - To illustrate, a pixel value located at x, y pixel coordinates may be multiplied by the lens surface gain adjustment values at the x, y pixel coordinates on the lens surface gain adjustment surface. Accordingly, at step 412, lens shading correction is performed on the pixel values of the captured image using the lens surface gain adjustment surface(s).
- In some embodiments discussed above, a
lens shading module 107 is provided to estimate the effects of lens shading and to possibly correct the gain of individual pixels in captured images. Thelens shading module 107 may, for example, be implemented as software or firmware. - The
lens shading module 107 may be implemented inimage processor 106 as software designed to implement lens shading correction, in one embodiment. Alternatively,lens shading module 107 may be implemented inimage sensor 102, in one embodiment. - In some embodiments, the
lens shading module 107 utilizes lens shading correction surfaces to determine gain correction for individual pixels to account for lens shading. An individual correction or gain adjustment surface may, for example, comprise parameters to calculate gain correction although it will also be understood that in some cases a correction table may be stored. Positional gain adjustments across the pixel array can be provided as digital gain values, one corresponding to each of the pixels. It may happen that the further away a pixel is from the center of the pixel array, the more gain is needed to be applied to the pixel value. The set of digital gain values for the entire pixel array forms a lens shading gain adjustment surface. - In some embodiments, only a relatively few gain values are preferably stored, in order to minimize the amount of memory required to store correction data, and a determination of values between the stored values is obtained, during the image modification process, by a form of interpolation. In order to avoid noticeable discontinuities in the image intensity, these few data values are preferably fit to a smooth curve or curves that are chosen to match the intensity variation or variations across the image that are to be corrected.
- Also, in some embodiments, the digital gain values are computed from an expression that approximates the desired lens shading gain adjustment surface, since the number of parameters needed to generate an approximate surface is generally significantly lower than the numbers of parameters needed to store the digital gain values for every pixel location. Some
image sensors 102 have built-in lens shading operation on-chip, while other image sensors rely on a separate image processing imaging chip for this operation. -
FIG. 5 is a flowchart representation of a method in accordance with one embodiment of the present disclosure. In particular, a method is presented for use in conjunction with one or more of the functions and features described in conjunction withFIGS. 1-3 . Instep 502, a lens shading surface is continually calculated for preview images being displayed by amobile device 100. When the calculated lens shading surface is determined to be satisfactory (e.g., no local motion detected, illumination of scene deemed to be stable, etc.), the lens shading surface is stored in areference database 125, instep 504. Accordingly, upon selection to capture an image, a newly calculated lens shading surface is used to compensate for lens shading effects in the captured image, instep 506 if the newly calculated lens shading surface was determined to be satisfactory. Otherwise, a lens shading surface prestored in thereference database 125 is used to compensate for lens shading effects in the captured image, instep 508. - Next,
FIG. 6 is a flowchart representation of a method in accordance with one embodiment of the present disclosure. In particular, a method is presented for use in conjunction with one or more of the functions and features described in conjunction withFIGS. 1-3 . Instep 602, two or more images of the same scene, containing objects, are captured, where the images have some displacement between themselves. Instep 604, the relative displacement between the images is analyzed based on tracking of image areas with details or discernable objects. Instep 606, for each point and for each color plane in the image, a ratio between an intensity level at the first image and the level of the same object point at the second image is calculated. If there were no lens shading, the values would be the same. Instep 608, the differences to the values are normalized, and instep 610, from the calculated ratios, a difference surface profile is created by possibly filtering the results and interpolating data points or values, as needed, to produce a smooth lens shading surface profile. As a point of reference,FIG. 7 is a representative lens shading surface profile that may be created depending on the scene being photographed and the particular illumination characteristics. Instep 612, after the lens shading surface is extracted or generated from a current scene, the lens shading surface is used to indicate the light source illuminating the scene and supply an unbiased measurement for the white balance. - While conventional lens correction processes are characterized by poor performances; inaccurate estimation of spectra from white balance (e.g., different spectra can have same white balance but different lens shading); inaccurate measurement extrapolation during manufacturing; costly tuning or calibration process, limited in applicability to fixed focus lenses (e.g., fixed optical patterns), etc., dynamic lens shading estimation and correction methods disclosed herein improve upon the foregoing drawbacks. As flex focusing or zoom controls gain popularity with digital cameras and become more complicated, dynamic estimation of lens shading based on current image captures and not preset measurements will provide improved accuracy over current conventional processes. Contemplated advantages include improved image quality with low quality lenses in cellular phones and other camera applications; shorter time to market; shorter calibration process of the camera in the product development stage; and reduction of manufacturing cost to the camera vendor due to shorter or non-calibration process per sensor.
-
Mobile device 100 may comprise a variety of platforms in various embodiments. To illustrate, a smart phoneelectronic device 100 a is represented inFIG. 8 , where thesmart phone 100 a includes anoptical system 101, at least one imaging device orsensor 102, at least oneimage processor 106 withlens shading sub-module 107, apower source 122, among other components (e.g.,display 120,processor 114, etc.). Further, a tabletelectronic device 100 b is represented inFIG. 9 , where thetablet 100 b includes anoptical system 101, at least one imaging device orsensor 102, at least oneimage processor 106 withlens shading sub-module 107, apower source 122, among other components (e.g.,display 120,processor 114, etc.). Then, alaptop computer 100 c is represented inFIG. 10 , where thelaptop computer 100 c includes anoptical system 101, at least one imaging device orsensor 102, at least oneimage processor 106 withlens shading sub-module 107, apower source 122, among other components (e.g.,display 120,processor 114, etc.). Also, a digital cameraelectronic device 100 d is represented inFIG. 11 , where thedigital camera 100 d includes anoptical system 101, at least one imaging device orsensor 102, at least oneimage processor 106 withlens shading sub-module 107, apower source 122, among other components (e.g.,display 120,processor 114, etc.). Therefore, a variety of platforms of electronic mobile devices may be integrated with theimage processor 106 and/orlens shading sub-module 107 of the various embodiments. - Embodiments of the present disclosure can be implemented in hardware, software, firmware, or a combination thereof. In some embodiments, the
lens shading sub-module 107 is implemented in software or firmware that is stored in a memory and that is executed by a suitable instruction execution system. In some embodiments, thelens shading sub-module 107 comprises an ordered listing of executable instructions for implementing logical functions and can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. In the context of this document, a “computer-readable medium” can be any means that can contain, store, communicate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The computer readable medium can be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. More specific examples (a nonexhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic) having one or more wires, a portable computer diskette (magnetic), a random access memory (RAM) (electronic), a read-only memory (ROM) (electronic), an erasable programmable read-only memory (EPROM or Flash memory) (electronic), an optical fiber (optical), and a portable compact disc read-only memory (CDROM) (optical). - If implemented in hardware, as in an alternative embodiment, the
lens shading sub-module 107 can be implemented with any or a combination of the following technologies, which are all well known in the art: a discrete logic circuit(s) having logic gates for implementing logic functions upon data signals, an application specific integrated circuit (ASIC) having appropriate combinational logic gates, a programmable gate array(s) (PGA), a field programmable gate array (FPGA), etc. - The flow chart of
FIGS. 4-6 shows the architecture, functionality, and operation of a possible implementation of theimage processor 106 and relevant sub-modules. In this regard, each block represents a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that in some alternative implementations, the functions noted in the blocks may occur out of the order noted inFIGS. 4-6 . For example, two blocks shown in succession inFIGS. 4-6 may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved, as will be further clarified hereinbelow. - It should be emphasized that the above-described embodiments of the present disclosure are merely possible examples of implementations, merely set forth for a clear understanding of the principles of the disclosure. Many variations and modifications may be made to the above-described embodiments without departing substantially from the spirit and principles of the disclosure. All such modifications and variations are intended to be included herein within the scope of this disclosure and protected by the following claims.
Claims (20)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US13/330,047 US20130021484A1 (en) | 2011-07-20 | 2011-12-19 | Dynamic computation of lens shading |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201161509747P | 2011-07-20 | 2011-07-20 | |
| US13/330,047 US20130021484A1 (en) | 2011-07-20 | 2011-12-19 | Dynamic computation of lens shading |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20130021484A1 true US20130021484A1 (en) | 2013-01-24 |
Family
ID=47555520
Family Applications (9)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US13/232,052 Abandoned US20130021512A1 (en) | 2011-07-20 | 2011-09-14 | Framing of Images in an Image Capture Device |
| US13/232,045 Abandoned US20130021488A1 (en) | 2011-07-20 | 2011-09-14 | Adjusting Image Capture Device Settings |
| US13/235,975 Abandoned US20130021504A1 (en) | 2011-07-20 | 2011-09-19 | Multiple image processing |
| US13/245,941 Abandoned US20130021489A1 (en) | 2011-07-20 | 2011-09-27 | Regional Image Processing in an Image Capture Device |
| US13/281,521 Abandoned US20130021490A1 (en) | 2011-07-20 | 2011-10-26 | Facial Image Processing in an Image Capture Device |
| US13/313,352 Active 2032-01-11 US9092861B2 (en) | 2011-07-20 | 2011-12-07 | Using motion information to assist in image processing |
| US13/313,345 Abandoned US20130022116A1 (en) | 2011-07-20 | 2011-12-07 | Camera tap transcoder architecture with feed forward encode data |
| US13/330,047 Abandoned US20130021484A1 (en) | 2011-07-20 | 2011-12-19 | Dynamic computation of lens shading |
| US13/413,863 Abandoned US20130021491A1 (en) | 2011-07-20 | 2012-03-07 | Camera Device Systems and Methods |
Family Applications Before (7)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US13/232,052 Abandoned US20130021512A1 (en) | 2011-07-20 | 2011-09-14 | Framing of Images in an Image Capture Device |
| US13/232,045 Abandoned US20130021488A1 (en) | 2011-07-20 | 2011-09-14 | Adjusting Image Capture Device Settings |
| US13/235,975 Abandoned US20130021504A1 (en) | 2011-07-20 | 2011-09-19 | Multiple image processing |
| US13/245,941 Abandoned US20130021489A1 (en) | 2011-07-20 | 2011-09-27 | Regional Image Processing in an Image Capture Device |
| US13/281,521 Abandoned US20130021490A1 (en) | 2011-07-20 | 2011-10-26 | Facial Image Processing in an Image Capture Device |
| US13/313,352 Active 2032-01-11 US9092861B2 (en) | 2011-07-20 | 2011-12-07 | Using motion information to assist in image processing |
| US13/313,345 Abandoned US20130022116A1 (en) | 2011-07-20 | 2011-12-07 | Camera tap transcoder architecture with feed forward encode data |
Family Applications After (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US13/413,863 Abandoned US20130021491A1 (en) | 2011-07-20 | 2012-03-07 | Camera Device Systems and Methods |
Country Status (1)
| Country | Link |
|---|---|
| US (9) | US20130021512A1 (en) |
Cited By (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20120249799A1 (en) * | 2011-03-30 | 2012-10-04 | Yukiko Shibata | Image capture device, pixel output level compensation method for same, infrared camera system, and interchangeable lens system |
| US9066015B2 (en) | 2011-04-28 | 2015-06-23 | Nippon Avionics Co., Ltd. | Image capture device, method for generating image, infrared camera system, and interchangeable lens system |
| US9270959B2 (en) | 2013-08-07 | 2016-02-23 | Qualcomm Incorporated | Dynamic color shading correction |
| US9973672B2 (en) | 2013-12-06 | 2018-05-15 | Huawei Device (Dongguan) Co., Ltd. | Photographing for dual-lens device using photographing environment determined using depth estimation |
| JPWO2021256504A1 (en) * | 2020-06-17 | 2021-12-23 | ||
| US20220051369A1 (en) * | 2020-08-12 | 2022-02-17 | Realtek Semiconductor Corp. | Method and system for compensating image having fixed pattern noise |
| US20220053144A1 (en) * | 2020-08-14 | 2022-02-17 | Raytheon Company | Parallelization technique for gain map generation using overlapping sub-images |
| CN114079735A (en) * | 2020-08-19 | 2022-02-22 | 瑞昱半导体股份有限公司 | Image compensation system for fixed image noise |
| US20230188860A1 (en) * | 2021-12-09 | 2023-06-15 | Fotonation Limited | Vehicle occupant monitoring system including an image acquisition device with a rolling shutter image sensor |
Families Citing this family (79)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10116839B2 (en) * | 2014-08-14 | 2018-10-30 | Atheer Labs, Inc. | Methods for camera movement compensation for gesture detection and object recognition |
| DE69735727T2 (en) | 1997-01-27 | 2007-01-04 | Peter D. Louisville Haaland | PROCESS FOR REDUCING THE REFLECTION OF OPTICAL SUBSTRATES |
| KR101796481B1 (en) * | 2011-11-28 | 2017-12-04 | 삼성전자주식회사 | Method of eliminating shutter-lags with low power consumption, camera module, and mobile device having the same |
| US9118876B2 (en) * | 2012-03-30 | 2015-08-25 | Verizon Patent And Licensing Inc. | Automatic skin tone calibration for camera images |
| US9472005B1 (en) * | 2012-04-18 | 2016-10-18 | Amazon Technologies, Inc. | Projection and camera system for augmented reality environment |
| US9619036B2 (en) | 2012-05-11 | 2017-04-11 | Comcast Cable Communications, Llc | System and methods for controlling a user experience |
| US9438805B2 (en) * | 2012-06-08 | 2016-09-06 | Sony Corporation | Terminal device and image capturing method |
| US8957973B2 (en) * | 2012-06-11 | 2015-02-17 | Omnivision Technologies, Inc. | Shutter release using secondary camera |
| US20130335587A1 (en) * | 2012-06-14 | 2013-12-19 | Sony Mobile Communications, Inc. | Terminal device and image capturing method |
| TWI498771B (en) * | 2012-07-06 | 2015-09-01 | Pixart Imaging Inc | Glasses that can recognize gestures |
| KR101917650B1 (en) * | 2012-08-03 | 2019-01-29 | 삼성전자 주식회사 | Method and apparatus for processing a image in camera device |
| US9554042B2 (en) * | 2012-09-24 | 2017-01-24 | Google Technology Holdings LLC | Preventing motion artifacts by intelligently disabling video stabilization |
| US9286509B1 (en) * | 2012-10-19 | 2016-03-15 | Google Inc. | Image optimization during facial recognition |
| JP2014086849A (en) * | 2012-10-23 | 2014-05-12 | Sony Corp | Content acquisition device and program |
| US9060127B2 (en) * | 2013-01-23 | 2015-06-16 | Orcam Technologies Ltd. | Apparatus for adjusting image capture settings |
| JP2014176034A (en) * | 2013-03-12 | 2014-09-22 | Ricoh Co Ltd | Video transmission device |
| US9552630B2 (en) * | 2013-04-09 | 2017-01-24 | Honeywell International Inc. | Motion deblurring |
| US9595083B1 (en) * | 2013-04-16 | 2017-03-14 | Lockheed Martin Corporation | Method and apparatus for image producing with predictions of future positions |
| US9916367B2 (en) | 2013-05-03 | 2018-03-13 | Splunk Inc. | Processing system search requests from multiple data stores with overlapping data |
| US8738629B1 (en) * | 2013-05-03 | 2014-05-27 | Splunk Inc. | External Result Provided process for retrieving data stored using a different configuration or protocol |
| WO2014190468A1 (en) | 2013-05-27 | 2014-12-04 | Microsoft Corporation | Video encoder for images |
| US10796617B2 (en) * | 2013-06-12 | 2020-10-06 | Infineon Technologies Ag | Device, method and system for processing an image data stream |
| US9529513B2 (en) * | 2013-08-05 | 2016-12-27 | Microsoft Technology Licensing, Llc | Two-hand interaction with natural user interface |
| DE112014004664T5 (en) * | 2013-10-09 | 2016-08-18 | Magna Closures Inc. | DISPLAY CONTROL FOR VEHICLE WINDOW |
| US10931866B2 (en) | 2014-01-05 | 2021-02-23 | Light Labs Inc. | Methods and apparatus for receiving and storing in a camera a user controllable setting that is used to control composite image generation performed after image capture |
| US9245347B2 (en) * | 2014-01-30 | 2016-01-26 | Adobe Systems Incorporated | Image Cropping suggestion |
| US9251594B2 (en) * | 2014-01-30 | 2016-02-02 | Adobe Systems Incorporated | Cropping boundary simplicity |
| US10121060B2 (en) * | 2014-02-13 | 2018-11-06 | Oath Inc. | Automatic group formation and group detection through media recognition |
| KR102128468B1 (en) * | 2014-02-19 | 2020-06-30 | 삼성전자주식회사 | Image Processing Device and Method including a plurality of image signal processors |
| CN103841328B (en) * | 2014-02-27 | 2015-03-11 | 深圳市中兴移动通信有限公司 | Low-speed shutter shooting method and device |
| US10136140B2 (en) | 2014-03-17 | 2018-11-20 | Microsoft Technology Licensing, Llc | Encoder-side decisions for screen content encoding |
| US20150297986A1 (en) * | 2014-04-18 | 2015-10-22 | Aquifi, Inc. | Systems and methods for interactive video games with motion dependent gesture inputs |
| US10104316B2 (en) * | 2014-05-08 | 2018-10-16 | Sony Corporation | Information processing device and information processing method |
| US10051196B2 (en) * | 2014-05-20 | 2018-08-14 | Lenovo (Singapore) Pte. Ltd. | Projecting light at angle corresponding to the field of view of a camera |
| US10460544B2 (en) * | 2014-07-03 | 2019-10-29 | Brady Worldwide, Inc. | Lockout/tagout device with non-volatile memory and related system |
| WO2016019450A1 (en) * | 2014-08-06 | 2016-02-11 | Warrian Kevin J | Orientation system for image recording devices |
| KR102225947B1 (en) * | 2014-10-24 | 2021-03-10 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
| CN105549302B (en) | 2014-10-31 | 2018-05-08 | 国际商业机器公司 | The coverage suggestion device of photography and vedio recording equipment |
| US10334158B2 (en) * | 2014-11-03 | 2019-06-25 | Robert John Gove | Autonomous media capturing |
| US20160148648A1 (en) * | 2014-11-20 | 2016-05-26 | Facebook, Inc. | Systems and methods for improving stabilization in time-lapse media content |
| US10924743B2 (en) | 2015-02-06 | 2021-02-16 | Microsoft Technology Licensing, Llc | Skipping evaluation stages during media encoding |
| US11721414B2 (en) | 2015-03-12 | 2023-08-08 | Walmart Apollo, Llc | Importing structured prescription records from a prescription label on a medication package |
| US12169944B2 (en) | 2015-03-21 | 2024-12-17 | Mine One Gmbh | Image reconstruction for virtual 3D |
| US12322071B2 (en) | 2015-03-21 | 2025-06-03 | Mine One Gmbh | Temporal de-noising |
| US10853625B2 (en) | 2015-03-21 | 2020-12-01 | Mine One Gmbh | Facial signature methods, systems and software |
| WO2016154123A2 (en) | 2015-03-21 | 2016-09-29 | Mine One Gmbh | Virtual 3d methods, systems and software |
| US20160316220A1 (en) * | 2015-04-21 | 2016-10-27 | Microsoft Technology Licensing, Llc | Video encoder management strategies |
| EP3295372A4 (en) * | 2015-05-12 | 2019-06-12 | Mine One GmbH | METHODS, SYSTEMS, AND FACIAL SIGNATURE SOFTWARE |
| US10165186B1 (en) * | 2015-06-19 | 2018-12-25 | Amazon Technologies, Inc. | Motion estimation based video stabilization for panoramic video from multi-camera capture device |
| US10447926B1 (en) | 2015-06-19 | 2019-10-15 | Amazon Technologies, Inc. | Motion estimation based video compression and encoding |
| US10136132B2 (en) | 2015-07-21 | 2018-11-20 | Microsoft Technology Licensing, Llc | Adaptive skip or zero block detection combined with transform size decision |
| EP3136726B1 (en) * | 2015-08-27 | 2018-03-07 | Axis AB | Pre-processing of digital images |
| US9648223B2 (en) * | 2015-09-04 | 2017-05-09 | Microvision, Inc. | Laser beam scanning assisted autofocus |
| US9456195B1 (en) | 2015-10-08 | 2016-09-27 | Dual Aperture International Co. Ltd. | Application programming interface for multi-aperture imaging systems |
| US9578221B1 (en) * | 2016-01-05 | 2017-02-21 | International Business Machines Corporation | Camera field of view visualizer |
| JP6514140B2 (en) * | 2016-03-17 | 2019-05-15 | 株式会社東芝 | Imaging support apparatus, method and program |
| EP3466051A1 (en) | 2016-05-25 | 2019-04-10 | GoPro, Inc. | Three-dimensional noise reduction |
| US9639935B1 (en) | 2016-05-25 | 2017-05-02 | Gopro, Inc. | Apparatus and methods for camera alignment model calibration |
| WO2017205597A1 (en) * | 2016-05-25 | 2017-11-30 | Gopro, Inc. | Image signal processing-based encoding hints for motion estimation |
| US10140776B2 (en) * | 2016-06-13 | 2018-11-27 | Microsoft Technology Licensing, Llc | Altering properties of rendered objects via control points |
| US9851842B1 (en) * | 2016-08-10 | 2017-12-26 | Rovi Guides, Inc. | Systems and methods for adjusting display characteristics |
| US10366122B2 (en) * | 2016-09-14 | 2019-07-30 | Ants Technology (Hk) Limited. | Methods circuits devices systems and functionally associated machine executable code for generating a searchable real-scene database |
| CN110084712A (en) * | 2016-10-26 | 2019-08-02 | 奥康科技有限公司 | For analyzing image and providing the wearable device and method of feedback |
| CN106550227B (en) * | 2016-10-27 | 2019-02-22 | 成都西纬科技有限公司 | A kind of image saturation method of adjustment and device |
| US10477064B2 (en) | 2017-08-21 | 2019-11-12 | Gopro, Inc. | Image stitching with electronic rolling shutter correction |
| US10791265B1 (en) | 2017-10-13 | 2020-09-29 | State Farm Mutual Automobile Insurance Company | Systems and methods for model-based analysis of damage to a vehicle |
| US11587046B1 (en) | 2017-10-25 | 2023-02-21 | State Farm Mutual Automobile Insurance Company | Systems and methods for performing repairs to a vehicle |
| CN111345036A (en) * | 2017-10-26 | 2020-06-26 | 京瓷株式会社 | Image processing device, imaging device, driving assistance device, moving body, and image processing method |
| KR20190087977A (en) * | 2017-12-25 | 2019-07-25 | 저텍 테크놀로지 컴퍼니 리미티드 | Laser beam scanning display and augmented reality glasses |
| KR102683294B1 (en) | 2018-09-10 | 2024-07-10 | 삼성전자주식회사 | Electronic apparatus for recognizing an object and controlling method thereof |
| WO2020084999A1 (en) * | 2018-10-25 | 2020-04-30 | ソニー株式会社 | Image processing device, image processing method, and program |
| US10771696B2 (en) * | 2018-11-26 | 2020-09-08 | Sony Corporation | Physically based camera motion compensation |
| WO2020142471A1 (en) * | 2018-12-30 | 2020-07-09 | Sang Chul Kwon | Foldable mobile phone |
| US11289078B2 (en) * | 2019-06-28 | 2022-03-29 | Intel Corporation | Voice controlled camera with AI scene detection for precise focusing |
| US10861127B1 (en) | 2019-09-17 | 2020-12-08 | Gopro, Inc. | Image and video processing using multiple pipelines |
| US11064118B1 (en) | 2019-12-18 | 2021-07-13 | Gopro, Inc. | Systems and methods for dynamic stabilization adjustment |
| US11006044B1 (en) * | 2020-03-03 | 2021-05-11 | Qualcomm Incorporated | Power-efficient dynamic electronic image stabilization |
| US11284157B2 (en) * | 2020-06-11 | 2022-03-22 | Rovi Guides, Inc. | Methods and systems facilitating adjustment of multiple variables via a content guidance application |
| WO2023150800A1 (en) * | 2022-02-07 | 2023-08-10 | Gopro, Inc. | Methods and apparatus for real-time guided encoding |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20030063816A1 (en) * | 1998-05-27 | 2003-04-03 | Industrial Technology Research Institute, A Taiwanese Corporation | Image-based method and system for building spherical panoramas |
| US20060268131A1 (en) * | 2002-06-21 | 2006-11-30 | Microsoft Corporation | System and method for camera calibration and images stitching |
| US20100194851A1 (en) * | 2009-02-03 | 2010-08-05 | Aricent Inc. | Panorama image stitching |
Family Cites Families (45)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR100325253B1 (en) * | 1998-05-19 | 2002-03-04 | 미야즈 준이치롯 | Motion vector search method and apparatus |
| US20010047517A1 (en) * | 2000-02-10 | 2001-11-29 | Charilaos Christopoulos | Method and apparatus for intelligent transcoding of multimedia data |
| JP2001245303A (en) * | 2000-02-29 | 2001-09-07 | Toshiba Corp | Moving picture coding apparatus and moving picture coding method |
| US6407680B1 (en) * | 2000-12-22 | 2002-06-18 | Generic Media, Inc. | Distributed on-demand media transcoding system and method |
| US7034848B2 (en) * | 2001-01-05 | 2006-04-25 | Hewlett-Packard Development Company, L.P. | System and method for automatically cropping graphical images |
| JP4205574B2 (en) * | 2001-05-31 | 2009-01-07 | キヤノン株式会社 | Image processing apparatus and control method thereof |
| US7801215B2 (en) * | 2001-07-24 | 2010-09-21 | Sasken Communication Technologies Limited | Motion estimation technique for digital video encoding applications |
| US20030126622A1 (en) * | 2001-12-27 | 2003-07-03 | Koninklijke Philips Electronics N.V. | Method for efficiently storing the trajectory of tracked objects in video |
| KR100850705B1 (en) * | 2002-03-09 | 2008-08-06 | 삼성전자주식회사 | Method for adaptive encoding motion image based on the temperal and spatial complexity and apparatus thereof |
| JP4275358B2 (en) * | 2002-06-11 | 2009-06-10 | 株式会社日立製作所 | Image information conversion apparatus, bit stream converter, and image information conversion transmission method |
| US20040131276A1 (en) * | 2002-12-23 | 2004-07-08 | John Hudson | Region-based image processor |
| AU2003296127A1 (en) * | 2002-12-25 | 2004-07-22 | Nikon Corporation | Blur correction camera system |
| KR100566290B1 (en) * | 2003-09-18 | 2006-03-30 | 삼성전자주식회사 | Image Scanning Method Using Scan Table and Discrete Cosine Converter |
| JP4123171B2 (en) * | 2004-03-08 | 2008-07-23 | ソニー株式会社 | Method for manufacturing vibration type gyro sensor element, vibration type gyro sensor element, and method for adjusting vibration direction |
| WO2005094270A2 (en) * | 2004-03-24 | 2005-10-13 | Sharp Laboratories Of America, Inc. | Methods and systems for a/v input device to diplay networking |
| US8315307B2 (en) * | 2004-04-07 | 2012-11-20 | Qualcomm Incorporated | Method and apparatus for frame prediction in hybrid video compression to enable temporal scalability |
| US20060109900A1 (en) * | 2004-11-23 | 2006-05-25 | Bo Shen | Image data transcoding |
| JP2006203682A (en) * | 2005-01-21 | 2006-08-03 | Nec Corp | Converting device of compression encoding bit stream for moving image at syntax level and moving image communication system |
| WO2007044556A2 (en) * | 2005-10-07 | 2007-04-19 | Innovation Management Sciences, L.L.C. | Method and apparatus for scalable video decoder using an enhancement stream |
| TW200816798A (en) * | 2006-09-22 | 2008-04-01 | Altek Corp | Method of automatic shooting by using an image recognition technology |
| US7843824B2 (en) * | 2007-01-08 | 2010-11-30 | General Instrument Corporation | Method and apparatus for statistically multiplexing services |
| US7924316B2 (en) * | 2007-03-14 | 2011-04-12 | Aptina Imaging Corporation | Image feature identification and motion compensation apparatus, systems, and methods |
| KR101128165B1 (en) * | 2007-05-23 | 2012-03-23 | 닛본 덴끼 가부시끼가이샤 | Dynamic image distribution system, conversion device, and dynamic image distribution method |
| KR20100031755A (en) * | 2007-07-30 | 2010-03-24 | 닛본 덴끼 가부시끼가이샤 | Connection terminal, distribution system, conversion method, and program |
| US20090060039A1 (en) * | 2007-09-05 | 2009-03-05 | Yasuharu Tanaka | Method and apparatus for compression-encoding moving image |
| US8098732B2 (en) * | 2007-10-10 | 2012-01-17 | Sony Corporation | System for and method of transcoding video sequences from a first format to a second format |
| US8063942B2 (en) * | 2007-10-19 | 2011-11-22 | Qualcomm Incorporated | Motion assisted image sensor configuration |
| US8170342B2 (en) * | 2007-11-07 | 2012-05-01 | Microsoft Corporation | Image recognition of content |
| JP2009152672A (en) * | 2007-12-18 | 2009-07-09 | Samsung Techwin Co Ltd | Recording apparatus, reproducing apparatus, recording method, reproducing method, and program |
| JP5242151B2 (en) * | 2007-12-21 | 2013-07-24 | セミコンダクター・コンポーネンツ・インダストリーズ・リミテッド・ライアビリティ・カンパニー | Vibration correction control circuit and imaging apparatus including the same |
| JP2009159359A (en) * | 2007-12-27 | 2009-07-16 | Samsung Techwin Co Ltd | Moving picture data encoding apparatus, moving picture data decoding apparatus, moving picture data encoding method, moving picture data decoding method, and program |
| US20090217338A1 (en) * | 2008-02-25 | 2009-08-27 | Broadcom Corporation | Reception verification/non-reception verification of base/enhancement video layers |
| US20090323810A1 (en) * | 2008-06-26 | 2009-12-31 | Mediatek Inc. | Video encoding apparatuses and methods with decoupled data dependency |
| US7990421B2 (en) * | 2008-07-18 | 2011-08-02 | Sony Ericsson Mobile Communications Ab | Arrangement and method relating to an image recording device |
| JP2010039788A (en) * | 2008-08-05 | 2010-02-18 | Toshiba Corp | Image processing apparatus and method thereof, and image processing program |
| JP2010147808A (en) * | 2008-12-18 | 2010-07-01 | Olympus Imaging Corp | Imaging apparatus and image processing method in same |
| US8311115B2 (en) * | 2009-01-29 | 2012-11-13 | Microsoft Corporation | Video encoding using previously calculated motion information |
| US9009338B2 (en) * | 2009-03-03 | 2015-04-14 | Viasat, Inc. | Space shifting over return satellite communication channels |
| US8520083B2 (en) * | 2009-03-27 | 2013-08-27 | Canon Kabushiki Kaisha | Method of removing an artefact from an image |
| US20100309987A1 (en) * | 2009-06-05 | 2010-12-09 | Apple Inc. | Image acquisition and encoding system |
| JP5473536B2 (en) * | 2009-10-28 | 2014-04-16 | 京セラ株式会社 | Portable imaging device with projector function |
| US20110170608A1 (en) * | 2010-01-08 | 2011-07-14 | Xun Shi | Method and device for video transcoding using quad-tree based mode selection |
| US8681255B2 (en) * | 2010-09-28 | 2014-03-25 | Microsoft Corporation | Integrated low power depth camera and projection device |
| US9007428B2 (en) * | 2011-06-01 | 2015-04-14 | Apple Inc. | Motion-based image stitching |
| US8554011B2 (en) * | 2011-06-07 | 2013-10-08 | Microsoft Corporation | Automatic exposure correction of images |
-
2011
- 2011-09-14 US US13/232,052 patent/US20130021512A1/en not_active Abandoned
- 2011-09-14 US US13/232,045 patent/US20130021488A1/en not_active Abandoned
- 2011-09-19 US US13/235,975 patent/US20130021504A1/en not_active Abandoned
- 2011-09-27 US US13/245,941 patent/US20130021489A1/en not_active Abandoned
- 2011-10-26 US US13/281,521 patent/US20130021490A1/en not_active Abandoned
- 2011-12-07 US US13/313,352 patent/US9092861B2/en active Active
- 2011-12-07 US US13/313,345 patent/US20130022116A1/en not_active Abandoned
- 2011-12-19 US US13/330,047 patent/US20130021484A1/en not_active Abandoned
-
2012
- 2012-03-07 US US13/413,863 patent/US20130021491A1/en not_active Abandoned
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20030063816A1 (en) * | 1998-05-27 | 2003-04-03 | Industrial Technology Research Institute, A Taiwanese Corporation | Image-based method and system for building spherical panoramas |
| US20080074500A1 (en) * | 1998-05-27 | 2008-03-27 | Transpacific Ip Ltd. | Image-Based Method and System for Building Spherical Panoramas |
| US20060268131A1 (en) * | 2002-06-21 | 2006-11-30 | Microsoft Corporation | System and method for camera calibration and images stitching |
| US20100194851A1 (en) * | 2009-02-03 | 2010-08-05 | Aricent Inc. | Panorama image stitching |
Cited By (14)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9170161B2 (en) * | 2011-03-30 | 2015-10-27 | Nippon Avionics Co., Ltd. | Image capture device, pixel output level compensation method for same, infrared camera system, and interchangeable lens system |
| US20120249799A1 (en) * | 2011-03-30 | 2012-10-04 | Yukiko Shibata | Image capture device, pixel output level compensation method for same, infrared camera system, and interchangeable lens system |
| US9066015B2 (en) | 2011-04-28 | 2015-06-23 | Nippon Avionics Co., Ltd. | Image capture device, method for generating image, infrared camera system, and interchangeable lens system |
| US9270959B2 (en) | 2013-08-07 | 2016-02-23 | Qualcomm Incorporated | Dynamic color shading correction |
| US9973672B2 (en) | 2013-12-06 | 2018-05-15 | Huawei Device (Dongguan) Co., Ltd. | Photographing for dual-lens device using photographing environment determined using depth estimation |
| JP7679832B2 (en) | 2020-06-17 | 2025-05-20 | ソニーグループ株式会社 | Information processing device and information processing method |
| JPWO2021256504A1 (en) * | 2020-06-17 | 2021-12-23 | ||
| US20220051369A1 (en) * | 2020-08-12 | 2022-02-17 | Realtek Semiconductor Corp. | Method and system for compensating image having fixed pattern noise |
| US11875481B2 (en) * | 2020-08-12 | 2024-01-16 | Realtek Semiconductor Corp. | Method and system for compensating image having fixed pattern noise |
| US11563899B2 (en) * | 2020-08-14 | 2023-01-24 | Raytheon Company | Parallelization technique for gain map generation using overlapping sub-images |
| US20220053144A1 (en) * | 2020-08-14 | 2022-02-17 | Raytheon Company | Parallelization technique for gain map generation using overlapping sub-images |
| CN114079735A (en) * | 2020-08-19 | 2022-02-22 | 瑞昱半导体股份有限公司 | Image compensation system for fixed image noise |
| US20230188860A1 (en) * | 2021-12-09 | 2023-06-15 | Fotonation Limited | Vehicle occupant monitoring system including an image acquisition device with a rolling shutter image sensor |
| US11902671B2 (en) * | 2021-12-09 | 2024-02-13 | Fotonation Limited | Vehicle occupant monitoring system including an image acquisition device with a rolling shutter image sensor |
Also Published As
| Publication number | Publication date |
|---|---|
| US20130021491A1 (en) | 2013-01-24 |
| US20130021483A1 (en) | 2013-01-24 |
| US20130021488A1 (en) | 2013-01-24 |
| US20130021489A1 (en) | 2013-01-24 |
| US20130021490A1 (en) | 2013-01-24 |
| US20130021512A1 (en) | 2013-01-24 |
| US20130021504A1 (en) | 2013-01-24 |
| US9092861B2 (en) | 2015-07-28 |
| US20130022116A1 (en) | 2013-01-24 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20130021484A1 (en) | Dynamic computation of lens shading | |
| US10542243B2 (en) | Method and system of light source estimation for image processing | |
| KR101154136B1 (en) | White balance calibration for digital camera device | |
| US9451187B2 (en) | Lens shading calibration for cameras | |
| US10070042B2 (en) | Method and system of self-calibration for phase detection autofocus | |
| RU2537038C2 (en) | Automatic white balance processing with flexible colour space selection | |
| US8996072B2 (en) | Method and apparatus for controlling light emitting elements in terminal device and terminal device | |
| US9247153B2 (en) | Image processing apparatus, method and imaging apparatus | |
| US9238377B1 (en) | Method and system of lens shading color correction using block matching | |
| US8619153B2 (en) | Radiometric calibration using temporal irradiance mixtures | |
| US8537264B2 (en) | Image capturing apparatus, method, and program for performing an auto focus operation using invisible and visible light | |
| US11503262B2 (en) | Image processing method and device for auto white balance | |
| CN102883107A (en) | Imaging apparatus capable of controlling exposure including flash amount control of flash apparatus, and control method thereof | |
| CN106211804A (en) | The colour measurement to raw image data is utilized to carry out AWB | |
| US20090040371A1 (en) | Methods, systems and apparatuses for pixel value correction using multiple vertical and/or horizontal correction curves | |
| US20160171697A1 (en) | Method and system of run-time self-calibrating lens shading correction | |
| US9282235B2 (en) | Focus score improvement by noise correction | |
| CN114584700B (en) | Focus marking method, marking device and electronic equipment | |
| US10455169B2 (en) | Method and apparatus for correcting vignetting effect caused on an image captured by lightfield cameras | |
| US20130229530A1 (en) | Spectral calibration of imaging devices | |
| US9787893B1 (en) | Adaptive output correction for digital image capture processing | |
| KR100566571B1 (en) | Method and apparatus for automatically correcting lens shading phenomenon in an image sensor | |
| KR20140071877A (en) | Image processing apparatus and image processing method |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: BROADCOM CORPORATION, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SOREK, NOAM;VITSNUDEL, ILIA;REEL/FRAME:027410/0777 Effective date: 20111218 |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
| AS | Assignment |
Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH CAROLINA Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001 Effective date: 20160201 Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001 Effective date: 20160201 |
|
| AS | Assignment |
Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD., SINGAPORE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:041706/0001 Effective date: 20170120 Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:041706/0001 Effective date: 20170120 |
|
| AS | Assignment |
Owner name: BROADCOM CORPORATION, CALIFORNIA Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:BANK OF AMERICA, N.A., AS COLLATERAL AGENT;REEL/FRAME:041712/0001 Effective date: 20170119 |