US20190089944A1 - Imaging pixels with depth sensing capabilities - Google Patents
Imaging pixels with depth sensing capabilities Download PDFInfo
- Publication number
- US20190089944A1 US20190089944A1 US16/174,558 US201816174558A US2019089944A1 US 20190089944 A1 US20190089944 A1 US 20190089944A1 US 201816174558 A US201816174558 A US 201816174558A US 2019089944 A1 US2019089944 A1 US 2019089944A1
- Authority
- US
- United States
- Prior art keywords
- photosensitive
- image
- photosensitive region
- image sensor
- pixel
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000003384 imaging method Methods 0.000 title abstract description 21
- 230000004044 response Effects 0.000 claims abstract description 19
- 239000000758 substrate Substances 0.000 claims abstract description 18
- 239000000463 material Substances 0.000 claims description 8
- 239000002184 metal Substances 0.000 claims description 6
- 229910052751 metal Inorganic materials 0.000 claims description 6
- 238000010586 diagram Methods 0.000 description 22
- 210000001747 pupil Anatomy 0.000 description 13
- 230000004888 barrier function Effects 0.000 description 11
- 230000000875 corresponding effect Effects 0.000 description 11
- 238000009792 diffusion process Methods 0.000 description 6
- 238000002955 isolation Methods 0.000 description 6
- 230000003213 activating effect Effects 0.000 description 5
- 230000006870 function Effects 0.000 description 4
- 238000003491 array Methods 0.000 description 3
- 230000001413 cellular effect Effects 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 238000005192 partition Methods 0.000 description 3
- 238000000638 solvent extraction Methods 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 238000000926 separation method Methods 0.000 description 2
- 230000006641 stabilisation Effects 0.000 description 2
- 238000011105 stabilization Methods 0.000 description 2
- 230000015572 biosynthetic process Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 239000003989 dielectric material Substances 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 150000002739 metals Chemical class 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/207—Image signal generators using stereoscopic image cameras using a single 2D image sensor
- H04N13/218—Image signal generators using stereoscopic image cameras using a single 2D image sensor using spatial multiplexing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/286—Image signal generators having separate monoscopic and stereoscopic modes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/76—Addressed sensors, e.g. MOS or CMOS sensors
- H04N25/78—Readout circuits for addressed sensors, e.g. output amplifiers or A/D converters
-
- H04N5/378—
Definitions
- This relates generally to imaging systems, and more particularly to imaging systems with depth sensing capabilities.
- Imagers i.e., image sensors
- image sensors may be formed from a two-dimensional array of image sensing pixels. Each pixel receives incident photons (light) and converts the photons into electrical signals.
- Image sensors are sometimes designed to provide images to electronic devices using a Joint Photographic Experts Group (JPEG) format.
- JPEG Joint Photographic Experts Group
- Some applications such as three-dimensional (3D) imaging may require electronic devices to provide stereo and/or depth sensing capabilities. For example, to properly generate a 3D image for a given scene, an electronic device may need to identify the distances between the electronic device and objects in the scene. To identify distances, conventional electronic devices use complex arrangements. Some arrangements require the use of multiple image sensors and camera lenses that capture images from various viewpoints. Other arrangements require the addition of lenticular arrays that focus incident light on sub-regions of a two-dimensional pixel array. Due to the addition of components such as additional image sensors or complex lens arrays, these arrangements lead to reduced spatial resolution, increased cost, and increased complexity.
- FIG. 1 is an illustrative schematic diagram of an electronic device with a camera sensor that may include depth sensing pixels in accordance with an embodiment of the present invention.
- FIG. 2A is an illustrative cross-sectional view of a depth sensing pixel having photosensitive regions with different and asymmetric angular responses in accordance with an embodiment of the present invention.
- FIGS. 2B and 2C are illustrative cross-sectional views of a depth sensing pixel having photosensitive regions that may be asymmetrically sensitive to incident light at negative and positive angles of incidence in accordance with an embodiment of the present invention.
- FIG. 3 is an illustrative diagram of illustrative signal outputs of photosensitive regions of a depth sensing pixel for incident light striking the depth sensing pixel at varying angles of incidence in accordance with an embodiment of the present invention.
- FIG. 4A is an illustrative diagram of a depth sensing imager having a lens and of an object located at a focal distance away from the lens showing how the lens focuses light from the object onto the depth sensing imager in accordance with an embodiment of the present invention.
- FIG. 4B is an illustrative diagram of a depth sensing imager having a lens and of an object located at more than a focal distance away from the lens showing how the lens focuses light from the object onto the depth sensing imager in accordance with an embodiment of the present invention.
- FIG. 4C is an illustrative diagram of a depth sensing imager having a lens and of an object located less than a focal distance away from the imaging lens showing how the lens focuses light from the object onto the depth sensing imager in accordance with an embodiment of the present invention.
- FIG. 5 is an illustrative diagram of illustrative depth signal values produced using output signals from a depth sensing pixel for an object at varying distances from the depth sensing pixel in accordance with an embodiment of the present invention.
- FIG. 6 is a diagram of an illustrative depth sensing pixel with multiple color filters in accordance with an embodiment of the present invention.
- FIG. 7A is an illustrative diagram of a depth sensing pixel having horizontally arranged photosensitive regions in accordance with an embodiment of the present invention.
- FIG. 7B is an illustrative diagram of a depth sensing pixel having vertically arranged photosensitive regions in accordance with an embodiment of the present invention.
- FIG. 7C is an illustrative diagram of a depth sensing pixel having multiple photosensitive regions in accordance with an embodiment of the present invention.
- FIG. 8 is a block diagram of an imager employing depth sensing pixels in accordance with an embodiment of the present invention.
- FIG. 9 is a block diagram of a processor system employing the imager of FIG. 8 in accordance with an embodiment of the present invention.
- FIG. 10 is an illustrative diagram showing how multiple depth sensing pixels may share readout circuitry in accordance with an embodiment of the present invention.
- FIG. 11 is an illustrative circuit diagram showing how multiple depth sensing pixels may have shared readout circuitry in accordance with an embodiment of the present invention.
- FIG. 12 is a diagram of an illustrative depth sensing pixel having four photosensitive regions that form respective photodiodes in accordance with an embodiment of the present invention.
- FIG. 13 is an illustrative cross-sectional side view of a pixel array with a corresponding graph of photo-diode potential across an axis of the pixel array in accordance with an embodiment of the present invention.
- Embodiments of the present invention relate to image sensors with depth sensing capabilities.
- An electronic device with a digital camera module is shown in FIG. 1 .
- Electronic device 10 may be a digital camera, a computer, a cellular telephone, a medical device, or other electronic device.
- Camera module 12 may include image sensor 14 and one or more lenses. During operation, the lenses focus light onto image sensor 14 .
- Image sensor 14 includes photosensitive elements (e.g., pixels) that convert the light into digital data.
- Image sensors may have any number of pixels (e.g., hundreds, thousands, millions, or more).
- a typical image sensor may, for example, have millions of pixels (e.g., megapixels).
- image sensor 14 may include bias circuitry (e.g., source follower load circuits), sample and hold circuitry, correlated double sampling (CDS) circuitry, amplifier circuitry, analog-to-digital (ADC) converter circuitry, data output circuitry, memory (e.g., buffer circuitry), address circuitry, etc.
- bias circuitry e.g., source follower load circuits
- sample and hold circuitry e.g., sample and hold circuitry
- CDS correlated double sampling
- ADC analog-to-digital converter circuitry
- data output circuitry e.g., memory (e.g., buffer circuitry), address circuitry, etc.
- Image processing and data formatting circuitry 16 may be used to perform image processing functions such as three-dimensional depth sensing, data formatting, adjusting white balance and exposure, implementing video image stabilization, face detection, etc. Image processing and data formatting circuitry 16 may also be used to compress raw camera image files if desired (e.g., to Joint Photographic Experts Group or JPEG format). In a typical arrangement, which is sometimes referred to as a system on chip (SOC) arrangement, camera sensor 14 and image processing and data formatting circuitry 16 are implemented on a common integrated circuit. The use of a single integrated circuit to implement camera sensor 14 and image processing and data formatting circuitry 16 can help to reduce costs.
- SOC system on chip
- Camera module 12 may convey acquired image data to host subsystems 20 over path 18 (e.g., image processing and data formatting circuitry 16 may convey image data to subsystems 20 ).
- Electronic device 10 typically provides a user with numerous high-level functions. In a computer or advanced cellular telephone, for example, a user may be provided with the ability to run user applications. To implement these functions, host subsystem 20 of electronic device 10 may include storage and processing circuitry 24 and input-output devices 22 such as keypads, input-output ports, joysticks, and displays.
- Storage and processing circuitry 24 may include volatile and nonvolatile memory (e.g., random-access memory, flash memory, hard drives, solid state drives, etc.). Storage and processing circuitry 24 may also include microprocessors, microcontrollers, digital signal processors, application specific integrated circuits, or other processing circuits.
- camera sensor 14 may include pixels such as pixel 100 shown in FIG. 2A .
- FIG. 2A is an illustrative cross-sectional view of pixel 100 .
- Pixel 100 may include microlens 102 , color filter 104 , stack of dielectric layers 106 , and substrate layer 108 .
- Photosensitive regions (areas) such as photosensitive regions 110 A and 110 B may be formed in substrate layer 108 .
- photosensitive regions 110 A and 110 B are formed at a distance from each other. If desired, photosensitive regions 110 A and 110 B may be formed adjacent to each other (e.g., directly in contact).
- Pixel separating regions 112 may also be formed in substrate layer 108 .
- Microlens 102 may direct incident light towards a substrate area between pixel separators 112 .
- Color filter 104 may filter the incident light by only allowing predetermined wavelengths to pass through color filter 104 (e.g., color filter 104 may only be transparent to the wavelengths corresponding to a green color).
- Photosensitive areas 110 A and 110 B may serve to absorb incident light focused by microlens 102 and produce image signals that correspond to the amount of incident light absorbed.
- Photosensitive areas 110 A and 110 B may each cover approximately half of the substrate area between pixel separators 112 (as an example). By only covering half of the substrate area, each photosensitive region may be provided with an asymmetric angular response (e.g., photosensitive region 110 A may produce different image signals based on the angle at which incident light reaches pixel 100 ). The angle at which incident light reaches pixel 100 relative to a normal axis 116 may be herein referred to as the incident angle or angle of incidence.
- incident light 113 may originate from the left of normal axis 116 and may reach pixel 100 with an angle 114 relative to normal axis 116 .
- Angle 114 may be a negative angle of incident light.
- Incident light 113 that reaches microlens 102 at a negative angle such as angle 114 may be focused towards photosensitive area 110 A.
- photosensitive area 110 A may produce relatively high image signals
- photosensitive area 110 B may produce relatively low image signals (e.g., because incident light 113 is not focused towards photosensitive area 110 B).
- An image sensor can be formed using front side illumination imager arrangements (e.g., when circuitry such as metal interconnect circuitry is interposed between the microlens and photosensitive regions) or back side illumination imager arrangements (e.g., when photosensitive regions are interposed between the microlens and the metal interconnect circuitry).
- front side illumination imager arrangements e.g., when circuitry such as metal interconnect circuitry is interposed between the microlens and photosensitive regions
- back side illumination imager arrangements e.g., when photosensitive regions are interposed between the microlens and the metal interconnect circuitry.
- incident light 113 may originate from the right of normal axis 116 and reach pixel 100 with an angle 118 relative to normal axis 116 .
- Angle 118 may be a positive angle of incident light.
- Incident light that reaches microlens 102 at a positive angle such as angle 118 may be focused towards photosensitive area 110 B (e.g., the light is not focused towards photosensitive area 110 A).
- photosensitive area 110 A may produce an image signal output that is relatively low
- photosensitive area 110 B may produce an image signal output that is relatively high.
- each photosensitive area may have an asymmetric angular response (e.g., photosensitive area 110 A may produce different signal outputs for incident light with a given intensity based on an angle of incidence).
- FIG. 3 an example of the image signal outputs of photosensitive areas (regions) 110 A and 110 B of a pixel 100 in response to varying angles of incident light is shown.
- Line 160 may represent the output image signal for photosensitive area 110 A whereas line 162 may represent the output image signal for photosensitive area 110 B.
- the output image signal for photosensitive area 110 A may increase (e.g., because incident light is focused onto photosensitive area 110 A) and the output image signal for photosensitive area 110 B may decrease (e.g., because incident light is focused away from photosensitive area 110 B).
- the output image signal for photosensitive area 110 A may be relatively small and the output image signal for photosensitive area 110 B may be relatively large.
- Line 164 of FIG. 3 may reflect the sum of the output signals for pixel 100 (e.g., the sum of lines 160 and 162 ). As shown in FIG. 3 , line 164 may remain relatively constant regardless of the angle of incidence (e.g., for any given angle of incidence, the total amount of light that is absorbed by the combination of photosensitive areas 110 A and 110 B may be substantially constant).
- photosensitive areas 110 within a pixel are merely illustrative.
- the edges of photosensitive areas 110 A and 110 B may be located at the center of pixel 100 (as shown in FIG. 2A ) or may be shifted slightly away from the center of pixel 100 in any direction.
- photosensitive areas 110 may be decreased in size to cover less than half of the pixel area.
- Depth sensing pixels 100 may be used to form imagers with depth sensing capabilities.
- FIGS. 4A, 4B, and 4C show illustrative image sensors 14 with depth sensing capabilities.
- image sensor 14 may contain an array of pixels 201 formed from pixels 100 (e.g., pixels 100 A, 100 B, 100 C, etc.).
- Image sensor 14 may have an associated camera lens 202 that focuses light originating from a scene of interest (e.g., a scene that includes an object 204 ) onto the array of pixels.
- Camera lens 202 may be located at a distance DF from image sensor 14 .
- Distance DF may correspond to the focal length of camera lens 202 .
- object 204 may be located at distance D 0 from camera lens 202 .
- Distance D 0 may correspond to a focused object plane of camera lens 202 (e.g., a plane located at a distance D 0 from camera lens 202 ).
- the focused object plane and a plane corresponding to image sensor 14 may sometimes be referred to as conjugate planes.
- light from object 204 may be focused onto pixel 100 A at an angle ⁇ 0 and an angle ⁇ 0 and the image signal outputs of photosensitive regions 110 A and 110 B of pixel 100 A may be equal (e.g., most of the light is absorbed by photosensitive region 110 B for the positive angle and most of the light is absorbed by photosensitive region 110 A for the negative angle).
- object 204 may be located at a distance D 1 from camera lens 202 .
- Distance D 1 may be larger than the distance of the focused object plane (e.g., the focused object plane corresponding to distance D 0 ) of camera lens 202 .
- some of the light from object 204 may be focused onto pixel 100 B at a negative angle ⁇ 1 (e.g., the light focused by the bottom half of camera lens 202 ) and some of the light from object 204 may be focused onto pixel 100 C at a positive angle ⁇ 1 (e.g., the light focused by the top half of camera lens 202 ).
- object 204 may be located at a distance D 2 from camera lens 202 .
- Distance D 2 may be smaller than the distance of the focused object plane (e.g., the focused object plane corresponding to distance D 0 ) of camera lens 202 .
- some of the light from object 204 may be focused by the top half of camera lens 202 onto pixel 100 B at a positive angle ⁇ 2 and some of the light from object 204 may be focused by the bottom half of camera lens 202 onto pixel 100 C at a negative angle ⁇ 2 .
- FIGS. 4A, 4B, and 4C may effectively partition the light focused by camera lens 202 into two halves split by a center plane at a midpoint between the top of the lens pupil and the bottom of the lens pupil (e.g., split into a top half and a bottom half).
- Each photosensitive region in pixel array 201 other than photosensitive regions of the center pixel may only receive substantial light from one of the two halves of lens 202 for objects at any distance. For example, for an object at distance D 1 , pixel 100 B only receives substantial light from the bottom half of lens 202 . For an object at distance D 2 , pixel 100 B only receives light from the top half of lens 202 .
- the partitioning of the light focused by camera lens 202 may be referred to herein as lens partitioning or lens pupil division.
- the output image signals of each pixel 100 of image sensor 14 may depend on the distance from camera lens 202 to object 204 .
- the angle at which incident light reaches depth sensing pixels of image sensor 14 depends on the distance between lens 202 and objects in a given scene (e.g., the distance between objects such as object 204 and device 10 ).
- An image depth signal may be calculated from the difference between the output image signals of the photosensitive areas of each pixel 100 .
- the diagram of FIG. 5 shows an image depth signal that may be calculated for pixel 100 B by subtracting the image signal output of photosensitive area 110 B from the image signal output of photosensitive area 110 A (e.g., by subtracting line 162 from line 160 of FIG. 3 ).
- the image depth signal may be negative.
- the image depth signal may be positive.
- the image depth signal may remain substantially constant.
- Photosensitive regions 110 A and 110 B may be unable to resolve incident angles with magnitudes larger than the magnitudes of angles provided by objects at distances greater than D 4 or at distances less than D 3 ).
- the depth sensing imager may have difficulty distinguishing whether an object is at a distance D 4 or a distance D 5 (as an example).
- the depth sensing imager may assume that all objects that result in an image depth signal equivalent to distance D 2 or D 4 are at a distance of D 2 or D 4 , respectively (e.g., the imager may identify objects located at distances such as D 2 as being at distance D 4 and objects located closer than distance D 3 as being at distance D 3 ).
- Depth sensing pixel 100 may be formed having the same color filter for both photosensitive regions or having multiple color filters as shown in FIG. 6 .
- Color filter 214 A may cover photosensitive region 110 A
- color filter 214 B may cover photosensitive region 110 B.
- Color filter 214 B may allow light of a first color to reach photosensitive region 110 B
- color filter 214 A may allow light of a second color to reach photosensitive region 110 A.
- color filter 214 B may allow green light to pass
- color filter 214 A may allow red light to pass.
- Depth sensing pixels may be formed with red color filters, blue color filters, green color filters, or color filters that pass other desirable wavelengths of light such as infrared and ultraviolet light wavelengths.
- depth sensing pixels may be formed with color filters that pass multiple wavelengths of light.
- the depth sensing pixel may be formed with a color filter that passes many wavelengths of light.
- the depth sensing pixel may be formed without a color filter (sometimes referred to as a clear pixel).
- Pixel 100 may be formed with any desired number of color filters.
- color filters may be provided for and substantially cover respective photosensitive regions of pixel 100 .
- color filters may be provided for respective groups of photosensitive regions of pixel 100 .
- an optional opaque layer 216 may be provided that covers an intermediate region between photosensitive areas 110 A and 110 B.
- Optional opaque layer 216 may be formed from metals, dielectric materials, or any other desirable opaque material (e.g., a material that prevents light from passing through to substrate 108 ).
- Opaque layer 216 may be formed to directly contact substrate 108 or, if desired, may be formed at a distance from substrate 108 .
- Opaque layer 216 helps to isolate photosensitive region 110 A from photosensitive region 110 B, which may help to improve depth sensing operations.
- FIG. 7A is a simplified top-down view of photosensitive areas 110 A and 110 B of a pixel 100 (e.g., FIG. 7A may represent the footprint of pixel 100 within a pixel array). Areas 110 A and 110 B may be formed horizontally, with photosensitive area 110 B on the left and photosensitive area 110 A on the right.
- a camera lens e.g., camera lens 202
- a camera lens used to focus light onto a pixel array including pixels with horizontally formed photosensitive areas may be horizontally partitioned (e.g., the resulting lens pupil division may be horizontal).
- the distance to an object may be determined from light originating from either the left side of the lens pupil or the right side of the lens pupil.
- pixel 100 of an image sensor may be used for two-dimension image capture.
- charges acquired in photosensitive areas 110 A and 110 B can be summed together using a common floating diffusion node.
- FIG. 7B is a simplified top-down view of a pixel having photosensitive areas 110 A and 110 B that are formed vertically.
- a lens 202 (not shown) used to focus light onto a pixel array formed from vertically split pixels 100 may be vertically partitioned into a top portion and a bottom portion (e.g., the resulting lens pupil division may be vertical).
- the distance to an object may be determined from light originating from either the top half of the lens pupil or the bottom half of the lens pupil.
- FIGS. 7A and 7B in which pixels 100 are formed from two photosensitive regions is merely illustrative.
- a pixel 100 may be formed having multiple photosensitive regions as shown in FIG. 7C .
- pixel 100 may include photosensitive areas 110 A, 110 B, 110 C, and 110 D.
- a camera lens used to focus light onto an array of the pixels may be both vertically and horizontally partitioned.
- image signal outputs from photosensitive areas 110 A and 110 B may be summed to form a first combined signal output and image signal outputs from photosensitive areas 110 C and 110 D may be summed to form a second combined signal output.
- the first combined signal output and the second combined signal output may be used to determine image depth signals that horizontally partition the camera lens (e.g., into left and right portions).
- Vertical partitioning of the camera lens may be performed by summing image signals from photosensitive areas 110 B and 110 C to form a third combined signal output and summing image signals from photosensitive areas 110 A and 110 D to form a fourth combined signal output.
- the third and fourth combined signal outputs may be used to determine image depth signals that vertically partition the camera lens.
- image signals from pixel 100 may be processed to perform vertical and horizontal camera lens splitting.
- Pixel 100 may be used for two-dimensional image capture.
- signals from photosensitive areas 110 A, 110 B, 110 C, 110 D can be summed together by binning charges on the common floating diffusion node.
- pixels 100 may be split into photosensitive regions along any desired axis (e.g., horizontally, vertically, diagonally, etc.). Pixel arrays may be formed having only one type of pixels 100 , two types (e.g., horizontally and vertically split pixels), or more types of lens division.
- FIG. 8 illustrates a simplified block diagram of an imager 14 , for example a CMOS imager, employing a pixel array 301 having depth sensing pixels.
- Pixel array 301 includes a plurality of pixels (e.g., depth sensing pixels and/or regular pixels) arranged in a predetermined number of columns and rows.
- the row lines are selectively activated by the row driver 302 in response to row address decoder 303 and the column select lines are selectively activated by the column driver 304 in response to column address decoder 305 .
- a row and column address is provided for each pixel.
- Imager 14 is operated by a timing and control circuit 306 , which controls decoders 303 and 305 for selecting the appropriate row and column lines for pixel readout, and row and column driver circuitry 302 , 304 , which apply driving voltages to the drive transistors of the selected row and column lines.
- the pixel signals which typically include a pixel reset signal Vrst and a pixel image signal Vsig for each pixel (or each photosensitive region of each pixel) are sampled by sample and hold circuitry 307 associated with the column driver 304 .
- a differential signal Vrst ⁇ Vsig is produced for each pixel (or each photosensitive area of each pixel), which is amplified by an amplifier 308 and digitized by analog-to-digital converter 309 .
- the analog to digital converter 309 converts the analog pixel signals to digital signals, which are fed to an image processor 310 which forms a digital image.
- Image processor 310 may, for example, be provided as part of image processing and data formatting circuitry 16 of FIG
- FIG. 9 is a simplified diagram of an illustrative processor system 400 , such as a digital camera, which includes an imaging device 12 (e.g., the camera module of FIG. 1 ) employing an imager having depth sensing pixels as described above.
- the processor system 400 is exemplary of a system having digital circuits that could include imaging device 12 . Without being limiting, such a system could include a computer system, still or video camera system, scanner, machine vision system, vehicle navigation system, video phone, surveillance system, auto focus system, star tracker system, motion detection system, image stabilization system, and other systems employing an imaging device.
- Processor system 400 for example a digital still or video camera system, generally includes a lens 202 for focusing an image on pixel array 301 when a shutter release button 497 is pressed, central processing unit (CPU) 495 , such as a microprocessor which controls camera and one or more image flow functions, which communicates with one or more input/output (I/O) devices 491 over a bus 493 .
- Imaging device 12 also communicates with CPU 495 over bus 493 .
- System 400 also includes random access memory (RAM) 492 and can optionally include removable memory 494 , such as flash memory, which also communicates with CPU 495 over the bus 493 .
- Imaging device 12 may be combined with the CPU, with or without memory storage on a single integrated circuit or on a different chip.
- bus 493 is illustrated as a single bus, it may be one or more busses, bridges or other communication paths used to interconnect system components of system 400 .
- FIG. 10 is a diagram of an illustrative pixel array 301 including a group of pixels 100 A and 100 B that share readout circuitry 522 .
- the photosensitive areas 110 of pixels 100 A and 100 B are formed horizontally for each pixel (e.g., as shown in FIG. 7A ).
- the photosensitive areas of pixels 100 A and 100 B may be formed in any desired arrangement (e.g., the arrangements of FIGS. 7A, 7B, 7C , any desired combination of these arrangements, or other arrangements).
- Photosensitive regions 110 may form photodiodes that are read out using shared readout circuitry 522 .
- Photosensitive regions 110 of pixel 100 A may form photodiodes PD 1 L and PD 1 R
- photosensitive regions 100 of pixel 100 B may form photodiodes PD 2 L and PD 2 R
- Each photodiode may be coupled to shared readout circuitry 522 via a respective transfer gate 524 .
- photodiodes PD 1 L , PD 1 R , PD 2 L , and PD 2 R may be coupled to shared readout circuitry 522 via transfer gates TX 1 L , TX 1 R , TX 2 L , and TX 2 R , respectively.
- Shared readout circuitry 522 may include a common storage region (e.g., a floating diffusion region) to which readout circuitry such as readout transistors and reset transistors are coupled. Image signals (i.e., acquired charge) from photosensitive regions 110 may be transferred to the common storage region by controlling transfer gates 524 (e.g., by activating or deactivating transfer gates 524 ).
- An imager such as imager 14 of FIG. 8 may be configured to selectively drive transfer gates 524 so that image signals are read sequentially from photodiodes 110 or in parallel.
- row driver circuitry 302 may be configured to selectively activate transfer gates 524 to transfer image signals to shared readout circuitry 522 (e.g., image signals from only photodiode PD 1 L may be read by activating transfer gate TX 1 L while deactivating transfer gates TX 1 R , TX 2 L , and TX 2 R , whereas image signals from photodiodes PD 1 L and PD 1 R may be read in parallel by activating transfer gates TX 1 L and TX 1 R while deactivating transfer gates TX 2 L and TX 2 R ).
- FIG. 11 is an illustrative circuit diagram of an illustrative pixel array 301 including depth sensing pixels 100 A and 100 B that share read readout circuitry 522 (e.g., pixel array 301 of FIG. 10 ).
- depth sensing pixels 100 A and 100 B may include photodiodes (e.g., formed from corresponding photosensitive regions) that are each coupled to shared floating diffusion region 532 (sometimes referred to as a charge storage node or charge detection node, because charge from the photodiodes is detected and at least temporarily stored at the charge storage node).
- photodiodes e.g., formed from corresponding photosensitive regions
- shared floating diffusion region 532 sometimes referred to as a charge storage node or charge detection node, because charge from the photodiodes is detected and at least temporarily stored at the charge storage node.
- a positive power supply voltage (e.g., voltage V AA ) may be supplied at positive power supply terminal 540 .
- reset control signal RST may be asserted, which enables reset transistor 534 .
- reset transistor 534 resets charge storage node 532 .
- Reset control signal RST may then be de-asserted to disable reset signal RST (e.g., thereby disconnecting supply voltage V AA from floating diffusion region 532 ).
- image signals from pixels 100 A and 100 B may be transferred to charge storage node 532 via transfer gates TX 1 L , TX 1 R , TX 2 L , and TX 2 R (e.g., as described in connection with FIG. 10 ).
- Transfer gates TX 1 L , TX 1 R , TX 2 L , and TX 2 R may be controlled via respective control signals C 1 , C 2 , C 3 , and C 4 that are provided to gate terminals of the transfer gates.
- Row select signal RS may be provided using row driver circuitry such as row driver circuit 302 of FIG. 8 . The row select signal may be asserted to transfer the stored image signals from charge storage node 532 to sample and hold circuitry such as circuitry 307 of FIG. 8 .
- FIG. 11 The example of FIG. 11 in which two adjacent pixels are grouped to share readout circuitry 522 is merely illustrative. If desired, any number of adjacent pixels may be grouped to share readout circuitry. For example, four pixels (or more) may be grouped to share readout circuitry. If desired, each pixel may be provided with respective readout circuitry 522 . If desired, each photosensitive region may be provided with respective readout circuitry 522 (e.g., readout circuitry 522 may be provided for each photosensitive region 110 ).
- Two-dimensional image information may be obtained from a pixel 100 by summing the output signals from photosensitive regions of a pixel.
- image processing and data formatting circuitry 16 of FIG. 1 or image processor 310 of FIG. 8 may be used to combine the output signals from the photosensitive areas of each pixel 100 in a pixel array 14 to obtain a regular two-dimensional color image.
- charge summing e.g., described in connection with FIG. 11
- FIG. 11 charge summing between the photosensitive areas of each pixel 100 may be used to combine the light collected by each photosensitive area to form a two-dimensional image.
- FIG. 12 is a diagram of an illustrative depth sensing pixel 100 C having four photosensitive regions 110 that form respective photodiodes PD 1 , PD 2 , PD 3 , and PD 4 .
- the photodiodes of depth sensing pixel 100 C may share readout circuitry 522 (e.g., to a shared floating diffusion region of circuitry 522 ).
- Each photodiode may be coupled to shared readout circuitry 522 via a respective transfer gate 524 (e.g., photodiode PD 1 may be coupled to readout circuitry 522 via transfer gate TX 1 , etc.).
- each photodiode of depth sensing pixel 100 C may be read separately by activating the corresponding transfer gate (e.g., while the remaining transfer gates are de-activated).
- the image signals may be combined to perform horizontal lens splitting, vertical lens splitting, and diagonal lens splitting.
- groups of photodiodes may be read together by selectively activating transfer gates. For example, photodiodes PD 1 and PD 2 may be read together, photodiodes PD 1 and PD 3 may be read together, etc.
- all transfer gates may be activated to transfer charge from photodiodes PD 1 , PD 2 , PD 3 , and PD 4 simultaneously.
- FIG. 13 is an illustrative cross-sectional side view of pixel array 301 including pixel 100 C ( FIG. 12 ) with a corresponding graph 600 of photo-diode potential across axis 602 .
- Axis 602 may sometimes be referred to as a potential cutline.
- pixel 100 C may be implemented as a back-side illuminated pixel in which photosensitive regions (e.g., photodiodes) are interposed between light-receiving surfaces (e.g., microlens 102 and color filters 104 ) circuitry such as a transfer gates 524 , metal interconnects, shared readout circuitry, etc.
- photosensitive regions e.g., photodiodes
- light-receiving surfaces e.g., microlens 102 and color filters 104
- circuitry such as a transfer gates 524 , metal interconnects, shared readout circuitry, etc.
- the photodiode potential across axis 602 may include barriers 608 and 606 (e.g., because photodiode potential at barriers 608 and 606 is lower than at other locations along axis 602 of pixel array 301 ).
- Barriers 608 and 606 may be formed partially by physical separation between photodiodes of pixel 100 C and between neighboring pixels.
- barrier 608 may be formed via physical separation between photodiodes PD 1 and PD 2 .
- Barriers 606 may be formed partially by differing pixel attributes.
- pixel 100 C may include color filters 104 that pass a first color (e.g., green), whereas, neighboring pixels may include color filters 104 that pass other colors (e.g., blue, red).
- the physical distance D 1 separating pixel 100 C and neighboring pixels may combine with the different color filter attributes to provide increased inter-pixel isolation relative to intra-pixel isolation.
- inter-pixel barrier 606 that isolates neighboring pixels may be greater than intra-pixel barrier 608 that isolates neighboring photodiodes within a pixel.
- barriers 606 and 608 electrical cross-talk between pixels and between photodiodes may be reduced, which helps to improve captured image quality.
- charge summing between photodiodes PD 1 and PD 2 may be performed to obtain two-dimensional images.
- An imager may include depth sensing pixels that receive incident light and convert the received light into electrical signals.
- the imager may have an associated imaging lens that focuses incident light onto the imager.
- Each of the depth sensing pixels may include a microlens that focuses incident light from the imaging lens through a color filter onto a substrate region.
- Each depth sensing pixel may include first and second photosensitive regions in the substrate region that receive incident light from the microlens. The first and second photosensitive regions may provide different and asymmetrical angular responses to incident light. The angular response of the first photosensitive region may be substantially inverted from the angular response of the second photosensitive region.
- the first and second photosensitive regions of a given depth sensing pixel may effectively divide the corresponding imaging lens pupil into separate portions.
- the first photosensitive region may receive incident light from a first portion of the corresponding imaging lens pupil.
- the second photosensitive region may receive incident light from a second portion of the corresponding imaging lens pupil.
- the photosensitive regions may be configured to divide the imaging lens pupil along a horizontal axis, vertical axis, or any desired axis.
- Depth information for each depth sensing pixel may be determined based on the difference between output signals of the first and second photosensitive regions of that depth sensing pixel. Color information for each depth sensing pixel may be determined from a summation of output signals of the first and second photosensitive regions.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Transforming Light Signals Into Electric Signals (AREA)
- Solid State Image Pick-Up Elements (AREA)
Abstract
Description
- This application is a continuation of patent application Ser. No. 15/375,654, filed Dec. 12, 2016, which is a continuation of patent application Ser. No. 13/728,086, filed Dec. 27, 2012, which claims the benefit of provisional patent application No. 61/603,855, filed Feb. 27, 2012 which are hereby incorporated by reference herein in their entireties. This application claims the benefit of and claims priority to patent application Ser. No. 15/375,654, filed Dec. 12, 2016, patent application Ser. No. 13/728,086, filed Dec. 27, 2012, and provisional patent application No. 61/603,855, filed Feb. 27, 2012.
- This relates generally to imaging systems, and more particularly to imaging systems with depth sensing capabilities.
- Modern electronic devices such as cellular telephones, cameras, and computers often use digital image sensors. Imagers (i.e., image sensors) may be formed from a two-dimensional array of image sensing pixels. Each pixel receives incident photons (light) and converts the photons into electrical signals. Image sensors are sometimes designed to provide images to electronic devices using a Joint Photographic Experts Group (JPEG) format.
- Some applications such as three-dimensional (3D) imaging may require electronic devices to provide stereo and/or depth sensing capabilities. For example, to properly generate a 3D image for a given scene, an electronic device may need to identify the distances between the electronic device and objects in the scene. To identify distances, conventional electronic devices use complex arrangements. Some arrangements require the use of multiple image sensors and camera lenses that capture images from various viewpoints. Other arrangements require the addition of lenticular arrays that focus incident light on sub-regions of a two-dimensional pixel array. Due to the addition of components such as additional image sensors or complex lens arrays, these arrangements lead to reduced spatial resolution, increased cost, and increased complexity.
-
FIG. 1 is an illustrative schematic diagram of an electronic device with a camera sensor that may include depth sensing pixels in accordance with an embodiment of the present invention. -
FIG. 2A is an illustrative cross-sectional view of a depth sensing pixel having photosensitive regions with different and asymmetric angular responses in accordance with an embodiment of the present invention. -
FIGS. 2B and 2C are illustrative cross-sectional views of a depth sensing pixel having photosensitive regions that may be asymmetrically sensitive to incident light at negative and positive angles of incidence in accordance with an embodiment of the present invention. -
FIG. 3 is an illustrative diagram of illustrative signal outputs of photosensitive regions of a depth sensing pixel for incident light striking the depth sensing pixel at varying angles of incidence in accordance with an embodiment of the present invention. -
FIG. 4A is an illustrative diagram of a depth sensing imager having a lens and of an object located at a focal distance away from the lens showing how the lens focuses light from the object onto the depth sensing imager in accordance with an embodiment of the present invention. -
FIG. 4B is an illustrative diagram of a depth sensing imager having a lens and of an object located at more than a focal distance away from the lens showing how the lens focuses light from the object onto the depth sensing imager in accordance with an embodiment of the present invention. -
FIG. 4C is an illustrative diagram of a depth sensing imager having a lens and of an object located less than a focal distance away from the imaging lens showing how the lens focuses light from the object onto the depth sensing imager in accordance with an embodiment of the present invention. -
FIG. 5 is an illustrative diagram of illustrative depth signal values produced using output signals from a depth sensing pixel for an object at varying distances from the depth sensing pixel in accordance with an embodiment of the present invention. -
FIG. 6 is a diagram of an illustrative depth sensing pixel with multiple color filters in accordance with an embodiment of the present invention. -
FIG. 7A is an illustrative diagram of a depth sensing pixel having horizontally arranged photosensitive regions in accordance with an embodiment of the present invention. -
FIG. 7B is an illustrative diagram of a depth sensing pixel having vertically arranged photosensitive regions in accordance with an embodiment of the present invention. -
FIG. 7C is an illustrative diagram of a depth sensing pixel having multiple photosensitive regions in accordance with an embodiment of the present invention. -
FIG. 8 is a block diagram of an imager employing depth sensing pixels in accordance with an embodiment of the present invention. -
FIG. 9 is a block diagram of a processor system employing the imager ofFIG. 8 in accordance with an embodiment of the present invention. -
FIG. 10 is an illustrative diagram showing how multiple depth sensing pixels may share readout circuitry in accordance with an embodiment of the present invention. -
FIG. 11 is an illustrative circuit diagram showing how multiple depth sensing pixels may have shared readout circuitry in accordance with an embodiment of the present invention. -
FIG. 12 is a diagram of an illustrative depth sensing pixel having four photosensitive regions that form respective photodiodes in accordance with an embodiment of the present invention. -
FIG. 13 is an illustrative cross-sectional side view of a pixel array with a corresponding graph of photo-diode potential across an axis of the pixel array in accordance with an embodiment of the present invention. - Embodiments of the present invention relate to image sensors with depth sensing capabilities. An electronic device with a digital camera module is shown in
FIG. 1 .Electronic device 10 may be a digital camera, a computer, a cellular telephone, a medical device, or other electronic device.Camera module 12 may includeimage sensor 14 and one or more lenses. During operation, the lenses focus light ontoimage sensor 14.Image sensor 14 includes photosensitive elements (e.g., pixels) that convert the light into digital data. Image sensors may have any number of pixels (e.g., hundreds, thousands, millions, or more). A typical image sensor may, for example, have millions of pixels (e.g., megapixels). As examples,image sensor 14 may include bias circuitry (e.g., source follower load circuits), sample and hold circuitry, correlated double sampling (CDS) circuitry, amplifier circuitry, analog-to-digital (ADC) converter circuitry, data output circuitry, memory (e.g., buffer circuitry), address circuitry, etc. - Still and video image data from
camera sensor 14 may be provided to image processing anddata formatting circuitry 16 viapath 26. Image processing anddata formatting circuitry 16 may be used to perform image processing functions such as three-dimensional depth sensing, data formatting, adjusting white balance and exposure, implementing video image stabilization, face detection, etc. Image processing anddata formatting circuitry 16 may also be used to compress raw camera image files if desired (e.g., to Joint Photographic Experts Group or JPEG format). In a typical arrangement, which is sometimes referred to as a system on chip (SOC) arrangement,camera sensor 14 and image processing anddata formatting circuitry 16 are implemented on a common integrated circuit. The use of a single integrated circuit to implementcamera sensor 14 and image processing anddata formatting circuitry 16 can help to reduce costs. -
Camera module 12 may convey acquired image data to hostsubsystems 20 over path 18 (e.g., image processing anddata formatting circuitry 16 may convey image data to subsystems 20).Electronic device 10 typically provides a user with numerous high-level functions. In a computer or advanced cellular telephone, for example, a user may be provided with the ability to run user applications. To implement these functions,host subsystem 20 ofelectronic device 10 may include storage andprocessing circuitry 24 and input-output devices 22 such as keypads, input-output ports, joysticks, and displays. Storage andprocessing circuitry 24 may include volatile and nonvolatile memory (e.g., random-access memory, flash memory, hard drives, solid state drives, etc.). Storage andprocessing circuitry 24 may also include microprocessors, microcontrollers, digital signal processors, application specific integrated circuits, or other processing circuits. - It may be desirable to provide image sensors with depth sensing capabilities (e.g., to use in 3D imaging applications such as machine vision applications and other three dimensional imaging applications). To provide depth sensing capabilities,
camera sensor 14 may include pixels such aspixel 100 shown inFIG. 2A . -
FIG. 2A is an illustrative cross-sectional view ofpixel 100.Pixel 100 may includemicrolens 102,color filter 104, stack ofdielectric layers 106, andsubstrate layer 108. Photosensitive regions (areas) such as 110A and 110B may be formed inphotosensitive regions substrate layer 108. In the example ofFIG. 2A , 110A and 110B are formed at a distance from each other. If desired,photosensitive regions 110A and 110B may be formed adjacent to each other (e.g., directly in contact).photosensitive regions Pixel separating regions 112 may also be formed insubstrate layer 108. -
Microlens 102 may direct incident light towards a substrate area betweenpixel separators 112.Color filter 104 may filter the incident light by only allowing predetermined wavelengths to pass through color filter 104 (e.g.,color filter 104 may only be transparent to the wavelengths corresponding to a green color). 110A and 110B may serve to absorb incident light focused byPhotosensitive areas microlens 102 and produce image signals that correspond to the amount of incident light absorbed. -
110A and 110B may each cover approximately half of the substrate area between pixel separators 112 (as an example). By only covering half of the substrate area, each photosensitive region may be provided with an asymmetric angular response (e.g.,Photosensitive areas photosensitive region 110A may produce different image signals based on the angle at which incident light reaches pixel 100). The angle at which incident light reachespixel 100 relative to anormal axis 116 may be herein referred to as the incident angle or angle of incidence. - In the example of
FIG. 2B ,incident light 113 may originate from the left ofnormal axis 116 and may reachpixel 100 with anangle 114 relative tonormal axis 116.Angle 114 may be a negative angle of incident light.Incident light 113 that reachesmicrolens 102 at a negative angle such asangle 114 may be focused towardsphotosensitive area 110A. In this scenario,photosensitive area 110A may produce relatively high image signals, whereasphotosensitive area 110B may produce relatively low image signals (e.g., becauseincident light 113 is not focused towardsphotosensitive area 110B). - An image sensor can be formed using front side illumination imager arrangements (e.g., when circuitry such as metal interconnect circuitry is interposed between the microlens and photosensitive regions) or back side illumination imager arrangements (e.g., when photosensitive regions are interposed between the microlens and the metal interconnect circuitry). In both cases the stack of dielectric layers and metal routing layers need to be optimized to couple effectively light from the microlens to photosensitive areas, for example using light guide structures as a part of
dielectric layers 106. - In the example of
FIG. 2C ,incident light 113 may originate from the right ofnormal axis 116 and reachpixel 100 with anangle 118 relative tonormal axis 116.Angle 118 may be a positive angle of incident light. Incident light that reachesmicrolens 102 at a positive angle such asangle 118 may be focused towardsphotosensitive area 110B (e.g., the light is not focused towardsphotosensitive area 110A). In this scenario,photosensitive area 110A may produce an image signal output that is relatively low, whereasphotosensitive area 110B may produce an image signal output that is relatively high. - Due to the asymmetric formation of individual
110A and 110B inphotosensitive areas substrate 108, each photosensitive area may have an asymmetric angular response (e.g.,photosensitive area 110A may produce different signal outputs for incident light with a given intensity based on an angle of incidence). In the diagram ofFIG. 3 , an example of the image signal outputs of photosensitive areas (regions) 110A and 110B of apixel 100 in response to varying angles of incident light is shown. -
Line 160 may represent the output image signal forphotosensitive area 110A whereasline 162 may represent the output image signal forphotosensitive area 110B. For negative angles of incidence, the output image signal forphotosensitive area 110A may increase (e.g., because incident light is focused ontophotosensitive area 110A) and the output image signal forphotosensitive area 110B may decrease (e.g., because incident light is focused away fromphotosensitive area 110B). For positive angles of incidence, the output image signal forphotosensitive area 110A may be relatively small and the output image signal forphotosensitive area 110B may be relatively large. -
Line 164 ofFIG. 3 may reflect the sum of the output signals for pixel 100 (e.g., the sum oflines 160 and 162). As shown inFIG. 3 ,line 164 may remain relatively constant regardless of the angle of incidence (e.g., for any given angle of incidence, the total amount of light that is absorbed by the combination of 110A and 110B may be substantially constant).photosensitive areas - The size and location of
photosensitive areas 110 within a pixel (e.g., as shown inFIGS. 2A, 2B, and 2C ) are merely illustrative. As examples, the edges of 110A and 110B may be located at the center of pixel 100 (as shown inphotosensitive areas FIG. 2A ) or may be shifted slightly away from the center ofpixel 100 in any direction. If desired,photosensitive areas 110 may be decreased in size to cover less than half of the pixel area. -
Depth sensing pixels 100 may be used to form imagers with depth sensing capabilities.FIGS. 4A, 4B, and 4C showillustrative image sensors 14 with depth sensing capabilities. As shown inFIGS. 4A, 4B, and 4C ,image sensor 14 may contain an array ofpixels 201 formed from pixels 100 (e.g., 100A, 100B, 100C, etc.).pixels Image sensor 14 may have an associatedcamera lens 202 that focuses light originating from a scene of interest (e.g., a scene that includes an object 204) onto the array of pixels.Camera lens 202 may be located at a distance DF fromimage sensor 14. Distance DF may correspond to the focal length ofcamera lens 202. - In the arrangement of
FIG. 4A , object 204 may be located at distance D0 fromcamera lens 202. Distance D0 may correspond to a focused object plane of camera lens 202 (e.g., a plane located at a distance D0 from camera lens 202). The focused object plane and a plane corresponding to imagesensor 14 may sometimes be referred to as conjugate planes. In this case, light fromobject 204 may be focused ontopixel 100A at an angle θ0 and an angle −θ0 and the image signal outputs of 110A and 110B ofphotosensitive regions pixel 100A may be equal (e.g., most of the light is absorbed byphotosensitive region 110B for the positive angle and most of the light is absorbed byphotosensitive region 110A for the negative angle). - In the arrangement of
FIG. 4B , object 204 may be located at a distance D1 fromcamera lens 202. Distance D1 may be larger than the distance of the focused object plane (e.g., the focused object plane corresponding to distance D0) ofcamera lens 202. In this case, some of the light fromobject 204 may be focused ontopixel 100B at a negative angle −θ1 (e.g., the light focused by the bottom half of camera lens 202) and some of the light fromobject 204 may be focused ontopixel 100C at a positive angle θ1 (e.g., the light focused by the top half of camera lens 202). - In the arrangement of
FIG. 4C , object 204 may be located at a distance D2 fromcamera lens 202. Distance D2 may be smaller than the distance of the focused object plane (e.g., the focused object plane corresponding to distance D0) ofcamera lens 202. In this case, some of the light fromobject 204 may be focused by the top half ofcamera lens 202 ontopixel 100B at a positive angle θ2 and some of the light fromobject 204 may be focused by the bottom half ofcamera lens 202 ontopixel 100C at a negative angle θ2. - The arrangements of
FIGS. 4A, 4B, and 4C may effectively partition the light focused bycamera lens 202 into two halves split by a center plane at a midpoint between the top of the lens pupil and the bottom of the lens pupil (e.g., split into a top half and a bottom half). Each photosensitive region inpixel array 201 other than photosensitive regions of the center pixel may only receive substantial light from one of the two halves oflens 202 for objects at any distance. For example, for an object at distance D1,pixel 100B only receives substantial light from the bottom half oflens 202. For an object at distance D2,pixel 100B only receives light from the top half oflens 202. The partitioning of the light focused bycamera lens 202 may be referred to herein as lens partitioning or lens pupil division. - The output image signals of each
pixel 100 ofimage sensor 14 may depend on the distance fromcamera lens 202 to object 204. The angle at which incident light reaches depth sensing pixels ofimage sensor 14 depends on the distance betweenlens 202 and objects in a given scene (e.g., the distance between objects such asobject 204 and device 10). - An image depth signal may be calculated from the difference between the output image signals of the photosensitive areas of each
pixel 100. The diagram ofFIG. 5 shows an image depth signal that may be calculated forpixel 100B by subtracting the image signal output ofphotosensitive area 110B from the image signal output ofphotosensitive area 110A (e.g., by subtractingline 162 fromline 160 ofFIG. 3 ). As shown inFIG. 5 , for an object at a distance that is less than distance D0 (e.g., the focused object distance), the image depth signal may be negative. For an object at distance that is greater than the focused object distance D0, the image depth signal may be positive. - For distances greater than D4 and less than D3, the image depth signal may remain substantially constant.
110A and 110B may be unable to resolve incident angles with magnitudes larger than the magnitudes of angles provided by objects at distances greater than D4 or at distances less than D3). In other words, it may be difficult for a depth sensing imager to accurately measure depth information for objects at distances greater than D4 or at distances less than D3. As an example, the depth sensing imager may have difficulty distinguishing whether an object is at a distance D4 or a distance D5 (as an example). If desired, the depth sensing imager may assume that all objects that result in an image depth signal equivalent to distance D2 or D4 are at a distance of D2 or D4, respectively (e.g., the imager may identify objects located at distances such as D2 as being at distance D4 and objects located closer than distance D3 as being at distance D3).Photosensitive regions -
Depth sensing pixel 100 may be formed having the same color filter for both photosensitive regions or having multiple color filters as shown inFIG. 6 .Color filter 214A may coverphotosensitive region 110A, whereascolor filter 214B may coverphotosensitive region 110B.Color filter 214B may allow light of a first color to reachphotosensitive region 110B, whereascolor filter 214A may allow light of a second color to reachphotosensitive region 110A. For example,color filter 214B may allow green light to pass, whereascolor filter 214A may allow red light to pass. This example is merely illustrative. Depth sensing pixels may be formed with red color filters, blue color filters, green color filters, or color filters that pass other desirable wavelengths of light such as infrared and ultraviolet light wavelengths. If desired, depth sensing pixels may be formed with color filters that pass multiple wavelengths of light. For example, to increase the amount of light absorbed by a depth sensing pixel, the depth sensing pixel may be formed with a color filter that passes many wavelengths of light. As another example, the depth sensing pixel may be formed without a color filter (sometimes referred to as a clear pixel).Pixel 100 may be formed with any desired number of color filters. For example, color filters may be provided for and substantially cover respective photosensitive regions ofpixel 100. Alternatively, color filters may be provided for respective groups of photosensitive regions ofpixel 100. - As shown in
FIG. 6 , an optionalopaque layer 216 may be provided that covers an intermediate region between 110A and 110B. Optionalphotosensitive areas opaque layer 216 may be formed from metals, dielectric materials, or any other desirable opaque material (e.g., a material that prevents light from passing through to substrate 108).Opaque layer 216 may be formed to directly contactsubstrate 108 or, if desired, may be formed at a distance fromsubstrate 108.Opaque layer 216 helps to isolatephotosensitive region 110A fromphotosensitive region 110B, which may help to improve depth sensing operations. -
FIG. 7A is a simplified top-down view of 110A and 110B of a pixel 100 (e.g.,photosensitive areas FIG. 7A may represent the footprint ofpixel 100 within a pixel array). 110A and 110B may be formed horizontally, withAreas photosensitive area 110B on the left andphotosensitive area 110A on the right. A camera lens (e.g., camera lens 202) used to focus light onto a pixel array including pixels with horizontally formed photosensitive areas may be horizontally partitioned (e.g., the resulting lens pupil division may be horizontal). In this scenario, the distance to an object may be determined from light originating from either the left side of the lens pupil or the right side of the lens pupil. - In addition to depth sensing mode of
operation pixel 100 of an image sensor may be used for two-dimension image capture. In this case charges acquired in 110A and 110B can be summed together using a common floating diffusion node.photosensitive areas -
Pixels 100 of an image sensor may be formed along any desired axis.FIG. 7B is a simplified top-down view of a pixel having 110A and 110B that are formed vertically. In the arrangement ofphotosensitive areas FIG. 7B , a lens 202 (not shown) used to focus light onto a pixel array formed from vertically splitpixels 100 may be vertically partitioned into a top portion and a bottom portion (e.g., the resulting lens pupil division may be vertical). In this scenario, the distance to an object may be determined from light originating from either the top half of the lens pupil or the bottom half of the lens pupil. - The examples of
FIGS. 7A and 7B in whichpixels 100 are formed from two photosensitive regions is merely illustrative. Apixel 100 may be formed having multiple photosensitive regions as shown inFIG. 7C . In the arrangement ofFIG. 7C ,pixel 100 may include 110A, 110B, 110C, and 110D. In this scenario, a camera lens used to focus light onto an array of the pixels may be both vertically and horizontally partitioned.photosensitive areas - As an example, image signal outputs from
110A and 110B may be summed to form a first combined signal output and image signal outputs fromphotosensitive areas 110C and 110D may be summed to form a second combined signal output. In this scenario, the first combined signal output and the second combined signal output may be used to determine image depth signals that horizontally partition the camera lens (e.g., into left and right portions). Vertical partitioning of the camera lens may be performed by summing image signals fromphotosensitive areas 110B and 110C to form a third combined signal output and summing image signals fromphotosensitive areas 110A and 110D to form a fourth combined signal output. The third and fourth combined signal outputs may be used to determine image depth signals that vertically partition the camera lens. In the example ofphotosensitive areas FIG. 7C , image signals frompixel 100 may be processed to perform vertical and horizontal camera lens splitting. -
Pixel 100 may be used for two-dimensional image capture. In this case signals from 110A, 110B, 110C, 110D can be summed together by binning charges on the common floating diffusion node.photosensitive areas - If desired,
pixels 100 may be split into photosensitive regions along any desired axis (e.g., horizontally, vertically, diagonally, etc.). Pixel arrays may be formed having only one type ofpixels 100, two types (e.g., horizontally and vertically split pixels), or more types of lens division. -
FIG. 8 illustrates a simplified block diagram of animager 14, for example a CMOS imager, employing apixel array 301 having depth sensing pixels.Pixel array 301 includes a plurality of pixels (e.g., depth sensing pixels and/or regular pixels) arranged in a predetermined number of columns and rows. The row lines are selectively activated by therow driver 302 in response torow address decoder 303 and the column select lines are selectively activated by thecolumn driver 304 in response tocolumn address decoder 305. Thus, a row and column address is provided for each pixel. -
Imager 14 is operated by a timing andcontrol circuit 306, which controls 303 and 305 for selecting the appropriate row and column lines for pixel readout, and row anddecoders 302, 304, which apply driving voltages to the drive transistors of the selected row and column lines. The pixel signals, which typically include a pixel reset signal Vrst and a pixel image signal Vsig for each pixel (or each photosensitive region of each pixel) are sampled by sample and holdcolumn driver circuitry circuitry 307 associated with thecolumn driver 304. A differential signal Vrst−Vsig is produced for each pixel (or each photosensitive area of each pixel), which is amplified by anamplifier 308 and digitized by analog-to-digital converter 309. The analog todigital converter 309 converts the analog pixel signals to digital signals, which are fed to animage processor 310 which forms a digital image.Image processor 310 may, for example, be provided as part of image processing anddata formatting circuitry 16 ofFIG. 1 . -
FIG. 9 is a simplified diagram of anillustrative processor system 400, such as a digital camera, which includes an imaging device 12 (e.g., the camera module ofFIG. 1 ) employing an imager having depth sensing pixels as described above. Theprocessor system 400 is exemplary of a system having digital circuits that could includeimaging device 12. Without being limiting, such a system could include a computer system, still or video camera system, scanner, machine vision system, vehicle navigation system, video phone, surveillance system, auto focus system, star tracker system, motion detection system, image stabilization system, and other systems employing an imaging device. -
Processor system 400, for example a digital still or video camera system, generally includes alens 202 for focusing an image onpixel array 301 when ashutter release button 497 is pressed, central processing unit (CPU) 495, such as a microprocessor which controls camera and one or more image flow functions, which communicates with one or more input/output (I/O)devices 491 over abus 493.Imaging device 12 also communicates withCPU 495 overbus 493.System 400 also includes random access memory (RAM) 492 and can optionally includeremovable memory 494, such as flash memory, which also communicates withCPU 495 over thebus 493.Imaging device 12 may be combined with the CPU, with or without memory storage on a single integrated circuit or on a different chip. Althoughbus 493 is illustrated as a single bus, it may be one or more busses, bridges or other communication paths used to interconnect system components ofsystem 400. - Image signal outputs from photosensitive regions of one or more pixels may be combined using shared readout circuitry.
FIG. 10 is a diagram of anillustrative pixel array 301 including a group of 100A and 100B that sharepixels readout circuitry 522. In the example ofFIG. 10 , thephotosensitive areas 110 of 100A and 100B are formed horizontally for each pixel (e.g., as shown inpixels FIG. 7A ). However, if desired, the photosensitive areas of 100A and 100B may be formed in any desired arrangement (e.g., the arrangements ofpixels FIGS. 7A, 7B, 7C , any desired combination of these arrangements, or other arrangements). -
Photosensitive regions 110 may form photodiodes that are read out using sharedreadout circuitry 522. -
Photosensitive regions 110 ofpixel 100A may form photodiodes PD1 L and PD1 R, whereasphotosensitive regions 100 ofpixel 100B may form photodiodes PD2 L and PD2 R. Each photodiode may be coupled to sharedreadout circuitry 522 via arespective transfer gate 524. In the example ofFIG. 10 , photodiodes PD1 L, PD1 R, PD2 L, and PD2 R may be coupled to sharedreadout circuitry 522 via transfer gates TX1 L, TX1 R, TX2 L, and TX2 R, respectively. - Shared
readout circuitry 522 may include a common storage region (e.g., a floating diffusion region) to which readout circuitry such as readout transistors and reset transistors are coupled. Image signals (i.e., acquired charge) fromphotosensitive regions 110 may be transferred to the common storage region by controlling transfer gates 524 (e.g., by activating or deactivating transfer gates 524). - An imager such as
imager 14 ofFIG. 8 may be configured to selectively drivetransfer gates 524 so that image signals are read sequentially fromphotodiodes 110 or in parallel. For example,row driver circuitry 302 may be configured to selectively activatetransfer gates 524 to transfer image signals to shared readout circuitry 522 (e.g., image signals from only photodiode PD1 L may be read by activating transfer gate TX1 L while deactivating transfer gates TX1 R, TX2 L, and TX2 R, whereas image signals from photodiodes PD1 L and PD1 R may be read in parallel by activating transfer gates TX1 L and TX1 R while deactivating transfer gates TX2 L and TX2 R). -
FIG. 11 is an illustrative circuit diagram of anillustrative pixel array 301 including 100A and 100B that share read readout circuitry 522 (e.g.,depth sensing pixels pixel array 301 ofFIG. 10 ). As shown inFIG. 11 , 100A and 100B may include photodiodes (e.g., formed from corresponding photosensitive regions) that are each coupled to shared floating diffusion region 532 (sometimes referred to as a charge storage node or charge detection node, because charge from the photodiodes is detected and at least temporarily stored at the charge storage node).depth sensing pixels - A positive power supply voltage (e.g., voltage VAA) may be supplied at positive
power supply terminal 540. Before an image is acquired, reset control signal RST may be asserted, which enables resettransistor 534. When enabled,reset transistor 534 resetscharge storage node 532. Reset control signal RST may then be de-asserted to disable reset signal RST (e.g., thereby disconnecting supply voltage VAA from floating diffusion region 532). - During signal readout operations, image signals from
100A and 100B may be transferred to chargepixels storage node 532 via transfer gates TX1 L, TX1 R, TX2 L, and TX2 R (e.g., as described in connection withFIG. 10 ). Transfer gates TX1 L, TX1 R, TX2 L, and TX2 R may be controlled via respective control signals C1, C2, C3, and C4 that are provided to gate terminals of the transfer gates. Row select signal RS may be provided using row driver circuitry such asrow driver circuit 302 ofFIG. 8 . The row select signal may be asserted to transfer the stored image signals fromcharge storage node 532 to sample and hold circuitry such ascircuitry 307 ofFIG. 8 . - The example of
FIG. 11 in which two adjacent pixels are grouped to sharereadout circuitry 522 is merely illustrative. If desired, any number of adjacent pixels may be grouped to share readout circuitry. For example, four pixels (or more) may be grouped to share readout circuitry. If desired, each pixel may be provided withrespective readout circuitry 522. If desired, each photosensitive region may be provided with respective readout circuitry 522 (e.g.,readout circuitry 522 may be provided for each photosensitive region 110). - Two-dimensional image information may be obtained from a
pixel 100 by summing the output signals from photosensitive regions of a pixel. For example, image processing anddata formatting circuitry 16 ofFIG. 1 orimage processor 310 ofFIG. 8 may be used to combine the output signals from the photosensitive areas of eachpixel 100 in apixel array 14 to obtain a regular two-dimensional color image. As another example, charge summing (e.g., described in connection withFIG. 11 ) between the photosensitive areas of eachpixel 100 may be used to combine the light collected by each photosensitive area to form a two-dimensional image. -
FIG. 12 is a diagram of an illustrativedepth sensing pixel 100C having fourphotosensitive regions 110 that form respective photodiodes PD1, PD2, PD3, and PD4. The photodiodes ofdepth sensing pixel 100C may share readout circuitry 522 (e.g., to a shared floating diffusion region of circuitry 522). Each photodiode may be coupled to sharedreadout circuitry 522 via a respective transfer gate 524 (e.g., photodiode PD1 may be coupled toreadout circuitry 522 via transfer gate TX1, etc.). - During depth sensing operations, each photodiode of
depth sensing pixel 100C may be read separately by activating the corresponding transfer gate (e.g., while the remaining transfer gates are de-activated). The image signals may be combined to perform horizontal lens splitting, vertical lens splitting, and diagonal lens splitting. If desired, groups of photodiodes may be read together by selectively activating transfer gates. For example, photodiodes PD1 and PD2 may be read together, photodiodes PD1 and PD3 may be read together, etc. During two-dimensional imaging operations, all transfer gates may be activated to transfer charge from photodiodes PD1, PD2, PD3, and PD4 simultaneously. -
FIG. 13 is an illustrative cross-sectional side view ofpixel array 301 includingpixel 100C (FIG. 12 ) with acorresponding graph 600 of photo-diode potential acrossaxis 602.Axis 602 may sometimes be referred to as a potential cutline. In the example ofFIG. 13 ,pixel 100C may be implemented as a back-side illuminated pixel in which photosensitive regions (e.g., photodiodes) are interposed between light-receiving surfaces (e.g.,microlens 102 and color filters 104) circuitry such as atransfer gates 524, metal interconnects, shared readout circuitry, etc. - As shown by
graph 600, the photodiode potential acrossaxis 602 may includebarriers 608 and 606 (e.g., because photodiode potential at 608 and 606 is lower than at other locations alongbarriers axis 602 of pixel array 301). 608 and 606 may be formed partially by physical separation between photodiodes ofBarriers pixel 100C and between neighboring pixels. For example,barrier 608 may be formed via physical separation between photodiodes PD1 and PD2.Barriers 606 may be formed partially by differing pixel attributes. For example,pixel 100C may includecolor filters 104 that pass a first color (e.g., green), whereas, neighboring pixels may includecolor filters 104 that pass other colors (e.g., blue, red). In this scenario, the physical distanceD1 separating pixel 100C and neighboring pixels may combine with the different color filter attributes to provide increased inter-pixel isolation relative to intra-pixel isolation. In other words,inter-pixel barrier 606 that isolates neighboring pixels may be greater thanintra-pixel barrier 608 that isolates neighboring photodiodes within a pixel. By providing 606 and 608, electrical cross-talk between pixels and between photodiodes may be reduced, which helps to improve captured image quality. In general, it may be desirable to have increased isolation between neighboring pixels (inter-pixel isolation).barriers - It may be desirable to have reduced intra-pixel isolation relative to inter-pixel isolation to balance sensitivity loss with depth sensing image quality and two-dimensional image quality as shown by
barrier difference 604 between 608 and 606. For example, charge summing between photodiodes PD1 and PD2 may be performed to obtain two-dimensional images. In this scenario, it may be desirable to have a reducedbarriers barrier 608 to help maximize the total amount of combined light absorbed by photodiodes PD1 and PD2. - Various embodiments have been described illustrating imagers with depth sensing capabilities.
- An imager may include depth sensing pixels that receive incident light and convert the received light into electrical signals. The imager may have an associated imaging lens that focuses incident light onto the imager. Each of the depth sensing pixels may include a microlens that focuses incident light from the imaging lens through a color filter onto a substrate region. Each depth sensing pixel may include first and second photosensitive regions in the substrate region that receive incident light from the microlens. The first and second photosensitive regions may provide different and asymmetrical angular responses to incident light. The angular response of the first photosensitive region may be substantially inverted from the angular response of the second photosensitive region.
- The first and second photosensitive regions of a given depth sensing pixel may effectively divide the corresponding imaging lens pupil into separate portions. The first photosensitive region may receive incident light from a first portion of the corresponding imaging lens pupil. The second photosensitive region may receive incident light from a second portion of the corresponding imaging lens pupil. The photosensitive regions may be configured to divide the imaging lens pupil along a horizontal axis, vertical axis, or any desired axis.
- Depth information for each depth sensing pixel may be determined based on the difference between output signals of the first and second photosensitive regions of that depth sensing pixel. Color information for each depth sensing pixel may be determined from a summation of output signals of the first and second photosensitive regions.
- The foregoing is merely illustrative of the principles of this invention and various modifications can be made by those skilled in the art without departing from the scope and spirit of the invention.
Claims (20)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US16/174,558 US20190089944A1 (en) | 2012-02-27 | 2018-10-30 | Imaging pixels with depth sensing capabilities |
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201261603855P | 2012-02-27 | 2012-02-27 | |
| US13/728,086 US9554115B2 (en) | 2012-02-27 | 2012-12-27 | Imaging pixels with depth sensing capabilities |
| US15/375,654 US10158843B2 (en) | 2012-02-27 | 2016-12-12 | Imaging pixels with depth sensing capabilities |
| US16/174,558 US20190089944A1 (en) | 2012-02-27 | 2018-10-30 | Imaging pixels with depth sensing capabilities |
Related Parent Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US15/375,654 Continuation US10158843B2 (en) | 2012-02-27 | 2016-12-12 | Imaging pixels with depth sensing capabilities |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20190089944A1 true US20190089944A1 (en) | 2019-03-21 |
Family
ID=49002442
Family Applications (3)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US13/728,086 Active 2033-08-29 US9554115B2 (en) | 2012-02-27 | 2012-12-27 | Imaging pixels with depth sensing capabilities |
| US15/375,654 Active US10158843B2 (en) | 2012-02-27 | 2016-12-12 | Imaging pixels with depth sensing capabilities |
| US16/174,558 Abandoned US20190089944A1 (en) | 2012-02-27 | 2018-10-30 | Imaging pixels with depth sensing capabilities |
Family Applications Before (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US13/728,086 Active 2033-08-29 US9554115B2 (en) | 2012-02-27 | 2012-12-27 | Imaging pixels with depth sensing capabilities |
| US15/375,654 Active US10158843B2 (en) | 2012-02-27 | 2016-12-12 | Imaging pixels with depth sensing capabilities |
Country Status (1)
| Country | Link |
|---|---|
| US (3) | US9554115B2 (en) |
Families Citing this family (34)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8742309B2 (en) | 2011-01-28 | 2014-06-03 | Aptina Imaging Corporation | Imagers with depth sensing capabilities |
| US10015471B2 (en) | 2011-08-12 | 2018-07-03 | Semiconductor Components Industries, Llc | Asymmetric angular response pixels for single sensor stereo |
| US9554115B2 (en) * | 2012-02-27 | 2017-01-24 | Semiconductor Components Industries, Llc | Imaging pixels with depth sensing capabilities |
| JP6075646B2 (en) * | 2014-03-17 | 2017-02-08 | ソニー株式会社 | Solid-state imaging device, driving method thereof, and electronic apparatus |
| US9491442B2 (en) * | 2014-04-28 | 2016-11-08 | Samsung Electronics Co., Ltd. | Image processing device and mobile computing device having the same |
| US9445018B2 (en) | 2014-05-01 | 2016-09-13 | Semiconductor Components Industries, Llc | Imaging systems with phase detection pixels |
| US9888198B2 (en) | 2014-06-03 | 2018-02-06 | Semiconductor Components Industries, Llc | Imaging systems having image sensor pixel arrays with sub-pixel resolution capabilities |
| US9338380B2 (en) | 2014-06-30 | 2016-05-10 | Semiconductor Components Industries, Llc | Image processing methods for image sensors with phase detection pixels |
| US9432568B2 (en) | 2014-06-30 | 2016-08-30 | Semiconductor Components Industries, Llc | Pixel arrangements for image sensors with phase detection pixels |
| JP6448289B2 (en) * | 2014-10-07 | 2019-01-09 | キヤノン株式会社 | Imaging apparatus and imaging system |
| US9741755B2 (en) | 2014-12-22 | 2017-08-22 | Google Inc. | Physical layout and structure of RGBZ pixel cell unit for RGBZ image sensor |
| JP6218799B2 (en) | 2015-01-05 | 2017-10-25 | キヤノン株式会社 | Imaging device and imaging apparatus |
| US10070088B2 (en) * | 2015-01-05 | 2018-09-04 | Canon Kabushiki Kaisha | Image sensor and image capturing apparatus for simultaneously performing focus detection and image generation |
| US9749556B2 (en) | 2015-03-24 | 2017-08-29 | Semiconductor Components Industries, Llc | Imaging systems having image sensor pixel arrays with phase detection capabilities |
| US10694169B1 (en) | 2015-08-14 | 2020-06-23 | Apple Inc. | Depth mapping with polarization and focus pixels |
| JP7005125B2 (en) | 2016-04-22 | 2022-01-21 | キヤノン株式会社 | Image sensor, image sensor, and method for manufacturing the image sensor |
| JP6738200B2 (en) * | 2016-05-26 | 2020-08-12 | キヤノン株式会社 | Imaging device |
| KR102391632B1 (en) * | 2016-06-07 | 2022-04-27 | 애어리3디 인크. | Light field imaging device and depth acquisition and three-dimensional imaging method |
| US10033949B2 (en) | 2016-06-16 | 2018-07-24 | Semiconductor Components Industries, Llc | Imaging systems with high dynamic range and phase detection pixels |
| US10764515B2 (en) * | 2016-07-05 | 2020-09-01 | Futurewei Technologies, Inc. | Image sensor method and apparatus equipped with multiple contiguous infrared filter elements |
| US10205937B2 (en) * | 2016-08-02 | 2019-02-12 | Apple Inc. | Controlling lens misalignment in an imaging system |
| US10110840B2 (en) | 2016-10-25 | 2018-10-23 | Semiconductor Components Industries, Llc | Image sensor pixels with overflow capabilities |
| US10574872B2 (en) | 2016-12-01 | 2020-02-25 | Semiconductor Components Industries, Llc | Methods and apparatus for single-chip multispectral object detection |
| US10271037B2 (en) | 2017-01-20 | 2019-04-23 | Semiconductor Components Industries, Llc | Image sensors with hybrid three-dimensional imaging |
| US10075663B2 (en) * | 2017-01-20 | 2018-09-11 | Semiconductor Components Industries, Llc | Phase detection pixels with high speed readout |
| KR102354991B1 (en) | 2017-05-24 | 2022-01-24 | 삼성전자주식회사 | Pixel circuit and image sensor including thereof |
| US10313613B2 (en) | 2017-10-24 | 2019-06-04 | Semiconductor Components Industries, Llc | High dynamic range image sensors with flicker and fixed pattern noise mitigation |
| US10536652B2 (en) | 2018-01-08 | 2020-01-14 | Semiconductor Components Industries, Llc | Image sensors with split photodiodes |
| KR102523281B1 (en) | 2018-03-09 | 2023-04-18 | 삼성전자주식회사 | 3D image sensor |
| DE102018216199A1 (en) * | 2018-09-24 | 2020-03-26 | Robert Bosch Gmbh | Image sensor element for outputting an image signal and method for producing an image sensor element for outputting an image signal |
| US10410368B1 (en) | 2018-09-27 | 2019-09-10 | Qualcomm Incorporated | Hybrid depth processing |
| CN110108283B (en) * | 2019-05-10 | 2020-11-17 | 成都四相致新科技有限公司 | High-precision positioning method based on multi-two-dimension code vision |
| KR20220019895A (en) * | 2020-08-10 | 2022-02-18 | 삼성전자주식회사 | Image sensor |
| KR102856413B1 (en) * | 2020-11-19 | 2025-09-05 | 에스케이하이닉스 주식회사 | Image sensing device |
Citations (84)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6166768A (en) * | 1994-01-28 | 2000-12-26 | California Institute Of Technology | Active pixel sensor array with simple floating gate pixels |
| US6396873B1 (en) * | 1999-02-25 | 2002-05-28 | Envision Advanced Medical Systems | Optical device |
| US20020117605A1 (en) * | 2001-01-08 | 2002-08-29 | Alden Ray M. | Three-dimensional receiving and displaying process and apparatus with military application |
| US20030211405A1 (en) * | 2002-05-13 | 2003-11-13 | Kartik Venkataraman | Color filter imaging array and method of formation |
| US20040012698A1 (en) * | 2001-03-05 | 2004-01-22 | Yasuo Suda | Image pickup model and image pickup device |
| US6714240B1 (en) * | 1998-06-23 | 2004-03-30 | Boeing North American, Inc. | Optical sensor employing motion compensated integration-device and process |
| US6856407B2 (en) * | 2000-09-13 | 2005-02-15 | Nextengine, Inc. | Method for depth detection in 3D imaging providing a depth measurement for each unitary group of pixels |
| US20050051860A1 (en) * | 2003-09-10 | 2005-03-10 | Fuji Photo Film Co., Ltd. | Solid state image pickup device |
| US20050057655A1 (en) * | 2003-09-17 | 2005-03-17 | Kevin Duesman | Method for automated testing of the modulation transfer function in image sensors |
| US6933978B1 (en) * | 1999-10-28 | 2005-08-23 | Canon Kabushiki Kaisha | Focus detecting device with photoelectric conversion portion having microlens and with light blocking portion having first and second openings |
| US20050190453A1 (en) * | 2004-03-01 | 2005-09-01 | Hideki Dobashi | Image sensor |
| US20060066739A1 (en) * | 2004-09-24 | 2006-03-30 | Fuji Photo Film Co., Ltd. | Image pickup apparatus including photosensitive cells each having photosensitive regions partitioned |
| US20060249804A1 (en) * | 2004-07-08 | 2006-11-09 | Chandra Mouli | Photonic crystal-based lens elements for use in an image sensor |
| US20070023801A1 (en) * | 2005-07-27 | 2007-02-01 | Magnachip Semiconductor Ltd. | Stacked pixel for high resolution CMOS image sensor |
| US7290880B1 (en) * | 2005-07-27 | 2007-11-06 | Visionsense Ltd. | System and method for producing a stereoscopic image of an eye fundus |
| US20080018662A1 (en) * | 2006-07-21 | 2008-01-24 | Gazeley William George | Method and apparatus for preventing or reducing color cross-talk between adjacent pixels in an image sensor device |
| US20080080028A1 (en) * | 2006-10-02 | 2008-04-03 | Micron Technology, Inc. | Imaging method, apparatus and system having extended depth of field |
| US20080180558A1 (en) * | 2007-01-16 | 2008-07-31 | Sharp Kabushiki Kaisha | Amplification-type solid-state image capturing apparatus and electronic information device |
| US20080217718A1 (en) * | 2007-03-06 | 2008-09-11 | Micron Technology, Inc. | Method, apparatus, and system to reduce ground resistance in a pixel array |
| US20080259202A1 (en) * | 2006-11-28 | 2008-10-23 | Sony Corporation | Imaging device |
| US20080274581A1 (en) * | 2007-05-03 | 2008-11-06 | Park Jin-Ho | Method for manufacturing image sensor |
| US20080278820A1 (en) * | 2007-05-08 | 2008-11-13 | Micron Technology, Inc. | Tetraform microlenses and method of forming the same |
| US20090200589A1 (en) * | 2008-02-08 | 2009-08-13 | Omnivision Technologies, Inc. | Backside illuminated imaging sensor with improved infrared sensitivity |
| US20090230394A1 (en) * | 2008-03-12 | 2009-09-17 | Omnivision Technologies, Inc. | Image sensor array with conformal color filters |
| US20090244514A1 (en) * | 2008-03-26 | 2009-10-01 | Samsung Electronics Co., Ltd. | Distance measuring sensors including vertical photogate and three-dimensional color image sensors including distance measuring sensors |
| US20090284731A1 (en) * | 2008-05-13 | 2009-11-19 | Samsung Electronics Co., Ltd. | Distance measuring sensor including double transfer gate and three dimensional color image sensor including the distance measuring sensor |
| US7646943B1 (en) * | 2008-09-04 | 2010-01-12 | Zena Technologies, Inc. | Optical waveguides in image sensors |
| US20100020209A1 (en) * | 2008-07-25 | 2010-01-28 | Samsung Electronics Co., Ltd. | Imaging method and apparatus |
| US20100033829A1 (en) * | 2006-10-10 | 2010-02-11 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Device for Homogenizing Radiation by Means of Irregular Microlens Arrays |
| US20100060717A1 (en) * | 2006-12-04 | 2010-03-11 | Koninklijke Philips Electronics N.V. | Image processing system for processing combined image data and depth data |
| US20100091161A1 (en) * | 2007-06-16 | 2010-04-15 | Nikon Corporation | Solid-state image sensor and imaging apparatus equipped with solid-state image sensor |
| US20100117177A1 (en) * | 2008-11-07 | 2010-05-13 | Young Je Yun | Image Sensor and Method of Manufacturing the Same |
| US20100123771A1 (en) * | 2008-11-14 | 2010-05-20 | Samsung Electronics Co., Ltd. | Pixel circuit, photoelectric converter, and image sensing system including the pixel circuit and the photoelectric converter |
| US20100128109A1 (en) * | 2008-11-25 | 2010-05-27 | Banks Paul S | Systems And Methods Of High Resolution Three-Dimensional Imaging |
| US20100150538A1 (en) * | 2008-12-15 | 2010-06-17 | Sony Corporation | Image Pickup apparatus and focus control method |
| US20100238330A1 (en) * | 2007-09-13 | 2010-09-23 | Sony Corporation | Solid-state imaging device, signal processing method thereof and image capturing apparatus |
| US20100245656A1 (en) * | 2009-03-31 | 2010-09-30 | Sony Corporation | Imaging device and focus detecting method |
| US20100265381A1 (en) * | 2009-04-16 | 2010-10-21 | Sony Corporation | Imaging device |
| US20100290674A1 (en) * | 2009-05-14 | 2010-11-18 | Samsung Electronics Co., Ltd. | 3D image processing apparatus improving depth accuracy of region of interest and method |
| US20110007135A1 (en) * | 2009-07-09 | 2011-01-13 | Sony Corporation | Image processing device, image processing method, and program |
| US20110019184A1 (en) * | 2007-10-02 | 2011-01-27 | Nikon Corporation | Light receiving device, focus detection device and imaging device |
| US20110019049A1 (en) * | 2009-07-27 | 2011-01-27 | Samsung Electronics Co., Ltd. | Photo detecting apparatus and unit pixel thereof |
| US20110018974A1 (en) * | 2009-07-27 | 2011-01-27 | Sen Wang | Stereoscopic imaging using split complementary color filters |
| US20110025904A1 (en) * | 2008-03-11 | 2011-02-03 | Canon Kabushiki Kaisha | Focus detection device and imaging apparatus having the same |
| US20110042552A1 (en) * | 2009-08-19 | 2011-02-24 | Furuya Shogo | Solid-state imaging device and method of manufacturing the same |
| US7935560B2 (en) * | 2007-09-06 | 2011-05-03 | International Business Machines Corporation | Imagers having electrically active optical elements |
| US20110109776A1 (en) * | 2009-11-10 | 2011-05-12 | Fujifilm Corporation | Imaging device and imaging apparatus |
| US20110199506A1 (en) * | 2008-11-11 | 2011-08-18 | Canon Kabushiki Kaisha | Focus detection apparatus and control method therefor |
| US20110199602A1 (en) * | 2010-02-17 | 2011-08-18 | Suk Pil Kim | Sensor and method using the same |
| US20110249161A1 (en) * | 2010-04-12 | 2011-10-13 | Canon Kabushiki Kaisha | Solid-state imaging device |
| US8049801B2 (en) * | 2006-09-14 | 2011-11-01 | Nikon Corporation | Image sensor and imaging apparatus |
| US20110309236A1 (en) * | 2007-04-18 | 2011-12-22 | Invisage Technologies, Inc. | Materials, systems and methods for optoelectronic devices |
| US20120019695A1 (en) * | 2010-07-26 | 2012-01-26 | Omnivision Technologies, Inc. | Image sensor having dark sidewalls between color filters to reduce optical crosstalk |
| US20120043634A1 (en) * | 2010-08-17 | 2012-02-23 | Canon Kabushiki Kaisha | Method of manufacturing microlens array, method of manufacturing solid-state image sensor, and solid-state image sensor |
| US20120056073A1 (en) * | 2010-09-03 | 2012-03-08 | Jung Chak Ahn | Pixel, method of manufacturing the same, and image processing devices including the same |
| US20120133809A1 (en) * | 2010-11-29 | 2012-05-31 | Canon Kabushiki Kaisha | Solid state image sensor |
| US20120173184A1 (en) * | 2011-01-05 | 2012-07-05 | Samsung Electronics Co., Ltd. | Depth sensor, defect correction method thereof, and signal processing system including the depth sensor |
| US20120175501A1 (en) * | 2005-09-27 | 2012-07-12 | Omnivision Technologies, Inc. | Image sensing device and manufacture method thereof |
| US20120193515A1 (en) * | 2011-01-28 | 2012-08-02 | Gennadiy Agranov | Imagers with depth sensing capabilities |
| US20120212581A1 (en) * | 2011-02-17 | 2012-08-23 | Canon Kabushiki Kaisha | Image capture apparatus and image signal processing apparatus |
| US20120212654A1 (en) * | 2011-02-18 | 2012-08-23 | Canon Kabushiki Kaisha | Image pickup apparatus, focus detection method, image generation method, and storage medium |
| US20120267747A1 (en) * | 2009-12-04 | 2012-10-25 | Canon Kabushiki Kaisha | Solid-state image pickup device and method for manufacturing the same |
| US20120268634A1 (en) * | 2011-04-20 | 2012-10-25 | Canon Kabushiki Kaisha | Image sensor and image capturing apparatus |
| US20130020620A1 (en) * | 2008-09-04 | 2013-01-24 | Zena Technologies, Inc. | Optical waveguides in image sensors |
| US20130038691A1 (en) * | 2011-08-12 | 2013-02-14 | Aptina Imaging Corporation | Asymmetric angular response pixels for single sensor stereo |
| US20130128087A1 (en) * | 2010-08-27 | 2013-05-23 | Todor G. Georgiev | Methods and Apparatus for Super-Resolution in Integral Photography |
| US20130181309A1 (en) * | 2012-01-18 | 2013-07-18 | Canon Kabushiki Kaisha | Image pickup apparatus and image pickup system |
| US20130182158A1 (en) * | 2012-01-18 | 2013-07-18 | Canon Kabushiki Kaisha | Image pickup apparatus and image pickup system |
| US20130222553A1 (en) * | 2010-09-24 | 2013-08-29 | Fujifilm Corporation | Image pickup device and image pickup apparatus |
| US20130222552A1 (en) * | 2012-02-27 | 2013-08-29 | Aptina Imaging Corporation | Imaging pixels with depth sensing capabilities |
| US20130222662A1 (en) * | 2012-02-28 | 2013-08-29 | Canon Kabushiki Kaisha | Imaging device, imaging system, and method for driving imaging device |
| US8525906B2 (en) * | 2008-07-18 | 2013-09-03 | Sony Corporation | Solid-state imaging element and camera system |
| US20130256510A1 (en) * | 2012-03-29 | 2013-10-03 | Omnivision Technologies, Inc. | Imaging device with floating diffusion switch |
| US20130271646A1 (en) * | 2012-04-11 | 2013-10-17 | Canon Kabushiki Kaisha | Image capture apparatus and control method therefor |
| US20140103410A1 (en) * | 2012-10-11 | 2014-04-17 | Omnivision Technologies, Inc. | Partial buried channel transfer device in image sensors |
| US8730545B2 (en) * | 2011-03-24 | 2014-05-20 | Fujifilm Corporation | Color imaging element, imaging device, and storage medium storing a control program for imaging device |
| US20140192248A1 (en) * | 2013-01-07 | 2014-07-10 | Canon Kabushiki Kaisha | Imaging apparatus and method for controlling same |
| US20140218580A1 (en) * | 2011-08-26 | 2014-08-07 | E2V Semiconductors | Pixel-grouping image sensor |
| US20140253905A1 (en) * | 2013-03-06 | 2014-09-11 | Samsung Electronics Co., Ltd | Depth pixel and image pick-up apparatus including the same |
| US20150001589A1 (en) * | 2013-06-28 | 2015-01-01 | Canon Kabushiki Kaisha | Photoelectric conversion device and imaging system |
| US8947572B2 (en) * | 2010-05-24 | 2015-02-03 | Omnivision Technologies, Inc. | Dual-sided image sensor |
| US20150062422A1 (en) * | 2013-08-27 | 2015-03-05 | Semiconductor Components Industries, Llc | Lens alignment in camera modules using phase detection pixels |
| US9106826B2 (en) * | 2013-09-20 | 2015-08-11 | Fujifilm Corporation | Image capturing apparatus and focusing control method |
| US20150312461A1 (en) * | 2014-04-28 | 2015-10-29 | Tae Chan Kim | Image sensor including a pixel having photoelectric conversion elements and image processing device having the image sensor |
Family Cites Families (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP4826152B2 (en) * | 2005-06-23 | 2011-11-30 | 株式会社ニコン | Image composition method and imaging apparatus |
| KR100690169B1 (en) * | 2005-10-25 | 2007-03-08 | 매그나칩 반도체 유한회사 | CMOS image sensor |
-
2012
- 2012-12-27 US US13/728,086 patent/US9554115B2/en active Active
-
2016
- 2016-12-12 US US15/375,654 patent/US10158843B2/en active Active
-
2018
- 2018-10-30 US US16/174,558 patent/US20190089944A1/en not_active Abandoned
Patent Citations (84)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6166768A (en) * | 1994-01-28 | 2000-12-26 | California Institute Of Technology | Active pixel sensor array with simple floating gate pixels |
| US6714240B1 (en) * | 1998-06-23 | 2004-03-30 | Boeing North American, Inc. | Optical sensor employing motion compensated integration-device and process |
| US6396873B1 (en) * | 1999-02-25 | 2002-05-28 | Envision Advanced Medical Systems | Optical device |
| US6933978B1 (en) * | 1999-10-28 | 2005-08-23 | Canon Kabushiki Kaisha | Focus detecting device with photoelectric conversion portion having microlens and with light blocking portion having first and second openings |
| US6856407B2 (en) * | 2000-09-13 | 2005-02-15 | Nextengine, Inc. | Method for depth detection in 3D imaging providing a depth measurement for each unitary group of pixels |
| US20020117605A1 (en) * | 2001-01-08 | 2002-08-29 | Alden Ray M. | Three-dimensional receiving and displaying process and apparatus with military application |
| US20040012698A1 (en) * | 2001-03-05 | 2004-01-22 | Yasuo Suda | Image pickup model and image pickup device |
| US20030211405A1 (en) * | 2002-05-13 | 2003-11-13 | Kartik Venkataraman | Color filter imaging array and method of formation |
| US20050051860A1 (en) * | 2003-09-10 | 2005-03-10 | Fuji Photo Film Co., Ltd. | Solid state image pickup device |
| US20050057655A1 (en) * | 2003-09-17 | 2005-03-17 | Kevin Duesman | Method for automated testing of the modulation transfer function in image sensors |
| US20050190453A1 (en) * | 2004-03-01 | 2005-09-01 | Hideki Dobashi | Image sensor |
| US20060249804A1 (en) * | 2004-07-08 | 2006-11-09 | Chandra Mouli | Photonic crystal-based lens elements for use in an image sensor |
| US20060066739A1 (en) * | 2004-09-24 | 2006-03-30 | Fuji Photo Film Co., Ltd. | Image pickup apparatus including photosensitive cells each having photosensitive regions partitioned |
| US20070023801A1 (en) * | 2005-07-27 | 2007-02-01 | Magnachip Semiconductor Ltd. | Stacked pixel for high resolution CMOS image sensor |
| US7290880B1 (en) * | 2005-07-27 | 2007-11-06 | Visionsense Ltd. | System and method for producing a stereoscopic image of an eye fundus |
| US20120175501A1 (en) * | 2005-09-27 | 2012-07-12 | Omnivision Technologies, Inc. | Image sensing device and manufacture method thereof |
| US20080018662A1 (en) * | 2006-07-21 | 2008-01-24 | Gazeley William George | Method and apparatus for preventing or reducing color cross-talk between adjacent pixels in an image sensor device |
| US8049801B2 (en) * | 2006-09-14 | 2011-11-01 | Nikon Corporation | Image sensor and imaging apparatus |
| US20080080028A1 (en) * | 2006-10-02 | 2008-04-03 | Micron Technology, Inc. | Imaging method, apparatus and system having extended depth of field |
| US20100033829A1 (en) * | 2006-10-10 | 2010-02-11 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Device for Homogenizing Radiation by Means of Irregular Microlens Arrays |
| US20080259202A1 (en) * | 2006-11-28 | 2008-10-23 | Sony Corporation | Imaging device |
| US20100060717A1 (en) * | 2006-12-04 | 2010-03-11 | Koninklijke Philips Electronics N.V. | Image processing system for processing combined image data and depth data |
| US20080180558A1 (en) * | 2007-01-16 | 2008-07-31 | Sharp Kabushiki Kaisha | Amplification-type solid-state image capturing apparatus and electronic information device |
| US20080217718A1 (en) * | 2007-03-06 | 2008-09-11 | Micron Technology, Inc. | Method, apparatus, and system to reduce ground resistance in a pixel array |
| US20110309236A1 (en) * | 2007-04-18 | 2011-12-22 | Invisage Technologies, Inc. | Materials, systems and methods for optoelectronic devices |
| US20080274581A1 (en) * | 2007-05-03 | 2008-11-06 | Park Jin-Ho | Method for manufacturing image sensor |
| US20080278820A1 (en) * | 2007-05-08 | 2008-11-13 | Micron Technology, Inc. | Tetraform microlenses and method of forming the same |
| US20100091161A1 (en) * | 2007-06-16 | 2010-04-15 | Nikon Corporation | Solid-state image sensor and imaging apparatus equipped with solid-state image sensor |
| US7935560B2 (en) * | 2007-09-06 | 2011-05-03 | International Business Machines Corporation | Imagers having electrically active optical elements |
| US20100238330A1 (en) * | 2007-09-13 | 2010-09-23 | Sony Corporation | Solid-state imaging device, signal processing method thereof and image capturing apparatus |
| US20110019184A1 (en) * | 2007-10-02 | 2011-01-27 | Nikon Corporation | Light receiving device, focus detection device and imaging device |
| US20090200589A1 (en) * | 2008-02-08 | 2009-08-13 | Omnivision Technologies, Inc. | Backside illuminated imaging sensor with improved infrared sensitivity |
| US20110025904A1 (en) * | 2008-03-11 | 2011-02-03 | Canon Kabushiki Kaisha | Focus detection device and imaging apparatus having the same |
| US20090230394A1 (en) * | 2008-03-12 | 2009-09-17 | Omnivision Technologies, Inc. | Image sensor array with conformal color filters |
| US20090244514A1 (en) * | 2008-03-26 | 2009-10-01 | Samsung Electronics Co., Ltd. | Distance measuring sensors including vertical photogate and three-dimensional color image sensors including distance measuring sensors |
| US20090284731A1 (en) * | 2008-05-13 | 2009-11-19 | Samsung Electronics Co., Ltd. | Distance measuring sensor including double transfer gate and three dimensional color image sensor including the distance measuring sensor |
| US8525906B2 (en) * | 2008-07-18 | 2013-09-03 | Sony Corporation | Solid-state imaging element and camera system |
| US20100020209A1 (en) * | 2008-07-25 | 2010-01-28 | Samsung Electronics Co., Ltd. | Imaging method and apparatus |
| US7646943B1 (en) * | 2008-09-04 | 2010-01-12 | Zena Technologies, Inc. | Optical waveguides in image sensors |
| US20130020620A1 (en) * | 2008-09-04 | 2013-01-24 | Zena Technologies, Inc. | Optical waveguides in image sensors |
| US20100117177A1 (en) * | 2008-11-07 | 2010-05-13 | Young Je Yun | Image Sensor and Method of Manufacturing the Same |
| US20110199506A1 (en) * | 2008-11-11 | 2011-08-18 | Canon Kabushiki Kaisha | Focus detection apparatus and control method therefor |
| US20100123771A1 (en) * | 2008-11-14 | 2010-05-20 | Samsung Electronics Co., Ltd. | Pixel circuit, photoelectric converter, and image sensing system including the pixel circuit and the photoelectric converter |
| US20100128109A1 (en) * | 2008-11-25 | 2010-05-27 | Banks Paul S | Systems And Methods Of High Resolution Three-Dimensional Imaging |
| US20100150538A1 (en) * | 2008-12-15 | 2010-06-17 | Sony Corporation | Image Pickup apparatus and focus control method |
| US20100245656A1 (en) * | 2009-03-31 | 2010-09-30 | Sony Corporation | Imaging device and focus detecting method |
| US20100265381A1 (en) * | 2009-04-16 | 2010-10-21 | Sony Corporation | Imaging device |
| US20100290674A1 (en) * | 2009-05-14 | 2010-11-18 | Samsung Electronics Co., Ltd. | 3D image processing apparatus improving depth accuracy of region of interest and method |
| US20110007135A1 (en) * | 2009-07-09 | 2011-01-13 | Sony Corporation | Image processing device, image processing method, and program |
| US20110019049A1 (en) * | 2009-07-27 | 2011-01-27 | Samsung Electronics Co., Ltd. | Photo detecting apparatus and unit pixel thereof |
| US20110018974A1 (en) * | 2009-07-27 | 2011-01-27 | Sen Wang | Stereoscopic imaging using split complementary color filters |
| US20110042552A1 (en) * | 2009-08-19 | 2011-02-24 | Furuya Shogo | Solid-state imaging device and method of manufacturing the same |
| US20110109776A1 (en) * | 2009-11-10 | 2011-05-12 | Fujifilm Corporation | Imaging device and imaging apparatus |
| US20120267747A1 (en) * | 2009-12-04 | 2012-10-25 | Canon Kabushiki Kaisha | Solid-state image pickup device and method for manufacturing the same |
| US20110199602A1 (en) * | 2010-02-17 | 2011-08-18 | Suk Pil Kim | Sensor and method using the same |
| US20110249161A1 (en) * | 2010-04-12 | 2011-10-13 | Canon Kabushiki Kaisha | Solid-state imaging device |
| US8947572B2 (en) * | 2010-05-24 | 2015-02-03 | Omnivision Technologies, Inc. | Dual-sided image sensor |
| US20120019695A1 (en) * | 2010-07-26 | 2012-01-26 | Omnivision Technologies, Inc. | Image sensor having dark sidewalls between color filters to reduce optical crosstalk |
| US20120043634A1 (en) * | 2010-08-17 | 2012-02-23 | Canon Kabushiki Kaisha | Method of manufacturing microlens array, method of manufacturing solid-state image sensor, and solid-state image sensor |
| US20130128087A1 (en) * | 2010-08-27 | 2013-05-23 | Todor G. Georgiev | Methods and Apparatus for Super-Resolution in Integral Photography |
| US20120056073A1 (en) * | 2010-09-03 | 2012-03-08 | Jung Chak Ahn | Pixel, method of manufacturing the same, and image processing devices including the same |
| US20130222553A1 (en) * | 2010-09-24 | 2013-08-29 | Fujifilm Corporation | Image pickup device and image pickup apparatus |
| US20120133809A1 (en) * | 2010-11-29 | 2012-05-31 | Canon Kabushiki Kaisha | Solid state image sensor |
| US20120173184A1 (en) * | 2011-01-05 | 2012-07-05 | Samsung Electronics Co., Ltd. | Depth sensor, defect correction method thereof, and signal processing system including the depth sensor |
| US20120193515A1 (en) * | 2011-01-28 | 2012-08-02 | Gennadiy Agranov | Imagers with depth sensing capabilities |
| US20120212581A1 (en) * | 2011-02-17 | 2012-08-23 | Canon Kabushiki Kaisha | Image capture apparatus and image signal processing apparatus |
| US20120212654A1 (en) * | 2011-02-18 | 2012-08-23 | Canon Kabushiki Kaisha | Image pickup apparatus, focus detection method, image generation method, and storage medium |
| US8730545B2 (en) * | 2011-03-24 | 2014-05-20 | Fujifilm Corporation | Color imaging element, imaging device, and storage medium storing a control program for imaging device |
| US20120268634A1 (en) * | 2011-04-20 | 2012-10-25 | Canon Kabushiki Kaisha | Image sensor and image capturing apparatus |
| US20130038691A1 (en) * | 2011-08-12 | 2013-02-14 | Aptina Imaging Corporation | Asymmetric angular response pixels for single sensor stereo |
| US20140218580A1 (en) * | 2011-08-26 | 2014-08-07 | E2V Semiconductors | Pixel-grouping image sensor |
| US20130181309A1 (en) * | 2012-01-18 | 2013-07-18 | Canon Kabushiki Kaisha | Image pickup apparatus and image pickup system |
| US20130182158A1 (en) * | 2012-01-18 | 2013-07-18 | Canon Kabushiki Kaisha | Image pickup apparatus and image pickup system |
| US20130222552A1 (en) * | 2012-02-27 | 2013-08-29 | Aptina Imaging Corporation | Imaging pixels with depth sensing capabilities |
| US20130222662A1 (en) * | 2012-02-28 | 2013-08-29 | Canon Kabushiki Kaisha | Imaging device, imaging system, and method for driving imaging device |
| US20130256510A1 (en) * | 2012-03-29 | 2013-10-03 | Omnivision Technologies, Inc. | Imaging device with floating diffusion switch |
| US20130271646A1 (en) * | 2012-04-11 | 2013-10-17 | Canon Kabushiki Kaisha | Image capture apparatus and control method therefor |
| US20140103410A1 (en) * | 2012-10-11 | 2014-04-17 | Omnivision Technologies, Inc. | Partial buried channel transfer device in image sensors |
| US20140192248A1 (en) * | 2013-01-07 | 2014-07-10 | Canon Kabushiki Kaisha | Imaging apparatus and method for controlling same |
| US20140253905A1 (en) * | 2013-03-06 | 2014-09-11 | Samsung Electronics Co., Ltd | Depth pixel and image pick-up apparatus including the same |
| US20150001589A1 (en) * | 2013-06-28 | 2015-01-01 | Canon Kabushiki Kaisha | Photoelectric conversion device and imaging system |
| US20150062422A1 (en) * | 2013-08-27 | 2015-03-05 | Semiconductor Components Industries, Llc | Lens alignment in camera modules using phase detection pixels |
| US9106826B2 (en) * | 2013-09-20 | 2015-08-11 | Fujifilm Corporation | Image capturing apparatus and focusing control method |
| US20150312461A1 (en) * | 2014-04-28 | 2015-10-29 | Tae Chan Kim | Image sensor including a pixel having photoelectric conversion elements and image processing device having the image sensor |
Also Published As
| Publication number | Publication date |
|---|---|
| US10158843B2 (en) | 2018-12-18 |
| US9554115B2 (en) | 2017-01-24 |
| US20170094260A1 (en) | 2017-03-30 |
| US20130222552A1 (en) | 2013-08-29 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10158843B2 (en) | Imaging pixels with depth sensing capabilities | |
| US10014336B2 (en) | Imagers with depth sensing capabilities | |
| US10498990B2 (en) | Imaging systems with high dynamic range and phase detection pixels | |
| US9445018B2 (en) | Imaging systems with phase detection pixels | |
| CN208014701U (en) | Imaging system and imaging sensor | |
| US20180288398A1 (en) | Asymmetric angular response pixels for singl sensor stereo | |
| US10015416B2 (en) | Imaging systems with high dynamic range and phase detection pixels | |
| US10797090B2 (en) | Image sensor with near-infrared and visible light phase detection pixels | |
| US10284769B2 (en) | Image sensor with in-pixel depth sensing | |
| US9883128B2 (en) | Imaging systems with high dynamic range and phase detection pixels | |
| US10593712B2 (en) | Image sensors with high dynamic range and infrared imaging toroidal pixels | |
| US9729806B2 (en) | Imaging systems with phase detection pixels | |
| US20180301484A1 (en) | Image sensors with high dynamic range and autofocusing hexagonal pixels | |
| US10419664B2 (en) | Image sensors with phase detection pixels and a variable aperture | |
| US20170374306A1 (en) | Image sensor system with an automatic focus function | |
| US20150281538A1 (en) | Multi-array imaging systems and methods | |
| US10075663B2 (en) | Phase detection pixels with high speed readout | |
| US20150244957A1 (en) | Backside illuminated imaging systems having auto-focus pixels | |
| US20210280623A1 (en) | Phase detection pixels with stacked microlenses |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: SEMICONDUCTOR COMPONENTS INDUSTRIES, LLC, ARIZONA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:APTINA IMAGING CORPORATION;REEL/FRAME:047354/0072 Effective date: 20141217 Owner name: APTINA IMAGING CORPORATION, CAYMAN ISLANDS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:AGRANOV, GENNADIY;KOMORI, HIROFUMI;CAO, DONGQING;SIGNING DATES FROM 20121220 TO 20121224;REEL/FRAME:047354/0018 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| AS | Assignment |
Owner name: DEUTSCHE BANK AG NEW YORK BRANCH, AS COLLATERAL AG Free format text: SECURITY INTEREST;ASSIGNORS:SEMICONDUCTOR COMPONENTS INDUSTRIES, LLC;FAIRCHILD SEMICONDUCTOR CORPORATION;REEL/FRAME:048327/0670 Effective date: 20190122 Owner name: DEUTSCHE BANK AG NEW YORK BRANCH, AS COLLATERAL AGENT, NEW YORK Free format text: SECURITY INTEREST;ASSIGNORS:SEMICONDUCTOR COMPONENTS INDUSTRIES, LLC;FAIRCHILD SEMICONDUCTOR CORPORATION;REEL/FRAME:048327/0670 Effective date: 20190122 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
| AS | Assignment |
Owner name: FAIRCHILD SEMICONDUCTOR CORPORATION, ARIZONA Free format text: RELEASE OF SECURITY INTEREST IN PATENTS RECORDED AT REEL 048327, FRAME 0670;ASSIGNOR:DEUTSCHE BANK AG NEW YORK BRANCH, AS COLLATERAL AGENT;REEL/FRAME:064079/0001 Effective date: 20230622 Owner name: SEMICONDUCTOR COMPONENTS INDUSTRIES, LLC, ARIZONA Free format text: RELEASE OF SECURITY INTEREST IN PATENTS RECORDED AT REEL 048327, FRAME 0670;ASSIGNOR:DEUTSCHE BANK AG NEW YORK BRANCH, AS COLLATERAL AGENT;REEL/FRAME:064079/0001 Effective date: 20230622 |