[go: up one dir, main page]

WO2015198876A1 - Élément imageur, et dispositif électronique - Google Patents

Élément imageur, et dispositif électronique Download PDF

Info

Publication number
WO2015198876A1
WO2015198876A1 PCT/JP2015/066830 JP2015066830W WO2015198876A1 WO 2015198876 A1 WO2015198876 A1 WO 2015198876A1 JP 2015066830 W JP2015066830 W JP 2015066830W WO 2015198876 A1 WO2015198876 A1 WO 2015198876A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
diffusion layer
image sensor
gate voltage
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2015/066830
Other languages
English (en)
Japanese (ja)
Inventor
駿介 古瀬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Publication of WO2015198876A1 publication Critical patent/WO2015198876A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10FINORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
    • H10F39/00Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
    • H10F39/80Constructional details of image sensors
    • H10F39/802Geometry or disposition of elements in pixels, e.g. address-lines or gate electrodes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/02Details
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/30Measuring the intensity of spectral lines directly on the spectrum itself
    • G01J3/36Investigating two or more bands of a spectrum by separate detectors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10DINORGANIC ELECTRIC SEMICONDUCTOR DEVICES
    • H10D99/00Subject matter not provided for in other groups of this subclass
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10FINORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
    • H10F30/00Individual radiation-sensitive semiconductor devices in which radiation controls the flow of current through the devices, e.g. photodetectors
    • H10F30/20Individual radiation-sensitive semiconductor devices in which radiation controls the flow of current through the devices, e.g. photodetectors the devices having potential barriers, e.g. phototransistors
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10FINORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
    • H10F39/00Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
    • H10F39/10Integrated devices
    • H10F39/12Image sensors
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10FINORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
    • H10F99/00Subject matter not provided for in other groups of this subclass

Definitions

  • This technology relates to image sensors and electronic devices. More specifically, the present invention relates to an image sensor and an electronic device that have improved performance for detecting the intensity of each wavelength of incident light.
  • each pixel has three diffusion layers having a depth of 0.2 ⁇ m, 0.6 ⁇ m, and 2 ⁇ m superimposed on a silicon substrate.
  • each pixel has a three-layer structure, and the three primary colors of light, red (R), green (G), and blue (B), have different wavelengths depending on the transmission characteristics of silicon. It is designed to transmit and receive light.
  • RGB wavelengths are incident from the surface of the silicon substrate, all of RGB is captured in the uppermost layer, RG excluding B elements absorbed in the uppermost layer is captured in the middle layer, and the uppermost layer and middle layer are captured in the lowermost layer.
  • the G value is obtained by subtracting the R value captured in the lowermost layer from the RG value captured in the middle layer
  • the B value is obtained by subtracting the R and G values from the RGB values captured in the uppermost layer.
  • the configuration of the photodiode described above cuts out RGB in the depth direction, but the position for capturing electrons cannot be changed, and the circuit is complicated, so the degree of freedom in design is low. Furthermore, since independent color information is continuously generated in one pixel, it is very difficult to distinguish between correct data and noise in consecutive portions and obtain RGB characteristics, so obtaining RGB characteristics is complicated. Software is required.
  • a spectroscopic sensor having a structure in which a single photodiode corresponding to incident light is provided and the potential depth of the photodiode can be controlled by changing the gate voltage has been proposed (for example, see Patent Document 2).
  • the wavelength and intensity of incident light can be measured by changing the gate voltage and changing the depth of capturing electrons generated by light incident in the photodiode in accordance with the gate voltage. it can.
  • Patent Document 3 proposes a spectroscopic sensor having a structure in which the potential depth of a photodiode can be controlled by changing the gate voltage as a back-illuminated image sensor. Also in Patent Document 3, since the gate electrode is on the incident surface side, the gate electrode absorbs light, and it is necessary to consider the absorption of the gate electrode at the time of analysis.
  • This technology has been made in view of such a situation, and enables a highly sensitive spectroscopic measurement with a single element.
  • An imaging device includes: a diffusion layer formed on a semiconductor substrate; and an electrode that is formed on a side different from a light incident side of the diffusion layer and to which a gate voltage is applied.
  • a gate voltage By changing the voltage, the depth from the surface of the diffusion layer that captures charges generated by incident light in the diffusion layer is changed, and the region from the surface side to the depth is changed for each changed gate voltage.
  • a current By measuring a current indicating the amount of the electric charge generated in step 1, the intensity for each wavelength of the incident light is analyzed.
  • the electrode can be made of metal.
  • the first substrate including the diffusion layer and the electrode and the second substrate including the circuit for performing the analysis may be joined.
  • the bonding can be a bump bonding.
  • a floating diffusion may be further provided, and the floating diffusion may be shared by a plurality of imaging elements.
  • a photoelectric conversion film can be further provided on the side of the diffusion layer different from the side where the electrode is formed.
  • An electronic apparatus includes a diffusion layer formed on a semiconductor substrate, and an electrode formed on a side different from a light incident side of the diffusion layer and to which a gate voltage is applied.
  • a gate voltage By changing the voltage, the depth from the surface of the diffusion layer that captures charges generated by incident light in the diffusion layer is changed, and the region from the surface side to the depth is changed for each changed gate voltage.
  • An image sensor that analyzes the intensity for each wavelength of the incident light by measuring a current that indicates the amount of the electric charge generated in step 1, and a processing unit that processes a signal from the image sensor.
  • An imaging device includes a diffusion layer formed on a semiconductor substrate and an electrode that is formed on a side different from the light incident side of the diffusion layer and to which a gate voltage is applied. . Then, by changing the gate voltage, the depth from the surface of the diffusion layer that captures the charge generated by the incident light in the diffusion layer is changed, and in the region from the surface side to the depth for each changed gate voltage. By measuring a current indicating the amount of generated charges, the intensity of each wavelength of incident light is analyzed.
  • the electronic device includes the image sensor.
  • spectroscopic measurement can be performed with high sensitivity by a single element.
  • the present technology is an image sensor capable of performing spectroscopic measurement.
  • the wavelength of light that is photoelectrically converted from light incident on the semiconductor device changes depending on the depth from the surface of the semiconductor.
  • a spectral characteristic can be measured by providing a gate electrode in a spectroscopic sensor that constitutes an imaging device and variably controlling the potential of a charge well in which electrons converted from photons accumulate.
  • a solid-state image sensor that extracts a color image from the measured spectral characteristics can be configured.
  • the semiconductor type is silicon (Si)
  • Si silicon
  • the electrons are excited and light is converted into electric charges.
  • ⁇ 0 is ⁇ 1.0 ⁇ m.
  • FIG. 2 shows the wavelength dependence of the energy (eV) of light incident on the semiconductor and the absorption coefficient ⁇ (cm ⁇ 1 ).
  • the effective absorption region becomes shallow. That is, short-wavelength light is almost absorbed and photoelectrically converted at a position close to the surface of the semiconductor. In contrast, long-wavelength light reaches a deeper position from the semiconductor surface and is photoelectrically converted.
  • the current generated by changing the potential depth at which electrons (or holes) generated by the incident light of the semiconductor device can be collected is measured.
  • wavelength information of incident light can be obtained.
  • the current generated up to the depth (position) W in the semiconductor can be obtained by calculation.
  • the light intensity attenuates exponentially. Therefore, the light intensity ⁇ at a certain depth x is expressed by the following equation (2).
  • each parameter is as follows.
  • a 1 , A 2 Incident light intensity [W / cm 2 ]
  • S Sensor area [cm2]
  • I 1 Actual measured value of current when electron capture position is W 1 [A]
  • I 2 Measured current value when the electron capture position is W 2 [A]
  • Frequency ⁇ 1 c / ⁇ 1
  • Frequency ⁇ 2 c / ⁇ 2
  • Equation (6) Each parameter in Equation (6) is as follows.
  • the incident light when the incident light is separated into three wavelengths, the current I 3 in the case of the position W 3 where electrons are captured is increased in the equation (4). After that, by calculating in the same manner as in the case of two wavelengths, the incoming light can be separated into three wavelengths. Similarly, when the incident light entering at a wavelength of 100 is dispersed, the position for capturing the electrons may be changed 100 times and measured.
  • 3 and 4 are diagrams showing the configuration of an imaging apparatus equipped with a column parallel analog-digital converter.
  • the solid-state imaging device 10 includes a pixel unit 11, a vertical scanning circuit 18, a column parallel analog-digital converter (ADC) 15, and a digital-analog that generates a ramp wave.
  • a converter (DAC: Digital-Analognaconverter) 19, a logic control circuit 20, and a digital output small amplitude differential signal (LVDS: Low Voltage16differential signaling) interface (I / F) 16.
  • a plurality of pixels 12 are two-dimensionally arranged on a matrix, for example.
  • the pixel 12 has a function as a spectroscopic sensor, for example, and includes a photodiode, a polycrystalline silicon layer to which an impurity is added, and a gate electrode that applies a gate voltage to the polycrystalline silicon layer.
  • the pixel 12 has a configuration in which pixel drive wirings are arranged in units of rows of pixels and vertical signal lines 24 are arranged in units of rows.
  • Each pixel 12 of the pixel unit 11 is driven by a pixel drive wiring 25 extending in the column direction.
  • the pixel signal is an analog signal and is output to the vertical signal line 24 extending in the row direction.
  • a column parallel ADC 15 which is a column parallel analog / digital conversion device includes a comparator 21 and a counter 23.
  • the comparator 21 compares the ramp wave generated from the DAC 19 that generates the ramp wave with the analog signal from each pixel 12.
  • the counter 23 is composed of, for example, an up / down counter that counts the comparison time until the comparison in the comparator 21 is completed and holds the result.
  • a phase locked loop (PLL: Phase Locked Loop) 17 is incorporated to generate a high speed count clock.
  • a logic control circuit 20 that generates an internal clock and a vertical scanning circuit 18 that controls row address and row scanning are arranged as a control circuit for sequentially reading out signals from the pixel unit 11.
  • the digital output LVDS interface (I / F) 16 processes and outputs a signal from the column parallel ADC 15. For example, only buffering may be performed, or black level adjustment, column variation correction, various digital signal processing, and the like may be performed before that. Further, although not particularly illustrated, other various signal processing circuits may be arranged.
  • the column parallel ADC 15 is configured by the comparator 21 and the counter (up / down counter) 23, but the up / down counter is an asynchronous up / down capable of high-speed operation with one count control clock.
  • a counter is preferred.
  • the up / down counter configuration of this embodiment is preferable because it has many advantages such as simplified circuit and high-speed operation.
  • the up / down counter instead of the up / down counter, it is possible to provide double counters, or the counters are not arranged in parallel, and the memory means may be doubled.
  • FIG. 5 is a cross-sectional view showing the configuration of the image sensor.
  • the image sensor 100 is composed of an upper part 101 and a lower part 102.
  • the upper part 101 is a photogate type CIS (CMOS Image Sensor) substrate, and the lower part 102 is a logic substrate.
  • the image sensor 100 is a back-illuminated image sensor in which a CIS substrate and a logic substrate are bonded.
  • the photogate is used to form a depletion layer in order to accumulate one charge among carriers generated by photoelectric conversion in the active layer in the active layer.
  • An upper chip 101 is provided with an on-chip lens (hereinafter referred to as a lens) 121, and an optical black region 122 is provided except for the portion where the lens 121 is provided.
  • the pixel region is defined.
  • a semiconductor substrate under the lens 121 is provided with a P-well region 123 through an n-region, and a back gate region 124 is also provided in a part thereof.
  • the P well region 123 which is a diffusion layer is also provided with a floating diffusion (FD) region 125 and a reset region (reset transistor) 126.
  • FD floating diffusion
  • reset transistor reset transistor
  • an n + region 127 is also provided in the n ⁇ region. These regions are connected to the logic circuit in the lower part 102 by wiring.
  • the back gate region 124 is connected to the logic circuit in the lower portion 102 by the wiring 141
  • the floating diffusion region 125 is connected to the logic circuit in the lower portion 102 by the wiring 144
  • the reset region 126 is connected to the logic circuit in the lower portion 102 by the wiring 146.
  • the n + region 127 is connected to the logic circuit in the lower part 102 by a wiring 147.
  • a gate electrode 131, a transfer gate 132, and a reset gate 133 are also provided below the P well region 123 in the upper portion 101, and wirings are also connected to these gates.
  • the gate electrode 131 is connected to the logic circuit in the lower part 102 by the wiring 142
  • the transfer gate 132 is connected to the logic circuit in the lower part 102 by the wiring 143
  • the reset gate 133 is connected to the logic circuit in the lower part 102 by the wiring 145.
  • the gate electrode 131, the transfer gate 132, and the reset gate 133 are each formed in the polysilicon layer 130 to which impurities are added.
  • the image sensor 100 is provided with the gate electrode 131 in the spectroscopic sensor constituting the image sensor, and variably controls the potential of the charge well in which the electrons converted from the photons are accumulated.
  • the spectral characteristics can be measured.
  • the spectral characteristic can be measured by varying the voltage of the gate electrode 131. This will be described with reference to FIG.
  • the horizontal axis represents a cross-sectional view of the potential
  • the vertical axis represents the potential.
  • the potential cross-sectional view is a cross-sectional view from the lens 121 at a predetermined position in the P well region 123 toward the gate electrode 131.
  • the solid thick line is a potential graph when the gate voltage is low
  • the solid thin line is a potential graph when the gate voltage is high. From the graph of FIG. 6, it can be seen that as the position moves from the lens 121 side in the direction of the gate electrode 131, the potential increases, reaches a maximum value at a predetermined position, and then decreases. That is, the potential reaches the maximum value at a predetermined position (depth) of the P well region 123.
  • the position where the potential becomes the maximum value differs depending on the gate voltage.
  • the dotted line indicates the position where the maximum value is obtained.
  • the image sensor 100 can be used as a spectroscopic sensor. Further, since it can be used as a spectroscopic sensor, a color image can be extracted from the measured spectral characteristics, and can be used as an image sensor for photographing a color image. Such gate voltage control and spectral processing (analysis) are performed by a logic circuit provided in the lower part 102.
  • the imaging element 100 shown in FIG. 5 has a structure in which the potential depth of the quantum well can be controlled by changing the gate voltage from the gate electrode 131. Combined with high sensitivity.
  • the driving of the image sensor 100 is performed by changing the gate voltage in accordance with the vertical readout.
  • An example of a method for spectroscopic measurement of incident light by driving the image sensor 100 will be described below.
  • incident light is incident and, for example, a gate voltage of 1 V is applied by the gate electrode 131, and a current flowing at that time is read.
  • a gate voltage of 2 V is applied to the P well region 123, and the current flowing at that time is read.
  • a gate voltage of 5 V is applied to the P well region 123, and the current flowing at that time is read.
  • the intensity of each wavelength of the incident light is calculated from the above equation (2) based on the current value measured in this way. For example, by raising the voltage of the gate electrode 131 as described above, as shown in FIG. 6, the position where the potential becomes the maximum value moves to the deep side. As a result, electrons are generated in the order of red ⁇ red + green ⁇ red + green + blue.
  • the wavelength ranges from 400 nm to 700 nm and has a width of 300 nm. Therefore, when the imaging device 100 has a unit resolution of 10 nm, if the voltage is changed according to this resolution, the wavelength characteristics ( Spectral characteristics). In addition, if the resolution is increased, the accuracy of spectral characteristics can be improved.
  • color images can be extracted by reproducing colors from the obtained spectral characteristics. Therefore, it is possible to configure the image sensor 100 that does not require a color filter.
  • the image sensor 100 that can also be used as a spectroscopic sensor in the present embodiment is configured to include the gate electrode 131, the polysilicon layer 130, and the P well region 123.
  • the gate electrode 131 is provided below the lens 121 and the P well region 123.
  • the gate electrode 131 is provided at a position that is different from the light incident side and does not prevent the light from entering the photodiode.
  • the gate electrode 131 is provided on the light incident side, for example, between the lens 121 and the P well region 123, the gate electrode 131 needs to be a transparent electrode. In this case, the light incident on the gate electrode 131 is absorbed, and the sensitivity may be lowered.
  • the gate electrode 131 is provided below the P well region 123, and therefore does not absorb the light incident through the lens 121. It is possible to prevent the sensitivity from decreasing.
  • the gate electrode 131 does not need to be a transparent electrode.
  • the gate electrode 131 can be formed of metal and can be used as a light-shielding film for reducing the influence of light on the logic circuit and the like of the lower part 102.
  • the upper part 101 and the lower part 102 can be designed and manufactured separately.
  • a logic circuit can be formed using a supporting substrate, temperature restrictions can be eliminated, and a profile of a light receiving portion (a photogate provided in the upper portion 101) can be easily created.
  • the image sensor 100 can be reduced in height.
  • an organic film such as a color filter can be eliminated, it can be manufactured using a high-temperature reflow process, and the imaging element 100 that can be used even under conditions of 240 degrees or more can be obtained. Become.
  • FIG. 7 shows the configuration of the image sensor according to the second embodiment.
  • the image pickup device 200 shown in FIG. 7 has the same basic configuration as the image pickup device 100 shown in FIG. 5 except that the upper part 101 and the lower part 102 are bump-bonded.
  • the image sensor 200 shown in FIG. 7 parts similar to those of the image sensor 100 shown in FIG.
  • the upper part 101 and the lower part 102 of the image sensor 200 are joined by bumps 202-1 to 202-6.
  • the upper part 101 and the lower part 102 are joined by joining the bumps 202 formed on the upper part 101 and the bumps 202 formed on the lower part 102. It is also possible to configure.
  • the same effect as that of the image sensor 100 is obtained because the gate electrode 131 is provided below the P well region 123 as in the image sensor 100 shown in FIG. Can do.
  • FIG. 8 shows the configuration of the image sensor according to the third embodiment.
  • the image pickup device 300 shown in FIG. 8 has the same basic configuration as the image pickup device 100 shown in FIG. 5 except that a plurality of pixels share a floating diffusion or the like.
  • the image sensor 300 is also composed of an upper part 301 and a lower part 302, similar to the image sensor 100.
  • the upper portion 301 is a photogate type CIS substrate, and the lower portion 302 is a logic substrate.
  • the upper portion 301 is provided with a lens 321, and an optical black region 322 is formed except for a portion where the lens 321 is provided.
  • a P-well region 323 is provided below the lens 321, and a back gate region 324 is provided in part of the P-well region 323. Since the imaging element 300 shares a floating diffusion or the like with other pixels, unlike the imaging element 100 shown in FIG. 5, the P-well region 323 is not provided with a floating diffusion (FD) region or a reset region. .
  • FD floating diffusion
  • an n + region 325 is provided, and the n + region 325 is connected to an n + region 361 provided in the lower portion 302 via a wiring 343. With this n + region 325, wiring 343, and n + region 361, the current from the P well region 323 is read out to the lower portion 302.
  • the upper portion 301 is provided with a wiring 341 connected to the back gate region 324, and this wiring 341 is also connected to the wiring of the back gate region of another pixel.
  • the shared wiring is not shown so as to be connected from the upper portion 301 to the lower portion 302, but is shown halfway, and the wiring that is shown halfway is shared with other pixels. Let's show that.
  • a gate electrode 331 is formed below the P well region 323 via a polysilicon layer 330 to which impurities are added, as in the other embodiments.
  • a wiring 342 is connected to the gate electrode 331, and the wiring 342 is shared with other pixels. The same gate voltage is applied to the pixels sharing the wiring 342 at the same timing.
  • a floating diffusion region 363 and a reset region 365 are provided in the lower portion 302.
  • a transfer gate 362 for transferring charges from the n + region 361 to the floating diffusion region 363 and a reset gate 364 for resetting the floating diffusion region 363 are also provided in the lower portion 302.
  • the floating diffusion region 363 is shared with other pixels, and a wiring 344 is connected thereto.
  • the layout shown in FIG. 9 is obtained when the image sensor 300 is viewed from the upper side (lens 321) side.
  • the image sensor shown in A of FIG. 9 has a configuration in which one floating diffusion is shared by four pixels.
  • the photogate contact 381 is connected to a gate electrode 331 provided below each photodiode.
  • the photogate contact 381 is shared by all pixels. With this configuration, the configuration of the imaging device including the imaging device 300 can be reduced in size.
  • an optical black region 322 is formed between each photodiode.
  • the transfer gate 362 and the like can be disposed below the P well region 323 (photodiode). That is, as shown in FIG. 9B, when the imaging device 300 is viewed from the lens 321 side, the transfer gate 362 of each photodiode is arranged at a position that overlaps the P well region 323 constituting the photodiode.
  • the imaging device including the imaging device 300 can be further reduced in size.
  • the same effect as the image pickup device 100 can be obtained because the gate electrode 331 is provided below the P well region 323 as in the case of the image pickup device 100 shown in FIG. Can do.
  • the image sensor 300 shown in FIG. 8 can be miniaturized by sharing the floating diffusion and the like with other pixels.
  • the image sensor 300 shown in FIG. 8 may be configured such that the upper portion 301 and the lower portion 302 are joined by bumps.
  • FIG. 9 shows the configuration of the image sensor according to the fourth embodiment.
  • the image pickup device 400 shown in FIG. 9 has the same basic configuration as the image pickup device 100 shown in FIG. 5 except that the image pickup device 400 has an organic photoelectric conversion film.
  • the same parts as those of the image sensor 100 shown in FIG. 9 the same parts as those of the image sensor 100 shown in FIG.
  • the image sensor 400 has a configuration in which an organic photoelectric conversion film 451 is added to the image sensor 100 shown in FIG.
  • the organic photoelectric conversion film 451 is directly below the lens 121 and is provided between the lens 121 and the P well region 123.
  • an electrode 461 is provided on the upper side of the organic photoelectric conversion film 451, and an electrode 462 is provided on the lower side.
  • the organic photoelectric conversion film 451 is configured to be sandwiched between the electrode 461 and the electrode 462.
  • the electrode 461 is connected to the logic circuit in the lower portion 402 by a wiring 471.
  • the electrode 462 is connected to the logic circuit in the lower portion 402 by a wiring 472.
  • the organic photoelectric conversion film 451 is a film having sensitivity to a specific color.
  • the organic photoelectric conversion film 451 is a film that absorbs blue light and transmits light of other colors.
  • the description will be continued assuming that the film absorbs blue light.
  • This electron-hole pair is separated into electrons and holes by the electric field applied by the electrodes 461 and 462 in the organic photoelectric conversion film 451, and is read out to the logic circuit.
  • the blue intensity is calculated according to the read charge amount.
  • the organic photoelectric conversion film 451 By providing the organic photoelectric conversion film 451 in this way, it is possible to reduce the number of colors to be detected while changing the voltage of the gate electrode 131 (narrow the width of the wavelength to be detected). Thereby, for example, analysis becomes easier than the imaging device 100 shown in FIG. In addition, since the amount of information to be analyzed is reduced, it is possible to shorten the time required for analysis.
  • the same effect as that of the image sensor 100 is obtained because the gate electrode 131 is provided below the P well region 123 as in the image sensor 100 shown in FIG. Can do.
  • the organic photoelectric conversion film 451 since there is the organic photoelectric conversion film 451, there is a temperature limitation on the organic photoelectric conversion film 451.
  • the image sensor 400 shown in FIG. 10 can also be configured such that the upper portion 401 and the lower portion 402 are joined by bumps.
  • the image sensor 400 shown in FIG. 10 can also have a configuration in which a floating diffusion or the like is shared by a plurality of pixels.
  • FIG. 11 is a block diagram illustrating an example of a configuration of an electronic apparatus according to the present technology, for example, an imaging apparatus.
  • an imaging apparatus 1000 according to the present technology includes an optical system including a lens group 1001 and the like, a solid-state imaging device (imaging device) 1002, a DSP (Digital Signal Processor) circuit 1003, a frame memory 1004, and a display unit 1005.
  • a DSP circuit 1003, a frame memory 1004, a display unit 1005, a recording unit 1006, an operation unit 1007, and a power supply unit 1008 are connected to each other via a bus line 1009.
  • the lens group 1001 takes in incident light (image light) from a subject and forms an image on the imaging surface of the solid-state imaging device 1002.
  • the solid-state imaging device 1002 converts the amount of incident light imaged on the imaging surface by the lens group 1001 into an electrical signal in units of pixels and outputs it as a pixel signal.
  • the DSP circuit 1003 processes a signal from the solid-state image sensor 1002.
  • the solid-state imaging device 1002 has pixels for constructing an image of a photographed subject, and processing such as processing a signal from such a pixel and developing it in the frame memory 1004 is also performed.
  • the display unit 1005 includes a panel type display device such as a liquid crystal display device or an organic EL (electroluminescence) display device, and displays a moving image or a still image captured by the solid-state image sensor 1002.
  • the recording unit 1006 records a moving image or a still image captured by the solid-state imaging device 1002 on a recording medium such as a DVD (Digital Versatile Disk).
  • the operation unit 1007 issues operation commands for various functions of the imaging apparatus under operation by the user.
  • the power source unit 1008 appropriately supplies various power sources serving as operation power sources for the DSP circuit 1003, the frame memory 1004, the display unit 1005, the recording unit 1006, and the operation unit 1007 to these supply targets.
  • the CPU 1010 controls each unit in the imaging apparatus 1000.
  • the imaging apparatus having the above-described configuration can be used as an imaging apparatus such as a video camera, a digital still camera, and a camera module for mobile devices such as a mobile phone.
  • the above-described imaging element can be used as the solid-state imaging element 1002.
  • this technology can also take the following structures.
  • An image sensor that analyzes the intensity of each wavelength of the incident light by measuring a current indicating the amount of the charge generated in the region.
  • Floating diffusion is further provided, The imaging element according to any one of (1) to (4), wherein the floating diffusion is shared by a plurality of imaging elements.
  • An image sensor that analyzes the intensity for each wavelength of the incident light by measuring a current indicating the amount of the electric charge generated in the region of An electronic apparatus comprising a processing unit that processes a signal from the image sensor.

Landscapes

  • Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Solid State Image Pick-Up Elements (AREA)
  • Photometry And Measurement Of Optical Pulse Characteristics (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)
  • Spectrometry And Color Measurement (AREA)

Abstract

 La présente technique se rapporte à un élément imageur et à un dispositif électronique qui permettent d'améliorer la performance de division de lumière. La présente invention comporte une couche de diffusion formée sur un substrat semi-conducteur, et une électrode formée sur un côté de la couche de diffusion différent du côté sur lequel la lumière est incidente, l'électrode étant soumise à une tension de grille. La tension de grille varie pour faire varier la profondeur, depuis la surface de la couche de diffusion, à laquelle une charge générée par la lumière incidente dans la couche de diffusion est capturée, et pour chaque valeur de la tension de grille variable, un courant est mesuré, lequel indique la quantité de charge générée dans la zone depuis la surface à la profondeur correspondante, l'intensité pour chaque longueur d'onde de la lumière incidente étant analysée. Cette technique peut être appliquée à un capteur spectral qui est élément diviseur de lumière, ou à un élément imageur pour la production d'informations de couleur au moyen du résultat de la division de lumière.
PCT/JP2015/066830 2014-06-24 2015-06-11 Élément imageur, et dispositif électronique Ceased WO2015198876A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014128839A JP2016009739A (ja) 2014-06-24 2014-06-24 撮像素子、電子機器
JP2014-128839 2014-06-24

Publications (1)

Publication Number Publication Date
WO2015198876A1 true WO2015198876A1 (fr) 2015-12-30

Family

ID=54937967

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/066830 Ceased WO2015198876A1 (fr) 2014-06-24 2015-06-11 Élément imageur, et dispositif électronique

Country Status (2)

Country Link
JP (1) JP2016009739A (fr)
WO (1) WO2015198876A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2021153295A1 (fr) * 2020-01-31 2021-08-05
JPWO2020170936A1 (ja) * 2019-02-20 2021-12-16 ソニーセミコンダクタソリューションズ株式会社 撮像装置

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7005886B2 (ja) * 2016-03-31 2022-01-24 ソニーグループ株式会社 固体撮像素子、および電子機器
KR102681913B1 (ko) * 2018-09-11 2024-07-05 소니 세미컨덕터 솔루션즈 가부시키가이샤 고체 촬상 소자
JP2022163433A (ja) * 2021-04-14 2022-10-26 株式会社ニコン 撮像素子及び撮像装置

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004146816A (ja) * 2002-09-30 2004-05-20 Matsushita Electric Ind Co Ltd 固体撮像装置およびこれを用いた機器
JP2005010114A (ja) * 2003-06-23 2005-01-13 Japan Science & Technology Agency 入射光の測定方法及びそれを用いた分光機構を有するセンサー
JP2006120922A (ja) * 2004-10-22 2006-05-11 Fuji Film Microdevices Co Ltd 光電変換膜積層型カラー固体撮像装置
JP2007067075A (ja) * 2005-08-30 2007-03-15 Nippon Hoso Kyokai <Nhk> カラー撮像素子
JP2009014459A (ja) * 2007-07-03 2009-01-22 Hamamatsu Photonics Kk 裏面入射型測距センサ及び測距装置
JP2009168742A (ja) * 2008-01-18 2009-07-30 Sony Corp 分光センサ、固体撮像素子及び撮像装置
JP2009180512A (ja) * 2008-01-29 2009-08-13 Fujifilm Corp 分光センサー、分光センサーを利用した蛍光検出方法および蛍光検出装置
JP2010225927A (ja) * 2009-03-24 2010-10-07 Sony Corp 固体撮像装置、固体撮像装置の駆動方法、及び電子機器
JP2011029337A (ja) * 2009-07-23 2011-02-10 Sony Corp 固体撮像装置とその製造方法、及び電子機器
JP2011114324A (ja) * 2009-11-30 2011-06-09 Sony Corp 固体撮像装置及び電子機器
JP2011181595A (ja) * 2010-02-26 2011-09-15 Panasonic Corp 固体撮像装置およびカメラ
JP2012104704A (ja) * 2010-11-11 2012-05-31 Toshiba Corp 固体撮像装置およびその製造方法
JP2013197697A (ja) * 2012-03-16 2013-09-30 Sony Corp 固体撮像装置及び電子機器
JP2013214930A (ja) * 2012-04-04 2013-10-17 Nippon Hoso Kyokai <Nhk> 裏面照射型撮像素子、それを備えた駆動装置及び撮像装置並びに裏面照射型撮像素子の駆動方法

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004146816A (ja) * 2002-09-30 2004-05-20 Matsushita Electric Ind Co Ltd 固体撮像装置およびこれを用いた機器
JP2005010114A (ja) * 2003-06-23 2005-01-13 Japan Science & Technology Agency 入射光の測定方法及びそれを用いた分光機構を有するセンサー
JP2006120922A (ja) * 2004-10-22 2006-05-11 Fuji Film Microdevices Co Ltd 光電変換膜積層型カラー固体撮像装置
JP2007067075A (ja) * 2005-08-30 2007-03-15 Nippon Hoso Kyokai <Nhk> カラー撮像素子
JP2009014459A (ja) * 2007-07-03 2009-01-22 Hamamatsu Photonics Kk 裏面入射型測距センサ及び測距装置
JP2009168742A (ja) * 2008-01-18 2009-07-30 Sony Corp 分光センサ、固体撮像素子及び撮像装置
JP2009180512A (ja) * 2008-01-29 2009-08-13 Fujifilm Corp 分光センサー、分光センサーを利用した蛍光検出方法および蛍光検出装置
JP2010225927A (ja) * 2009-03-24 2010-10-07 Sony Corp 固体撮像装置、固体撮像装置の駆動方法、及び電子機器
JP2011029337A (ja) * 2009-07-23 2011-02-10 Sony Corp 固体撮像装置とその製造方法、及び電子機器
JP2011114324A (ja) * 2009-11-30 2011-06-09 Sony Corp 固体撮像装置及び電子機器
JP2011181595A (ja) * 2010-02-26 2011-09-15 Panasonic Corp 固体撮像装置およびカメラ
JP2012104704A (ja) * 2010-11-11 2012-05-31 Toshiba Corp 固体撮像装置およびその製造方法
JP2013197697A (ja) * 2012-03-16 2013-09-30 Sony Corp 固体撮像装置及び電子機器
JP2013214930A (ja) * 2012-04-04 2013-10-17 Nippon Hoso Kyokai <Nhk> 裏面照射型撮像素子、それを備えた駆動装置及び撮像装置並びに裏面照射型撮像素子の駆動方法

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2020170936A1 (ja) * 2019-02-20 2021-12-16 ソニーセミコンダクタソリューションズ株式会社 撮像装置
JP7541971B2 (ja) 2019-02-20 2024-08-29 ソニーセミコンダクタソリューションズ株式会社 撮像装置
US12389695B2 (en) 2019-02-20 2025-08-12 Sony Semiconductor Solutions Corporation Imaging device
JPWO2021153295A1 (fr) * 2020-01-31 2021-08-05
US12446344B2 (en) 2020-01-31 2025-10-14 Sony Semiconductor Solutions Corporation Solid-state image sensor

Also Published As

Publication number Publication date
JP2016009739A (ja) 2016-01-18

Similar Documents

Publication Publication Date Title
JP7264187B2 (ja) 固体撮像装置およびその駆動方法、並びに電子機器
US10903279B2 (en) Solid state image sensor pixel electrode below a photoelectric conversion film
US10942304B2 (en) Solid-state imaging element, manufacturing method of the same, and electronic device
JP6879919B2 (ja) 固体撮像素子、電子機器、及び、固体撮像素子の製造方法
JP5061915B2 (ja) 固体撮像素子及び撮像装置
WO2017130728A1 (fr) Dispositif d&#39;imagerie à semiconducteur et dispositif électronique
US9287302B2 (en) Solid-state imaging device
CN108369953B (zh) 成像装置和电子设备
CN107078147A (zh) 固态摄像元件、固态摄像元件的制造方法和电子装置
KR20220066188A (ko) 고체 촬상 소자 및 그 제조 방법, 및 전자 기기
WO2010004683A1 (fr) Dispositif d&#39;imagerie à semi-conducteurs
US10536659B2 (en) Solid-state image capturing element, manufacturing method therefor, and electronic device
CN117673105A (zh) 光检测装置和电子设备
WO2016104177A1 (fr) Élément de capture d&#39;image à semi-conducteur, son procédé de fabrication, et composant électronique
JPWO2017159362A1 (ja) 固体撮像素子およびその製造方法、並びに電子機器
WO2015198876A1 (fr) Élément imageur, et dispositif électronique
WO2016084629A1 (fr) Élément d&#39;imagerie à semiconducteur et équipement électronique
WO2022149488A1 (fr) Dispositif de détection de lumière et appareil électronique
TW202245464A (zh) 攝像裝置及測距系統

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15811309

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15811309

Country of ref document: EP

Kind code of ref document: A1