WO2025191218A1 - Image sensor - Google Patents
Image sensorInfo
- Publication number
- WO2025191218A1 WO2025191218A1 PCT/FI2025/050131 FI2025050131W WO2025191218A1 WO 2025191218 A1 WO2025191218 A1 WO 2025191218A1 FI 2025050131 W FI2025050131 W FI 2025050131W WO 2025191218 A1 WO2025191218 A1 WO 2025191218A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image sensor
- photodiode
- readout
- cmos substrate
- pixel region
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- H—ELECTRICITY
- H10—SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
- H10F—INORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
- H10F39/00—Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
- H10F39/10—Integrated devices
- H10F39/12—Image sensors
- H10F39/18—Complementary metal-oxide-semiconductor [CMOS] image sensors; Photodiode array image sensors
- H10F39/182—Colour image sensors
-
- H—ELECTRICITY
- H10—SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
- H10F—INORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
- H10F39/00—Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
- H10F39/10—Integrated devices
- H10F39/12—Image sensors
- H10F39/191—Photoconductor image sensors
- H10F39/193—Infrared image sensors
-
- H—ELECTRICITY
- H10—SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
- H10F—INORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
- H10F39/00—Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
- H10F39/80—Constructional details of image sensors
- H10F39/809—Constructional details of image sensors of hybrid image sensors
-
- H—ELECTRICITY
- H10—SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
- H10F—INORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
- H10F39/00—Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
- H10F39/80—Constructional details of image sensors
- H10F39/811—Interconnections
Definitions
- the present disclosure relates to image sensors, and particularly to the integration of infrared-sensing elements with CMOS image sensors.
- CMOS-based image sensors are abundant in consumer and industrial applications. CMOS image sensors offer high sensitivity and extreme pixel resolution with few-micrometer pixel size enabling imager arrays comprising several millions of pixels in a small sensor footprint.
- CQD image sensors have emerged as competitive technology providing broadband photodetection extending into near-infrared (NIR), short-wave infrared (SWIR) and mid-wave infrared (MWIR) wavelengths while offering the benefits of monolithic integration on CMOS wafer surfaces.
- NIR near-infrared
- SWIR short-wave infrared
- MWIR mid-wave infrared
- Current state-of-the-art CQD-based image sensor and camera products are demonstrating megapixel-resolution typically as monochromatic sensors (i.e. without any on-chip filtering).
- An object of the present disclosure is to integrate infrared sensing to a CMOS image sensor with less inactive area.
- the disclosure is based on the idea of utilizing the pixel area of one pixel to accommodate at least a part of the readout circuitry of another pixel.
- An advantage of this solution is that the inactive area of the latter pixel can be reduced.
- Figure la - lb are illustrate an image sensor with a first pixel and a second pixel.
- Figures 2a - 2f illustrate image sensors with multiple first pixels and filters.
- Figures 3a - 3b illustrate image sensors where the electrical circuit stack has been built on the non-illuminated side of the CMOS substrate.
- Figures 4a - 4d illustrate possible pixel geometries in the xy-plane.
- Figures 5a - 5e illustrate practical examples of pixel geometries in two and three dimensions.
- FIGS. 6a - 6e illustrate examples of practical implementations.
- This disclosure describes an image sensor which comprises a CMOS substrate which defines an xy-plane, wherein the xy-plane comprises a readout region.
- the image sensor also comprises a first photodiode in the CMOS substrate.
- the first photodiode defines a first pixel region in the xy-plane.
- the image sensor also comprises a first set of readout components in the CMOS substrate.
- the first set of readout components is electrically connected to the first photodiode.
- the image sensor also comprises an electrical circuit stack on the CMOS substrate.
- the image sensor also comprises a second photodiode which is configured to sense infrared radiation.
- the second photodiode comprises a photoactive layer, and the photoactive layer comprises colloidal quantum dots.
- the second photodiode defines a second pixel region in the xy-plane.
- the image sensor also comprises a second set of readout components in the electrical circuit stack.
- the second set of readout components is electrically connected to the second photodiode.
- the first and second sets of readout components are stacked on top each other in the readout region.
- Figure la illustrates an image sensor 10 with a CMOS substrate 19.
- the substrate 19 may be a silicon substrate.
- the substrate has a first side 191 and a second side 192 which is opposite to the first side 191.
- the CMOS substrate 19 will be illustrated in configurations where the first side 191 has a greater z-coordinate than the second side 192.
- the xy- plane may for example be the plane of the first side 191 of the substrate.
- the z-axis is perpendicular to the xy-plane.
- a first photodiode 13 has been formed in the CMOS substrate 19.
- the first photodiode may comprise n- and p-doped layers (not separately illustrated), which may be formed in the CMOS substrate 19 by diffusion and/or implantation. These processes may be formed on the first side 191 of the substrate 19, so that the first photodiode 13 lies just beneath the surface of the first side 191 of the CMOS substrate 19.
- the area in the xy-plane where the first photodiode 13 is located is the first pixel region 111. Additional layers, such as an oxide layer (not illustrated), may be deposited on the first side 191 of the CMOS substrate 19 in the first pixel region 111.
- the first photodiode 13 may be a pinned photodiode.
- the first photodiode 13 may, but does not necessarily have to, be configured to sense visible light. It could alternatively be configured to sense infrared radiation.
- the first photodiode may be an avalanche photodiode or a single-photon avalanche diode.
- any photodiode (first, second or additional) which is intended to sense infrared radiation may be configured to sense NIR, SWIR or MWIR wavelengths. Any such photodiode may be configured to respond to infrared radiation in just one of these wavelength ranges. Any such photodiode may alternatively be configured to respond to infrared radiation in all of these wavelength ranges.
- the image sensor may comprise a control circuit, or it may be connected to a control circuit.
- the first and second readout components may be connected to the control circuit.
- the control circuit may be configured to retrieve a measurement signal from any photodiode described in this disclosure. The control circuit will be described in more detail below.
- the image sensor may be operated when it is illuminated by incoming electromagnetic radiation 101.
- the image sensor 10 has an illuminated side (illustrated as the top side in the figures) which is configured to receive illumination and a non-illuminated side (illustrated as the bottom side) which is opposite to the illuminated side.
- the first side 191 of the CMOS substrate 19 faces toward the illuminated side of the image sensor.
- the second side 192 of the CMOS substrate 19 faces toward the non-illuminated side of the image sensor 10.
- the illuminated side of the image sensor 10 may for example comprise lenses and/or other at least partly transparent optical elements (not illustrated).
- the non-illuminated side may comprise a non-transparent main body (not illustrated) of the image sensor 10, which provides structural support and protection for the sensor.
- the image sensor in figure la also comprises an electrical circuit stack 15 on the CMOS substrate 19.
- the side of the CMOS substrate 19 where the stack 15 is built may be called the front side of the substrate 19, and the opposite side may be called the back side of the substrate 19.
- the stack 15 may be formed on the first side 191 (front-side illumination), but it may alternatively be formed on the second side 192 (back-side illumination). This will be described in more detail below.
- the stack 15 may comprise insulating layers 151 and electric conductors 152 which extend through the insulating layers in the x-, y and/or z-directions.
- the electrical circuit stack 15 may be a multilayer electrical circuit which extends in the xy-plane and in the z-direction. The circuit may be much more complex than the schematic illustration in figure la indicates.
- the image sensor 10 also comprises a second photodiode 16.
- This second diode 16 is in figure la formed on top of the electrical circuit stack 15, but other options are also possible, as figure 3a for example illustrates.
- the second diode 16 may comprise a first electrode 161 and a second electrode 163, which may also be called an anode and a cathode.
- a photoactive layer 162 comprising colloidal quantum dots (CQD) is sandwiched between the two electrodes 161 and 162.
- the second diode 16 may also comprise either an electron transport layer (ETL, not illustrated) or a hole transport layer (HTL, not illustrated) between the photoactive layer 162 and the first electrode 161 and between the photoactive layer 162 and the second electrode 163.
- ETL electron transport layer
- HTL hole transport layer
- the photoactive layer 162 may for example comprise any of the following colloidal quantum dot materials: PbS, HgTe, InAs, Ag 2 Se, Ag 2 Te, Bi 2 S 3 .
- the photoactive layer 162 may comprise HgE (where E may stand for S, Se, Te or varying compositions of S, Se, Te), or alloys of CdE, ZnE and HgE.
- the photoactive layer 162 may comprise CuInE 2 , AgBiE 2 , CuZnSnE; Sb 2 E 3 , Bi 2 E 3 ; Cu 2 E, Ag 2 E.
- the photoactive layer 162 may comprise Al, In or Ga in varying compositions combined with varying compositions of N, P, As or Sb.
- the photoactive layer 162 may comprise Pb or Sn in varying compositions combined with varying compositions of S, Se, Te.
- Other quantum dot materials may also be used.
- the photoactive layer may comprise particles of any material listed above, and the diameter of the particles may be in the range 2 nm - 20 nm, or in the range 2 nm - 25 nm, or in the range 1 nm - 50 nm.
- the photoactive layer 162 may also comprise ligand materials such as thiols, amines, or halides.
- Different functional groups may share the same R group to produce bifunctional ligands.
- the ligand materials may be inorganic ligands and they may include, SON-, I-, Br-, CI-, OH-. A combination of organic and inorganic ligands may also be used.
- the first electrode 161 of the second photodiode 16 is the electrode which is closer to the non-illuminated side of the image sensor, while the second electrode 163 is the electrode which is closer to the illuminated side. Consequently, the second electrode 163 should be transparent for the radiation which is detected by the second photodiode 16, while the first electrode 161 does not necessarily have to be transparent to that radiation.
- the electrode material may for example be ITO, AZO or graphene, IZO, FTO.
- the electrode may comprise a metal layer with a thickness below 15 nm. The transparency of the electrode material may for example be greater than 70% VLT.
- any electrode described in this disclosure may comprise a patterned metal grid. The metal may for example be gold or platinum. The patterning may allow a sufficient amount of radiation to pass through the electrode. Alternatively, in any embodiment the electrode may comprise a network of nanowires.
- the electrode material may for example be a metal. The metal may for example be gold or platinum.
- the area in the xy-plane where the second photodiode 16 is located is the second pixel region 121. This is the area where the photoactive layer 162 lies between both the first electrode 161 and the second electrode 163. It may in some embodiments be convenient to extend the photoactive layer 162 across a large area, while first electrode 161 is patterned so that it only covers a smaller area, for example only the readout region. The photoactive layer 162 may in this case extend beyond the second pixel region 121, since the second pixel region 121 is limited to the area where the first electrode 161 is present. These options will be illustrated in some figures of this disclosure.
- the second photodiode and the second set of readout components may be configured for either a current-mode measurement or a voltage-mode measurement.
- current-mode the photodiode is typically reverse- biased (or in some cases zero-biased).
- CTIA Capacitive Transimpedance Amplifier
- the pixel electronics sample the photodiode voltage at the end of each exposure-time cycle. During exposure, the forward voltage across the photodiode increases as a function of photogenerated charge (eventually reaching open-circuit voltage in the case of sufficiently long exposure time).
- the pixel electronics may be designed to exhibit very high input impedance so that minimal charge leakage occurs. Between exposure cycles, the photodiode is reset by short-circuiting the photodiode stack thus discharging any photogenerated charge.
- the image sensor has a readout region 199 in the xy-plane.
- the readout region 199 may be at least partly covered by the electrical circuit stack 15. As figure la illustrates, the readout region 199 may be coextensive with the second pixel region 121, since the second photodiode 16 may cover the entire electrical circuit stack 15. However, the readout region 199 and second pixel region 121 may also be different, as other figures of this disclosure will illustrate.
- the readout region 199 may overlap with at least a part of the second pixel region 221 in the xy-plane, the figures of this disclosure illustrate.
- the readout region does not overlap with any first pixel region (such as 211, 212 and 213) in the figures of this disclosure. This allows the active area of the first pixel regions to be maximized, since the first set of readout components 181 does not occupy the xy-plane in the first pixel regions.
- the image sensor 10 also comprises a first set of readout components 181 in the CMOS substrate 19.
- the first set of readout components 181 may be connected to the first photodiode 13 with an electrical connection 14.
- electrical connections such as 14 are for simplicity illustrated schematically with a black line. In practice, these connections may comprise regions in the CMOS substrate which have been doped and configured to perform charge transfer.
- the first set of readout components 181 may form a front-end-of-line (FEOL) block.
- the first set of readout components may for example comprise one or more of the following components: floating diffusion capacitors, pn-junction capacitors, depletion capacitors, transistor capacitors, MOS capacitors, varactor diodes, trench capacitors or deep trench capacitors.
- floating diffusion capacitors floating diffusion capacitors
- pn-junction capacitors depletion capacitors
- transistor capacitors transistor capacitors
- MOS capacitors varactor diodes
- trench capacitors or deep trench capacitors Some exemplary component architectures are presented toward the end of this disclosure.
- any readout components formed in the CMOS substrate 19 may comprise regions which have been doped and configured to perform the intended function.
- the first set of readout components 181 is for simplicity illustrated just as a single box in figure la, even though it may contain more than one readout component.
- the first set of readout components may be configured to generate a first output signal from the first photodiode 13.
- the first set of readout components 181 may be formed in the CMOS substrate 19 with the same processes that are used to form the first photodiode 13, although the process steps may be separate.
- the first set of readout components 181 may be connected to the electrical circuit stack 15 with an electrical connection 14.
- the first output signal may be transferred to external circuitry via a control circuit (not illustrated in figure la) in the circuit stack 15.
- the image sensor 10 also comprises a second set of readout components 189 in the electrical circuit stack 15.
- This second set of readout components 189 is in figure la connected to the second photodiode 16 with an electrical connection 169 which lies in the electrical circuit stack. More generally, as illustrated in the other figures of this disclosure, this electrical connection 169 may be direct connection. It may be a via which extends in the z-direction from the second photodiode 16 to the second set of readout components 189. This via may extend through any intervening layers (for example filling layer 17 in figure lb) between the second photodiode 16 and the second set of readout components 189.
- the second set of readout components 189 may form a back-end-of-line (BEOL) block.
- the second set of readout components 189 may for example comprise one or more metal-insulator-metal (MIM) capacitors, metal-oxide- metal capacitors (MOM) or metal-oxide-poly (MOP) capacitors.
- MIM metal-insulator-metal
- MOM metal-oxide- metal capacitors
- MOP metal-oxide-poly
- the second set of readout components 189 may be configured to generate a second output signal from the second photodiode 16.
- the second output signal may be transferred to external circuitry via the control circuit in the electrical circuit stack 15.
- the readout region 199 may be adjacent to the first pixel region 111 in the xy-plane. Both the first (181) and second (189) sets of readout components lie in the readout region 199.
- the term "on top of each other" means that the first and second readout components have different z-coordinates.
- the first set of readout components 181 is in the CMOS substrate 19 and the second set of readout components 189 is in the electrical circuit stack which lies on the CMOS substrate in the readout region.
- the second set of readout components 189 may have a greater z- coordinate than the first set of readout components 181 (when the positive z- direction points toward the illuminated side of the image sensor 10).
- the first set of readout components 181 may alternatively have a greater z-coordinate than the second set of readout components 189.
- the first set of readout components 181 may be at least partly aligned with the second set of readout components 189 in the z-direction, so that the projection to the xy-plane of at least one component in the second set 189 overlaps with at least one component of the first set 181 in the xy-plane.
- the first (181) and second (189) sets do not necessarily need to be aligned in this way.
- the first readout components 181 lie in the readout region 199, they do not occupy space in the first pixel region 111.
- the first set of readout components 181 is placed in the readout region, which is an optically inactive region in the CMOS substrate 19.
- the CMOS substrate 19 does not comprise any photodiodes in the readout region 199.
- the first (181) and second (189) sets of readout components are both connected to the electrical circuit stack 15.
- the electrical connections, such as 169, may extend through the electrical circuit stack. They may be located in the readout region 199 of the CMOS substrate 19.
- the image sensor 10 may comprise a filling layer 17 on the CMOS substrate 19, as figure lb illustrates.
- the filling layer 17 may be electrically insulating.
- the filling layer 17 may be planarized, so that its top surface 171 is smooth.
- the second photodiode 16 may be built on the top surface 171 of the filling layer 17. In other words, the filling layer 17 may lie between the second photodiode 16 and the electrical circuit stack 15.
- Figures la - lb illustrate very simple device geometries. In practice, the geometry can be more complex.
- the circuitry which is connected to the first and second photodiodes may also comprise other components which are not discussed in this disclosure.
- the image sensor 10 may be called a front-side illuminated image sensor when the electrical circuit stack 15 is built on the illuminated first side 191 of the CMOS image sensor.
- Figure 2a illustrates a front-side illuminated image sensor 10 where the electrical circuit stack lies between the CMOS substrate and the second photodiode.
- the image sensor 10 has an illuminated side which is configured to receive illumination and a non-illuminated side which is opposite to the illuminated side, and the second photodiode 16 is closer to the illuminated side of the image sensor 10 than the CMOS substrate 19.
- the second photodiode 16 has here been built on top of the filling layer 17.
- Reference numbers 211, 221, 231 and 281 in figures 2a - 2d correspond to reference numbers 111, 121, 131 and 181, respectively, in figures la - lb.
- the image sensor may comprise one or more additional first photodiodes in the CMOS substrate 19.
- the additional first photodiodes define additional first pixel regions in the xy-plane.
- the image sensor 10 in figure 2a comprises two additional first photodiodes 232 and 233 in the CMOS substrate. All options listed above for the first photodiode 231 apply to these additional first photodiodes 232 - 233 as well.
- the additional first photodiodes form additional first pixel regions 212 and 213, respectively, in the xy-plane.
- the image sensor 10 in figure 2a comprises two additional sets of readout components 282 and 283.
- Both sets 282 and 283 may comprise any of the components that the first set of readout components 281 comprises.
- Sets 282 and 283 may perform the same functions for additional first photodiodes 232 - 223 as the first set of readout components 281 performs for the first photodiode 231.
- the additional sets of readout components 282 - 283 may lie in the readout region 199 of the xy-plane.
- the image sensor may comprise a colour filter in the first pixel region.
- the image sensor 10 in figure 2a comprises a first colour filter 241, a second colour filter 242 and a third colour filter 243.
- the first colour filter 241 lies in the first pixel region, while the second and third colour filters 242 and 243 lie in the additional first pixel regions 212 and 213, respectively.
- One of the three colour filters 241 - 243 may for example be a red (R) filter (which primarily transmits red wavelengths), another may be a green (G) filter and the third may be a blue (B) filter.
- the three pixels 211 - 213 may then form an R pixel, a G pixel, and a B pixel, respectively, while the second pixel 221 may form an infrared (IR) pixel.
- R red
- G green
- B blue
- the pass wavelengths of the filters 241 - 243 can be freely selected.
- the corresponding filter 241 - 243 may alternatively be a UV and visible light filter which blocks all visible and UV wavelengths but allows infrared radiation to pass through.
- the image sensor may comprise a UV filter and/or a visible light filter in the readout region. This prevents photogeneration of charge carriers in the first (and third) set of readout components in the CMOS substrate. This is particularly important in backside- illuminated devices. Any filter described in this disclosure may lie on the illuminated side of the CMOS substrate, so that incoming radiation strikes the filter before it reaches the CMOS substrate and/or any photodiode.
- the image sensor may comprise a band-pass filter in the second pixel region or in any first pixel region or additional first pixel region.
- the colour filters 241 - 243 (which may, for example allow the passage of 600-750 nm I 500-600 nm I 400-500 nm) could be designed as band-pass filters.
- the image sensor may comprise a band-pass filter for a particular infrared wavelength band in the second pixel region. Any band-pass filter described in this disclosure may be an interferometric filter.
- Band-pass filters may allow the second photodiode 16, and/or the possible additional second photodiodes (described below), to respond to the corresponding 2*lambda wavelength (1200-1500 nm I 1000-1200 nm I 800- 1000 nm) if they are placed on top of the underlying first photodiodes.
- a second photodiode which is not on top a first photodiode could be then dedicated to the final SWIR band (1500-2000nm).
- Any filters listed here such as Bayer colour filter, UV filters, band-pass filters, can alternatively be constructed as meta-optics elements or metasurfaces.
- the image sensor 10 may comprise a set of RGB pixels 211 - 213 that are adjacent to the readout region 199 in the xy-plane. They may also be adjacent to the electrical circuit stack 15 in the xy-plane.
- the sets of readout components 281 - 283 which are connected the first photodiodes 231 - 233 in these colour pixels may lie in the readout region 199 of the xy-plane.
- the optically active area of the pixels 211 - 213 can then be maximized.
- Figure 2a illustrates an embodiment where the second photodiode 16 has been built next to the colour filters 241 - 243, so that the second electrode 163 of this photodiode extends across the colour filters 241 - 243.
- the first electrode 161 and the photoactive layer 162 are here restricted to the second pixel region 221.
- the second electrode 163 should in this case be transparent for the wavelengths which are intended to be measured in each underlying pixel
- the photoactive layer 162 does not have to be patterned so precisely. It may not need to be patterned at all.
- the photoactive layer 162 should in this case be at least partly transparent to the radiation wavelengths that are detected with the first photodiodes 231 - 233 located in the CMOS substrate 19. This option is illustrated in figure 2b, where the photoactive layer 162 extends across all of the illustrated pixel regions 211 - 213 and 221.
- the first electrode 161 of the second photodiode 16 is present only in the second pixel region 221, so it does not have to be transparent.
- the optically active area of the second photodiode 16 lies between the two electrodes 161 and 163, primarily in the second pixel region 221.
- the regions just next to the second pixel region 221 may also be optically active.
- the colour filters 241 - 243 are on top of the second electrode 163 in figure 2b.
- the image sensor 10 in this figure may also comprise an infrared filter (not illustrated), for example a short-wave infrared (SWIR) filter, on top of the second electrode 163 in the second pixel region 221.
- SWIR short-wave infrared
- figure 2c illustrates an option where a colour filter 242 extends from a neighboring pixel
- the colour filter also extends to the second pixel region.
- the second pixel region 221 is the area where the photoactive layer 162 lies between both the first electrode 161 and the second electrode 163.
- the second pixel region 221 is not necessarily coextensive with the readout region 199.
- Figure 2d illustrates an option where the first electrode 161 of the second photodiode extends beyond the electrical circuit stack 15 and beyond the readout region 199.
- the second pixel region 221 is larger than the readout region 199.
- the first electrode 161 could alternatively be smaller than the electrical circuit stack, so that the second pixel region 221 would cover only a part of the readout region 199.
- the readout region 199 is not necessarily coextensive with the electrical circuit stack 15, either.
- Figure 2e illustrates an image sensor where the readout region 199 lies beneath the electrical circuit stack and the first electrode 161.
- the combined area of the electrical circuit stack and first electrode 161 is an optically inactive area in the CMOS substrate 19 if the first electrode 161 is not transparent to radiation. All of this area can therefore be utilized as the readout region, if necessary.
- Figure 2f illustrates an image sensor 10 where the first electrode 161 extends over multiple pixels, or even across the entire optically active area of the image sensor.
- the first electrode 161 should in this case be at least partly transparent to the radiation wavelengths that are detected with the first photodiodes 231 - 233 located in the CMOS substrate 19.
- the second pixel region 221 may in this configuration extend across the entire optically active area of the image sensor, as figure 2f illustrates.
- the readout region is here restricted to the area of the electrical circuit stack 15. It should be noted that, in any embodiment presented in this disclosure, only a part of the optically inactive area in the CMOS substrate 19 may be utilized as the readout region 199.
- the electrical circuit stack 15 in figure 2f may, for example, have a much larger area than what is needed to accommodate the sets of readout components that lie beneath it.
- the image sensor 10 may be called a back-side illuminated image sensor if the electrical circuit stack 15 is built on the nonilluminated second side 192 of the CMOS image sensor.
- This alternative is illustrated in figures 3a and 3b.
- the CMOS substrate 19 may be supported on the non-illuminated side by a handle wafer, which may comprise an imageprocessing IC chip or a PWB interposer.
- the CMOS substrate 19 lies between the electrical circuit stack 15 and the second photodiode 16.
- the electrical circuit stack 15 lies between the CMOS substrate 19 and the second photodiode 16.
- the image sensor 10 in figure 3a corresponds in some respects to figure 2a. All options relating to reference numbers 199, 161 - 163, 211 - 213, 221, 241 - 243 and 281 - 283 that were described with reference to figures 2a - 2f apply in figure 3a also.
- the photoactive layer 162 and/or one or both electrodes 161 and 163 in figure 3a may extend only across the second pixel region 221, or across multiple pixels.
- Any colour filter 241 - 243 may also in figures 3a - 3b extend into the second pixel region 221, and infrared filtering may be employed according to any of the schemes described above with reference to figures 2a - 2f.
- the image sensor 10 in figure 3a differs from the ones in figures 2a - 2f in that the electrical circuit stack has been built on the non-illuminated second side 192 of the CMOS substrate 19.
- the electrical circuit stack 15 may be present only in the readout region 199 (this option is not illustrated).
- the electrical circuit stack 15 may in this embodiment extend below multiple pixels since the first photodiodes 231 - 233 are illuminated from the opposite side of the CMOS substrate 19.
- the second photodiode 16 can be built on the illuminated side of the CMOS substrate 19, as figure 3a illustrates.
- the electrical connection 169 through the CMOS substrate 19 may in this case be a through-silicon via.
- the second photodiode 16 can alternatively be built on the nonilluminated side of the CMOS substrate 19, so that the electrical circuit stack 15 lies between the CMOS substrate 19 and the second photodiode 16.
- the CMOS substrate 19 and the electrical circuit stack 15 may be at least partly transparent to infrared radiation, so the second photodiode 16 can function even beneath the other layers illustrated in figure 3b.
- the configuration in figure 3b has the advantage that neither the first electrode 161 nor the second electrode 163 needs to be transparent to visible light.
- One or both of these electrodes may also function as a reflector for infrared wavelengths, to increase the efficiency of the second photodiode 16.
- this configuration can be used to minimize filter deposition steps because the substrate 19 can act as a visible light filter to the second photodiode 16.
- the configuration allows for the simultaneous uncoupled detection of SWI and visible radiation (instead of VIS+SWIR and VIS) needed for deconvolution of incident radiation in the z-direction as both photodiodes are stacked in increasing wavelength range from the illuminated side.
- the first electrode 161 may for example be made of gold.
- the first electrode 161 may be configured to be selectively reflective for selected wavebands with an anti-reflective coating or a transmissive thin-film gold layer on an anti-reflective coated glass substrate.
- a filling layer 17 may optionally lie between the second photodiode 16 and the electrical circuit stack 15, as figure 3b illustrates.
- the second pixel region 221 may extend beneath one or more of the first pixel 211 and the additional pixels 212 - 213.
- Figure 4a illustrates the first pixel region 211 in the xy-plane.
- the readout region 199 may (but does not necessarily have to) cover the same area in the xy-plane as the second pixel region 221.
- the second pixel region 221 may surround the first pixel region 211 in the xy- plane, as figure 4b illustrates.
- the photoactive layer 162, and possibly also the first and/or second electrodes 161 and 163, may be patterned so that they contain openings.
- the first pixel region 211 may be located in an opening in the photoactive layer 162.
- the image sensor may comprise one or more additional first photodiodes in the CMOS substrate, wherein the additional first photodiodes define additional first pixel regions 212 - 213 in the xy-plane.
- the second pixel region 221 may surround both the first pixel region 211 and the additional first pixel regions 212 - 213 in the xy-plane, as figure 4b shows.
- the photoactive layer 162 may extend across any of pixels 211 - 213, as in figure 2b - 2f.
- the second pixel region 221 may fully or partly overlap with the first pixel region 211.
- the second pixel region may also fully or partly overlap with the additional first pixel regions 212 - 213.
- Figure 4c illustrates a pixel geometry where the image sensor also comprises a first photodiode in the CMOS substrate for sensing infrared radiation.
- This first photodiode defines pixel region 214 in the xy-plane.
- the first photodiodes 231 - 233 may for example form R, G and B pixel regions 211 - 213, while the infrared-sensing first photodiode forms an IR pixel region 214.
- the second pixel region may surround both the RGB and IR pixel regions, as figure 4c illustrates.
- the image sensor may comprise an additional second photodiode which is configured to sense infrared radiation.
- the photoactive layer may extend to both the second photodiode and to the additional second photodiode.
- the additional second photodiode may be constructed in the same way as the second photodiode described in any preceding embodiment where the second photodiode does not extend across the entire optically active area of the image sensor.
- the second photodiode and the additional second photodiode may share the same photoactive layer 162. They may also share either the same second electrode 163 or the same first electrode 161.
- Each second photodiode is formed in the same way as the second photodiode, in an area where the photoactive layer 162 lies between two electrodes. Either the first electrodes or the second electrodes (or both) of two adjacent second photodiodes could be separated (for example by patterning the corresponding electrode layer) to create the separate pixels 221 and 222 seen in figure 4c.
- Figure 4d illustrate one possible geometry in the xy-plane.
- the additional second pixel region 222 formed by the additional second photodiode is adjacent to the second pixel region 221 formed by the second photodiode.
- the photoactive layer 162 extends across both of these pixel regions 221 and 222.
- the letters R, G, B indicate first photodiodes which may be built in the CMOS substrate in the same way as the first photodiodes which define first pixel regions 211 - 213 in the preceding figures.
- Figures 5a - 5b illustrate a practical example which corresponds at least to figure 2a and 4c.
- Reference number 284 indicates a first set of readout components dedicated to the first photodiode CI which senses infrared radiation. Reference number 9 will be explained below.
- the first electrode 161 in figure 5b has been patterned, so allow light passage to first pixels 211 - 214. This patterning determines the second pixel region 221 in figure 5a.
- the readout region is in this case coextensive with the second pixel region 221.
- Figure 5b illustrates how the readout components of the first photodiodes (R, G, B and CI) can be placed in the readout region in figure 5a. Many other geometries are also possible in the xy-plane.
- Figure 5c illustrates how the electrical circuit stack 15 and the various conductors (such as 169) and the second set readout components 189 which are included in the stack 15 can also be arranged in the readout region without blocking the illumination of pixels 211 - 214.
- Figure 5d illustrates how the readout components of the first photodiodes (R, G, B and CI) can be placed in the readout region in an image sensor which corresponds to figure 3b.
- Reference number 384 indicates the readout components connected to pixel 214.
- Figure 5e illustrates how the electrical conductors in the electrical circuit stack may be placed in this configuration.
- the first photodiodes may be closer to the non-illuminated side 192 than to the illuminated side 191 in the CMOS substrate 19.
- the thickness of the CMOS substrate 19 may for example be less than 0.1 mm or less than 0.2 mm in any of these embodiments. This ensures that the incoming radiation is not attenuated too much in the CMOS substrate before it reaches the first photodiodes.
- the illuminated side 191 is in this case the back surface of the CMOS substrate 19, while the first photodiodes have been formed on the front surface.
- the thickness of the CMOS substrate 19 can be selected freely. It may for example be in the range 0.7 mm - 1 mm.
- Figure 6a illustrates in more detail one possible implementation of the device shown in figure la.
- Reference number 12 indicates an electron transport layer and 64 a hole transport layer.
- Reference numbers 22 is a metal via through the dielectric layers 65 from the first electrode 161 to the CMOS substrate.
- the image sensor in figure 6a comprises a third set of readout components 9 in the CMOS substrate 19.
- the second set of readout components 189 is in this case connected to the second photodiode via the third set of readout components 9.
- This third set of readout components 9 comprises in this case a switch transistor with a doped well 21, a source well 18, a drain well 69 and a gate 20.
- a third set of readout components such as the set 9, may be located in the CMOS substrate.
- This third set may, in any embodiment presented in this disclosure, connect the second set of readout components to the second photodiode.
- the third set of readout components may for example comprise floating diffusion capacitors, pn-junction capacitors, depletion capacitors, transistor capacitors, MOS capacitors, varactor diodes, trench capacitors or deep trench capacitors. Stacked and 3D transistor structures, including FinFET, Nanosheet and Forksheet structures, may also be included in the third set of readout components.
- the second set of readout components 189 comprises a metal-insulator-metal capacitor 60 formed by a top electrode 66, a bottom electrode 67 and an insulating layer 65 between these two electrodes.
- the first photodiode 13 in figure 6a comprises a P+ doped layer 2 and an n- type storage well 3.
- a p-doped well 5 in the substrate 19 extends from the first photodiode 13 to a floating diffusion well 4, which may be n+ doped.
- the set of first readout components 181 comprises in this case the floating diffusion well 4 and a preamplifier transistor 8 connected to the first photodiode 13.
- Reference number 6 indicates a transfer gate and 7 a gate control line.
- Figure 6b illustrates a circuit diagram for the image sensor shown in figure 6a.
- Figure 6c illustrates in more detail one possible implementation of the device shown in figure 3b. Only one first photodiode 331 and its readout components 381 are illustrated for clarity reasons, but the additional photodiodes 332 - 333 in figure 3b and their readout could be implemented with the same configuration as 331. All other elements in figure 6c correspond to the ones illustrated in figure 6a, though all elements are not necessarily identical in these two figures. For example, as explained above with reference to figure 3b, the embodiment shown in figure 6c allows greater use of materials that are not transparent to visible wavelengths.
- Figure 6d illustrates one possible pixel configuration in the xy-plane for the image sensor shown in figure 6c.
- the first electrode 161 and the entire second photodiode may in this extend across multiple adjacent pixel regions.
- Figure 6e illustrates one possible way to implement filtering in the image sensor shown in figure 6c.
- Figure 6e illustrates two adjacent pixels for sensing visible light, and a single infrared pixel which extends below them both.
- the image sensor may comprise colour filters 24 and 25 arranged above the first photodiodes.
- the image sensor may also comprise a light-shielding film 26 which covers the rest of the illuminated surface.
- the light-shielding film 26 may for example block UV wavelengths and visible wavelengths, and optionally also near-infrared wavelengths.
- the light-shielding film may be transparent to short-wave infrared wavelength and longer wavelengths.
- the image sensor may comprise a main body 27.
- the previously described elements of the image sensor may be attached to this main body.
- This main body may for example be a support wafer or a protective enclosure.
Landscapes
- Solid State Image Pick-Up Elements (AREA)
Abstract
This disclosure describes an image sensor comprising with a CMOS substrate, a first photodiode in the CMOS substrate, a first set of readout components in the CMOS substrate and an electrical circuit stack on the CMOS substrate. The image sensor also comprises a second photodiode which is configured to sense infrared radiation. The second photodiode comprises a photoactive layer, and the photoactive layer comprises quantum dots. A second set of readout components in the electrical circuit stack is electrically connected to the second photodiode. The first and second sets of readout components are stacked on top each other in the readout region.
Description
IMAGE SENSOR
FIELD OF THE DISCLOSURE
The present disclosure relates to image sensors, and particularly to the integration of infrared-sensing elements with CMOS image sensors.
BACKGROUND OF THE DISCLOSURE
CMOS-based image sensors are abundant in consumer and industrial applications. CMOS image sensors offer high sensitivity and extreme pixel resolution with few-micrometer pixel size enabling imager arrays comprising several millions of pixels in a small sensor footprint.
Colloidal quantum dot (CQD) image sensors have emerged as competitive technology providing broadband photodetection extending into near-infrared (NIR), short-wave infrared (SWIR) and mid-wave infrared (MWIR) wavelengths while offering the benefits of monolithic integration on CMOS wafer surfaces. Current state-of-the-art CQD-based image sensor and camera products are demonstrating megapixel-resolution typically as monochromatic sensors (i.e. without any on-chip filtering).
Document US20220141400 discloses an image sensor with a substrate. Photodiodes for sensing visible light have been built in the substrate. An infrared photodetector formed by an active layer is built on top of a stack of insulating layers on the same substrate, next to the photodiodes. A disadvantage with the arrangement disclosed in this document is that a portion of each pixel for sensing visible light is occupied by a readout circuit. This portion is optically inactive.
BRIEF DESCRIPTION OF THE DISCLOSURE
An object of the present disclosure is to integrate infrared sensing to a CMOS image sensor with less inactive area.
The object of the disclosure is achieved by an image sensor which is characterized by what is stated in the independent claims. Some embodiments of the disclosure are disclosed in the dependent claims.
The disclosure is based on the idea of utilizing the pixel area of one pixel to accommodate at least a part of the readout circuitry of another pixel. An advantage of this solution is that the inactive area of the latter pixel can be reduced.
BRIEF DESCRIPTION OF THE DRAWINGS
In the following the disclosure will be described in greater detail by means of preferred embodiments with reference to the accompanying drawings, in which:
Figure la - lb are illustrate an image sensor with a first pixel and a second pixel.
Figures 2a - 2f illustrate image sensors with multiple first pixels and filters.
Figures 3a - 3b illustrate image sensors where the electrical circuit stack has been built on the non-illuminated side of the CMOS substrate.
Figures 4a - 4d illustrate possible pixel geometries in the xy-plane.
Figures 5a - 5e illustrate practical examples of pixel geometries in two and three dimensions.
Figures 6a - 6e illustrate examples of practical implementations.
The figures are for illustrative purposes only and are not shown in scale.
DETAILED DESCRIPTION OF THE DISCLOSURE
This disclosure describes an image sensor which comprises a CMOS substrate which defines an xy-plane, wherein the xy-plane comprises a readout region. The image sensor also comprises a first photodiode in the CMOS substrate. The first photodiode defines a first pixel region in the xy-plane. The image sensor also comprises a first set of readout components in the CMOS substrate. The first set of readout components is electrically connected to the first photodiode. The image sensor also comprises an electrical circuit stack on the CMOS substrate.
The image sensor also comprises a second photodiode which is configured to sense infrared radiation. The second photodiode comprises a photoactive layer, and the photoactive layer comprises colloidal quantum dots. The second photodiode defines a second pixel region in the xy-plane.
The image sensor also comprises a second set of readout components in the electrical circuit stack. The second set of readout components is electrically connected to the second photodiode. The first and second sets of readout components are stacked on top each other in the readout region.
Figure la illustrates an image sensor 10 with a CMOS substrate 19. The substrate 19 may be a silicon substrate. The substrate has a first side 191 and a second side 192 which is opposite to the first side 191. Throughout this disclosure, the CMOS substrate 19 will be illustrated in configurations where the first side 191 has a greater z-coordinate than the second side 192. The xy- plane may for example be the plane of the first side 191 of the substrate. The z-axis is perpendicular to the xy-plane.
A first photodiode 13 has been formed in the CMOS substrate 19. The first photodiode may comprise n- and p-doped layers (not separately illustrated), which may be formed in the CMOS substrate 19 by diffusion and/or implantation. These processes may be formed on the first side 191 of the
substrate 19, so that the first photodiode 13 lies just beneath the surface of the first side 191 of the CMOS substrate 19. The area in the xy-plane where the first photodiode 13 is located is the first pixel region 111. Additional layers, such as an oxide layer (not illustrated), may be deposited on the first side 191 of the CMOS substrate 19 in the first pixel region 111.
In any embodiment presented in this disclosure, the first photodiode 13 may be a pinned photodiode. The first photodiode 13 may, but does not necessarily have to, be configured to sense visible light. It could alternatively be configured to sense infrared radiation. The first photodiode may be an avalanche photodiode or a single-photon avalanche diode.
In any embodiment of this disclosure, any photodiode (first, second or additional) which is intended to sense infrared radiation may be configured to sense NIR, SWIR or MWIR wavelengths. Any such photodiode may be configured to respond to infrared radiation in just one of these wavelength ranges. Any such photodiode may alternatively be configured to respond to infrared radiation in all of these wavelength ranges.
The image sensor may comprise a control circuit, or it may be connected to a control circuit. The first and second readout components may be connected to the control circuit. The control circuit may be configured to retrieve a measurement signal from any photodiode described in this disclosure. The control circuit will be described in more detail below.
The image sensor may be operated when it is illuminated by incoming electromagnetic radiation 101. In any embodiment presented in this disclosure, the image sensor 10 has an illuminated side (illustrated as the top side in the figures) which is configured to receive illumination and a non-illuminated side (illustrated as the bottom side) which is opposite to the illuminated side. The first side 191 of the CMOS substrate 19 faces toward the illuminated side of the image sensor. The second side 192 of the CMOS substrate 19 faces toward the non-illuminated side of the image sensor 10. The illuminated side of the image sensor 10 may for example comprise lenses and/or other at least partly
transparent optical elements (not illustrated). The non-illuminated side may comprise a non-transparent main body (not illustrated) of the image sensor 10, which provides structural support and protection for the sensor.
The image sensor in figure la also comprises an electrical circuit stack 15 on the CMOS substrate 19. The side of the CMOS substrate 19 where the stack 15 is built may be called the front side of the substrate 19, and the opposite side may be called the back side of the substrate 19. The stack 15 may be formed on the first side 191 (front-side illumination), but it may alternatively be formed on the second side 192 (back-side illumination). This will be described in more detail below. The stack 15 may comprise insulating layers 151 and electric conductors 152 which extend through the insulating layers in the x-, y and/or z-directions. In other words, the electrical circuit stack 15 may be a multilayer electrical circuit which extends in the xy-plane and in the z-direction. The circuit may be much more complex than the schematic illustration in figure la indicates.
The image sensor 10 also comprises a second photodiode 16. This second diode 16 is in figure la formed on top of the electrical circuit stack 15, but other options are also possible, as figure 3a for example illustrates. The second diode 16 may comprise a first electrode 161 and a second electrode 163, which may also be called an anode and a cathode. A photoactive layer 162 comprising colloidal quantum dots (CQD) is sandwiched between the two electrodes 161 and 162. The second diode 16 may also comprise either an electron transport layer (ETL, not illustrated) or a hole transport layer (HTL, not illustrated) between the photoactive layer 162 and the first electrode 161 and between the photoactive layer 162 and the second electrode 163.
In any embodiment presented in this disclosure, the photoactive layer 162 may for example comprise any of the following colloidal quantum dot materials: PbS, HgTe, InAs, Ag2Se, Ag2Te, Bi2S3. Alternatively, the photoactive layer 162 may comprise HgE (where E may stand for S, Se, Te or varying compositions of S, Se, Te), or alloys of CdE, ZnE and HgE. Alternatively, the photoactive layer 162 may comprise CuInE2, AgBiE2, CuZnSnE; Sb2E3, Bi2E3; Cu2E, Ag2E.
Alternatively, the photoactive layer 162 may comprise Al, In or Ga in varying compositions combined with varying compositions of N, P, As or Sb. Alternatively, the photoactive layer 162 may comprise Pb or Sn in varying compositions combined with varying compositions of S, Se, Te. Other quantum dot materials may also be used. The photoactive layer may comprise particles of any material listed above, and the diameter of the particles may be in the range 2 nm - 20 nm, or in the range 2 nm - 25 nm, or in the range 1 nm - 50 nm.
The photoactive layer 162 may also comprise ligand materials such as thiols, amines, or halides. The ligand materials may be organic ligands that contain functional groups: carboxylic acids RCOOH, amines R3N, phosphines R3P, phosphine oxides R3PO, phosphonic acids RPO3H, thiols RSH including their deprotonated forms/salts and where R = carbon <6 and any number of hydrogen to produce saturated or unsaturated hydrocarbons. Different functional groups may share the same R group to produce bifunctional ligands. Alternatively, the ligand materials may be inorganic ligands and they may include, SON-, I-, Br-, CI-, OH-. A combination of organic and inorganic ligands may also be used.
The first electrode 161 of the second photodiode 16 is the electrode which is closer to the non-illuminated side of the image sensor, while the second electrode 163 is the electrode which is closer to the illuminated side. Consequently, the second electrode 163 should be transparent for the radiation which is detected by the second photodiode 16, while the first electrode 161 does not necessarily have to be transparent to that radiation.
However, depending on the geometry of the electrodes and on the complexity of the manufacturing process, it may in some cases be beneficial to make the first electrode 161 transparent to some forms of radiation, too. Many different electrode configurations will be discussed in the embodiments below. In any embodiment where an electrode is transparent or partially transparent to visible radiation, the electrode material may for example be ITO, AZO or graphene, IZO, FTO. Alternatively, the electrode may comprise a metal layer
with a thickness below 15 nm. The transparency of the electrode material may for example be greater than 70% VLT. Alternatively, any electrode described in this disclosure may comprise a patterned metal grid. The metal may for example be gold or platinum. The patterning may allow a sufficient amount of radiation to pass through the electrode. Alternatively, in any embodiment the electrode may comprise a network of nanowires. In any embodiment where an electrode does not have to be transparent to visible radiation, the electrode material may for example be a metal. The metal may for example be gold or platinum.
The area in the xy-plane where the second photodiode 16 is located is the second pixel region 121. This is the area where the photoactive layer 162 lies between both the first electrode 161 and the second electrode 163. It may in some embodiments be convenient to extend the photoactive layer 162 across a large area, while first electrode 161 is patterned so that it only covers a smaller area, for example only the readout region. The photoactive layer 162 may in this case extend beyond the second pixel region 121, since the second pixel region 121 is limited to the area where the first electrode 161 is present. These options will be illustrated in some figures of this disclosure.
The second photodiode and the second set of readout components may be configured for either a current-mode measurement or a voltage-mode measurement. In current-mode, the photodiode is typically reverse- biased (or in some cases zero-biased). The photogenerated charge may be collected into an integrating capacitor and converted to a voltage signal via a Capacitive Transimpedance Amplifier (CTIA).
In voltage-mode, the pixel electronics sample the photodiode voltage at the end of each exposure-time cycle. During exposure, the forward voltage across the photodiode increases as a function of photogenerated charge (eventually reaching open-circuit voltage in the case of sufficiently long exposure time). The pixel electronics may be designed to exhibit very high input impedance so that minimal charge leakage occurs. Between exposure cycles, the photodiode
is reset by short-circuiting the photodiode stack thus discharging any photogenerated charge.
The image sensor has a readout region 199 in the xy-plane. The readout region 199 may be at least partly covered by the electrical circuit stack 15. As figure la illustrates, the readout region 199 may be coextensive with the second pixel region 121, since the second photodiode 16 may cover the entire electrical circuit stack 15. However, the readout region 199 and second pixel region 121 may also be different, as other figures of this disclosure will illustrate. The readout region 199 may overlap with at least a part of the second pixel region 221 in the xy-plane, the figures of this disclosure illustrate. The readout region does not overlap with any first pixel region (such as 211, 212 and 213) in the figures of this disclosure. This allows the active area of the first pixel regions to be maximized, since the first set of readout components 181 does not occupy the xy-plane in the first pixel regions.
The image sensor 10 also comprises a first set of readout components 181 in the CMOS substrate 19. The first set of readout components 181 may be connected to the first photodiode 13 with an electrical connection 14. Throughout this disclosure, electrical connections such as 14 are for simplicity illustrated schematically with a black line. In practice, these connections may comprise regions in the CMOS substrate which have been doped and configured to perform charge transfer.
The first set of readout components 181 may form a front-end-of-line (FEOL) block. The first set of readout components may for example comprise one or more of the following components: floating diffusion capacitors, pn-junction capacitors, depletion capacitors, transistor capacitors, MOS capacitors, varactor diodes, trench capacitors or deep trench capacitors. Some exemplary component architectures are presented toward the end of this disclosure. Throughout this disclosure, any readout components formed in the CMOS substrate 19 may comprise regions which have been doped and configured to perform the intended function.
The first set of readout components 181 is for simplicity illustrated just as a single box in figure la, even though it may contain more than one readout component. The first set of readout components may be configured to generate a first output signal from the first photodiode 13. The first set of readout components 181 may be formed in the CMOS substrate 19 with the same processes that are used to form the first photodiode 13, although the process steps may be separate. The first set of readout components 181 may be connected to the electrical circuit stack 15 with an electrical connection 14. The first output signal may be transferred to external circuitry via a control circuit (not illustrated in figure la) in the circuit stack 15.
The image sensor 10 also comprises a second set of readout components 189 in the electrical circuit stack 15. This second set of readout components 189 is in figure la connected to the second photodiode 16 with an electrical connection 169 which lies in the electrical circuit stack. More generally, as illustrated in the other figures of this disclosure, this electrical connection 169 may be direct connection. It may be a via which extends in the z-direction from the second photodiode 16 to the second set of readout components 189. This via may extend through any intervening layers (for example filling layer 17 in figure lb) between the second photodiode 16 and the second set of readout components 189.
The second set of readout components 189 may form a back-end-of-line (BEOL) block. The second set of readout components 189 may for example comprise one or more metal-insulator-metal (MIM) capacitors, metal-oxide- metal capacitors (MOM) or metal-oxide-poly (MOP) capacitors.
The second set of readout components 189 may be configured to generate a second output signal from the second photodiode 16. The second output signal may be transferred to external circuitry via the control circuit in the electrical circuit stack 15.
In any embodiment presented in this disclosure, the readout region 199 may be adjacent to the first pixel region 111 in the xy-plane. Both the first (181)
and second (189) sets of readout components lie in the readout region 199.
They are stacked on top of each other in the readout region 199.
In this disclosure, the term "on top of each other" means that the first and second readout components have different z-coordinates. In practice, the first set of readout components 181 is in the CMOS substrate 19 and the second set of readout components 189 is in the electrical circuit stack which lies on the CMOS substrate in the readout region. As the figures of this disclosure illustrate, the second set of readout components 189 may have a greater z- coordinate than the first set of readout components 181 (when the positive z- direction points toward the illuminated side of the image sensor 10). However, as figures 3a - 3b for example illustrate, the first set of readout components 181 may alternatively have a greater z-coordinate than the second set of readout components 189.
The first set of readout components 181 may be at least partly aligned with the second set of readout components 189 in the z-direction, so that the projection to the xy-plane of at least one component in the second set 189 overlaps with at least one component of the first set 181 in the xy-plane. However, the first (181) and second (189) sets do not necessarily need to be aligned in this way.
Since the first readout components 181 lie in the readout region 199, they do not occupy space in the first pixel region 111. The optically active area of the first pixel region 111, where the first photodiode 13 receives radiation, can therefore be maximized. In other words, the first set of readout components 181 is placed in the readout region, which is an optically inactive region in the CMOS substrate 19. The CMOS substrate 19 does not comprise any photodiodes in the readout region 199.
The first (181) and second (189) sets of readout components are both connected to the electrical circuit stack 15. The electrical connections, such as 169, may extend through the electrical circuit stack. They may be located in the readout region 199 of the CMOS substrate 19.
In any embodiment presented in this disclosure, the image sensor 10 may comprise a filling layer 17 on the CMOS substrate 19, as figure lb illustrates. The filling layer 17 may be electrically insulating. The filling layer 17 may be planarized, so that its top surface 171 is smooth. The second photodiode 16 may be built on the top surface 171 of the filling layer 17. In other words, the filling layer 17 may lie between the second photodiode 16 and the electrical circuit stack 15.
Figures la - lb illustrate very simple device geometries. In practice, the geometry can be more complex. The circuitry which is connected to the first and second photodiodes may also comprise other components which are not discussed in this disclosure.
Front-side illumination
As mentioned above, the image sensor 10 may be called a front-side illuminated image sensor when the electrical circuit stack 15 is built on the illuminated first side 191 of the CMOS image sensor.
Figure 2a illustrates a front-side illuminated image sensor 10 where the electrical circuit stack lies between the CMOS substrate and the second photodiode. The image sensor 10 has an illuminated side which is configured to receive illumination and a non-illuminated side which is opposite to the illuminated side, and the second photodiode 16 is closer to the illuminated side of the image sensor 10 than the CMOS substrate 19. The second photodiode 16 has here been built on top of the filling layer 17.
Reference numbers 211, 221, 231 and 281 in figures 2a - 2d correspond to reference numbers 111, 121, 131 and 181, respectively, in figures la - lb.
The image sensor may comprise one or more additional first photodiodes in the CMOS substrate 19. The additional first photodiodes define additional first pixel regions in the xy-plane.
The image sensor 10 in figure 2a comprises two additional first photodiodes 232 and 233 in the CMOS substrate. All options listed above for the first photodiode 231 apply to these additional first photodiodes 232 - 233 as well. The additional first photodiodes form additional first pixel regions 212 and 213, respectively, in the xy-plane.
The image sensor 10 in figure 2a comprises two additional sets of readout components 282 and 283. Both sets 282 and 283 may comprise any of the components that the first set of readout components 281 comprises. Sets 282 and 283 may perform the same functions for additional first photodiodes 232 - 223 as the first set of readout components 281 performs for the first photodiode 231. As figure 2b illustrates, the additional sets of readout components 282 - 283 may lie in the readout region 199 of the xy-plane.
In any embodiment presented in this disclosure, the image sensor may comprise a colour filter in the first pixel region.
The image sensor 10 in figure 2a comprises a first colour filter 241, a second colour filter 242 and a third colour filter 243. The first colour filter 241 lies in the first pixel region, while the second and third colour filters 242 and 243 lie in the additional first pixel regions 212 and 213, respectively. One of the three colour filters 241 - 243 may for example be a red (R) filter (which primarily transmits red wavelengths), another may be a green (G) filter and the third may be a blue (B) filter. The three pixels 211 - 213 may then form an R pixel, a G pixel, and a B pixel, respectively, while the second pixel 221 may form an infrared (IR) pixel. Other options are also possible. The pass wavelengths of the filters 241 - 243 can be freely selected.
If any of the first photodiodes 231 - 233 is an infrared photodiode, then the corresponding filter 241 - 243 may alternatively be a UV and visible light filter which blocks all visible and UV wavelengths but allows infrared radiation to pass through.
In any embodiment in this disclosure the image sensor may comprise a UV filter and/or a visible light filter in the readout region. This prevents photogeneration of charge carriers in the first (and third) set of readout components in the CMOS substrate. This is particularly important in backside- illuminated devices. Any filter described in this disclosure may lie on the illuminated side of the CMOS substrate, so that incoming radiation strikes the filter before it reaches the CMOS substrate and/or any photodiode.
Alternatively, the image sensor may comprise a band-pass filter in the second pixel region or in any first pixel region or additional first pixel region. In particular, the colour filters 241 - 243 (which may, for example allow the passage of 600-750 nm I 500-600 nm I 400-500 nm) could be designed as band-pass filters. Alternatively, the image sensor may comprise a band-pass filter for a particular infrared wavelength band in the second pixel region. Any band-pass filter described in this disclosure may be an interferometric filter.
Band-pass filters may allow the second photodiode 16, and/or the possible additional second photodiodes (described below), to respond to the corresponding 2*lambda wavelength (1200-1500 nm I 1000-1200 nm I 800- 1000 nm) if they are placed on top of the underlying first photodiodes. A second photodiode which is not on top a first photodiode could be then dedicated to the final SWIR band (1500-2000nm).
Any filters listed here such as Bayer colour filter, UV filters, band-pass filters, can alternatively be constructed as meta-optics elements or metasurfaces.
In other words, the image sensor 10 may comprise a set of RGB pixels 211 - 213 that are adjacent to the readout region 199 in the xy-plane. They may also be adjacent to the electrical circuit stack 15 in the xy-plane. The sets of readout components 281 - 283 which are connected the first photodiodes 231 - 233 in these colour pixels may lie in the readout region 199 of the xy-plane. The optically active area of the pixels 211 - 213 can then be maximized.
Figure 2a illustrates an embodiment where the second photodiode 16 has been built next to the colour filters 241 - 243, so that the second electrode 163 of this photodiode extends across the colour filters 241 - 243. The first electrode 161 and the photoactive layer 162, are here restricted to the second pixel region 221. The second electrode 163 should in this case be transparent for the wavelengths which are intended to be measured in each underlying pixel
211 - 213 and 221.
However, it may in some applications be preferable to extend the photoactive layer 162 over multiple pixels, or even across the entire optically active area of the image sensor 10. In this case the photoactive layer 162 does not have to be patterned so precisely. It may not need to be patterned at all. The photoactive layer 162 should in this case be at least partly transparent to the radiation wavelengths that are detected with the first photodiodes 231 - 233 located in the CMOS substrate 19. This option is illustrated in figure 2b, where the photoactive layer 162 extends across all of the illustrated pixel regions 211 - 213 and 221. The first electrode 161 of the second photodiode 16 is present only in the second pixel region 221, so it does not have to be transparent. The optically active area of the second photodiode 16 lies between the two electrodes 161 and 163, primarily in the second pixel region 221. The regions just next to the second pixel region 221 may also be optically active.
The colour filters 241 - 243 are on top of the second electrode 163 in figure 2b. The image sensor 10 in this figure may also comprise an infrared filter (not illustrated), for example a short-wave infrared (SWIR) filter, on top of the second electrode 163 in the second pixel region 221. Alternatively, figure 2c illustrates an option where a colour filter 242 extends from a neighboring pixel
212 to the second pixel region 221 on top of the second photodiode 16. In other words, the colour filter also extends to the second pixel region.
As mentioned before, the second pixel region 221 is the area where the photoactive layer 162 lies between both the first electrode 161 and the second electrode 163. The second pixel region 221 is not necessarily coextensive with the readout region 199. Figure 2d illustrates an option where the first electrode
161 of the second photodiode extends beyond the electrical circuit stack 15 and beyond the readout region 199. The second pixel region 221 is larger than the readout region 199. The first electrode 161 could alternatively be smaller than the electrical circuit stack, so that the second pixel region 221 would cover only a part of the readout region 199.
The readout region 199 is not necessarily coextensive with the electrical circuit stack 15, either. Figure 2e illustrates an image sensor where the readout region 199 lies beneath the electrical circuit stack and the first electrode 161. The combined area of the electrical circuit stack and first electrode 161 is an optically inactive area in the CMOS substrate 19 if the first electrode 161 is not transparent to radiation. All of this area can therefore be utilized as the readout region, if necessary.
Figure 2f illustrates an image sensor 10 where the first electrode 161 extends over multiple pixels, or even across the entire optically active area of the image sensor. The first electrode 161 should in this case be at least partly transparent to the radiation wavelengths that are detected with the first photodiodes 231 - 233 located in the CMOS substrate 19. The second pixel region 221 may in this configuration extend across the entire optically active area of the image sensor, as figure 2f illustrates. The readout region is here restricted to the area of the electrical circuit stack 15. It should be noted that, in any embodiment presented in this disclosure, only a part of the optically inactive area in the CMOS substrate 19 may be utilized as the readout region 199. The electrical circuit stack 15 in figure 2f may, for example, have a much larger area than what is needed to accommodate the sets of readout components that lie beneath it.
Back-side illumination
As mentioned above, the image sensor 10 may be called a back-side illuminated image sensor if the electrical circuit stack 15 is built on the nonilluminated second side 192 of the CMOS image sensor. This alternative is illustrated in figures 3a and 3b. The CMOS substrate 19 may be supported on
the non-illuminated side by a handle wafer, which may comprise an imageprocessing IC chip or a PWB interposer.
In figure 3a, the CMOS substrate 19 lies between the electrical circuit stack 15 and the second photodiode 16. In figure 3b, the electrical circuit stack 15 lies between the CMOS substrate 19 and the second photodiode 16.
The image sensor 10 in figure 3a corresponds in some respects to figure 2a. All options relating to reference numbers 199, 161 - 163, 211 - 213, 221, 241 - 243 and 281 - 283 that were described with reference to figures 2a - 2f apply in figure 3a also. For example, the photoactive layer 162 and/or one or both electrodes 161 and 163 in figure 3a may extend only across the second pixel region 221, or across multiple pixels. Any colour filter 241 - 243 may also in figures 3a - 3b extend into the second pixel region 221, and infrared filtering may be employed according to any of the schemes described above with reference to figures 2a - 2f.
The image sensor 10 in figure 3a differs from the ones in figures 2a - 2f in that the electrical circuit stack has been built on the non-illuminated second side 192 of the CMOS substrate 19. The electrical circuit stack 15 may be present only in the readout region 199 (this option is not illustrated). Alternatively, as figure 3a illustrates, the electrical circuit stack 15 may in this embodiment extend below multiple pixels since the first photodiodes 231 - 233 are illuminated from the opposite side of the CMOS substrate 19.
The second photodiode 16 can be built on the illuminated side of the CMOS substrate 19, as figure 3a illustrates. The electrical connection 169 through the CMOS substrate 19 may in this case be a through-silicon via.
However, the second photodiode 16 can alternatively be built on the nonilluminated side of the CMOS substrate 19, so that the electrical circuit stack 15 lies between the CMOS substrate 19 and the second photodiode 16. The CMOS substrate 19 and the electrical circuit stack 15 may be at least partly
transparent to infrared radiation, so the second photodiode 16 can function even beneath the other layers illustrated in figure 3b.
The configuration in figure 3b has the advantage that neither the first electrode 161 nor the second electrode 163 needs to be transparent to visible light. One or both of these electrodes may also function as a reflector for infrared wavelengths, to increase the efficiency of the second photodiode 16. Furthermore, this configuration can be used to minimize filter deposition steps because the substrate 19 can act as a visible light filter to the second photodiode 16. Also, the configuration allows for the simultaneous uncoupled detection of SWI and visible radiation (instead of VIS+SWIR and VIS) needed for deconvolution of incident radiation in the z-direction as both photodiodes are stacked in increasing wavelength range from the illuminated side.
The first electrode 161 may for example be made of gold. Alternatively, the first electrode 161 may be configured to be selectively reflective for selected wavebands with an anti-reflective coating or a transmissive thin-film gold layer on an anti-reflective coated glass substrate.
A filling layer 17 may optionally lie between the second photodiode 16 and the electrical circuit stack 15, as figure 3b illustrates. The second pixel region 221 may extend beneath one or more of the first pixel 211 and the additional pixels 212 - 213.
Pixel geometries in the xy-plane
The following pixel geometries can be implemented in both the front-side illuminated and back-side illuminated embodiments presented above. Other pixel geometries are also possible.
Figure 4a illustrates the first pixel region 211 in the xy-plane. The readout region 199 may (but does not necessarily have to) cover the same area in the xy-plane as the second pixel region 221.
The second pixel region 221 may surround the first pixel region 211 in the xy- plane, as figure 4b illustrates. In other words, referring for example to figure 2a or 3a, the photoactive layer 162, and possibly also the first and/or second electrodes 161 and 163, may be patterned so that they contain openings. The first pixel region 211 may be located in an opening in the photoactive layer 162.
As described previously, the image sensor may comprise one or more additional first photodiodes in the CMOS substrate, wherein the additional first photodiodes define additional first pixel regions 212 - 213 in the xy-plane. The second pixel region 221 may surround both the first pixel region 211 and the additional first pixel regions 212 - 213 in the xy-plane, as figure 4b shows.
If the photoactive layer 162 is transparent to visible light, then it may extend across any of pixels 211 - 213, as in figure 2b - 2f. In other words, the second pixel region 221 may fully or partly overlap with the first pixel region 211. The second pixel region may also fully or partly overlap with the additional first pixel regions 212 - 213.
Figure 4c illustrates a pixel geometry where the image sensor also comprises a first photodiode in the CMOS substrate for sensing infrared radiation. This first photodiode defines pixel region 214 in the xy-plane. The first photodiodes 231 - 233 may for example form R, G and B pixel regions 211 - 213, while the infrared-sensing first photodiode forms an IR pixel region 214. The second pixel region may surround both the RGB and IR pixel regions, as figure 4c illustrates.
Additional second photodiode
The image sensor may comprise an additional second photodiode which is configured to sense infrared radiation. The photoactive layer may extend to both the second photodiode and to the additional second photodiode.
The additional second photodiode may be constructed in the same way as the second photodiode described in any preceding embodiment where the second
photodiode does not extend across the entire optically active area of the image sensor. The second photodiode and the additional second photodiode may share the same photoactive layer 162. They may also share either the same second electrode 163 or the same first electrode 161. Each second photodiode is formed in the same way as the second photodiode, in an area where the photoactive layer 162 lies between two electrodes. Either the first electrodes or the second electrodes (or both) of two adjacent second photodiodes could be separated (for example by patterning the corresponding electrode layer) to create the separate pixels 221 and 222 seen in figure 4c.
Figure 4d illustrate one possible geometry in the xy-plane. The additional second pixel region 222 formed by the additional second photodiode is adjacent to the second pixel region 221 formed by the second photodiode. The photoactive layer 162 extends across both of these pixel regions 221 and 222. The letters R, G, B indicate first photodiodes which may be built in the CMOS substrate in the same way as the first photodiodes which define first pixel regions 211 - 213 in the preceding figures.
Practical examples
Figures 5a - 5b illustrate a practical example which corresponds at least to figure 2a and 4c. Reference number 284 indicates a first set of readout components dedicated to the first photodiode CI which senses infrared radiation. Reference number 9 will be explained below.
The first electrode 161 in figure 5b has been patterned, so allow light passage to first pixels 211 - 214. This patterning determines the second pixel region 221 in figure 5a. The readout region is in this case coextensive with the second pixel region 221.
Figure 5b illustrates how the readout components of the first photodiodes (R, G, B and CI) can be placed in the readout region in figure 5a. Many other geometries are also possible in the xy-plane. Figure 5c illustrates how the electrical circuit stack 15 and the various conductors (such as 169) and the
second set readout components 189 which are included in the stack 15 can also be arranged in the readout region without blocking the illumination of pixels 211 - 214.
Figure 5d illustrates how the readout components of the first photodiodes (R, G, B and CI) can be placed in the readout region in an image sensor which corresponds to figure 3b. Reference number 384 indicates the readout components connected to pixel 214. Figure 5e illustrates how the electrical conductors in the electrical circuit stack may be placed in this configuration.
In general, in any embodiment where the electrical circuit stack is on the nonilluminated side of the CMOS substrate 19 (for example figures 3a - 3b, 5d, 6c and 6e), the first photodiodes may be closer to the non-illuminated side 192 than to the illuminated side 191 in the CMOS substrate 19. The thickness of the CMOS substrate 19 may for example be less than 0.1 mm or less than 0.2 mm in any of these embodiments. This ensures that the incoming radiation is not attenuated too much in the CMOS substrate before it reaches the first photodiodes. The illuminated side 191 is in this case the back surface of the CMOS substrate 19, while the first photodiodes have been formed on the front surface. In any embodiment where the illuminated side 191 is the front surface where the first photodiodes have been formed, the thickness of the CMOS substrate 19 can be selected freely. It may for example be in the range 0.7 mm - 1 mm.
Figure 6a illustrates in more detail one possible implementation of the device shown in figure la. Reference number 12 indicates an electron transport layer and 64 a hole transport layer. Reference numbers 22 is a metal via through the dielectric layers 65 from the first electrode 161 to the CMOS substrate.
The image sensor in figure 6a comprises a third set of readout components 9 in the CMOS substrate 19. The second set of readout components 189 is in this case connected to the second photodiode via the third set of readout components 9. This third set of readout components 9 comprises in this case
a switch transistor with a doped well 21, a source well 18, a drain well 69 and a gate 20.
More generally, a third set of readout components such as the set 9, may be located in the CMOS substrate. This third set may, in any embodiment presented in this disclosure, connect the second set of readout components to the second photodiode. The third set of readout components may for example comprise floating diffusion capacitors, pn-junction capacitors, depletion capacitors, transistor capacitors, MOS capacitors, varactor diodes, trench capacitors or deep trench capacitors. Stacked and 3D transistor structures, including FinFET, Nanosheet and Forksheet structures, may also be included in the third set of readout components.
The second set of readout components 189 comprises a metal-insulator-metal capacitor 60 formed by a top electrode 66, a bottom electrode 67 and an insulating layer 65 between these two electrodes.
The first photodiode 13 in figure 6a comprises a P+ doped layer 2 and an n- type storage well 3. A p-doped well 5 in the substrate 19 extends from the first photodiode 13 to a floating diffusion well 4, which may be n+ doped. The set of first readout components 181 comprises in this case the floating diffusion well 4 and a preamplifier transistor 8 connected to the first photodiode 13. Reference number 6 indicates a transfer gate and 7 a gate control line.
Figure 6b illustrates a circuit diagram for the image sensor shown in figure 6a.
Figure 6c illustrates in more detail one possible implementation of the device shown in figure 3b. Only one first photodiode 331 and its readout components 381 are illustrated for clarity reasons, but the additional photodiodes 332 - 333 in figure 3b and their readout could be implemented with the same configuration as 331. All other elements in figure 6c correspond to the ones illustrated in figure 6a, though all elements are not necessarily identical in these two figures. For example, as explained above with reference to figure
3b, the embodiment shown in figure 6c allows greater use of materials that are not transparent to visible wavelengths.
Figure 6d illustrates one possible pixel configuration in the xy-plane for the image sensor shown in figure 6c. The first electrode 161 and the entire second photodiode may in this extend across multiple adjacent pixel regions.
Figure 6e illustrates one possible way to implement filtering in the image sensor shown in figure 6c. Figure 6e illustrates two adjacent pixels for sensing visible light, and a single infrared pixel which extends below them both. The image sensor may comprise colour filters 24 and 25 arranged above the first photodiodes. The image sensor may also comprise a light-shielding film 26 which covers the rest of the illuminated surface. The light-shielding film 26 may for example block UV wavelengths and visible wavelengths, and optionally also near-infrared wavelengths. The light-shielding film may be transparent to short-wave infrared wavelength and longer wavelengths.
In any embodiment presented in this disclosure, the image sensor may comprise a main body 27. The previously described elements of the image sensor may be attached to this main body. This main body may for example be a support wafer or a protective enclosure.
Claims
1. An image sensor comprising: a CMOS substrate which defines an xy-plane, wherein the xy-plane comprises a readout region, a first photodiode in the CMOS substrate, wherein the first photodiode defines a first pixel region in the xy-plane, a first set of readout components in the CMOS substrate, wherein the first set of readout components is electrically connected to the first photodiode, an electrical circuit stack on the CMOS substrate, characterized in that the image sensor also comprises: a second photodiode which is configured to sense infrared radiation, wherein the second photodiode comprises a photoactive layer, the photoactive layer comprises colloidal quantum dots, and the second photodiode defines a second pixel region in the xy-plane, a second set of readout components in the electrical circuit stack, wherein the second set of readout components is electrically connected to the second photodiode, and the first and second sets of readout components are stacked on top each other in the readout region.
2. An image sensor according to claim 1, wherein the electrical circuit stack lies between the CMOS substrate and the second photodiode.
3. An image sensor according to claim 2, wherein the image sensor has an illuminated side which is configured to receive illumination and a nonilluminated side which is opposite to the illuminated side, and the second photodiode is closer to the illuminated side of the image sensor than the CMOS substrate.
4. An image sensor according to claim 2, wherein the image sensor has an illuminated side which is configured to receive illumination and a nonilluminated side which is opposite to the illuminated side, and the CMOS
substrate is closer to the illuminated side of the image sensor than the second photodiode.
5. An image sensor according to claim 1, wherein the CMOS substrate lies between the electrical circuit stack and the second photodiode.
6. An image sensor according to any of claims 1-5, wherein the second pixel region surrounds the first pixel region in the xy-plane.
7. An image sensor according to any of claims 1-6, wherein the image sensor comprises one or more additional photodiodes in the CMOS substrate, wherein the additional photodiodes define additional first pixel regions in the xy-plane, and the second pixel region surrounds both the first pixel region and the additional first pixel regions in the xy-plane.
8. An image sensor according to any of claims 1-5, wherein the second pixel region partly overlaps with the first pixel region.
9. An image sensor according to any preceding claim, wherein the image sensor comprises an additional second photodiode which is configured to sense infrared radiation, and the photoactive layer extends to both the second photodiode and to the additional second photodiode.
10. An image sensor according to any preceding claim, wherein the first photodiode is configured to sense visible light.
11. An image sensor according to claim 10, wherein the image sensor comprises a colour filter in the first pixel region.
12. An image sensor according to claim 11, wherein the colour filter also extends to the second pixel region.
13. An image sensor according to any preceding claim, wherein the first photodiode is a pinned photodiode.
14. An image sensor according to any preceding claim, wherein the image sensor comprises a UV filter and a visible light filter in the readout region.
15. An image sensor according to any of claims 1-13, wherein the image sensor comprises a band-pass filter for a particular infrared wavelength band in the second pixel region.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| FI20245307 | 2024-03-14 | ||
| FI20245307 | 2024-03-14 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025191218A1 true WO2025191218A1 (en) | 2025-09-18 |
Family
ID=95155002
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/FI2025/050131 Pending WO2025191218A1 (en) | 2024-03-14 | 2025-03-14 | Image sensor |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025191218A1 (en) |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20130284889A1 (en) * | 2010-11-03 | 2013-10-31 | Commissariat A L'energie Atomique Et Aux Energies Alternatives | Monolithic multispectral visible and infrared imager |
| US20220141400A1 (en) | 2019-03-01 | 2022-05-05 | Isorg | Color and infrared image sensor |
| US20230352513A1 (en) * | 2022-05-02 | 2023-11-02 | Stmicroelectronics (Crolles 2) Sas | Electrical connection and its method of fabrication |
-
2025
- 2025-03-14 WO PCT/FI2025/050131 patent/WO2025191218A1/en active Pending
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20130284889A1 (en) * | 2010-11-03 | 2013-10-31 | Commissariat A L'energie Atomique Et Aux Energies Alternatives | Monolithic multispectral visible and infrared imager |
| US20220141400A1 (en) | 2019-03-01 | 2022-05-05 | Isorg | Color and infrared image sensor |
| US20230352513A1 (en) * | 2022-05-02 | 2023-11-02 | Stmicroelectronics (Crolles 2) Sas | Electrical connection and its method of fabrication |
Non-Patent Citations (1)
| Title |
|---|
| GEORGITZIKIS EPIMITHEAS ET AL: "Integration of PbS Quantum Dot Photodiodes on Silicon for NIR Imaging", IEEE SENSORS JOURNAL, vol. 20, no. 13, 1 July 2020 (2020-07-01), pages 6841 - 6848, XP093255404, ISSN: 1530-437X, DOI: 10.1109/JSEN.2019.2933741 * |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| EP2239777A2 (en) | Imaging device | |
| US11011583B2 (en) | Image sensors and electronic devices | |
| CN107039473B (en) | Image sensor and electronic device including the same | |
| CN112002810B (en) | Photoelectric conversion element and solid-state imaging device | |
| JP6711573B2 (en) | Solid-state image sensor | |
| CN106252368B (en) | Image sensor, method of manufacturing same, and optoelectronic system including image sensor | |
| WO2016203724A1 (en) | Solid state imaging element and method for manufacturing solid state imaging element, photoelectric conversion element, imaging device, and electronic device | |
| US10074696B2 (en) | Imaging device, manufacturing device, and manufacturing method | |
| US20040178467A1 (en) | Vertical color filter sensor group array that emulates a pattern of single-layer sensors with efficient use of each sensor group's sensors | |
| US7427734B2 (en) | Multiple photosensor pixel | |
| US20100044822A1 (en) | Luminous radiation colour photosensitive structure | |
| KR102520573B1 (en) | Image sensor and electronic device including the same | |
| KR20170058096A (en) | Image sensor and electronic device including the same | |
| US20210366991A1 (en) | Organic image sensors without color filters | |
| US20070131992A1 (en) | Multiple photosensor pixel image sensor | |
| KR102547654B1 (en) | Image sensor and electronic device including the same | |
| US8835924B2 (en) | Photo-detecting device and method of making a photo-detecting device | |
| US20090085080A1 (en) | Image Sensor and Method for Manufacturing The Same | |
| US7339216B1 (en) | Vertical color filter sensor group array with full-resolution top layer and lower-resolution lower layer | |
| US12205973B2 (en) | Image sensor comprising stacked photo-sensitive devices | |
| WO2025191218A1 (en) | Image sensor | |
| FI20245307A1 (en) | ||
| WO2007061565A2 (en) | Vertical color filter sensor group array with full-resolution top layer and lower-resolution lower layer | |
| JP2021174930A (en) | Imaging device and imaging system | |
| WO2010005484A1 (en) | Backside illuminated image sensor with biased layer |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 25714781 Country of ref document: EP Kind code of ref document: A1 |