[go: up one dir, main page]

HK1191451B - Backside-illuminated photosensor array with white, yellow and red-sensitive elements - Google Patents

Backside-illuminated photosensor array with white, yellow and red-sensitive elements Download PDF

Info

Publication number
HK1191451B
HK1191451B HK14104585.7A HK14104585A HK1191451B HK 1191451 B HK1191451 B HK 1191451B HK 14104585 A HK14104585 A HK 14104585A HK 1191451 B HK1191451 B HK 1191451B
Authority
HK
Hong Kong
Prior art keywords
pixel
light signal
sensor
red
depth
Prior art date
Application number
HK14104585.7A
Other languages
Chinese (zh)
Other versions
HK1191451A (en
Inventor
陈刚
毛杜立
戴幸志
Original Assignee
豪威科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 豪威科技股份有限公司 filed Critical 豪威科技股份有限公司
Publication of HK1191451A publication Critical patent/HK1191451A/en
Publication of HK1191451B publication Critical patent/HK1191451B/en

Links

Abstract

A monolithic backside-sensor-illumination (BSI) image sensor has a sensor array is provided with a multiple-pixel cells having a first pixel sensor primarily sensitive to red light, a second pixel sensor primarily sensitive to red and green light, and a third pixel sensor having panchromatic sensitivity, the pixel sensors laterally adjacent each other. The image sensor determines a red, a green, and a blue signal comprising by reading the red-sensitive pixel sensor of each multiple-pixel cell to determine the red signal, reading the sensor primarily sensitive to red and green light to determine a yellow signal and subtracting the red signal to determine a green signal. The image sensor reads the panchromatic-sensitive pixel sensor to determine a white signal and subtracts the yellow signal to determine the blue signal.

Description

Backside illuminated optical sensor array with white, yellow and red light sensing elements
Technical Field
The present invention relates to the field of semiconductor array optical sensors, and in particular to the field of illuminating an optical sensor array as used on an image sensor integrated circuit.
Background
Optical sensor arrays are commonly used in electronic cameras (including still cameras and video cameras), and optical sensor arrays typically have components such as an image sensor integrated circuit and a circuit that can read an image from the optical sensor array. Typically, these components are integrated circuit chips that include a rectangular array of pixel sensors, wherein each pixel sensor includes at least one photodiode or phototransistor adapted to detect light, and circuitry for sensing the sensor to generate an electronic signal representative of the light detected by the sensor, and for outputting the signal to circuitry off-chip. Most optical sensors are top-illuminated to receive light entering the pixel sensor from the same chip surface, which may also be provided with control transistors (including signal output circuitry and sensing transistors).
When the optical sensor array is of the "black and white" type, it is commonly applied to security cameras, but in 2012 most video and still camera applications require color.
Common color optical sensor arrays are typically provided with color filters disposed over the top-illuminated pixel sensors. These filters are typically in a four pixel, three color pattern that is repeated (or tiled) throughout the array, and the filters in such an array are typically colored, where one filter may allow red light to enter the first sensor, another filter may allow green light to enter the second sensor, another filter may allow blue light to enter the third pixel sensor, and a fourth filter in each pattern may allow one of red, green, or blue light to enter the fourth pixel sensor.
In many camera systems, the output of the pixel sensors in the above-described pattern may be processed to yield conventional red, green, and blue (RGB) color signals, as may be used in alternative color display systems to provide full color images. Red, green and blue have become the standard colors for color electronic cameras and color computer monitor video.
In recent years, backside illuminated (BSI) optical sensor arrays have been developed. These optical sensor arrays are typically disposed on a thinned chip, with control transistors also formed on a first surface of the chip, but are designed to receive light through a second surface (or backside) of the chip opposite the first surface.
Some BSI optical sensors utilize a color filter pattern printed on the backside to selectively allow red, green, and blue light to reach the sensor for each tiling pattern, and it has been found that pixel sensors can be designed to have a color response that can be determined by the junction profile and depth of the sensor. In figure 1 of U.S. published patent application PCT/US01/29488, a color optical sensor array is described that includes a pixel sensor having three vertically stacked (sequentially from bottom to top) photodiode junctions, with the deepest junction (depth of about 2 microns) being used for sensing red light, another junction at an intermediate depth (depth of about 0.6 microns) being used for sensing green light, and the shallowest junction (depth of about 0.2 microns) being used for sensing blue light.
Disclosure of Invention
A monolithic backside illuminated image sensor having a sensor array tiled with a plurality of multi-pixel cells, wherein each multi-pixel cell includes a first pixel sensor for primarily sensing red light, a second pixel sensor for primarily sensing red and green light (also referred to herein as "yellow light"), and a third pixel sensor having panchromatic sensitivity, and the pixel sensors are laterally adjacent to each other. In a specific embodiment, the main spectral sensitivity of each pixel sensor is determined by the junction depth of the photodiode portion of each sensor. The method for determining the red light signal, the green light signal and the blue light signal by the image sensor comprises the steps of reading the red light sensing pixel sensor of each multi-pixel unit to determine the red light signal; the sensors that sense primarily red and green light are read to determine the yellow signal, and the red signal is subtracted to determine the green signal. The image sensor reads a panchromatic sensing pixel sensor to determine a white light signal and subtracts a yellow light signal to determine a blue light signal.
Drawings
FIG. 1 shows a schematic cross-sectional view of a red light sensing pixel sensor, a yellow light sensing pixel sensor and a white light sensing pixel sensor of an optical sensor array;
FIG. 2 is a schematic plan view of a four-pixel cell including a red-light sensing pixel sensor, a yellow-light sensing pixel sensor, a white-light sensing pixel sensor, and a pixel sensor selected from any one of the red-light sensing pixel sensor, the yellow-light sensing pixel sensor, and the white-light sensing pixel sensor;
fig. 3A illustrates a four-pixel tiling pattern with two red-light sensing pixel sensors, one yellow-light sensing pixel sensor, and one white-light sensing pixel sensor tiled;
fig. 3B illustrates a four-pixel tiling pattern with one red-light sensing pixel sensor, one yellow-light sensing pixel sensor, and two white-light sensing pixel sensors tiled;
fig. 3C shows a four-pixel tiling pattern with one red-light sensing pixel sensor, two yellow-light sensing pixel sensors, and one white-light sensing pixel sensor tiled;
fig. 3D illustrates a nine-pixel tiling pattern with two red-light sensing pixel sensors, one yellow-light sensing pixel sensor, and six white-light sensing pixel sensors tiled;
FIG. 3E shows a sixteen pixel tiling pattern;
FIG. 4 shows a block diagram of a backside illuminated image sensor circuit comprising a tiling unit of the optical sensor arrays shown in FIGS. 1 and 2;
FIG. 5 shows a block diagram of a color recovery unit for providing red, green and blue light intensities to each of the tiled cells shown in FIGS. 3B and 2;
fig. 6 shows a block diagram of another color recovery unit for providing separate red, green and blue light intensities to each pixel of a four-pixel tiling pattern.
Detailed Description
Structure of optical sensor array
When light enters the surface of a silicon optical sensor, shorter wavelength light is typically absorbed closer to the surface, while longer wavelength light is typically absorbed deeper, meaning that blue light is absorbed closer to the surface, green and yellow light is absorbed a bit deeper, and red light is absorbed deeper.
An optical sensor array 98 suitable for backside illumination is formed on a semiconductor wafer, which in a specific embodiment is a silicon wafer. Fig. 1 shows a cross-sectional view of the optical sensors of the array (but without showing the associated electronics), and fig. 2 shows a top view of a repeating (tiling) unit of a portion of the array. The optical sensor array has a first surface (or upper surface) 100 with diffusion regions 102, 103 formed through the first surface 100, the diffusion regions 102, 103 forming part of a silicon gate Metal Oxide Semiconductor (MOS) transistor. A gate oxide is grown on the first surface 100 and a gate material 104, such as a polysilicon gate material, is disposed on the gate oxide, as is well known to those skilled in the art, ion implantation may be utilized to adjust the threshold of the mos transistor. In one embodiment, N-type and P-type channel transistors may be formed. In one embodiment, a P-channel transistor is formed to form P-type source and drain implant regions 102 in an N-type well 106, and an N-channel transistor is formed to form N-type source and drain implant regions 103 in a P-type well 108. Also disposed on the upper surface 100 are one or more dielectric oxide layers 110, and one or more interconnect metal layers 112. Vias 114 providing patterned interconnections between metal layers 112, and contact lines 116 providing patterned interconnections between lower metal layers 112 and diffusion regions 102, 103 are also provided in the dielectric oxide layer 110 on the upper surface 100, and are configured as known to those skilled in the art of multi-level metal, silicon gate, cmos, integrated circuit fabrication, and the like. The transistors are used to decode, drive, precharge or reset, sense and read the optical sensors in the optical sensor array, although they may be used for other purposes.
A portion of the optical sensor array 98 is assigned to the optical sensors 152, 154, 156. Each optical sensor is formed in a P-type epitaxial layer on the first surface 100 of the semiconductor wafer by forming N-regions 122, 124, 126 that are photodiodes. A surface P + capping region 128 overlies the N-region and a P + isolation sidewall 127 may be provided for isolating it from adjacent N-regions and regions having transistors. Each N-photodiode region is associated with an adjacent select transistor gate 129, a P-well 131 to provide threshold adjustment, and a drain diffusion region 132. In some embodiments, the drain diffusion region 132 may be coupled to a column of sense lines 133.
The semiconductor wafer is thinned to allow at least some light 153, 155, 157 to be incident on the second surface (or backside) 135 of the wafer and reach the N-photodiode regions 122, 124, 126.
Portions of the optical sensor array integrated circuit with decode, drive, sense amplification, multiplexing, and other cmos circuitry may be shielded from ambient light 153, 155, 157 by a patterned opaque coating 136. The coating has openings to allow light 153, 155, 157 to be incident on the portion of the surface where the optical sensors are located, and it may define areas where light is to be prevented from reaching, which areas are areas of the optical sensor array where it is not desired to illuminate the light, because when the light illuminates the areas, the performance of the circuitry is affected, for example, the performance of the decoder, sense amplifier and analog or digital signal processor circuitry located on the same chip as the optical sensor array. A microlens 138 may be disposed on the second surface 135 to concentrate light to the second surface 135 over the photodiode regions 122, 124, 126.
The depth of the N-photodiode regions 122, 124, 126 is selected from the shallow red light sensing absorber depth (e.g., region 122), the deeper red-green (yellow) light sensing absorber depth (e.g., region 124), and the deeper white light (or panchromatic) sensing absorber region depth (e.g., region 126), respectively.
Additional P + regions (such as guard rings or diode contact regions 140) may be provided to isolate the transistor circuit region from the optical sensor region of the array, and further, a buried P + impedance dip region 142 may be provided.
The optical sensor array described herein does not require a color filter array printed on the back surface 135 that does not have different light transmission and absorption characteristics for one or more pixels of the array and for other pixels of the array. In the case where each optical sensor has the same optical characteristics, such a color filter array printed on the back side (i.e., the side where light enters the array) must be disposed on the optical sensor array.
Fabrication with implantation of N-photodiode absorber region
In one embodiment, the depth of the N-photodiode absorber regions 122, 124, 126 is ion implant dependent, wherein the depth is controlled by selecting the ion beam energy provided by the ion implanter, which can be obtained by measuring the effective acceleration voltage. In one embodiment, the absorber regions 122, 124, 126 are formed by ion implantation, and to form a specific depth of the absorber regions, a photoresist layer may be formed on the first surface of the wafer, which upon exposure and development may result in a patterned photoresist layer having a plurality of openings in which second diffusion regions are formed to a desired specific depth; the surface of the wafer is then exposed to an ion beam to perform an implantation process for the wafer, and those skilled in the art of integrated circuit fabrication will appreciate that the ion beam of the ion implanter is sufficiently energetic to create an absorber region extending to that particular depth, and finally the remaining photoresist is removed. Then, the steps of forming the photoresist layer, exposing and developing the photoresist layer, and ion implantation are repeated under different ion beam energies so as to generate other second diffusion regions with different depths.
In one embodiment, the implant energy peak for the full color or white light sensing implant is 1MV, the implant energy peak for the yellow light sensing implant is 500kV, and the implant energy peak for the red light sensing implant is 250kV when ion implantation is performed in the silicon substrate, wherein 250kV implants may form absorber regions with a depth of less than 0.5 microns and 500kV implants may form absorber regions with a depth of less than 2 microns. Since the high voltage implant can carry the most ions deep enough to keep the surface un-inverted and to form an inverted N-region below the surface, the implant can be repeatedly stacked. For example, the deep absorber region 126 may receive implants of 1MV and 250kV, or implants of all energies such as 1MV, 500kV and 250kV, or implants of any energy between 0 and 1MV to produce the desired doping profile and extend the absorber region from the depth towards the first surface and to the surface or bottom of the blanket diffusion layer 109. Similarly, the middle absorber region 124 may receive 250kV and 500kV implants, or implants of any energy between 0 and 500 kV; for purposes of this description, the absorber region may extend to a depth below the first surface bounded between the N-type or P-type absorber region and a complementary P-type or N-type surrounding material at that depth, although the absorber region may have an N-type or P-type region extending from that depth to a point at or adjacent to the first surface. The optical sensors described herein may also be fabricated with other semiconductor materials, such as silicon carbide, gallium arsenide, or germanium, which, of course, require the use of different ion beam energies and different junction depths than those used for silicon.
The term wavelength-determining implant as used herein means that the implant is capable of determining the depth of the lowermost portion of the absorber regions 122, 124, 126, and thus the depth of the active photodiode region.
Fabrication with epitaxial and doping applications
In alternative embodiments, the depth of the absorber region may be determined using alternative methods. In one embodiment, lightly doped absorber regions are epitaxially grown on the substrate, then epitaxial growth is stopped and dopants are provided to the deeper depth regions to form the deep diffusion regions 126, then epitaxial growth is continued again followed by a second stop of epitaxial growth and dopants are provided to the mid-depth regions to form the mid-diffusion regions 124, then epitaxial growth is continued again followed by a stop of epitaxial growth again and dopants are provided to the shallower depth regions to form the shallow buried diffusion regions 122.
Thinning
After the optical sensors, associated circuitry and other circuitry are formed on the first surface of the wafer, and before the opaque mask regions 136 and microlenses 138 are formed, the opposite or second surface 135 of the wafer may be thinned to allow light to pass through and reach the second surface (or back surface) and reach the photodiode array, as is well known to those skilled in the art of back side illuminated silicon array optical sensors. Wherein the pixel (or optical sensor) 152 having the shallow absorber region 122 corresponds primarily to the red light 153 because shorter wavelength light (e.g., blue light) is absorbed in the portion of the wafer between the shallow absorber region 122 and the second surface 135. Similarly, the pixel 154 with the intermediate depth absorber region 124 corresponds primarily to red and yellow light 155, since blue light is absorbed at the wafer site between the absorber region 122 and the second surface 135. Finally, the pixels 156, 158 with the deep absorber regions 126 correspond primarily to all wavelengths 157 (including blue light), so they can be considered white light sensing. Optical sensors capable of sensing all wavelengths of visible light are also referred to as panchromatic sensors.
In some embodiments, an infrared light absorbing filter (or other filter having uniform absorption characteristics for all pixels or optical sensors of the array) may be disposed on the second surface 135; in some embodiments, the filter may be disposed between the second surface 135 and the microlens 138; in other embodiments, the filter may be disposed on the microlens 138.
Lay pattern
The optical sensor array is configured in a repeating (tiling) manner, each unit having four or more optical sensors; fig. 2 shows an embodiment of a four-optical-sensor laying unit. In this laying unit, at least one of the red light sensing pixel optical sensor 152, the yellow light sensing pixel optical sensor 154, and the white light sensing pixel optical sensor 156 is laterally disposed adjacent to each other; in addition, a fourth optical sensor is disposed in the cell and is selected from the group consisting of a red-sensing pixel optical sensor, a yellow-sensing pixel optical sensor, and a white-sensing pixel optical sensor, and the fourth optical sensor 158 may be another white-sensing pixel optical sensor for optimal low light sensitivity. Those skilled in the art of optical sensor arrays will appreciate that column 160 and row 162 circuitry using more than one transistor formed on the first surface of the wafer addresses each optical sensor via a row line 164 and connects the optical sensors via a column line 166. In some embodiments, a separate column line is used to route each optical sensor of the cell to allow four optical sensors to be read simultaneously. In some embodiments, a row memory for color recovery may also be provided, and column lines may be shared by multiple rows in a tiled cell.
Fig. 2 shows only one of several possible tiling patterns, while fig. 3A, 3B and 3C show different four-pixel tiling patterns, respectively, wherein the four-pixel tiling pattern shown in fig. 3A has two red-light sensing sensors, one yellow-light sensing sensor and one white-light sensing sensor, the four-pixel tiling pattern shown in fig. 3B has one red-light sensing sensor, one yellow-light sensing sensor and two white-light sensing sensors, and the four-pixel tiling pattern shown in fig. 3C has one red-light sensing sensor, two yellow-light sensing sensors and one white-light sensing sensor.
Multiple image sensors may be employed in television applications, where the effective bandwidth or resolution of color information is typically much lower than that of illumination information (or black and white information), for which a tiled pattern of more than four pixels is typically required, and at least one red, one yellow and one white light sensing sensor must be present in each repeating pattern. For example, fig. 3D shows a nine-pixel tiling pattern with two red-light sensing sensors, one yellow-light sensing sensor, and six white-light sensing sensors, while fig. 3E shows a sixteen-pixel tiling pattern.
As shown in fig. 4, the image sensor integrated circuit 200 has an optical sensor array 202 that is laid out with a pattern of pixel sensors as shown in fig. 1 and 2. The image sensor integrated circuit 200 also includes a scan exposure control circuit 204 that includes a row counter and a column counter for addressing the pixel sensors of the array in a set order. The row logic device 206 may decode the output of the row counter of the scan exposure control circuit 204 to provide row selection for the optical sensor array 202. The optical sensors of selected columns may be coupled to column sense amplifiers and multiplexers 208 so that signals are provided that represent light received by a series of pixels or a series of tiling patterns, which may include red, yellow, and white light information.
Sensing light
In operation, during the precharge phase, a reset or precharge device (which may be part of column logic 206) may be used to provide a bias voltage to each photodiode in each sensor, and select gate 129 may be used. The light received from the second surface 135 and absorbed by the absorber regions 122, 124, 126 causes these regions to generate minority carriers, thereby providing leakage current through the junction (junction) of the photodiode. After an exposure time has elapsed, the remaining charge on the photodiode of each sensor is measured, and in one embodiment, the sensors may be coupled through column lines 166 to a sense amplifier (not shown) through the arrangement of row logic devices 206 to generate a signal representative of the light received by each sensor.
In one embodiment, prior to color recovery, signals representing light received by a series of pixels or tiling patterns may be digitized using an analog-to-digital converter 210. In an alternative embodiment, after color recovery, digital processing is performed using analog-to-digital converter 212 to provide a digital image signal for subsequent processing. In either embodiment, a color recovery processor 214 is provided to translate the red, yellow, and white light information obtained from the optical sensor into the same red, green, and blue light information as provided by a conventional image sensor.
Color recovery
Fig. 5 shows a block diagram of a color recovery processor 214 for providing corresponding red, green and blue information for each of the tiling patterns shown in fig. 2 or fig. 3B. If provided before the adc 212, the unit includes an analog multiplier and a summing amplifier; if provided after the analog-to-digital converter 210, the unit includes a digital array multiplier and a binary adder. Binary adder or summing amplifier 252 may add two white light signals W1 and W2, respectively representing light received by optical sensors 156 and 158, to provide a double average white light level, which is then multiplied by white light scale factor 254 by multiplier 256 to obtain a scaled white light level. Similarly, the multiplier 260 multiplies the yellow sensor signal Y representing the optical sensor 154 by the yellow scaling factor 258 to obtain a scaled yellow light level, and then subtracts the scaled yellow light level from the scaled white light level in a binary adder or adder 262 to obtain the BLUE light signal BLUE. The circuit may perform the following equation: BLUE = W (W scale factor) -Y (Y scale factor).
Similarly, the yellow light signal Y is multiplied by a second yellow light scaling factor 264 in a multiplier 266 to obtain a scaled white light level. Similarly, multiplier 270 multiplies the red sensor signal R, which represents optical sensor 152, by a red light scaling factor 268 to obtain a scaled red light level. The scaled yellow light level is then subtracted from the scaled red light level in a binary adder or adder 272 to obtain the GREEN light signal GREEN. The circuit may perform the following equation: GREEN = Y (Y scale factor) -R (R scale factor).
The RED light signal R is then multiplied by a second RED scale factor 276 in a multiplier 278 to obtain a RED light signal RED. The circuit may perform the following equation: RED = R · (R scaling factor).
In some embodiments, in the second stage of color processing, the signals RED, GREEN, BLUE may be multiplied by the original white light signals W1 and W2 by multipliers 302, 304, 306, 308, 310, 312 as shown in fig. 6 to obtain pixel RED light signals RW1, RW2, pixel GREEN light signals GW1, GW2, and pixel BLUE light signals BW1, BW2, respectively. Similarly, processed red, green, and blue values may be generated for pixels associated with the yellow (RY, GY, BY) and red (RR, GR, BR) optical sensors 152 and 154, respectively. The circuit may perform the following equation:
RW1=RED*W1
RW2=RED*W2
GW1=GREEN*W1
GW2=GREEN*W2
BW1=BLUE*W1
BW2=BLUE*W2
the present embodiment is applicable to light intensity resolution at the expense of color resolution, such as may be used for television, since its historical chromaticity (chromaticity) is allocated a bandwidth less than the illumination bandwidth. The RED, GREEN, BLUE signals may optionally be averaged with the RED, GREEN, BLUE signals from adjacent paving patterns before reconstructing the RED, GREEN, and BLUE signals for each pixel of the paving pattern.
Modifications may be made in the foregoing detailed description without departing from the spirit or scope of the invention. The terms used in the following claims should not be construed to limit the invention to the embodiments disclosed in the specification. And the foregoing description and drawings are by way of illustration of the invention only and not limiting. It is also to be understood that the following claims may be interpreted to cover both the generic and specific features of the above-described specification.

Claims (12)

1. A monolithic backside illuminated optical sensor array, comprising:
a plurality of multi-pixel cells, each of the multi-pixel cells comprising:
a first pixel sensor for sensing primarily red light,
a second pixel sensor for sensing mainly red light and green light, and
a third pixel sensor having panchromatic sensitivity, the plurality of pixel sensors of each of the multi-pixel cells being laterally adjacent to one another, an
Means for determining a green signal by subtracting the read value of the first pixel sensor from the read value of the second pixel sensor.
2. The optical sensor array of claim 1, wherein the read value of the first pixel sensor is scaled prior to performing the operation of subtracting the read value of the first pixel sensor from the read value of the second pixel sensor.
3. The optical sensor array of claim 1, further comprising means for determining a blue light signal by subtracting a read value of the second pixel sensor from a read value of the third pixel sensor.
4. The optical sensor array of claim 2, wherein each pixel sensor has a wavelength-dependent diffusion absorber region, the wavelength-dependent diffusion absorber region of the first pixel sensor extends to a first depth, the wavelength-dependent diffusion absorber region of the second pixel sensor extends to a second depth, the wavelength-dependent diffusion absorber region of the third pixel sensor extends to a third depth, and the first depth is not equal to the second depth.
5. The optical sensor array of claim 4, wherein the depth of the wavelength-dependent diffusion absorber region is determined by the energy of the wavelength-dependent implant.
6. The optical sensor array of claim 4, wherein the optical sensor array comprises silicon optical sensors.
7. A method for determining a red light signal, a green light signal, and a blue light signal, comprising:
reading a first pixel sensor of an optical sensor array to determine the red light signal;
reading a second pixel sensor of the optical sensor array to determine a yellow light signal;
subtracting the red light signal from the yellow light signal to determine the green light signal;
reading a third pixel sensor of the optical sensor array to determine a white light signal; and
subtracting the yellow light signal from the white light signal to determine the blue light signal.
8. The method of claim 7, wherein the red light signal is scaled with at least one of the yellow light signals prior to determining the green light signal.
9. The method of claim 8, wherein the yellow light signal is scaled to at least one of the white light signals prior to determining the blue light signal.
10. The method of claim 7, wherein each pixel sensor of the optical sensor array has a diffusion absorber region having a sensitivity determined by a wavelength-dependent implant, the absorber region of the first pixel sensor extending to a first depth, the absorber region of the second pixel sensor extending to a second depth, and the absorber region of the third pixel sensor extending to a third depth, wherein the first depth is not equal to the second depth, the first pixel sensor laterally adjacent to the second pixel sensor.
11. The method of claim 10, wherein the optical sensor array has a plurality of multi-pixel cells, the first, second, and third pixel sensors belonging to a first multi-pixel cell of the plurality of multi-pixel cells, the method further comprising: determining an image by determining the red light signal, the green light signal, and the blue light signal for each multi-pixel cell of the optical sensor array.
12. The method of claim 11, further comprising multiplying the white light signal with the red light signal to determine a red light signal at each pixel.
HK14104585.7A 2012-09-24 2014-05-15 Backside-illuminated photosensor array with white, yellow and red-sensitive elements HK1191451B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US13/625,458 2012-09-24

Publications (2)

Publication Number Publication Date
HK1191451A HK1191451A (en) 2014-07-25
HK1191451B true HK1191451B (en) 2018-01-26

Family

ID=

Similar Documents

Publication Publication Date Title
CN103681718B (en) Backside-illuminated photosensor array with white, yellow ad red-sensitive elements
US11843015B2 (en) Image sensors
US7132724B1 (en) Complete-charge-transfer vertical color filter detector
US7915652B2 (en) Integrated infrared and color CMOS imager sensor
US6462365B1 (en) Active pixel having reduced dark current in a CMOS image sensor
US6930336B1 (en) Vertical-color-filter detector group with trench isolation
CN100416241C (en) Color separation in active pixel cell imaging matrices using triple well structures
US6946715B2 (en) CMOS image sensor and method of fabrication
US8274587B2 (en) Image sensor pixels with vertical charge transfer
US7531857B2 (en) Image sensor with buried barrier layer having different thickness according to wavelength of light and method of forming the same
US7705380B2 (en) Amplification-type solid-state image sensing device
JP5151371B2 (en) Solid-state imaging device and camera
US7655545B2 (en) Image sensor and method of fabricating the same
US20060255372A1 (en) Color pixels with anti-blooming isolation and method of formation
US20090160981A1 (en) Apparatus including green and magenta pixels and method thereof
US7180150B2 (en) CMOS image sensor and method for detecting color sensitivity thereof
US7534982B2 (en) Reduced imager crosstalk and pixel noise using extended buried contacts
US7498624B2 (en) Solid-state imaging device
US20150270300A1 (en) Cmos image sensor and method of manufacturing the same
US6911712B2 (en) CMOS pixel using vertical structure and sub-micron CMOS process
US7105878B2 (en) Active pixel having reduced dark current in a CMOS image sensor
US20110007194A1 (en) Solid-state imaging device and electronic device
HK1191451B (en) Backside-illuminated photosensor array with white, yellow and red-sensitive elements
HK1191451A (en) Backside-illuminated photosensor array with white, yellow and red-sensitive elements
CN1684266A (en) Complementary Metal-Oxide-Semiconductor Pixels Utilizing Vertical Structure