[go: up one dir, main page]

HK1190253B - Lens array for partitioned image sensor - Google Patents

Lens array for partitioned image sensor Download PDF

Info

Publication number
HK1190253B
HK1190253B HK14103276.3A HK14103276A HK1190253B HK 1190253 B HK1190253 B HK 1190253B HK 14103276 A HK14103276 A HK 14103276A HK 1190253 B HK1190253 B HK 1190253B
Authority
HK
Hong Kong
Prior art keywords
focal length
image sensor
lens
sensor regions
color
Prior art date
Application number
HK14103276.3A
Other languages
Chinese (zh)
Other versions
HK1190253A (en
Inventor
王嘉伟
邓兆展
徐运强
Original Assignee
豪威科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 豪威科技股份有限公司 filed Critical 豪威科技股份有限公司
Publication of HK1190253A publication Critical patent/HK1190253A/en
Publication of HK1190253B publication Critical patent/HK1190253B/en

Links

Abstract

The subject application relates to lens array for partitioned image sensor. An apparatus includes an image sensor having N image sensor regions arranged thereon. A lens array having a including N lens structures is disposed proximate to the image sensor. Each one of the N lens structures is arranged to focus a single image onto a respective one of the N image sensor regions. The N lens structures include a first lens structure having a first focal length and positioned the first focal length away from the respective one of the N image sensor regions. A second lens structure having a second focal length is positioned the second focal length away from the respective one of the N image sensor regions. A third lens structure having a third focal length is positioned the third focal length away from the respective one of the N image sensor regions. The first, second and third focal lengths are different.

Description

Lens array for zoned image sensor
Technical Field
The present disclosure relates generally to image sensors, and more particularly, to lens arrays for partitioned image sensors.
Background
The image capturing unit typically includes an image sensor and an imaging lens. The imaging lens focuses light onto the image sensor to form an image, and the image sensor converts the light into an electrical signal. The electrical signals are output from the image capture unit to a host electronic system or other unit in a subsystem. The electronic system may be a mobile phone, a computer, a digital camera, or a medical device.
As the use of image capture units in electronic systems increases, so does the demand for image capture unit features, capabilities, and device size. For example, there is an increasing need for image capture units to have a lower profile (profile) so that the overall size of electronic systems that include the image capture units can be reduced without sacrificing the quality of the captured optical images. The profile of the image capture unit may be associated with a distance from a bottom of the image sensor to a top of the imaging lens.
Disclosure of Invention
One aspect of the invention relates to an apparatus comprising: an image sensor including N image sensor regions arranged thereon; and a lens array comprising N lens structures disposed proximate to the image sensor, wherein each of the N lens structures is arranged to focus a single image onto a respective one of the N image sensor regions, wherein the N lens structures comprise: a first lens structure having a first focal length and positioned the first focal length from the respective one of the N image sensor regions; a second lens structure having a second focal length and positioned the second focal length from the respective one of the N image sensor regions; and a third lens structure having a third focal length and positioned the third focal length from the respective one of the N image sensor regions, wherein the first, second, and third focal lengths are different.
Another aspect of the invention relates to an apparatus comprising: an image sensor including N image sensor regions arranged thereon; and a lens array comprising N lens structures disposed proximate to the image sensor, wherein each of the N lens structures is arranged to focus a single image onto a respective one of the N image sensor regions, wherein the N lens structures comprise: a first lens structure having a first radius of curvature and positioned a certain focal length from the respective one of the N image sensor regions; a second lens structure having a second radius of curvature and positioned the focal length from the respective one of the N image sensor regions; and a third lens structure having a third radius of curvature and positioned the focal length from the respective one of the N image sensor regions, wherein the first, second, and third radii of curvature are different.
Yet another aspect of the invention relates to an imaging system comprising: a pixel array comprising an image sensor having N image sensor regions arranged therein, wherein each of the N image sensor regions has a plurality of pixels arranged therein; and a lens array comprising N lens structures disposed proximate to the image sensor, wherein each of the N lens structures is arranged to focus a single image onto a respective one of the N image sensor regions, wherein the N lens structures comprise: a first lens structure having a first focal length and positioned the first focal length from the respective one of the N image sensor regions; a second lens structure having a second focal length and positioned the second focal length from the respective one of the N image sensor regions; and a third lens structure having a third focal length and positioned the third focal length from the respective one of the N image sensor regions, wherein the first, second, and third focal lengths are different; control circuitry coupled to the pixel array to control operation of the pixel array; and readout circuitry coupled to the pixel array to readout image data from the plurality of pixels.
Drawings
Non-limiting and non-exhaustive embodiments of the present invention are described with reference to the following figures, wherein like reference numerals refer to like parts throughout the various views unless otherwise indicated.
FIG. 1A is a schematic diagram of an image capture unit including an imaging lens and an image sensor.
FIG. 1B is a schematic diagram of a low-profile image capture unit including a low-profile imaging lens and an image sensor.
FIG. 2 illustrates one example of an image sensor having four partitioned areas according to the teachings of this disclosure.
FIG. 3 is a cross-section illustrating two lenses and two partitioned regions of one example of a low-profile image capture unit, according to the teachings of this disclosure.
Figure 4 illustrates one example of a 2x2 lens array for a partitioned image sensor according to the teachings of this disclosure.
FIG. 5 illustrates one example of a modulation transfer function in RGB balance for a lens at focus designed according to the teachings of this disclosure.
Figure 6A illustrates one example of a modulation transfer function in the individual R of a lens at focus designed according to the teachings of this disclosure.
Figure 6B illustrates one example of a modulation transfer function in an individual R of a lens after focus adjustment according to the teachings of this disclosure.
Figure 7A illustrates one example of a modulation transfer function in individual B of a lens at focus designed according to the teachings of this disclosure.
Figure 7B illustrates one example of a modulation transfer function in individual B of a lens after focus adjustment, according to the teachings of this disclosure.
Figure 8 illustrates one example of a modulation transfer function in the individual G of a lens at focus designed according to the teachings of this disclosure.
Figure 9 illustrates one example of a 2x2 lens array on a partitioned image sensor according to the teachings of this disclosure.
Figure 10 illustrates a cross section of one example of a 2x2 lens array on a partitioned image sensor according to the teachings of this disclosure.
Figure 11 illustrates a cross section of another example of a 2x2 lens array on a partitioned image sensor according to the teachings of this disclosure.
Figure 12 illustrates a cross section of another example of a 2x2 lens array on a partitioned image sensor according to the teachings of this disclosure.
FIG. 13 is a block diagram illustrating one example of an image sensor according to the teachings of this disclosure.
Detailed Description
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. However, it will be apparent to one skilled in the art that the specific details need not be used to practice the invention. In other instances, well-known materials or methods have not been described in detail in order to avoid obscuring the present invention.
Reference throughout this specification to "one embodiment," an embodiment, "an example" or "an example" means that a particular feature, structure, or characteristic described in connection with the embodiment or example is included in at least one embodiment of the present invention. Thus, appearances of the phrases "in one embodiment," "in an embodiment," "one example" or "an example" in various places throughout this specification are not necessarily all referring to the same embodiment or example. Furthermore, the particular features, structures, or characteristics may be combined in any suitable combination and/or sub-combination in one or more embodiments or examples. The particular features, structures, or characteristics may be included in an integrated circuit, an electronic circuit, a combinational logic circuit, or other suitable components that provide the described functionality. Further, it should be understood that the drawings provided herein are for explanation purposes to persons skilled in the art and that the drawings are not necessarily drawn to scale.
Example methods and apparatus relating to low-profile image capture units are disclosed. As will be understood, according to the teachings of this disclosure, a low-profile image capture unit according to the teachings of this disclosure may be provided while not sacrificing the quality (e.g., resolution (i.e., number of pixels) and sharpness) of the captured optical image for low-profile.
To illustrate, FIG. 1A is a schematic diagram of an image capture unit 200 including an imaging lens 202 and an image sensor 204. The distance between the lens 202 and the image sensor 204 is approximately f, where f is the focal length of the lens 202. The image sensor 204 has a width W covered by the lens 202 and a lens diameter D. For comparison, FIG. 1B shows a schematic diagram of low-profile image capture unit 210 including imaging lens 212 and image sensor 214. The distance between lens 212 and image sensor 214 is approximately f/2, where f/2 is the focal length of lens 212. The image sensor 214 is covered by the lens 212 with a width of W/2 and a lens diameter of D/2.
In a low-profile image capture unit, the imaging lens is replaced with a low-profile imaging lens while the image sensor is unchanged. Image sensors 204 and 214 are the same image sensor, and both image sensors have the same pixel array structure. Because the width of image sensor 214 is half the width of image sensor 204, image sensor 214 will have half the number of pixels in one dimension as compared to image sensor 204. In two dimensions, image sensor 214 will have a quarter number of pixels compared to image sensor 204. In other words, the number of pixels of the captured image is approximately proportional to the square of the proportion of the distance between the lens and the image sensor.
FIG. 2 illustrates an image sensor 220 having four partitioned areas 222, 224, 226, and 228 closely arranged in proximity to one another according to the teachings of this disclosure. Each partitioned area 222, 224, 226, and 228 is covered by a respective imaging lens (e.g., lens 212 of FIG. 1B). In this way, the focal length of the imaging lens (e.g., lens 212 of FIG. 1B) may be half that of the imaging lens (e.g., lens 202 of FIG. 1A) when the image sensor is not divided into four regions. Thus, a low-profile image capture unit may be constructed using four lenses and four partitioned regions of an image sensor. Because four regions of the image sensor are used, the low-profile image capture unit will have about the same resolution (i.e., the same number of pixels) as the original image capture unit. An area of the image sensor may be similar to the image sensor 214 of FIG. 1B.
To illustrate, FIG. 3 shows a cross-section of a low-profile image capture unit 300 including four imaging lenses and four partitioned regions of an image sensor, according to the teachings of this disclosure. In one example, the cross-section illustrated in fig. 3 may correspond to dashed line a-a' of fig. 2. The four partitioned areas of the image sensor may be areas 222, 224, 226, and 228 of the image sensor 220 of FIG. 2. Only two imaging lenses 302 and 304 having focal lengths f1 and f2, respectively, are shown in fig. 3. Similarly, only two partitioned areas 222 and 224 of image sensor 220 are shown in FIG. 3. In this way, an image capture system having a low profile may be constructed while the resolution (i.e., number of pixels) of the captured image may be maintained.
As shown in the illustrated example, the imaging lens 302 is positioned a first focal length f1 from the respective image sensor 222. Imaging lens 304 is positioned a second focal length f2 from the corresponding image sensor 224. As shown in the depicted example, the second focal length f2 is approximately half of the focal length when compared to the lens 202 shown in fig. 1. Thus, according to the teachings of this disclosure, example image capture unit 300 of FIG. 3 is a low-profile image capture unit such that the width covered by image sensors 222 and 224 by lenses 302 and 304 is W/2, and the lens diameter of lenses 302 and 304 is D/2.
A typical image capture unit may include an array of bayer-type color filters on an image sensor. In contrast, the partitioned regions of image sensors 222 and 224 of FIG. 3 may not include a Bayer type color filter array. Referring back to fig. 2, the partition areas 222, 224, 226, and 228 may be assigned to red (R), green (G), blank (C), and blue (B) areas, respectively. The red area may be covered by a single red filter, the green area may be covered by a single green filter, the blue area may be covered by a single blue filter, and the blank area or C-area may not be covered by any filter or may be covered by a single green filter.
As shown in the example, the first focal length f1 is different from the second focal length f 2. In one embodiment, the first focal length f1 corresponds to light having a first color (e.g., without limitation, red (R)) and the second focal length f2 corresponds to light having a second color (e.g., without limitation, green (G)). Thus, according to the teachings of this disclosure, a single image having a first color is focused by lens 302 onto image sensor 222 and the same single image having a second color is focused by lens 304 onto image sensor 224.
Referring briefly back to the example depicted in fig. 2, the red (R) region includes only red pixels, the green (G) region includes only green pixels and the blue (B) region includes only blue pixels. The white or C-region may include white pixels when no filter is applied and the white or C-region may include green pixels when a green filter is applied. A readout system and/or processor (not shown) may rearrange the red, green, and blue pixels into a bayer pattern or any pattern to further process the color signals and form a color image. According to the teachings of this disclosure, a C pixel may be used as a white pixel or simply function as a green pixel for a particular process.
Figure 4 illustrates a lens array 400 for a partitioned image sensor according to the teachings of this disclosure. The partitioned image sensor may be the image sensor 220 of fig. 2. Lens array 400 may be a 2x2 array with low-profile lenses 402, 404, 406, and 408, low-profile lenses 402, 404, 406, and 408 assigned to red (R), green (G), blank (C), and blue (B) regions, respectively. In other words, each of the lenses 402, 404, 406, and 408 is arranged to focus a single image onto a respective one of the red (R), green (G), clear (C), and blue (B) area image sensor regions. Thus, lens 402 forms only a red image, lens 404 forms only a green image and lens 408 forms only a blue image. Further, in one example, each of lenses 402, 404, 406, and 408 has a different respective focal length corresponding to a particular color of light focused onto a corresponding image sensor region in accordance with the teachings of this disclosure. In another example and as will be discussed in further detail below, each of lenses 402, 404, 406, and 408 has a different respective radius of curvature corresponding to a particular color of light focused onto a corresponding image sensor region in accordance with the teachings of this disclosure.
In contrast, typical image capture units use a single imaging lens that simultaneously forms red, green, and blue images. However, because each lens 402, 404, 406, and 408 according to the teachings of this disclosure individually forms a single color image, the optical quality (e.g., sharpness) of each individual image may be improved by individually adjusting the focal distance between each lens and the corresponding image sensor. Thus, in one example, the focal length distance between each of lenses 402, 404, 406, and 408 and the corresponding segmented image sensor may be individually adjusted according to the wavelength of the light to obtain high quality images according to the teachings of this disclosure. Thus, in one example, the position of the red lens relative to the red region of the image sensor, the position of the green lens relative to the green region of the image sensor, and the position of the blue lens relative to the blue region of the image sensor are different in accordance with the teachings of this disclosure. In one example, the position of the C lens relative to the C region of the image sensor may be the same as the position of the green lens relative to the green region of the image sensor.
To illustrate, the lens is designed in the balancing of RGB colors with a specific weighting. The Modulation Transfer Function (MTF) in the balance of RGB at the designed focus is illustrated in fig. 5, according to the teachings of this disclosure. According to the teachings of this disclosure, if the lens is used for red only (e.g., lens 402 of FIG. 4), the MTF in the individual R at the designed focus is illustrated in FIG. 6A. After focus adjustment, the MTF in individual R (R is no longer at the designed focus) is illustrated in fig. 6B, according to the teachings of this disclosure. As can be appreciated, it is apparent that the lens performance illustrated in fig. 6B is superior to the lens performance illustrated in fig. 6A. Similarly, if the lens is used for blue only (e.g., lens 408 of FIG. 4), the MTF in individual B at the designed focus is illustrated in FIG. 7A, in accordance with the teachings of this disclosure. After focus adjustment, the MTF in individual B (B is no longer at the designed focus) is illustrated in fig. 7B, according to the teachings of this disclosure. As can be appreciated, it is apparent that the lens performance illustrated in fig. 7B is superior to the lens performance illustrated in fig. 7A. FIG. 8 illustrates the MTF in individual G at designed focus according to the teachings of this disclosure. In the depicted example, green does not require focus adjustment, as the balanced design in that example is optimized at green.
Comparing fig. 6B with fig. 5, and comparing fig. 7B with fig. 5, it is apparent that the MTFs in fig. 6B and fig. 7B are generally higher than the MTFs in fig. 5. For example, the peak of the MTF in fig. 5 is less than 0.7, while the peak of the MTF in fig. 6B and 7B is greater than 0.7. Because individual narrow bandwidths of wavelength are used, focus adjustment provides even better MTF than the original design. In other words, according to the teachings of this disclosure, the image quality (i.e., image sharpness) provided by the focus-adjusted individual lenses is better than the image quality (i.e., image sharpness) as designed (e.g., MTF in fig. 5) and better than the image quality before the individual focus adjustment (e.g., MTF in fig. 6A and 7A).
FIG. 9 illustrates a 2x2 lens array 500 disposed proximate to a partitioned image sensor 501 according to the teachings of this disclosure. In one example, lens array 500 may include individual wafer level lens cubes 502, 504, 506, and 508 (which are identical lenses) to focus a single image onto a respective one of respective sections of image sensor 501, according to the teachings of this disclosure. In the depicted example, lenses 502, 504, 506, and 508 are designated to R, G, C and the B region, respectively. As previously described, the focal length positions of the R lens 502 and the B lens 508 may be adjusted according to the particular color from the designed focus. In one example, the positions of the G lens 504 and the C lens 506 may not require adjustment from a designed focus. In other words, the red focal length and the B focal length are different from the G focal length according to the teachings of this disclosure. There are three focal lengths, namely, an R focal length, a B focal length, and a G focal length. Thus, according to the teachings of this disclosure, the position of the R lens relative to the R region of the image sensor, the position of the G lens relative to the G region of the image sensor, and the position of the B lens relative to the B region of the image sensor are different. In one example, the position of the C lens relative to the C region of the image sensor is the same as the position of the G lens relative to the G region of the image sensor.
Figure 10 illustrates a cross-section 550 of a 2x2 lens array 500 according to the teachings of the present disclosure. Only lens cubes 502 and 504 are shown in fig. 10. In one example, the cross-section illustrated in fig. 10 may correspond to dashed line B-B' of fig. 9. Note that lens cubes 502 and 504 in fig. 10 are equivalent to lenses 302 and 304 in fig. 3. Thus, as shown in the illustrated example, lens cube 502 is positioned a first focal length f1 from respective image sensor 512 and lens cube 504 is positioned a second focal length f2 from respective image sensor 514. In the example, the first focal length f1 corresponds to a first color and the second focal length f2 corresponds to a second color. In the example, the first focal length f1 is different from the second focal length f2 and the first color is different from the second color.
As shown in the depicted example, lens cubes 502 and 504 are disposed on backside spacers 516 and 518, respectively. Backside spacers 516 and 518 are disposed on the cover glass 510. Partitioned areas 512 and 514 of the single image sensor are under the cover slip 510, aligned with the lens cubes 502 and 504, respectively. The thickness of the backside spacers 516 is different from the thickness of the backside spacers 518 for individual focus adjustment. Three backside spacers of different thicknesses may be required, one for R, G and B.
In one example, each wafer level lens cube includes at least a glass wafer and a lens on the glass wafer. In general, each wafer level lens cube may include lens 520 on glass wafer 522, lens 524 on the other side of glass wafer 522, lens 528 on glass wafer 530, lens 532 on the other side of glass wafer 530, and spacer 526 between glass wafer 522 and glass wafer 530, and glass wafers 522 and 530.
It should be understood that conventional lenses including molded plastic lenses may also be used. Wafer level lens cubes, however, have some advantages over conventional lenses. For example, a plurality of plastic lenses may be inserted into the sleeve. The sleeve may be screwed into a holder that may be fixed with the image sensor on a printed circuit board. The corresponding sleeve and holder may increase the overall size of the lens, which may require a larger gap between the partitioned areas of the image sensor. This may result in an increased silicon wafer size, which in turn may increase the overall cost of the image sensor.
Figure 11 shows a cross section of another example of a 2x2 lens array 600 according to the teachings of this disclosure. Only lens cubes 602 and 604 are shown in fig. 11. In one example, the cross-section illustrated in FIG. 11 may also correspond to dashed line B-B' in FIG. 9. Note that the 2x2 lens array 600 of fig. 11 shares many similarities with the 2x2 lens array 550 of fig. 10. However, the backside spacers 616 and 618 of the lens array 600 have the same thickness when compared to the backside spacers 516 and 518. Thus, lens cubes 602 and 604 have at least one different radius of curvature (ROC) according to the teachings of the present invention. For example, ROC620 of R lens cube 602 is different from ROC622 of G lens cube 604. The difference in ROC causes focal length f1 at the red color of R lens cube 602 to be equal to focal length f1 at the green color of G lens cube 604. Although not shown, the ROC of the B lens cube is also different from the ROC of the R and G lens cubes, and the ROC of the C lens cube may be the same as the ROC of the G lens cube. Thus, according to the teachings of this disclosure, the ROC of the R lens cube, the ROC of the G lens cube, and the ROC of the B lens cube are not identical and correspond to a particular color of a single image focused onto a respective image sensor. However, the focal length at red of the R lens, the focus at green of the G lens, and the focal length at blue of the B lens are the same. Thus, according to the teachings of this disclosure, the position of each lens relative to the corresponding segmented image sensor is the same (e.g., f1 as illustrated in FIG. 11), even though the respective radii of curvature of the lenses are different depending on the color.
Figure 12 shows a cross section of another embodiment of a 2x2 lens array 700 according to the teachings of the present disclosure. In one example, the cross-section illustrated in fig. 12 may correspond to dashed line B-B' in fig. 9. In the lens array 700, an R wafer level lens cube 702, a G wafer level lens cube 704, a C wafer level lens cube 706, and a B wafer level lens cube 708 are formed on the same wafer. For example, the same wafers may be glass wafers 720 and 722. In the example depicted in FIG. 12, cubes 702, 704, and 708 have different ROCs corresponding to particular colors of a single image focused onto respective image sensor zones, in accordance with the teachings of this disclosure. For example, ROC710 of R lens cube 702 is different from ROC712 of G lens cube 704. The C lens cube 706 has an ROC, which may be the same as the ROC of the G lens cube 704. Only lens cubes 702 and 704 are shown in fig. 12. The back spacers 716 and 718 have the same thickness. Therefore, the ROC of the R lens cube, the ROC of the G lens cube, and the ROC of the B lens cube are different. However, the focal length at red of the R lens, the focal length at green of the G lens, and the focal length at blue of the B lens have the same value of f 1. Thus, according to the teachings of this disclosure, the position of each lens relative to the corresponding segmented image sensor is the same (e.g., as illustrated in fig. 12 as f1), even though the respective radii of curvature of the lenses are different depending on the color.
In one example, a single color filter may be disposed on each lens cube. Alternatively, a single color filter may be disposed between each lens cube and each partitioned area of a single image sensor. For example, referring back to FIG. 10, color filters 540 and 542 are disposed on lens cubes 502 and 504, respectively. Alternatively, color filters 544 and 546 may be disposed between lens cube 502 and partitioned area 512 of the image sensor, and between lens cube 504 and partitioned area 514 of the image sensor, respectively, as also shown in FIG. 10. Only a single color filter, for example, color filter 540 or color filter 544, is included in the image capture unit for each lens cube.
Figure 13 is a block diagram illustrating an image sensor 800 according to the teachings of this disclosure. Image sensor 800 is one example implementation of image sensor 220 of fig. 2 or image sensor 501 of fig. 9. The illustrated embodiment of image sensor 800 includes pixel array 805, readout circuitry 810, functional logic 815, and control circuitry 820. The pixel array 805 may be partitioned into four partitioned areas (not shown in fig. 13), such as shown in fig. 2.
The pixel array 805 is a two-dimensional (2D) array of image sensors or pixels (e.g., pixels P1, P2 … Pn). Each pixel may be a CMOS pixel or a CCD pixel. As illustrated, each pixel is arranged in rows (e.g., rows R1-Ry) and columns (e.g., columns C1-Cx) to acquire image data of a person, place, object, etc., which may then be used to present a 2D image of the person, place, object, etc. In one example, pixel array 805 is a backside illuminated (BSI) image sensor. In one example, pixel array 805 is a front-lit (FSI) image example, and pixel array 805 is partitioned into a plurality of partitioned regions. Each partitioned area is covered by a color filter.
After each pixel has obtained its image data or image charge, the image data is read out by readout circuitry 810 and transferred to functional logic 815. The readout circuitry 810 may include amplification circuitry, analog-to-digital (ADC) conversion circuitry, or others. Function logic 815 may simply store the image data or even manipulate the image data by applying a post-image effect (e.g., crop, rotate, remove red-eye, adjust brightness, adjust contrast, or otherwise). In one example, readout circuitry 810 may readout a row of image data at a time along readout column lines (illustrated) or may readout the image data using a variety of other techniques (not illustrated), such as a serial readout or a full parallel readout of all pixels simultaneously.
Control circuitry 820 is coupled to pixel array 805 to control the operating characteristics of pixel array 805. For example, the control circuit 820 may generate a shutter signal for controlling image acquisition. In one embodiment, the shutter signal is a global shutter signal for simultaneously enabling all pixels within pixel array 805 to simultaneously acquire their respective image data during a single acquisition window. In an alternative embodiment, the shutter signal is a rolling shutter signal whereby each row, column, or group of pixels is sequentially enabled during successive acquisition windows.
It should be understood that the low-profile image capture unit is not limited to a 2x2 lens array, and that any size lens array is possible. Therefore, the image sensor is not limited to four divisional areas, and any number of divisional areas is possible. The partitioned area of the image sensor may be square or rectangular. The cross-section of the lens cube may be circular, elliptical, square, or rectangular. The image sensor may be a CMOS image sensor or a CCD.
The above description of illustrated examples of the invention, including what is described in the abstract, is not intended to be exhaustive or to be limited to the precise forms disclosed. While specific embodiments of, and examples for, the invention are described herein for illustrative purposes, various equivalent modifications are possible without departing from the broader spirit and scope of the invention. Indeed, it should be understood that specific example voltages, currents, frequencies, power range values, times, etc., are provided for purposes of explanation and that other values may also be used in other embodiments and examples in accordance with the teachings of this disclosure.

Claims (15)

1. An imaging apparatus, comprising:
an image sensor including N image sensor regions arranged thereon;
a lens array comprising N identical lens structures disposed proximate to and above the image sensor, wherein each of the N identical lens structures is arranged to focus a single image onto a respective one of the N image sensor regions, wherein the N identical lens structures comprise: a first lens structure having a first focal length corresponding to light having a first color and positioned the first focal length from the respective one of the N image sensor regions; a second lens structure having a second focal length corresponding to light having a second color and positioned the second focal length from the respective one of the N image sensor regions; and a third lens structure having a third focal length corresponding to light having a third color and positioned the third focal length from the respective one of the N image sensor zones, wherein the first focal length, the second focal length, and the third focal length are different, and wherein at least one of the first focal length, the second focal length, and the third focal length is different from a design focal length of the respective lens structure to provide a better modulation transfer function, MTF; and
first, second, and third color filters respectively proximate to and disposed above the first, second, and third lens structures.
2. The imaging apparatus of claim 1, wherein the first lens structure focuses the single image having the first color onto the respective one of the N image sensor regions, wherein the second lens structure focuses the single image having the second color onto the respective one of the N image sensor regions, and wherein the third lens structure focuses the single image having the third color onto the respective one of the N image sensor regions.
3. The imaging apparatus of claim 1, wherein the N identical lens structures further comprise a fourth lens structure having a fourth focal length and positioned the fourth focal length from the respective one of the N image sensor regions.
4. The imaging apparatus of claim 3, wherein the fourth focal length corresponds to white light.
5. The imaging apparatus of claim 3, wherein the fourth focal length is substantially equal to one of the first focal length, the second focal length, and the third focal length.
6. An imaging apparatus, comprising:
an image sensor including N image sensor regions arranged thereon;
a lens array having N lens structures disposed proximate to and above the image sensor, wherein each of the N lens structures is arranged to focus a single image onto a respective one of the N image sensor regions, wherein the N lens structures comprise: a first lens structure having a first radius of curvature and positioned a certain focal length from the respective one of the N image sensor regions; a second lens structure having a second radius of curvature and positioned the focal length from the respective one of the N image sensor regions; and a third lens structure having a third radius of curvature and positioned the focal length from the respective one of the N image sensor regions, wherein the first, second, and third radii of curvature are different, wherein the first radius of curvature corresponds to light having a first color, wherein the second radius of curvature corresponds to light having a second color, and wherein the third radius of curvature has light corresponding to a third color; and
first, second, and third color filters respectively proximate to and above the first, second, and third lens structures.
7. The imaging apparatus of claim 6, wherein the first lens structure focuses the single image having the first color onto the respective one of the N image sensor regions, wherein the second lens structure focuses the single image having the second color onto the respective one of the N image sensor regions, and wherein the third lens structure focuses the single image having the third color onto the respective one of the N image sensor regions.
8. The imaging apparatus of claim 6, wherein the N lens structures further comprise a fourth lens structure having a fourth radius of curvature and positioned a fourth focal length from the respective one of the N image sensor regions.
9. The imaging apparatus of claim 8, wherein the fourth focal length corresponds to white light.
10. An imaging system, comprising:
a pixel array comprising an image sensor having N image sensor regions arranged therein, wherein each of the N image sensor regions has a plurality of pixels arranged therein; and
a lens array comprising N identical lens structures disposed proximate to the image sensor, wherein each of the N identical lens structures is arranged to focus a single image onto a respective one of the N image sensor regions, wherein the N identical lens structures comprise: a first lens structure having a first focal length corresponding to light having a first color and positioned the first focal length from the respective one of the N image sensor regions; a second lens structure having a second focal length corresponding to light having a second color and positioned the second focal length from the respective one of the N image sensor regions; and a third lens structure having a third focal length corresponding to light having a third color and positioned the third focal length from the respective one of the N image sensor zones, wherein the first focal length, the second focal length, and the third focal length are different, and wherein at least one of the first focal length, the second focal length, and the third focal length is different from a design focal length of the respective lens structure to provide a better modulation transfer function, MTF;
first, second, and third color filters proximate to and above the first, second, and third lens structures, respectively;
control circuitry coupled to the pixel array to control operation of the pixel array; and
readout circuitry coupled to the pixel array to readout image data from the plurality of pixels.
11. The imaging system of claim 10, further comprising functional logic coupled to the readout circuitry to store a single image data readout from each of the N image sensor regions.
12. The imaging system of claim 10, wherein the first lens structure focuses the single image having the first color onto the respective one of the N image sensor regions, wherein the second lens structure focuses the single image having the second color onto the respective one of the N image sensor regions, and wherein the third lens structure focuses the single image having the third color onto the respective one of the N image sensor regions.
13. The imaging system of claim 10, wherein the N identical lens structures further comprise a fourth lens structure having a fourth focal length and positioned the fourth focal length from the respective one of the N image sensor regions.
14. The imaging system of claim 13, wherein the fourth focal length corresponds to white light.
15. The imaging system of claim 13, wherein the fourth focal length is substantially equal to one of the first focal length, the second focal length, and the third focal length.
HK14103276.3A 2012-06-01 2014-04-04 Lens array for partitioned image sensor HK1190253B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US13/486,787 2012-06-01

Publications (2)

Publication Number Publication Date
HK1190253A HK1190253A (en) 2014-06-27
HK1190253B true HK1190253B (en) 2019-05-17

Family

ID=

Similar Documents

Publication Publication Date Title
CN103579268B (en) There is the lens arra of the segmentation imageing sensor of color filter
US8791403B2 (en) Lens array for partitioned image sensor to focus a single image onto N image sensor regions
CN206947348U (en) Imaging sensor
US8478123B2 (en) Imaging devices having arrays of image sensors and lenses with multiple aperture sizes
US8405748B2 (en) CMOS image sensor with improved photodiode area allocation
US9270953B2 (en) Wafer level camera having movable color filter grouping
US20110317048A1 (en) Image sensor with dual layer photodiode structure
US10187600B2 (en) Four shared pixel with phase detection and full array readout modes
WO2008042137A2 (en) Imaging method, apparatus and system having extended depth of field
CN102118551A (en) Imaging device
CN102881699A (en) Solid-state imaging device, manufacturing method of solid-state imaging device and electronic apparatus
US9386203B2 (en) Compact spacer in multi-lens array module
US20230395626A1 (en) Hybrid image pixels for phase detection auto focus
CN105810702A (en) Optical isolation grid over color filter array
US11736821B1 (en) Four shared pixel with phase detection and horizontal and vertical binning readout modes
HK1190253A (en) Lens array for partitioned image sensor
HK1190253B (en) Lens array for partitioned image sensor
HK1192371A (en) Lens array for partitioned image sensor having color filters
HK1192371B (en) Lens array for partitioned image sensor having color filters
HK1209551B (en) Compact spacer in multi-lens array module
CN113905193A (en) Image sensor with full color function of dark scene and imaging method thereof
HK1212840B (en) Wafer level camera having movable color filter grouping
HK1204407A1 (en) Apparatus, system and method for correcting image sensor fixed pattern noise