[go: up one dir, main page]

JP2016029674A - Solid-state image pickup device - Google Patents

Solid-state image pickup device Download PDF

Info

Publication number
JP2016029674A
JP2016029674A JP2012275453A JP2012275453A JP2016029674A JP 2016029674 A JP2016029674 A JP 2016029674A JP 2012275453 A JP2012275453 A JP 2012275453A JP 2012275453 A JP2012275453 A JP 2012275453A JP 2016029674 A JP2016029674 A JP 2016029674A
Authority
JP
Japan
Prior art keywords
phase difference
difference pixel
pixel
peripheral portion
direction side
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2012275453A
Other languages
Japanese (ja)
Inventor
田中 俊介
Shunsuke Tanaka
俊介 田中
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujifilm Corp
Original Assignee
Fujifilm Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujifilm Corp filed Critical Fujifilm Corp
Priority to JP2012275453A priority Critical patent/JP2016029674A/en
Priority to PCT/JP2013/082558 priority patent/WO2014097884A1/en
Publication of JP2016029674A publication Critical patent/JP2016029674A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/34Systems for automatic generation of focusing signals using different areas in a pupil plane
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/134Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/703SSIS architectures incorporating pixels for producing signals other than image signals
    • H04N25/704Pixels specially adapted for focusing, e.g. phase difference pixel sets
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10FINORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
    • H10F39/00Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
    • H10F39/80Constructional details of image sensors
    • H10F39/805Coatings
    • H10F39/8053Colour filters
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10FINORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
    • H10F39/00Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
    • H10F39/80Constructional details of image sensors
    • H10F39/806Optical elements or arrangements associated with the image sensors
    • H10F39/8063Microlenses
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10FINORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
    • H10F39/00Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
    • H10F39/10Integrated devices
    • H10F39/12Image sensors
    • H10F39/199Back-illuminated image sensors
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10FINORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
    • H10F39/00Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
    • H10F39/80Constructional details of image sensors
    • H10F39/805Coatings
    • H10F39/8057Optical shielding

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Automatic Focus Adjustment (AREA)
  • Solid State Image Pick-Up Elements (AREA)
  • Focusing (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)

Abstract

【課題】撮影画像の画質を低下させずに、受光領域の中央部と周辺部の位相差画素の感度差を低減する。【解決手段】固体撮像装置は、被写体の像が結像される受光領域に通常画素と右位相差画素42aと左位相差画素をそれぞれ複数備える。通常画素は、入射光を等方的に受光する。右位相差画素及び左位相差画素は、それぞれ右側,左側に偏心配置された開口を有し、入射光のうち右側,左側に到達する光を選択的に受光する。受光領域の左側の周辺部11bLにある右位相差画素42aは、中央部11a及び右側の周辺部11bRのものよりも開口56Rの面積が大きい。逆に、受光領域の右側の周辺部11bRにある左位相差画素は、中央部11a及び左側の周辺部11bLのものよりも開口の面積が大きい。【選択図】図8A sensitivity difference between phase difference pixels in a central portion and a peripheral portion of a light receiving region is reduced without degrading the image quality of a captured image. A solid-state imaging device includes a plurality of normal pixels, a right phase difference pixel, and a left phase difference pixel in a light receiving region where an image of a subject is formed. A normal pixel receives incident light isotropically. Each of the right phase difference pixel and the left phase difference pixel has openings that are eccentrically arranged on the right side and the left side, respectively, and selectively receive light reaching the right side and the left side of incident light. The right phase difference pixel 42a in the left peripheral portion 11bL of the light receiving region has an area of the opening 56R larger than that of the central portion 11a and the right peripheral portion 11bR. Conversely, the left phase difference pixel in the peripheral portion 11bR on the right side of the light receiving region has an opening area larger than those in the central portion 11a and the peripheral portion 11bL on the left side. [Selection] Figure 8

Description

本発明は、焦点検出等に利用される非対称な画素を有する固体撮像装置に関する。   The present invention relates to a solid-state imaging device having asymmetric pixels used for focus detection and the like.

デジタルカメラや、携帯電話機やスマートフォン等の携帯機器に搭載されるカメラモジュールにはオートフォーカス機能がほぼ標準的に搭載されている。オートフォーカスを実現するための方法は複数知られており、例えば、近年のデジタルカメラやカメラモジュールでは、像のコントラストが最大になるように焦点調節を行うコントラスト検出方式のオートフォーカス(いわゆるコントラストAF)や、視差による位相差に基づいて焦点調節を行う位相差検出方式のオートフォーカス(以下、位相差AFという)が多く用いられている。   Camera modules installed in digital cameras and mobile devices such as mobile phones and smartphones are almost equipped with an autofocus function. A plurality of methods for realizing autofocus are known. For example, in recent digital cameras and camera modules, contrast detection type autofocus (so-called contrast AF) that performs focus adjustment so that the contrast of an image is maximized. In addition, a phase difference detection type autofocus (hereinafter referred to as phase difference AF) that performs focus adjustment based on a phase difference due to parallax is often used.

位相差AFには、例えば特許文献1に記載の固体撮像装置のように、複数の画素が配列された受光領域内の所定箇所に、左右いずれかの方向から入射する光を選択的に受光して位相差を検出するために、左右方向に非対称な形状に形成された焦点検出用の画素(以下、位相差画素という)を有する固体撮像装置が用いられる(特許文献1)。   For the phase difference AF, for example, as in the solid-state imaging device described in Patent Document 1, light incident from a left or right direction is selectively received at a predetermined position in a light receiving region in which a plurality of pixels are arranged. In order to detect the phase difference, a solid-state imaging device having focus detection pixels (hereinafter referred to as phase difference pixels) formed in an asymmetric shape in the left-right direction is used (Patent Document 1).

ところで、近年のデジタルカメラやカメラモジュールは、小型化,薄型化の要請からバックフォーカスが短くなる傾向にある。また、小型薄型のデジタルカメラ等にも受光領域の面積が大きい固体撮像装置が搭載されるようになってきている。このように、バックフォーカスを短くしたり、バックフォーカスに対して受光領域の面積が大きい固体撮像装置を採用したりすると、固体撮像装置の周辺部への光の入射角度が大きくなる。   By the way, in recent digital cameras and camera modules, the back focus tends to be shortened due to demands for miniaturization and thinning. In addition, a solid-state imaging device having a large light receiving area has been mounted on a small and thin digital camera or the like. As described above, when the back focus is shortened or a solid-state imaging device having a large light receiving area with respect to the back focus is employed, the incident angle of light to the peripheral portion of the solid-state imaging device increases.

固体撮像装置の画素は、概ね正面から入射する光を効率良く集光して光電変換するように形成されているので、光の入射角度が大きくなると感度が低下する。このため、受光領域の周辺部にある画素への光の入射角度が大きい場合、撮影画像には輝度や色にシェーディングが生じることがある。シェーディングを低減するためには、周辺部への光の入射角度を考慮して、中央部と周辺部で画素の感度を均一化すれば良い。このため、例えば、各画素に設けられるマイクロレンズを、周辺部の画素ほど中央部に寄せて配置(スケーリング)することにより、周辺部にある画素の集光効率を上げ、中央部の画素と周辺部の画素の感度差を緩和し、シェーディングを低減する技術が知られている。   Since the pixels of the solid-state imaging device are formed so that light incident from the front is generally efficiently collected and photoelectrically converted, the sensitivity decreases as the light incident angle increases. For this reason, when the incident angle of light to the pixels in the periphery of the light receiving region is large, shading may occur in luminance and color in the captured image. In order to reduce shading, it is only necessary to equalize the sensitivity of the pixels at the central portion and the peripheral portion in consideration of the incident angle of light to the peripheral portion. For this reason, for example, by arranging (scaling) the microlenses provided in each pixel closer to the center as the pixels in the peripheral part increase the light collection efficiency of the pixels in the peripheral part, There is known a technique for reducing the shading by relaxing the sensitivity difference between the pixels of the part.

また、固体撮像装置の中央部から周辺部にかけて画素の開口を大きくすることにより、あるいは周辺部から中央部にかけて画素の開口を小さくすることにより、中央部と周辺部の感度を均一化してシェーディングを低減する技術も知られている(特許文献2)。   In addition, by increasing the pixel aperture from the center to the periphery of the solid-state imaging device, or by reducing the pixel aperture from the periphery to the center, the sensitivity of the center and the periphery is made uniform and shading is performed. The technique to reduce is also known (patent document 2).

さらに、位相差画素を有する固体撮像装置においては、中央部の位相差画素の瞳領域を周辺部のものに比べて小さくすることによって、各位相差画素の信号レベルを均一にすることや、逆に周辺部の位相差画素の瞳領域を中央部のものに比べて小さくすることによって、各位相差画素のケラレを均一化することで、位相差AFの精度を向上させる技術が知られている(特許文献3)。   Furthermore, in a solid-state imaging device having phase difference pixels, the signal level of each phase difference pixel can be made uniform by making the pupil region of the phase difference pixel in the central portion smaller than that in the peripheral portion, and conversely A technique for improving the accuracy of the phase difference AF by making the pupil region of the phase difference pixels in the peripheral part smaller than that in the central part and making the vignetting of each phase difference pixel uniform (patent) Reference 3).

特開2012−003009号公報JP 2012-003009 A 特開2004−056407号公報JP 2004-056407 A 特開2010−039106号公報JP 2010-039106 A

位相差画素は例えば左右に非対称な構造に形成されているので、通常画素よりも光の入射角度に対して敏感であり、周辺部にある位相差画素は、通常画素よりも感度の低下が大きい。通常画素は、例えば、マイクロレンズのスケーリング等によって感度の均一化が図られるが、上述のように周辺部における位相差画素の感度の低下量は、通常画素よりも大きいので、通常画素用のマイクロレンズスケーリングでは位相差画素の感度を十分に補えず、周辺部の位相差画素を用いる場合には位相差AFの精度が悪い場合がある。   For example, the phase difference pixel is formed in an asymmetrical structure on the left and right sides, so that it is more sensitive to the incident angle of light than the normal pixel, and the phase difference pixel in the peripheral portion has a lower sensitivity than the normal pixel. . For example, the sensitivity of the normal pixel is made uniform by, for example, microlens scaling. However, as described above, the amount of decrease in the sensitivity of the phase difference pixel in the peripheral portion is larger than that of the normal pixel. In the lens scaling, the sensitivity of the phase difference pixels cannot be sufficiently compensated, and when the phase difference pixels in the peripheral part are used, the accuracy of the phase difference AF may be poor.

このため、周辺部の位相差画素を用いる場合でも精度が良い位相差AFを行うためには、例えば特許文献2,3に倣って周辺部の位相差画素の開口や瞳領域の大きさを調節し、周辺部の位相差画素の感度を向上させることが望ましい。   For this reason, in order to perform accurate phase difference AF even when using peripheral phase difference pixels, the aperture of the peripheral phase difference pixels and the size of the pupil region are adjusted according to Patent Documents 2 and 3, for example. It is desirable to improve the sensitivity of the peripheral phase difference pixels.

しかしながら、位相差画素は、例えば左右非対称な構造に形成されているので、特許文献2,3のように周辺部にある位相差画素の開口等を一律に大きくしても、位相差AFの精度が向上しない場合がある。具体的には、入射光のうち右側に到達した光を選択的に受光する位相差画素(以下、右位相差画素という)は、受光領域の中央に対して、左側の周辺部にあるものは感度が低いが、右側の周辺部にあるものではむしろ感度が高い。逆に、入射光のうち左側に到達した光を選択的に受光する位相差画素(以下、左位相差画素という)は、右側の周辺部にあるものでは感度が低いが、左側の周辺部にあるものでは感度が高い。したがって、特許文献2,3のように、周辺部において開口や瞳領域の大きさを調節するとしても、右位相差画素と左位相差画素の区別をせずに、位相差画素の開口等の大きさを一律に調節したのでは、もともと感度が高い側にあるものではS/Nが悪化してしまうので、必ずしも位相差AFの精度向上には繋がらない。   However, since the phase difference pixels are formed in a left-right asymmetric structure, for example, the accuracy of the phase difference AF can be achieved even if the apertures of the phase difference pixels in the peripheral portion are uniformly increased as in Patent Documents 2 and 3. May not improve. Specifically, a phase difference pixel that selectively receives light that has reached the right side of incident light (hereinafter referred to as a right phase difference pixel) is located on the left side with respect to the center of the light receiving region. Although the sensitivity is low, the sensitivity at the peripheral part on the right is rather high. Conversely, a phase difference pixel (hereinafter referred to as a left phase difference pixel) that selectively receives light that has reached the left side of the incident light has a low sensitivity in the right side peripheral part, but in the left side peripheral part. Some have high sensitivity. Therefore, as in Patent Documents 2 and 3, even if the size of the aperture or pupil region is adjusted in the peripheral portion, the aperture of the phase difference pixel or the like is not distinguished without distinguishing between the right phase difference pixel and the left phase difference pixel. If the size is uniformly adjusted, the S / N is deteriorated in the case where the sensitivity is originally high, so that the accuracy of the phase difference AF is not necessarily improved.

また、画素値を読み出す場合に通常画素の画素値を読み出す場合よりもゲインを上げることによって、感度が低い周辺部の位相差画素の画素値を大きくし、各位相差画素の感度を均一化させることもできるが、読み出しのゲインを上げると、画素値に含まれるノイズ成分も増大してしまうのでS/Nは改善しない。特に、光の入射角度が大きいために周辺部の画素では隣接する画素に入射すべき光が入射されて、混色が発生することがあるが、読み出しのゲインを上げる場合、混色による信号値も大きくなるので、感度が低い周辺部の位相差画素の画素値を読み出す場合にゲインを上げるだけでは、位相差AFの精度を十分に改善することはできない。   Also, by increasing the gain when reading the pixel value than when reading the pixel value of the normal pixel, the pixel value of the phase difference pixel in the peripheral area with low sensitivity is increased, and the sensitivity of each phase difference pixel is made uniform However, increasing the readout gain also increases the noise component included in the pixel value, so the S / N is not improved. In particular, because the light incident angle is large, light that should be incident on adjacent pixels may enter the neighboring pixels and color mixing may occur. However, if the readout gain is increased, the signal value due to color mixing also increases. Therefore, the accuracy of the phase difference AF cannot be sufficiently improved only by increasing the gain when reading the pixel values of the peripheral phase difference pixels with low sensitivity.

この他には、マイクロレンズのスケーリング量を、通常画素と位相差画素とで異なる大きさにすることによって、周辺部の位相差画素の感度を向上させることも考えられるが、このように通常画素と位相差画素とでマイクロレンズのスケーリング量を変えると、通常画素のマイクロレンズと位相差画素のマイクロレンズの間に隙間が生じるために混色が増大してしまうことがある。   In addition to this, it is conceivable to improve the sensitivity of the peripheral phase difference pixels by making the microlens scaling amount different between the normal pixels and the phase difference pixels. If the scaling amount of the microlens is changed between the phase difference pixel and the phase difference pixel, a color gap may increase due to a gap between the microlens of the normal pixel and the phase difference pixel.

本発明は、撮影画像の画質を低下させずに、受光領域の周辺部にある位相差画素のうち感度が低いものの感度を向上し、周辺部の位相差画素を用いる場合でも、正確な位相差AFを行えるようにすることを目的とする。   The present invention improves the sensitivity of the low-sensitivity phase difference pixels in the peripheral part of the light receiving region without degrading the image quality of the photographed image, so that the accurate phase difference can be obtained even when the peripheral phase difference pixels are used. The purpose is to enable AF.

本発明の固体撮像装置は、被写体の像が結像される受光領域に通常画素と第1位相差画素と第2位相差画素をそれぞれ複数備える。通常画素は、入射光を等方的に受光する。第1位相差画素は、第1方向(例えば、右方向)に偏心配置された開口を有する画素であり、入射光のうち前記第1方向の側に到達した光を選択的に受光する。第2位相差画素は、第1方向とは逆の第2方向(例えば、左方向)に偏心配置された開口を有する画素であり、入射光のうち第2方向の側に到達した光を選択的に受光する。そして、第2方向の側の周辺部にある第1位相差画素は、受光領域の中央部及び第1方向の側の周辺部にある第1位相差画素に対して、第1開口の面積が大きい。逆に、第1方向の側の周辺部にある第2位相差画素は、中央部及び第2方向の側の周辺部にある第2位相差画素に対して、第2開口の面積が大きい。   The solid-state imaging device of the present invention includes a plurality of normal pixels, first phase difference pixels, and second phase difference pixels in a light receiving region where an image of a subject is formed. A normal pixel receives incident light isotropically. The first phase difference pixel is a pixel having an opening that is eccentrically arranged in a first direction (for example, the right direction), and selectively receives light reaching the first direction side of incident light. The second phase difference pixel is a pixel having an opening eccentrically arranged in a second direction (for example, the left direction) opposite to the first direction, and selects light that has reached the second direction side from incident light. Receive light. The first phase difference pixel in the peripheral portion on the second direction side has an area of the first opening with respect to the first phase difference pixel in the central portion of the light receiving region and the peripheral portion on the first direction side. large. Conversely, the second phase difference pixel in the peripheral portion on the first direction side has a larger area of the second opening than the second phase difference pixel in the peripheral portion on the central portion and the second direction side.

第2方向の側の周辺部にある第1位相差画素では、開口面積を拡大するために、第1開口が中央部のものに対して第2方向の側に拡張されていることが好ましい。また、第1方向の側の周辺部にある第2位相差画素では、開口面積を拡大するために、第2開口が中央部のものに対して第1方向の側に拡張されていることが好ましい。   In the first phase difference pixel in the peripheral portion on the second direction side, the first opening is preferably extended to the second direction side with respect to the central portion in order to enlarge the opening area. In addition, in the second phase difference pixel in the peripheral portion on the first direction side, the second opening may be extended to the first direction side with respect to the central portion in order to enlarge the opening area. preferable.

第2方向の側の周辺部にある第1位相差画素では、第1開口の中心位置が、中央部のものよりも第2方向の側にあることが好ましい。また、第1方向の側の周辺部にある前記第2位相差画素では、第2開口の中心位置が中央部のものよりも第1方向の側にあることが好ましい。   In the first phase difference pixel in the peripheral portion on the second direction side, the center position of the first opening is preferably on the second direction side with respect to the central portion. In the second phase difference pixel in the peripheral portion on the first direction side, the center position of the second opening is preferably on the first direction side with respect to the central portion.

第2方向の側の周辺部にある第1位相差画素、及び、第1方向の側の周辺部にある第2位相差画素は、光電変換をするためのフォトダイオードが中央部のものよりも小さいことが好ましい。   The first phase difference pixel in the peripheral portion on the second direction side and the second phase difference pixel in the peripheral portion on the first direction side have a photodiode for photoelectric conversion more than that in the central portion. Small is preferable.

第1開口の面積は中央部からの第2方向の側の周辺部にかけて、第2開口の面積は中央部から第1方向の側の周辺部にかけて、それぞれ段階的に拡大されていても良い。   The area of the first opening may be expanded stepwise from the central part to the peripheral part on the second direction side, and the area of the second opening may be expanded stepwise from the central part to the peripheral part on the first direction side.

第1位相差画素及び第2位相差画素が複数の色に設けられている場合には、第2方向の側の周辺部にある第1位相差画素の第1開口、及び、第1方向の側の周辺部にある第2位相差画素の第2開口の各拡大率は、色毎に定められていることが好ましい。   When the first phase difference pixel and the second phase difference pixel are provided in a plurality of colors, the first opening of the first phase difference pixel in the peripheral portion on the second direction side, and the first direction pixel It is preferable that each magnification of the second opening of the second phase difference pixel in the peripheral portion on the side is determined for each color.

ズームレンズによって被写体の像を受光領域に結像する場合は、ズームレンズの焦点距離を最も長くした場合と最も短くした場合の中間の場合における入射光の入射角度に応じて、第2方向の側の周辺部にある第1位相差画素の第1開口、及び、第1方向の側の周辺部にある第2位相差画素の第2開口の各面積が定められていることが好ましい。   When the image of the subject is formed on the light receiving area by the zoom lens, the side in the second direction depends on the incident angle of the incident light when the focal length of the zoom lens is the longest and the shortest. It is preferable that each area of the first opening of the first phase difference pixel in the peripheral part of the first phase difference and the second opening of the second phase difference pixel in the peripheral part on the first direction side is determined.

さらに、第1方向の側の周辺部にある第1位相差画素では、中央部にある第1位相差画素に対して第1開口を小さくし、第2方向の側の周辺部にある第2位相差画素では、中央部にある第2位相差画素に対して第2開口を小さくしても良い。   Further, in the first phase difference pixel in the peripheral portion on the first direction side, the first opening is made smaller than that in the first phase difference pixel in the central portion, and the second phase position in the peripheral portion on the second direction side. In the phase difference pixel, the second opening may be made smaller than the second phase difference pixel in the center.

入射光を集光するマイクロレンズを画素毎に備えている場合、周辺部のマイクロレンズは、各画素への入射光の入射角度に応じて中央部の方向に偏心して配置されていることが好ましい。   In the case where each pixel is provided with a microlens that collects incident light, it is preferable that the microlenses in the peripheral portion are arranged eccentrically in the direction of the center portion according to the incident angle of the incident light to each pixel. .

また、本発明の別の固体撮像装置は、通常画素がなく、全画素が上述の第1位相差画素または第2位相差画素のいずれかの位相差画素で構成されているものである。   In another solid-state imaging device of the present invention, there are no normal pixels, and all the pixels are composed of the phase difference pixels of the first phase difference pixel or the second phase difference pixel.

本発明によれば、右位相差画素の開口を左側の周辺部において拡大し、右側の周辺部では中央部のものと同じ(あるいは縮小)し、これとは逆に左位相差画素の開口は右側の周辺部で拡大し、左側の周辺部では中央部のものと同じ(あるいは縮小)しているので、左右の位相差画素の感度を適切に補い、位相差AFの精度を向上することができる。また、マイクロレンズのスケーリング等にはよらず、位相差画素の開口の大きさを調節しているので、撮影画像の画質を低下させることもない。   According to the present invention, the opening of the right phase difference pixel is enlarged at the left side peripheral portion, and the right side peripheral portion is the same (or reduced) as that at the central portion. Since the enlargement is performed at the right peripheral part and the left peripheral part is the same (or reduced) as the central part, the sensitivity of the left and right phase difference pixels can be appropriately compensated to improve the accuracy of the phase difference AF. it can. In addition, since the size of the aperture of the phase difference pixel is adjusted regardless of the scaling of the microlens or the like, the image quality of the captured image is not degraded.

固体撮像装置を示す説明図である。It is explanatory drawing which shows a solid-state imaging device. カラーフィルタの配列を示す説明図である。It is explanatory drawing which shows the arrangement | sequence of a color filter. 通常画素及び位相差画素の構造を示す断面図である。It is sectional drawing which shows the structure of a normal pixel and a phase difference pixel. 受光領域の中央部と周辺部の光の入射角度を示す説明図である。It is explanatory drawing which shows the incident angle of the light of the center part of a light-receiving region, and a peripheral part. 位置に応じた通常画素への入射光を示す断面図である。It is sectional drawing which shows the incident light to the normal pixel according to a position. 開口面積が位置によらない場合の右位相差画素への入射光を示す断面図である。It is sectional drawing which shows the incident light to the right phase difference pixel in case an opening area does not depend on a position. 開口面積が位置によらない場合の位置に応じた右位相差画素の比感度を表すグラフである。It is a graph showing the specific sensitivity of the right phase difference pixel according to the position when the opening area does not depend on the position. 左側の周辺部で開口を拡大した場合の右位相差画素への入射光を示す断面図である。It is sectional drawing which shows the incident light to the right phase difference pixel at the time of expanding an opening in the peripheral part on the left side. 左側の周辺部で開口を拡大した場合の位置に応じた右位相差画素の開口面積を表すグラフである。It is a graph showing the opening area of the right phase difference pixel according to the position when opening is expanded in the peripheral part on the left side. 位置に応じた右位相差画素の比感度を表すグラフである。It is a graph showing the specific sensitivity of the right phase difference pixel according to a position. 左側の周辺部の右位相差画素の比感度を中央部のものと揃えた場合のグラフである。It is a graph at the time of aligning the specific sensitivity of the right phase difference pixel of the peripheral part of the left side with the thing of a center part. さらに右側の周辺部において右位相差画素の開口面積を小さくした場合の位置に応じた開口面積を示すグラフである。Furthermore, it is a graph which shows the opening area according to the position at the time of reducing the opening area of the right phase difference pixel in the right peripheral part. 中央部と周辺部で比感度を一定にした場合のグラフである。It is a graph at the time of making a specific sensitivity constant in a center part and a peripheral part. 位置に応じた左位相差画素の開口面積を示すグラフである。It is a graph which shows the opening area of the left phase difference pixel according to a position. 位置に応じた左位相差画素の感度比を示すグラフである。It is a graph which shows the sensitivity ratio of the left phase difference pixel according to a position. 位相差画素の開口面積の拡大方法を表す説明図である。It is explanatory drawing showing the expansion method of the opening area of a phase difference pixel. 位相差画素の開口面積を拡大する別の方法を表す説明図である。It is explanatory drawing showing another method to expand the opening area of a phase difference pixel. 周辺部において位相差画素のPDを小さくする場合の説明図である。It is explanatory drawing in the case of making PD of a phase difference pixel small in a peripheral part. 受光領域を細分化する例を示す説明図である。It is explanatory drawing which shows the example which subdivides a light reception area | region. 位相差画素の開口面積を段階的に拡大する場合の位置に応じた開口面積の変化を表すグラフである。It is a graph showing the change of the opening area according to the position in the case of expanding the opening area of a phase difference pixel in steps. G画素以外の位相差画素を表す説明図である。It is explanatory drawing showing phase difference pixels other than G pixel. ハニカム配列の固体撮像装置を示す説明図である。It is explanatory drawing which shows the solid-state imaging device of a honeycomb arrangement | sequence. 色配列が異なる他のハニカム配列の固体撮像装置を示す説明図である。It is explanatory drawing which shows the solid-state imaging device of the other honeycomb arrangement | sequence from which a color arrangement | sequence differs. カラーフィルタがベイヤー配列の固体撮像装置を示す説明図である。It is explanatory drawing which shows the solid-state imaging device of which a color filter is a Bayer arrangement. 全画素を位相差画素にした固体撮像装置を示す説明図である。It is explanatory drawing which shows the solid-state imaging device which made all the pixels the phase difference pixel.

図1に示すように、固体撮像装置10はCMOS型イメージセンサであり、受光領域11と、垂直走査回路13、水平走査回路14、出力回路16、制御回路17等を備える。   As shown in FIG. 1, the solid-state imaging device 10 is a CMOS image sensor, and includes a light receiving region 11, a vertical scanning circuit 13, a horizontal scanning circuit 14, an output circuit 16, a control circuit 17, and the like.

受光領域11は、複数の画素21が水平方向(X方向)及び垂直方向(Y方向)に沿って正方配列された領域である。受光領域11には、撮像レンズ(図3参照)によって被写体の像が結像され、各画素21は光電変換により、入射光量に応じた電荷を蓄積する。   The light receiving region 11 is a region in which a plurality of pixels 21 are squarely arranged along the horizontal direction (X direction) and the vertical direction (Y direction). An image of a subject is formed in the light receiving region 11 by an imaging lens (see FIG. 3), and each pixel 21 accumulates a charge corresponding to the amount of incident light by photoelectric conversion.

画素21は、フォトダイオード(以下、PDという)22、リセットトランジスタ(以下、Trという)23、アンプトランジスタ(以下、Taという)24、選択トランジスタ(以下、Tsという)25を備えている。Tr23,Ta24,Ts25は例えばn型のMOSトランジスタである。また、各画素21には、各画素21には赤色(R),緑色(G),青色(B)のいずれかの色のカラーフィルタ41(図2参照)が設けられており、各画素21は対応するカラーフィルタ41の色の光を光電変換する。   The pixel 21 includes a photodiode (hereinafter referred to as PD) 22, a reset transistor (hereinafter referred to as Tr) 23, an amplifier transistor (hereinafter referred to as Ta) 24, and a selection transistor (hereinafter referred to as Ts) 25. Tr23, Ta24, and Ts25 are, for example, n-type MOS transistors. Each pixel 21 is provided with a color filter 41 (see FIG. 2) of any one of red (R), green (G), and blue (B). Photoelectrically converts the color light of the corresponding color filter 41.

なお、受光領域11に配列されている画素21は、通常画素と位相差画素の2種類の画素のいずれかである。通常画素と位相差画素は、立体的構造(具体的にはPD22に対する開口の大きさや位置)が異なっているが、電気的構成は同じである。このため、図1では通常画素と位相差画素を区別せずに、受光領域11内の画素を全て画素21としている。   The pixel 21 arranged in the light receiving region 11 is one of two types of pixels, a normal pixel and a phase difference pixel. The normal pixel and the phase difference pixel have different three-dimensional structures (specifically, the size and position of the opening with respect to the PD 22), but the electrical configuration is the same. For this reason, in FIG. 1, all the pixels in the light receiving region 11 are the pixels 21 without distinguishing between the normal pixels and the phase difference pixels.

通常画素は、水平方向及び垂直方向に対称に形成された画素21であり、通常画素で蓄積された電荷に基づく撮像信号は撮影画像の形成に利用される。画素21の大半は通常画素である。   The normal pixel is a pixel 21 formed symmetrically in the horizontal direction and the vertical direction, and an imaging signal based on the charge accumulated in the normal pixel is used for forming a captured image. Most of the pixels 21 are normal pixels.

位相差画素は、左右いずれかの方向から入射する光を選択的に受光するために水平方向に非対称な形状に形成されており、右方向からの入射光を選択的に受光する位相差画素(以下、位相差A画素という)と、左方向からの入射光を選択的に受光する位相差画素(以下、位相差B画素という)とを一組として、複数組の位相差画素が受光領域11内に満遍なく設けられている。すなわち、位相差画素は受光領域11の中央部にも周辺部にも設けられている。位相差画素で蓄積された電荷に基づく撮像信号は位相差AFに利用される。   The phase difference pixel is formed in an asymmetric shape in the horizontal direction to selectively receive light incident from either the left or right direction, and the phase difference pixel (selectively receives light incident from the right direction) Hereinafter, a phase difference pixel (hereinafter referred to as a phase difference pixel) and a phase difference pixel (hereinafter referred to as a phase difference B pixel) that selectively receives incident light from the left direction as a set, and a plurality of sets of phase difference pixels are included in the light receiving region 11. It is evenly provided inside. That is, the phase difference pixels are provided in the central portion and the peripheral portion of the light receiving region 11. An imaging signal based on the charge accumulated in the phase difference pixel is used for phase difference AF.

PD22は光電変換により、入射光量に応じた電荷を生成する素子であり、アノードがグラウンドに接続されており、カソードがTa24のゲート電極に接続されている。PD22のカソードとTa24のゲート電極の接続部分がフローティングディフュージョン(図示しない。以下、FDという。)であり、ここにPD22で生成された電荷が蓄積される。   The PD 22 is an element that generates a charge corresponding to the amount of incident light by photoelectric conversion. The anode is connected to the ground, and the cathode is connected to the gate electrode of Ta24. A connection portion between the cathode of the PD 22 and the gate electrode of the Ta 24 is a floating diffusion (not shown; hereinafter referred to as FD), and charges generated by the PD 22 are stored therein.

Tr23は、ソース電極がFDに接続され、ドレイン電極には電源電圧VDD(図示しない)が印加される。また、Tr23のゲート電極はリセット線26に接続されている。Tr23は、垂直走査回路13からリセット線26を介してゲート電極にリセットパルスが印加されることによりオン状態になる。Tr23がオン状態になると、FDに電源電圧VDDが印加され、FDに蓄積された電荷が破棄される。   In Tr23, the source electrode is connected to the FD, and the power supply voltage VDD (not shown) is applied to the drain electrode. The gate electrode of Tr 23 is connected to the reset line 26. The Tr 23 is turned on when a reset pulse is applied from the vertical scanning circuit 13 to the gate electrode via the reset line 26. When the Tr 23 is turned on, the power supply voltage VDD is applied to the FD, and the charges accumulated in the FD are discarded.

Ta24は、ゲート電極はFDに接続され、ドレイン電極には電源電圧VDDが印加されている。また、信号電圧が出力されるソース電極は、Ts25のドレイン電極に接続されている。Ta24は、FDに蓄積された電荷量に応じた信号電圧をソース電極に出力する。   In Ta24, the gate electrode is connected to the FD, and the power supply voltage VDD is applied to the drain electrode. The source electrode from which the signal voltage is output is connected to the drain electrode of Ts25. The Ta 24 outputs a signal voltage corresponding to the amount of charge accumulated in the FD to the source electrode.

Ts25はドレイン電極がTa24のソース電極に接続され、ソース電極は信号線27に接続されている。また、Ts25のゲート電極は、行選択線28に接続されている。Ts25は、行選択線28を介して垂直走査回路13から選択パルスが入力されるとオン状態になり、Ta24から出力された信号電圧を信号線27に出力する。   The drain electrode of Ts25 is connected to the source electrode of Ta24, and the source electrode is connected to the signal line 27. The gate electrode of Ts25 is connected to the row selection line 28. Ts 25 is turned on when a selection pulse is input from the vertical scanning circuit 13 via the row selection line 28, and outputs a signal voltage output from the Ta 24 to the signal line 27.

垂直走査回路13は、画素21の駆動回路であり、各行のリセット線26と行選択線28が接続されている。垂直走査回路13は、選択された行のリセット線26にリセットパルスを入力する。また、行選択線28に選択パルスを入力してTr23やTs25の動作を制御する。これらにより、前述のように画素21を動作させる。   The vertical scanning circuit 13 is a driving circuit for the pixels 21, and a reset line 26 and a row selection line 28 for each row are connected to each other. The vertical scanning circuit 13 inputs a reset pulse to the reset line 26 of the selected row. Further, a selection pulse is input to the row selection line 28 to control the operation of Tr23 and Ts25. As a result, the pixel 21 is operated as described above.

水平走査回路14は、各信号線27上に設けられた列選択トランジスタ(Tc)32のうち1つをオンにすることにより、信号電圧の読み出しを行う列を選択する。   The horizontal scanning circuit 14 selects a column from which a signal voltage is read by turning on one of the column selection transistors (Tc) 32 provided on each signal line 27.

信号線27は、各画素21からの信号電圧を読み出すための配線であり、画素21の列毎に設けられている。また、信号線27の末端には、相関二重サンプリング(CDS)回路31と列選択トランジスタ(Tc)32が設けられている。CDS回路31は、制御回路17から入力されるクロック信号に基づいて動作し、垂直走査回路13によって選択された行選択線28上の画素21から、読み出しに伴うノイズが除去されるように電圧信号をサンプリングし、保持する。CDS回路31が保持する電圧信号は、水平走査回路14によって列選択トランジスタ67がオンにされることによって出力バスライン33を介して出力回路16に出力される。   The signal line 27 is a wiring for reading a signal voltage from each pixel 21, and is provided for each column of the pixels 21. A correlated double sampling (CDS) circuit 31 and a column selection transistor (Tc) 32 are provided at the end of the signal line 27. The CDS circuit 31 operates based on the clock signal input from the control circuit 17 and is a voltage signal so that noise associated with readout is removed from the pixels 21 on the row selection line 28 selected by the vertical scanning circuit 13. Is sampled and held. The voltage signal held by the CDS circuit 31 is output to the output circuit 16 via the output bus line 33 when the column selection transistor 67 is turned on by the horizontal scanning circuit 14.

出力回路16は、CDS31から入力される電圧信号を増幅するアンプ34と、アンプ34で増幅された電圧信号をデジタルデータに変換するA/D変換回路35を備える。アンプ34のゲインは可変であり、撮影モード等の設定に応じて適宜調節される。   The output circuit 16 includes an amplifier 34 that amplifies the voltage signal input from the CDS 31 and an A / D conversion circuit 35 that converts the voltage signal amplified by the amplifier 34 into digital data. The gain of the amplifier 34 is variable and is appropriately adjusted according to the setting such as the shooting mode.

制御回路17は、垂直走査回路13や水平走査回路14等の固体撮像装置10の各部と接続されており、固体撮像装置10の各部を統括的に制御する。例えば、垂直走査回路13や水平走査回路14の動作は制御回路17から入力されるクロック信号等の制御信号に基づいて動作する。また、CDS31の動作やアンプ34のゲイン等も制御回路17によって制御される。   The control circuit 17 is connected to each part of the solid-state imaging device 10 such as the vertical scanning circuit 13 and the horizontal scanning circuit 14, and comprehensively controls each part of the solid-state imaging device 10. For example, the operations of the vertical scanning circuit 13 and the horizontal scanning circuit 14 operate based on a control signal such as a clock signal input from the control circuit 17. The operation of the CDS 31 and the gain of the amplifier 34 are also controlled by the control circuit 17.

図2に示すように、固体撮像装置10のカラーフィルタ41は、6×6画素の色配列を1単位とする長い周期性を有している。より詳しくは、カラーフィルタ41は、3×3画素の色配列からなる2種類のサブユニット41a,41bをそれぞれ対角の位置に配置した配列である。第1サブユニット41aは、3×3画素の中央及び対角位置に緑色(G)フィルタが配列され、左右中央に青色(B)フィルタを、上下中央に赤色(R)フィルタが配置されたサブユニットである。第2サブユニット41bは、3×3画素の中央及び対角位置にGフィルタが配列されている点は第1サブユニット41aと同様であるが、BフィルタとRフィルタが第1サブユニット41aとは逆になっており、左右中央にはRフィルタが配列され、上下中央にはBフィルタが配列される。   As shown in FIG. 2, the color filter 41 of the solid-state imaging device 10 has a long periodicity with a color arrangement of 6 × 6 pixels as one unit. More specifically, the color filter 41 is an array in which two types of subunits 41a and 41b each having a color array of 3 × 3 pixels are arranged at diagonal positions. In the first subunit 41a, a green (G) filter is arranged at the center and diagonal positions of 3 × 3 pixels, a blue (B) filter is arranged at the center of the left and right, and a red (R) filter is arranged at the center of the top and bottom. Is a unit. The second subunit 41b is the same as the first subunit 41a in that G filters are arranged at the center and diagonal positions of 3 × 3 pixels, but the B filter and the R filter are different from the first subunit 41a. Are reversed, an R filter is arranged at the center of the left and right, and a B filter is arranged at the center of the top and bottom.

例えば2×2画素の色配列を1単位としたベイヤー配列と比較すると、カラーフィルタ41は長い周期性を有するので、固体撮像装置10では、光学的ローパスフィルタを用いなくてもモアレの発生を抑えることができる。また、カラーフィルタ41を採用した固体撮像装置10では、縦方向(垂直方向)及び横方向(水平方向)に必ずRGBの画素が存在するので、偽色が抑えられ、正確な色再現が可能である。   For example, the color filter 41 has a long periodicity as compared with a Bayer arrangement in which the color arrangement of 2 × 2 pixels is one unit. Therefore, the solid-state imaging device 10 suppresses the occurrence of moire even without using an optical low-pass filter. be able to. Further, in the solid-state imaging device 10 employing the color filter 41, since RGB pixels always exist in the vertical direction (vertical direction) and the horizontal direction (horizontal direction), false colors can be suppressed and accurate color reproduction is possible. is there.

左右一対の位相差画素42a,42bは、位相差AFのための視差に関する情報を含む信号を得るための画素であり、例えば、右上の第1サブユニット41aと左上の第2サブユニット41bの各左下のG画素に形成される。位相差画素42a,42b以外の画素は被写体の撮影画像を得るための通常画素43である。   The pair of left and right phase difference pixels 42a and 42b are pixels for obtaining a signal including information related to parallax for phase difference AF. For example, each of the first subunit 41a at the upper right and the second subunit 41b at the upper left It is formed in the lower left G pixel. Pixels other than the phase difference pixels 42a and 42b are normal pixels 43 for obtaining a captured image of the subject.

通常画素43は、開口55が上下左右に対称に形成されており、上下左右の各方向から入射する光を均等に(等方的に)受光する。これに対して、位相差画素42aは、入射光のうちPD22の右側に到達した光を選択的に受光する右位相差画素であり、開口56Rが通常画素43の開口55よりも小さく、右寄りに偏心配置されている。また、位相差画素42bは、入射光のうちPD22の左側に到達した光を選択的に受光する左位相差画素であり、開口56Lが通常画素43の開口55よりも小さく、左寄りに偏心配置されている。   In the normal pixel 43, the openings 55 are formed symmetrically in the vertical and horizontal directions, and receive light that is incident from each of the vertical and horizontal directions equally (isotropically). On the other hand, the phase difference pixel 42a is a right phase difference pixel that selectively receives light that has reached the right side of the PD 22 among incident light, and the opening 56R is smaller than the opening 55 of the normal pixel 43, and is located to the right. It is arranged eccentrically. The phase difference pixel 42b is a left phase difference pixel that selectively receives light that has reached the left side of the PD 22 among incident light, and the opening 56L is smaller than the opening 55 of the normal pixel 43 and is eccentrically arranged to the left. ing.

位相差画素42a,42bから出力される電圧信号は、それぞれ左右の視差がある信号であり、この視差がある電圧信号(あるいはこの電圧信号に基づく画素値)は位相差AFに用いられる。また、固体撮像装置10が出力する各画素の電圧信号から撮影画像を生成する場合、位相差画素42a,42bの電圧信号は例えば所定倍され、通常画素43に感度を一致させて利用される。また、位相差画素42a,42bの周辺のG画素(通常画素43)の電圧信号に基づいて補間により位相差画素42a,42bの位置の画素値が生成される場合もある。   The voltage signals output from the phase difference pixels 42a and 42b are signals having left and right parallax, respectively, and the voltage signal having this parallax (or a pixel value based on this voltage signal) is used for the phase difference AF. Further, when a captured image is generated from the voltage signal of each pixel output from the solid-state imaging device 10, the voltage signal of the phase difference pixels 42 a and 42 b is multiplied by a predetermined amount, for example, and used with the sensitivity matched to the normal pixel 43. In some cases, the pixel values at the positions of the phase difference pixels 42a and 42b are generated by interpolation based on the voltage signal of the G pixel (normal pixel 43) around the phase difference pixels 42a and 42b.

なお、位相差画素42a,42bは受光領域11の全面に設けられるが、カラーフィルタ41の6×6画素の単位中に必ず位相差画素42a,42bが設けられているわけではなく、受光領域11の各画素をカラーフィルタ41の6×6画素の単位でみた場合、位相差画素42a,42bが設けられていない単位もあり、位相差画素42a,42bが設けられる場合には6×6画素単位中の上記箇所に設けられる。位相差画素42a,42bが設けられていない単位では、全てが通常画素43である。   The phase difference pixels 42 a and 42 b are provided on the entire surface of the light receiving region 11. However, the phase difference pixels 42 a and 42 b are not necessarily provided in the 6 × 6 pixel unit of the color filter 41. When each pixel is viewed in the unit of 6 × 6 pixels of the color filter 41, there are units in which the phase difference pixels 42a and 42b are not provided, and in the case where the phase difference pixels 42a and 42b are provided, 6 × 6 pixel units. It is provided at the above location. In the unit in which the phase difference pixels 42 a and 42 b are not provided, all are normal pixels 43.

図3に示すように、固体撮像装置10は、裏面照射型(BSI;back side illuminated)のCMOSイメージセンサであり、支持基板51、配線層52、p型半導体基板53、遮光層54、カラーフィルタ41、マイクロレンズ57等で構成される。p型半導体基板53に対して配線層52が設けられている側が固体撮像装置10の「表面」であり、カラーフィルタ41や遮光層54、マイクロレンズ57が設けられている側が「裏面」である。固体撮像装置10にはマイクロレンズ57、カラーフィルタ41、遮光層54を介して裏面側からPD22に光が入射される。なお、図3は、受光領域11の中央部11a(図4参照)における固体撮像装置10の断面を模式的に示したものである。   As shown in FIG. 3, the solid-state imaging device 10 is a back side illuminated (BSI) CMOS image sensor, and includes a support substrate 51, a wiring layer 52, a p-type semiconductor substrate 53, a light shielding layer 54, and a color filter. 41, a microlens 57, and the like. The side on which the wiring layer 52 is provided with respect to the p-type semiconductor substrate 53 is the “front surface” of the solid-state imaging device 10, and the side on which the color filter 41, the light shielding layer 54, and the microlens 57 are provided is the “back surface”. . Light enters the solid-state imaging device 10 from the back side through the microlens 57, the color filter 41, and the light shielding layer 54. FIG. 3 schematically shows a cross section of the solid-state imaging device 10 in the central portion 11a (see FIG. 4) of the light receiving region 11.

支持基板51は、例えばシリコン基板であり、裏面照射型の固体撮像装置10を製造する過程で、p型半導体53の裏面を露呈させ、p型半導体53を薄板化するために配線層52の表面に接合される。   The support substrate 51 is, for example, a silicon substrate, and exposes the back surface of the p-type semiconductor 53 in the process of manufacturing the back-illuminated solid-state imaging device 10, and the surface of the wiring layer 52 to thin the p-type semiconductor 53. To be joined.

配線層52はp型半導体基板53の表面に形成され、配線層52内に設けられた配線52aによって、各画素21(通常画素43及び位相差画素42a,42b)を形成するトランジスタ(Tr23,Ta24,Ts25)や、各画素21を駆動するための垂直走査回路13,水平走査回路14,制御回路17,出力回路16,制御回路17,CDS31等の各種回路、リセット線26,信号線27,行選択線28、出力バスライン33等の各種配線が形成される。   The wiring layer 52 is formed on the surface of the p-type semiconductor substrate 53, and the transistors (Tr23, Ta24) that form each pixel 21 (the normal pixel 43 and the phase difference pixels 42a, 42b) by the wiring 52a provided in the wiring layer 52. , Ts25), various circuits such as a vertical scanning circuit 13, a horizontal scanning circuit 14, a control circuit 17, an output circuit 16, a control circuit 17, and a CDS 31 for driving each pixel 21, a reset line 26, a signal line 27, a row Various wirings such as a selection line 28 and an output bus line 33 are formed.

PD22は、p型半導体基板53と、p型半導体基板53内に形成されたn型半導体領域のPN接合によって形成される。p型半導体基板53は薄板化されており、破線で示すPD22の光電変換領域はp型半導体基板53の裏面近傍にまで達している。PD22が光電変換により生成した電荷は配線層52との界面近傍に蓄積される。なお、各PD22は図示しない分離層(例えばp++層)によって分離されている。   The PD 22 is formed by a PN junction of a p-type semiconductor substrate 53 and an n-type semiconductor region formed in the p-type semiconductor substrate 53. The p-type semiconductor substrate 53 is thinned, and the photoelectric conversion region of the PD 22 indicated by a broken line reaches the vicinity of the back surface of the p-type semiconductor substrate 53. The charges generated by the photoelectric conversion of the PD 22 are accumulated in the vicinity of the interface with the wiring layer 52. Each PD 22 is separated by a separation layer (for example, p ++ layer) not shown.

遮光層54はp型半導体基板53の裏面側に設けられ、各PD22間に入射する光を遮光することにより混色を抑える。遮光層54は例えば金属薄膜等で形成され、各PD22に対応する開口55,56を有する。開口55,56には、例えば、透明な材料が充填されており、遮光層54の裏面(カラーフィルタ41との界面)は平坦になっている。   The light shielding layer 54 is provided on the back surface side of the p-type semiconductor substrate 53 and suppresses color mixing by shielding light incident between the PDs 22. The light shielding layer 54 is formed of, for example, a metal thin film and has openings 55 and 56 corresponding to the PDs 22. The openings 55 and 56 are filled with, for example, a transparent material, and the back surface of the light shielding layer 54 (interface with the color filter 41) is flat.

開口55は、通常画素43に対応して設けられたものであり、通常画素43のPD22のほぼ全面を露呈する大きさに設けられている。このため、一点鎖線で示すように、通常画素43では、対応するマイクロレンズ57及びカラーフィルタ41を介して入射する光のほぼ全光束が開口55を通ってPD22に入射される。なお、開口55は、通常画素43の位置等によらず、全ての通常画素43で同じ大きさである。このため、通常画素43は、受光領域11(図4参照)内での位置によらず、全て同じ受光面積を有する。   The opening 55 is provided corresponding to the normal pixel 43, and has a size that exposes almost the entire surface of the PD 22 of the normal pixel 43. For this reason, as indicated by the alternate long and short dash line, in the normal pixel 43, almost the entire light flux of the light incident through the corresponding microlens 57 and the color filter 41 is incident on the PD 22 through the opening 55. The opening 55 has the same size in all the normal pixels 43 regardless of the position of the normal pixels 43 and the like. For this reason, all the normal pixels 43 have the same light receiving area regardless of the position in the light receiving region 11 (see FIG. 4).

一方、開口56Lは、左位相差画素42bに対応して設けられたものであり、通常画素43用の開口55よりも幅が狭く、開口面積は小さい。また、開口56Lは、中心が左側(X方向負側)に偏心配置されている。このため、位相差画素42bでは、対応するマイクロレンズ57及びカラーフィルタ41を介して入射する光のうち、さらに開口56Lを通ってPD22の左側に到達する一部の光束だけを受光する。   On the other hand, the opening 56L is provided corresponding to the left phase difference pixel 42b, and has a narrower width and a smaller opening area than the opening 55 for the normal pixel 43. Further, the opening 56L is eccentrically arranged on the left side (X direction negative side). For this reason, the phase difference pixel 42b receives only a part of the light beam that enters through the corresponding microlens 57 and the color filter 41 and reaches the left side of the PD 22 through the opening 56L.

なお、図3では位相差画素42a,42bのうち、左位相差画素42bだけを示しているが、右位相差画素42aの構成は左位相差画素42bと同様であるため、図示を省略する。但し、右位相差画素42aでは、左位相差画素42bとは逆に、開口56Rが右側(X方向正側)に偏心配置されており、対応するマイクロレンズ57及びカラーフィルタ41を介して入射する光のうち、さらに開口56Rを通ってPD22の右側に到達する一部の光束だけを受光する。   In FIG. 3, only the left phase difference pixel 42b is shown among the phase difference pixels 42a and 42b, but the configuration of the right phase difference pixel 42a is the same as that of the left phase difference pixel 42b, and thus illustration is omitted. However, in the right phase difference pixel 42a, contrary to the left phase difference pixel 42b, the opening 56R is eccentrically arranged on the right side (X direction positive side), and enters through the corresponding microlens 57 and the color filter 41. Of the light, only a part of the light beam that reaches the right side of the PD 22 through the opening 56R is received.

カラーフィルタ41の各色セグメントは、PD22にそれぞれ対応するように設けられており、各色セグメントの中心はPD22の中心に対応する位置にある。カラーフィルタ41の配列については前述の通りであり、マイクロレンズ57によって集光される光束はカラーフィルタ41を通ることによってR,G,Bのいずれかの色になってPD22に入射する。   Each color segment of the color filter 41 is provided so as to correspond to the PD 22, and the center of each color segment is at a position corresponding to the center of the PD 22. The arrangement of the color filter 41 is as described above, and the light beam condensed by the microlens 57 passes through the color filter 41 and becomes one of R, G, and B and enters the PD 22.

マイクロレンズ57は、各PD22に対応するように、カラーフィルタ41上に設けられ、形状は概ね半球である。マイクロレンズ57は、入射光を対応するPD22に集光させる。受光領域11の中心(中央部11aの中心)では、マイクロレンズ57の中心はPD22の中央にほぼ一致している。但し、中央部11a内でも、マイクロレンズ57は、主光線角度45に応じて受光領域11の中心方向にオフセットして配置(スケーリング)されている。   The micro lens 57 is provided on the color filter 41 so as to correspond to each PD 22 and has a substantially hemispherical shape. The microlens 57 collects incident light on the corresponding PD 22. At the center of the light receiving region 11 (center of the central portion 11a), the center of the microlens 57 substantially coincides with the center of the PD 22. However, the microlens 57 is also arranged (scaled) in the central portion 11 a while being offset in the central direction of the light receiving region 11 according to the principal ray angle 45.

図4に示すように、固体撮像装置10は撮像レンズ44とともに用いられ、固体撮像装置10の受光領域11には撮像レンズ44によって被写体59の像が結像される。図4では、模式的に撮像レンズ44を1枚のレンズにしているが、撮像レンズ44は1枚のレンズで構成されていても良いし、複数のレンズで構成されていても良い。すなわち、撮像レンズ44は、被写体59の像を受光領域11に結像させることができれば、その具体的構成は任意である。このため、撮像レンズ44には、赤外線フィルタ等の各種光学フィルタや、実質的にレンズ作用がない光学要素が含まれていても良い。   As shown in FIG. 4, the solid-state imaging device 10 is used together with the imaging lens 44, and an image of a subject 59 is formed on the light receiving region 11 of the solid-state imaging device 10 by the imaging lens 44. In FIG. 4, the imaging lens 44 is schematically formed as one lens, but the imaging lens 44 may be configured by a single lens or may be configured by a plurality of lenses. That is, the specific configuration of the imaging lens 44 is arbitrary as long as the image of the subject 59 can be formed on the light receiving region 11. For this reason, the imaging lens 44 may include various optical filters such as an infrared filter and an optical element having substantially no lens action.

撮像レンズ44によって被写体59の像を受光領域11に結像させる場合、一点鎖線で示す主光線の角度(以下、主光線角度という)45は、受光領域11の中央部11aから左右の周辺部11bR,11bLにかけて徐々に大きくなる。このため、固体撮像装置10は、各画素が各箇所の主光線角度45に合わせた構造になっている。   When an image of the subject 59 is formed on the light receiving region 11 by the imaging lens 44, the chief ray angle (hereinafter referred to as chief ray angle) 45 indicated by a one-dot chain line is from the central portion 11a of the light receiving region 11 to the left and right peripheral portions 11bR. , Gradually increases over 11 bL. For this reason, the solid-state imaging device 10 has a structure in which each pixel is matched with the principal ray angle 45 at each location.

まず、図5に示すように、固体撮像装置10では、主光線角度45に合わせてマイクロレンズ57がスケーリングされている。具体的には、カラーフィルタ41以下の構造は中央部11aでも、周辺部11bR,11bLでも全て同じ構造になっているが、周辺部11bR,11bLでは、各マイクロレンズ57の中心位置が中央部11a側に偏心して配置されている。例えば、各通常画素43への入射光の光束を、破線矢印で示す右側半分の光束(以下、右側光束という)と、実線矢印で示す左側半分の光束(以下、左側光束という)に分けて考えるとすると、中央部11aの通常画素43では、マイクロレンズ75によって左側光束はPD22の右側に、右側光束はPD22の左側にそれぞれ集光される。すなわち、中央部11aの通常画素43では、入射光の光束はほぼ全てPD22に集光される。   First, as shown in FIG. 5, in the solid-state imaging device 10, the microlens 57 is scaled according to the principal ray angle 45. Specifically, the structure below the color filter 41 is the same in both the central portion 11a and the peripheral portions 11bR and 11bL. However, in the peripheral portions 11bR and 11bL, the center position of each microlens 57 is the central portion 11a. It is arranged eccentric to the side. For example, the light flux of incident light to each normal pixel 43 is divided into a right half light flux (hereinafter referred to as a right light flux) indicated by a broken line arrow and a left half light flux (hereinafter referred to as a left light flux) indicated by a solid arrow. Then, in the normal pixel 43 in the central portion 11a, the left light beam is condensed on the right side of the PD 22 and the right light beam is condensed on the left side of the PD 22 by the micro lens 75. That is, in the normal pixel 43 in the central portion 11a, almost all of the incident light beam is collected on the PD 22.

また、右側の周辺部11bRにある通常画素43では、一点鎖線で示すように主光線が傾斜しているが、この主光線の角度に合わせてマイクロレンズ75がスケーリングされているので、左側光束も右側光束も全てPD22に集光される。同様に、左側の周辺部11bLにある通常画素43についても、同様であり、主光線の角度に合わせてマイクロレンズ75がスケーリングされているので、左側光束も右側光束も全てPD22に集光される。したがって、通常画素43は、中央部11a,右側の周辺部11bR,左側の周辺部11bLのいずれにおいても入射光の光束のほぼ全てをPD22で受光することができる。   Further, in the normal pixel 43 in the right peripheral portion 11bR, the chief ray is inclined as shown by the alternate long and short dash line, but since the microlens 75 is scaled according to the angle of the chief ray, the left luminous flux is also All the right side light flux is also condensed on PD22. Similarly, the same applies to the normal pixels 43 in the left peripheral portion 11bL. Since the microlens 75 is scaled according to the angle of the principal ray, both the left beam and the right beam are condensed on the PD 22. . Therefore, the normal pixel 43 can receive almost all of the incident light beam by the PD 22 in any of the central portion 11a, the right peripheral portion 11bR, and the left peripheral portion 11bL.

さらに、固体撮像装置10では、位相差画素42a,42bの開口56L,56Rの面積(開口面積)が、受光領域11内での位置に応じて調節されている。   Further, in the solid-state imaging device 10, the areas (opening areas) of the openings 56 </ b> L and 56 </ b> R of the phase difference pixels 42 a and 42 b are adjusted according to the position in the light receiving region 11.

まず、比較のために、位相差画素42a,42bの開口56L,56Rが中央部11a,右側の周辺部11bR,左側の周辺部11bLのいずれにおいても同じ大きさ(通常画素43の開口55の半分の面積)になっている場合を説明する。この場合、図6に示すように、中央部11aにある右位相差画素42aは、左側半分が遮光されているので、開口56Rを通ってPD22の右側に到達する光のみを選択的に受光する。例えば、左側光束の約1/5、右側光束の約4/5がPD22の右側に到達するとすれば、中央部11aにある右位相差画素42aの受光量は、入射光量の約1/2である。   First, for comparison, the openings 56L and 56R of the phase difference pixels 42a and 42b have the same size in each of the central portion 11a, the right peripheral portion 11bR, and the left peripheral portion 11bL (half the opening 55 of the normal pixel 43). The case of (area) is described. In this case, as shown in FIG. 6, since the left half of the right phase difference pixel 42a in the central portion 11a is shielded from light, only light that reaches the right side of the PD 22 through the opening 56R is selectively received. . For example, if approximately 1/5 of the left beam and approximately 4/5 of the right beam reach the right side of the PD 22, the amount of light received by the right phase difference pixel 42a in the central portion 11a is approximately 1/2 of the incident light amount. is there.

一方、右側の周辺部11bRにある右位相差画素42aでは、入射光の光束のうち、入射角度が小さい右側光束(破線矢印)は、その殆どが遮光され、入射角度が大きい左側光束(実線矢印)はほぼ全てPD22に到達する。例えば、右側光束の約1/5、左側光束のほぼ全てがPD22の右側に到達するとすれば、右側の周辺部11bRにある右位相差画素42aの受光量は、入射光量の約3/5である。   On the other hand, in the right phase difference pixel 42a in the right peripheral portion 11bR, among the incident light beams, the right light beam (broken arrow) having a small incident angle is mostly shielded and the left light beam (solid line arrow) having a large incident angle. ) Almost reaches the PD 22. For example, if approximately 1/5 of the right beam and almost all of the left beam reach the right side of the PD 22, the amount of light received by the right phase difference pixel 42a in the right peripheral portion 11bR is approximately 3/5 of the incident light amount. is there.

また、左側の周辺部11bLにある右位相差画素42aでは、右側の周辺部11bRにあるものとは逆に、入射光の光束のうち、右側光束(破線矢印)の入射角度が大きく、左側光束(実線矢印)の入射角度が小さい。このため、左側の周辺部11bLにある右位相差画素42aでは、右側光束はほぼ全て遮光され、左側光束の一部がPD22に到達する。例えば、左側光束の約4/5がPD22に到達するとすれば、周辺部11bLにある右位相差画素42aの受光量は、入射光量の約2/5である。   On the other hand, in the right phase difference pixel 42a in the left peripheral portion 11bL, the incident angle of the right light beam (broken arrow) is large in the incident light beam, contrary to that in the right peripheral portion 11bR. The incident angle of (solid line arrow) is small. For this reason, in the right phase difference pixel 42a in the left peripheral portion 11bL, almost all of the right beam is blocked and a part of the left beam reaches the PD 22. For example, if about 4/5 of the left light beam reaches the PD 22, the amount of light received by the right phase difference pixel 42a in the peripheral portion 11bL is about 2/5 of the incident light amount.

したがって、図7に示すように、右位相差画素42aの比感度(通常画素43の感度に対する感度)は、概ねX方向の位置に比例し、右側の周辺部11bRでは中央部11aのものよりも大きいが、左側の周辺部11bLでは中央部11aのものよりも小さい。このため、比感度が大きい右側の周辺部11bRの右位相差画素42aを用いる場合には、中央部11aのものを用いる場合よりも精度良く位相差AFを行うことができるが、一方で、比感度が中央部11aのものよりも小さい左側の周辺部11bLの右位相差画素42aを用いると、位相差AFの精度が良くない。   Accordingly, as shown in FIG. 7, the specific sensitivity of the right phase difference pixel 42a (sensitivity to the sensitivity of the normal pixel 43) is substantially proportional to the position in the X direction, and the right peripheral portion 11bR is more than the central portion 11a. Although larger, the left peripheral portion 11bL is smaller than that of the central portion 11a. For this reason, when the right phase difference pixel 42a of the right peripheral portion 11bR having high specific sensitivity is used, the phase difference AF can be performed with higher accuracy than when the right phase difference pixel 42a of the central portion 11a is used. When the right phase difference pixel 42a in the left peripheral portion 11bL whose sensitivity is smaller than that of the central portion 11a is used, the accuracy of the phase difference AF is not good.

このため、図8及び図9に示すように、固体撮像装置10では、受光領域11の中心(中央部11aの中央)から左側の周辺部11bLにある右位相差画素42aの開口56Rを、中心部11a及び右側の周辺部11bRの開口56Rよりも、面積を大きくしてある。このように、開口面積を中央部11a及び右側の周辺部11bRのものよりも大きくすると、左側の周辺部11bLにある右位相差画素42aの感度を、中央部11aにある右位相差画素42aの感度以上に向上させることができる。   For this reason, as shown in FIGS. 8 and 9, in the solid-state imaging device 10, the opening 56 </ b> R of the right phase difference pixel 42 a in the peripheral portion 11 b </ b> L on the left side from the center of the light receiving region 11 (the center of the central portion 11 a) is centered. The area is larger than the opening 56R of the portion 11a and the right peripheral portion 11bR. As described above, when the opening area is larger than that of the central portion 11a and the right peripheral portion 11bR, the sensitivity of the right phase difference pixel 42a in the left peripheral portion 11bL is set to the sensitivity of the right phase difference pixel 42a in the central portion 11a. It can be improved beyond sensitivity.

例えば、開口面積が中央部11aのものと同じ場合(図6参照)には一部が遮光されていた左側光束(実線矢印)が、開口面積を拡大した場合(図8参照)にはほぼ全てがPD22に到達する。また、開口面積が中央部11aのものと同じ場合には全て遮光されていた右側光束(破線矢印)は、面積が拡大された開口56Rを通過してPD22に到達する。このため、右側光束の約1/5がPD22に到達するとすれば、面積が拡大された開口56Rを有する左側周辺部11bLの位相差画素42aの受光量は、入射光量の約3/5程度に向上する。これは、右側の周辺部11bRにある右位相差画素42aの受光量とほぼ同じ受光量である。   For example, when the opening area is the same as that of the central portion 11a (see FIG. 6), the left light beam (solid arrow) that is partially shielded from light is almost all when the opening area is enlarged (see FIG. 8). Reaches PD22. Further, when the opening area is the same as that of the central portion 11a, the right light beam (broken arrow) that has been shielded from light passes through the opening 56R having an enlarged area and reaches the PD 22. For this reason, if approximately 1/5 of the right light beam reaches the PD 22, the amount of light received by the phase difference pixel 42a of the left peripheral portion 11bL having the opening 56R whose area is enlarged is about 3/5 of the incident light amount. improves. This is substantially the same amount of received light as that of the right phase difference pixel 42a in the right peripheral portion 11bR.

左側の周辺部11bLにおける開口56Rの拡大率は、例えば、図10に示すように、比感度が受光領域11の中心から左側の周辺部11bLと、中心から右側の周辺部11bRとで対称になるように、各右位相差画素42aの位置に応じて調節すれば良い。なお、左側の周辺部11bにおける開口56Rの拡大率は、左側の周辺部11bLにある右位相差画素42aの比感度が、少なくとも中央部11aの右位相差画素42aの比感度以上になるように調節されていれば良いが、特に、図11に示すように、受光領域11の中心から左側の周辺部11bLにかけて比感度が一定になるように、開口56Rの拡大率が調節されていることが好ましい。このように、中心における比感度と、左側の周辺部11bLの比感度が等しくなるように開口56Rの拡大率が調節されている場合、左側周辺部11bLの右位相差画素42aが本来受光すべき左側光束(信号)が多く、ノイズになる右側光束の受光量が少ないので、比感度が大きいながらも特にS/Nが良いからである。   For example, as shown in FIG. 10, the magnification of the opening 56 </ b> R in the left peripheral portion 11 b </ i> L is symmetric between the left peripheral portion 11 b </ i> L from the center of the light receiving region 11 and the right peripheral portion 11 b </ i> R from the center. In this way, the adjustment may be made according to the position of each right phase difference pixel 42a. The enlargement ratio of the opening 56R in the left peripheral portion 11b is such that the specific sensitivity of the right phase difference pixel 42a in the left peripheral portion 11bL is at least equal to or greater than the specific sensitivity of the right phase difference pixel 42a in the central portion 11a. In particular, as shown in FIG. 11, the enlargement ratio of the opening 56R is adjusted so that the specific sensitivity is constant from the center of the light receiving region 11 to the left peripheral portion 11bL as shown in FIG. preferable. As described above, when the enlargement ratio of the opening 56R is adjusted so that the specific sensitivity at the center and the specific sensitivity of the left peripheral portion 11bL are equal, the right phase difference pixel 42a of the left peripheral portion 11bL should originally receive light. This is because the left light beam (signal) is large and the amount of light received by the right light beam that causes noise is small, so that the S / N is particularly good while the specific sensitivity is high.

なお、図12に示すように、右側の周辺部11bRにある右位相差画素42aの開口56Rを中央部11aのものよりも小さくしても良い。そして、右側の周辺部11bRにおける開口56Rの縮小率は、図13に示すように、受光領域11の中心から右側の周辺部11bRにかけて比感度が一定になるように調節されていることが好ましい。このように、受光領域11の中心における比感度と、右側の周辺部11bRの比感度が等しくなるように開口56Rの面積が調節されている場合、開口面積が中心のものと同じ場合と比較すれば比感度が低下するが、S/Nを向上させることができる。また、上述の説明から分かるように、図13のように、中央部11a,左側の周辺部11bL,右側の周辺部11bRで比感度がほぼ等しくなるように開口56Rの面積が調節されていることが特に好ましいことは言うまでもない。   As shown in FIG. 12, the opening 56R of the right phase difference pixel 42a in the right peripheral portion 11bR may be smaller than that in the central portion 11a. The reduction ratio of the opening 56R in the right peripheral portion 11bR is preferably adjusted so that the specific sensitivity is constant from the center of the light receiving region 11 to the right peripheral portion 11bR as shown in FIG. As described above, when the area of the opening 56R is adjusted so that the specific sensitivity at the center of the light receiving region 11 and the specific sensitivity of the right peripheral portion 11bR are equal, it is compared with the case where the opening area is the same as that at the center. Although the specific sensitivity decreases, the S / N can be improved. Further, as can be seen from the above description, as shown in FIG. 13, the area of the opening 56R is adjusted so that the specific sensitivity is substantially equal in the central portion 11a, the left peripheral portion 11bL, and the right peripheral portion 11bR. Needless to say, is particularly preferable.

なお、左位相差画素42bは右位相差画素42aと左右対称である。このため、図14に示すように、左位相差画素42bは、上述の右位相差画素42aの場合とは逆に、少なくとも受光領域11の中心から右側の周辺部11bRにかけて開口56Lの面積が大きくなるように開口面積が調節されている。   The left phase difference pixel 42b is bilaterally symmetric with the right phase difference pixel 42a. Therefore, as shown in FIG. 14, the left phase difference pixel 42b has a large area of the opening 56L at least from the center of the light receiving region 11 to the right peripheral portion 11bR, contrary to the case of the right phase difference pixel 42a described above. The opening area is adjusted so that

図15に示すように、左位相差画素42bの開口面積が一定になっている場合、右側の周辺部11bRにおける左位相差画素42bの比感度は、中央部11aのものよりも低い(一点鎖線)。しかし、上述のように、右側の周辺部11bRにおいて左位相差画素42bの開口面積を拡大しておくことで、左側の周辺部11bLの左位相差画素42bと同程度に(実線)、あるいは中央部11aの中心ものと同程度に(破線)、右側の周辺部11bRの左位相差画素42bの比感度を向上させている。   As shown in FIG. 15, when the opening area of the left phase difference pixel 42b is constant, the specific sensitivity of the left phase difference pixel 42b in the right peripheral portion 11bR is lower than that in the central portion 11a (the one-dot chain line). ). However, as described above, by expanding the opening area of the left phase difference pixel 42b in the right peripheral portion 11bR, the same as the left phase difference pixel 42b in the left peripheral portion 11bL (solid line), or the center The specific sensitivity of the left phase difference pixel 42b in the right peripheral portion 11bR is improved to the same extent as that in the center of the portion 11a (broken line).

上述のように、固体撮像装置10は、右位相差画素42aの開口56Rを左側の周辺部11bLで拡大し、右側の周辺部11bRでは中央部11aと同じか、あるいは縮小している。同様に、左位相差画素42bの開口56Lを右側の周辺部11bRで拡大し、左側の周辺部11bLでは中央部11aと同じか、あるいは縮小している。このように、固体撮像装置10は、左右の位相差画素42a,42bをそれぞれ別に、位置に応じて開口面積を調節しているので、左右の位相差画素の感度を適切に補い、位相差AFの精度を向上することができる。   As described above, in the solid-state imaging device 10, the opening 56R of the right phase difference pixel 42a is enlarged at the left peripheral portion 11bL, and the right peripheral portion 11bR is the same as or reduced at the central portion 11a. Similarly, the opening 56L of the left phase difference pixel 42b is enlarged at the right peripheral portion 11bR, and the left peripheral portion 11bL is the same as or reduced at the central portion 11a. As described above, since the solid-state imaging device 10 adjusts the opening area according to the position of the left and right phase difference pixels 42a and 42b, respectively, the sensitivity of the left and right phase difference pixels is appropriately compensated for, and the phase difference AF Accuracy can be improved.

例えば、左右の位相差画素42a,42bを区別することなく、受光領域11の中心からの距離に応じて、一律に開口面積を拡大すると、もともと比感度が大きい位置にある位相差画素(右側の周辺部の右位相差画素42a及び左側の周辺部11bLにある左位相差画素42b)では、ノイズになる光束の受光量が増えるので、S/Nが悪化し、比感度が高くても位相差AFの精度が悪化してしまう場合がある。この場合と比較すると、固体撮像装置10は、左右の位相差画素42a,42bでそれぞれ別に、位置に応じて開口56L,56Rの大きさを調節しているので、S/Nを悪化させることなく、確実に位相差AFの精度を向上させることができる。   For example, when the aperture area is uniformly enlarged according to the distance from the center of the light receiving region 11 without distinguishing the left and right phase difference pixels 42a and 42b, the phase difference pixel (right side In the right phase difference pixel 42a in the peripheral portion and the left phase difference pixel 42b in the left peripheral portion 11bL), the amount of received light of the light beam that becomes noise increases, so the S / N deteriorates and the phase difference is high even if the specific sensitivity is high. The accuracy of AF may deteriorate. Compared to this case, the solid-state imaging device 10 adjusts the sizes of the openings 56L and 56R according to the positions of the left and right phase difference pixels 42a and 42b, respectively, so that the S / N is not deteriorated. The accuracy of phase difference AF can be improved reliably.

また、位相差画素42a,42bのマイクロレンズ75を通常画素43とは異なるスケーリングにする場合、混色等が発生して撮影画像の画質が損なわれる場合があるが、固体撮像装置10は、マイクロレンズ75はスケーリングによらず、位相差画素42a,42bの開口面積の調節だけで位相差AFの精度を向上させているので、撮影画像の画質は損なわれない。   In addition, when the microlens 75 of the phase difference pixels 42a and 42b is scaled differently from the normal pixel 43, color mixing may occur and the image quality of the captured image may be impaired. Since the accuracy of the phase difference AF is improved only by adjusting the aperture areas of the phase difference pixels 42a and 42b regardless of the scaling 75, the image quality of the captured image is not impaired.

なお、右位相差画素42aの開口面積を上述のように調節する場合、例えば、図16に示すように、左側の周辺部11bLでは、中央部11aの右位相差画素42aの開口56Rに対して、右位相差画素42aの左辺の位置を左側に移動して開口56Rの横に拡張することによって、面積を拡大すれば良い。このように、中央部11aの右位相差画素42aの開口56Rに対して、右辺の位置を揃えたまま開口を横に拡張して、面積を拡大すると、S/Nと比感度の向上を両立させやすい。また、この場合、開口面積を拡大しても基線長(位相差画素42a,42bの中心間隔)が短くなり難いので、視差に関する情報も得やすい。図16では右位相差画素42aを例にしたが、左位相差画素42bも同様であり、右側の周辺部11bRにおいて、左辺の位置を中央部11aのものと揃えたまま、右辺の位置を右側に移動して開口56Lを横に拡張することによって、面積を拡大すれば良い。   When the opening area of the right phase difference pixel 42a is adjusted as described above, for example, as shown in FIG. 16, in the left peripheral portion 11bL, the opening 56R of the right phase difference pixel 42a in the central portion 11a is compared. The area may be enlarged by moving the left side position of the right phase difference pixel 42a to the left and expanding it to the side of the opening 56R. As described above, when the aperture is expanded laterally with the right side of the aperture 56R of the right phase difference pixel 42a in the central portion 11a being aligned, and the area is expanded, both S / N and specific sensitivity are improved. Easy to make. In this case, even if the opening area is enlarged, the base line length (the center distance between the phase difference pixels 42a and 42b) is not easily shortened, so that information on parallax can be easily obtained. In FIG. 16, the right phase difference pixel 42a is taken as an example, but the same applies to the left phase difference pixel 42b. In the right peripheral portion 11bR, the left side position is aligned with that of the central portion 11a, and the right side position is set to the right side. The area may be enlarged by moving to and expanding the opening 56L horizontally.

一方、図17に示すように、左側の周辺部11bLの右位相差画素42aでは、中央部11aの右位相差画素42aの開口56Rの右辺及び左辺を両方とも左側に移動させて開口面積を拡大しても良い。この場合、開口56Rの中心位置も左側に移動する。このように、開口面積の拡大のために、開口56Rの右辺及び左辺の位置をともに左側に移動させると、開口56Rの幅をできるだけ小さく保ったまま、信号になる左側光束をほぼ全て受光しつつ、ノイズになる右側光束の入射を防ぐように開口面積を拡大することができる。このため、前述の右辺を移動させない場合と比較すると、比感度の増分が少なく、また、基線長も短くなりやすいが、S/Nが特に良好である。図17では、右位相差画素42aを例にしたが、左位相差画素42bも同様である。   On the other hand, as shown in FIG. 17, in the right phase difference pixel 42a of the left peripheral portion 11bL, the right side and the left side of the opening 56R of the right phase difference pixel 42a of the central portion 11a are both moved to the left side to enlarge the opening area. You may do it. In this case, the center position of the opening 56R also moves to the left. As described above, when the positions of the right side and the left side of the opening 56R are both moved to the left side in order to enlarge the opening area, almost all the left light beam that becomes a signal is received while the width of the opening 56R is kept as small as possible. In addition, the opening area can be enlarged so as to prevent the incidence of the right light beam that becomes noise. For this reason, as compared with the case where the right side is not moved, the increase in specific sensitivity is small and the base line length tends to be short, but the S / N is particularly good. In FIG. 17, the right phase difference pixel 42a is taken as an example, but the same applies to the left phase difference pixel 42b.

なお、上述のように周辺部11bLの右位相差画素42aの開口面積を調節した上で、さらに図18に示すように、周辺部11bLの右位相差画素42aのPD22を中央部11aのPD22に対して小さくしても良い。このように周辺部11bLの右位相差画素42aのPD22を小さくすると、右位相差画素42aのPD22は周辺の通常画素43のPD22からの距離が大きくなる。通常画素43のPD22と右位相差画素42aのPD22の距離が近い場合、通常画素43に斜めに入射した光が右位相差画素42aのPD22に漏れ、混色が発生する場合があるが、上述のように、右位相差画素42aのPD22を小さくしておけば、通常画素43に斜めに入射した光が位相差画素42a側に漏れても、右位相差画素42aのPD22には到達し難いので、混色がより確実に防止され、S/Nが良い信号が得られやすい。   In addition, after adjusting the opening area of the right phase difference pixel 42a in the peripheral portion 11bL as described above, the PD 22 of the right phase difference pixel 42a in the peripheral portion 11bL is changed to the PD 22 in the central portion 11a as shown in FIG. However, it may be made smaller. When the PD 22 of the right phase difference pixel 42a in the peripheral part 11bL is reduced in this way, the distance of the PD 22 of the right phase difference pixel 42a from the PD 22 of the peripheral normal pixel 43 increases. When the distance between the PD 22 of the normal pixel 43 and the PD 22 of the right phase difference pixel 42a is short, light obliquely incident on the normal pixel 43 may leak into the PD 22 of the right phase difference pixel 42a and color mixing may occur. As described above, if the PD 22 of the right phase difference pixel 42a is made small, even if light obliquely incident on the normal pixel 43 leaks to the phase difference pixel 42a side, it is difficult to reach the PD 22 of the right phase difference pixel 42a. Color mixing is more reliably prevented, and a signal with good S / N is easily obtained.

このように、周辺部11bLの右位相差画素42aのPD22を小さくする場合、開口56Rの面積や、PD22の大きさ及び形状は、比感度及びS/Nが最適になるように適宜定めれば良い。また、PD22の大きさ及び形状を、拡大した右位相差画素42aの開口56Rに合わせた長方形にしても良い。例えば、右位相差画素42aの開口56Rに大きさや形状がほぼ合致する長方形にすれば良い。なお、左位相差画素42bについても同様である。   As described above, when the PD 22 of the right phase difference pixel 42a in the peripheral portion 11bL is made small, the area of the opening 56R and the size and shape of the PD 22 may be appropriately determined so that the specific sensitivity and S / N are optimized. good. The size and shape of the PD 22 may be a rectangle that matches the enlarged opening 56R of the right phase difference pixel 42a. For example, a rectangular shape whose size and shape substantially match the opening 56R of the right phase difference pixel 42a may be used. The same applies to the left phase difference pixel 42b.

なお、上述の実施形態では、受光領域を中央部11aと周辺部11bL,11bRに分けているが、受光領域11aをさらに細かく区分けしても良い。例えば、図19に示すように、受光領域11を、中央部81aL,81aR、中間部81bL,81bR,周辺部81cL,81cRに分けても良い。撮影画像のシェーディング補正や傷補正(位相差画素42a,42bの画素値の補正)等の所定の各種処理の方法が受光領域11の位置によって異なる場合があるが、上述のように受光領域11を細かく区分けする場合には、各種処理の方法の区切りに合わせて受光領域11を区分けすることが好ましい。   In the above-described embodiment, the light receiving region is divided into the central portion 11a and the peripheral portions 11bL and 11bR. However, the light receiving region 11a may be further divided. For example, as shown in FIG. 19, the light receiving region 11 may be divided into a central portion 81aL, 81aR, an intermediate portion 81bL, 81bR, and a peripheral portion 81cL, 81cR. Although predetermined various processing methods such as shading correction of a captured image and flaw correction (correction of pixel values of the phase difference pixels 42a and 42b) may vary depending on the position of the light receiving region 11, the light receiving region 11 is changed as described above. In the case of fine division, it is preferable to divide the light receiving region 11 according to the division of various processing methods.

この場合、さらに、図20に示すように、右位相差画素42aの開口面積を、左側の中央部81aL,左側の中間部81bL,左側の周辺部81cLの順に段階的に拡大しても良い。このように、各種処理の方法が異なる区切りに合わせて、位相差画素42a,42bの開口面積を段階的に拡大しておくと、中心からの距離に応じて開口面積を滑らかに拡大するよりも(図9参照)容易に良好な処理結果が得られる。すなわち、位相差画素42a,42bの開口面積の段階的拡大は、位相差AFの精度向上と撮影画像の画質を両立しやすい。なお、図20では、受光領域11を中央部81aL,81aR、中間部81bL,81bR,周辺部81cL,81cRに分けているが、上述の実施形態のように、受光領域11を中央部11aと周辺部11bL,11bRに分ける場合も同様である。また、左位相差画素42bは、上述の実施形態と同様に、右側の中央部81aR,中間部81bR,周辺部81cRで段階的に開口面積を拡大すれば良い。   In this case, as shown in FIG. 20, the opening area of the right phase difference pixel 42a may be enlarged stepwise in the order of the left central portion 81aL, the left intermediate portion 81bL, and the left peripheral portion 81cL. As described above, when the opening areas of the phase difference pixels 42a and 42b are enlarged in stages in accordance with different processing methods, the opening area is smoothly enlarged according to the distance from the center. (See FIG. 9) A good processing result can be easily obtained. That is, the stepwise expansion of the opening areas of the phase difference pixels 42a and 42b makes it easy to improve both the accuracy of the phase difference AF and the image quality of the captured image. In FIG. 20, the light receiving region 11 is divided into the central portions 81aL and 81aR, the intermediate portions 81bL and 81bR, and the peripheral portions 81cL and 81cR. However, as in the above-described embodiment, the light receiving region 11 is separated from the central portion 11a and the periphery. The same applies to the case of dividing into parts 11bL and 11bR. In the left phase difference pixel 42b, the opening area may be enlarged stepwise at the right central portion 81aR, intermediate portion 81bR, and peripheral portion 81cR as in the above embodiment.

なお、撮像レンズ44がズームレンズの場合、ズーム倍率によって各位置の主光線角度45が変化するので、各位置の位相差画素42a,42bの開口面積を一義的に定められない場合がある。この場合、撮像レンズ44がテレ端(焦点距離が最も長い場合)、ワイド端(焦点距離が最も短い場合)、またはこれらの中間の場合の主光線角度45に合わせて各位置の位相差画素42a,42bの開口面積を定めることが好ましい。但し、テレ端の場合の主光線角度45に合わせるとワイド端での誤差(比感度等のずれ)が大きく、逆にワイド端の場合の主光線角度45に合わせるとテレ端での誤差が大きいので、テレ端及びワイド端の場合の位相差AFの精度を両立するためには、テレ端とワイド端の中間の場合の主光線角度45に合わせて、各位置の各位置の位相差画素42a,42bの開口面積を定めることが特に好ましい。   When the imaging lens 44 is a zoom lens, the principal ray angle 45 at each position changes depending on the zoom magnification, and thus the aperture areas of the phase difference pixels 42a and 42b at each position may not be uniquely determined. In this case, the phase difference pixel 42a at each position is set in accordance with the principal ray angle 45 in the telephoto end (when the focal length is the longest), the wide end (when the focal length is the shortest), or an intermediate between them. , 42b is preferably defined. However, when matched with the chief ray angle 45 at the telephoto end, the error at the wide end (deviation of specific sensitivity, etc.) is large. Conversely, when matched with the chief ray angle 45 at the wide end, the error at the telephoto end is large. Therefore, in order to achieve both the accuracy of the phase difference AF at the tele end and the wide end, the phase difference pixels 42a at the respective positions in accordance with the principal ray angle 45 in the middle between the tele end and the wide end. 42b is particularly preferable.

なお、上述の実施形態では、G画素を位相差画素42a,42bにしたが、図21に示すように、R画素やB画素を位相差画素42a,42bにしても良い。もちろん、位相差画素42a,42bを全てR画素で形成しても良いし、位相差画素42a,42bを全てB画素で形成しても良い。また、R,G,Bのいずれか2種にだけ位相差画素42a,42bを設けても良い。   In the above-described embodiment, the G pixel is the phase difference pixels 42a and 42b. However, as shown in FIG. 21, the R pixel and the B pixel may be phase difference pixels 42a and 42b. Of course, all of the phase difference pixels 42a and 42b may be formed of R pixels, and all of the phase difference pixels 42a and 42b may be formed of B pixels. Further, only two of R, G, and B may be provided with the phase difference pixels 42a and 42b.

但し、R,G,Bのうち2色以上に位相差画素42a,42bを設ける場合、色毎に周辺部11bにおける感度の低下量が異なるので、色毎に位相差画素42a,42bの開口面積の拡大率を変えることが好ましい。マイクロレンズ75の焦点等によっても異なるが、例えば、各色の位相差画素42a,42bの開口面積の拡大率をB画素>G画素>R画素にする。もちろん、開口面積の拡大率が2色で同じであり、1色だけ異なっていても良い(例えば、B画素=G画素>R画素)。拡大率の基準は、中心部11aにおける各色の位相差画素42a,42bの開口面積である。このように、色毎に位相差画素42a,42bの開口面積の拡大率を調節しておけば、各色の位相差画素42a,42bでS/Nが良い信号が得られるので、位相差AFの精度を向上させやすい。   However, when the phase difference pixels 42a and 42b are provided for two or more colors of R, G, and B, the amount of decrease in sensitivity in the peripheral portion 11b is different for each color. Therefore, the opening areas of the phase difference pixels 42a and 42b are different for each color. It is preferable to change the enlargement ratio. For example, the enlargement ratio of the opening area of the phase difference pixels 42a and 42b of each color is set to B pixel> G pixel> R pixel, although it depends on the focal point of the micro lens 75 and the like. Of course, the enlargement ratio of the opening area is the same for the two colors, and only one color may be different (for example, B pixel = G pixel> R pixel). The reference for the enlargement ratio is the opening area of each color phase difference pixel 42a, 42b in the central portion 11a. In this way, if the enlargement ratio of the opening area of the phase difference pixels 42a and 42b is adjusted for each color, a signal having a good S / N can be obtained in the phase difference pixels 42a and 42b of each color. Easy to improve accuracy.

なお、上述の実施形態では、固体撮像装置10の画素が正方配列されているが、画素の配列は任意である。例えば、画素の配列を図22及び図23のようにハニカム配列にしても良い。また、画素配列をハニカム配列にする場合のカラーフィルタの色配列は任意であるが、例えば、図22や図23の色配列にすることができる。図22の色配列は、G画素の列と、R画素及びB画素が2個ずつ交互に並んだ列を斜め45度方向に交互に配置した例である。   In the above-described embodiment, the pixels of the solid-state imaging device 10 are arranged in a square, but the arrangement of the pixels is arbitrary. For example, the pixel arrangement may be a honeycomb arrangement as shown in FIGS. Further, the color arrangement of the color filter when the pixel arrangement is the honeycomb arrangement is arbitrary, but for example, the color arrangement shown in FIGS. 22 and 23 can be used. The color arrangement in FIG. 22 is an example in which columns of G pixels and columns in which two R pixels and two B pixels are alternately arranged are alternately arranged in a 45-degree oblique direction.

また、上述の実施形態では、固体撮像装置10の画素が正方配列されている場合に、6×6画素を単位としたカラーフィルタ41を用いたが、カラーフィルタ41の配列は任意である。例えば、画素を正方配列にする場合には、図24に示すように、破線で囲む2×2画素を上下左右に並べたいわゆるベイヤー配列にしても良い。   In the above-described embodiment, when the pixels of the solid-state imaging device 10 are arranged in a square, the color filter 41 in units of 6 × 6 pixels is used. However, the arrangement of the color filters 41 is arbitrary. For example, when the pixels are arranged in a square array, as shown in FIG. 24, a so-called Bayer array in which 2 × 2 pixels surrounded by a broken line are arranged vertically and horizontally may be used.

なお、上述の実施形態及び変形例では、カラーフィルタ41の6×6画素の単位において、右上の第1サブユニット41aの左下G画素と、左上の第2サブユニット41bの左下G画素を位相差画素42a,42bを形成する例を説明したが、位相差画素42a,42bを形成するG画素の位置は任意である。例えば、右上の第1サブユニット41aの中央のG画素と、左上の第2サブユニット41bの中央のG画素をそれぞれ位相差画素42a,42bにしても良い。また、例えば、右下の第2サブユニット41bのG画素と、左下の第1サブユニット41aのG画素を位相差画素42a,42bにしても良い。   In the above-described embodiment and modification, in the 6 × 6 pixel unit of the color filter 41, the lower left G pixel of the upper right first subunit 41a and the lower left G pixel of the upper left second subunit 41b are phase-differenced. The example in which the pixels 42a and 42b are formed has been described, but the position of the G pixel that forms the phase difference pixels 42a and 42b is arbitrary. For example, the center G pixel of the upper right first subunit 41a and the center G pixel of the upper left second subunit 41b may be phase difference pixels 42a and 42b, respectively. Further, for example, the G pixel of the lower right second subunit 41b and the G pixel of the lower left first subunit 41a may be phase difference pixels 42a and 42b.

さらに、上述の実施形態及び変形例では、対になる2つの位相差画素42a,42bが同じ行内に設けられているが、対になる2つの位相差画素42a,42bは異なる行に設けられていても良い。   Furthermore, in the above-described embodiment and modification, the paired two phase difference pixels 42a and 42b are provided in the same row, but the paired two phase difference pixels 42a and 42b are provided in different rows. May be.

なお、上述の実施形態では、3つのトランジスタTr23,Ta24,Ts25で画素を構成しているが(図1参照)、画素21(通常画素43及び位相差画素42a,42b)は、入射光を各々光電変換し、撮影画像の形成や位相差AFに必要な信号を出力することができればよく、トランジスタの数等は任意である。例えば、画素21は、PD22とFDの間に転送用のトランジスタを設け、4つのトランジスタを用いて構成されていても良い。   In the above-described embodiment, the three transistors Tr23, Ta24, and Ts25 form a pixel (see FIG. 1), but the pixel 21 (the normal pixel 43 and the phase difference pixels 42a and 42b) receives incident light respectively. The number of transistors and the like are arbitrary as long as photoelectric conversion can be performed and signals necessary for formation of a captured image and phase difference AF can be output. For example, the pixel 21 may be configured by providing a transistor for transfer between the PD 22 and the FD and using four transistors.

なお、上述の実施形態及び変形例では、遮光層54とカラーフィルタ41を積層して設けたが、遮光層54の開口55,56内にカラーフィルタ41の各色セグメントを埋め込むように形成することで、遮光層54とカラーフィルタ41を一体に形成しても良い。   In the embodiment and the modification described above, the light shielding layer 54 and the color filter 41 are stacked, but the color segments of the color filter 41 are formed so as to be embedded in the openings 55 and 56 of the light shielding layer 54. The light shielding layer 54 and the color filter 41 may be integrally formed.

なお、上述の実施形態及び変形例では、p型半導体基板53の裏面に、遮光層54、カラーフィルタ41の順に各層を積層しているが、遮光層54とカラーフィルタ41の積層順は逆順でも良い。   In the above-described embodiment and modification, the light shielding layer 54 and the color filter 41 are laminated in this order on the back surface of the p-type semiconductor substrate 53. However, the light shielding layer 54 and the color filter 41 may be laminated in the reverse order. good.

なお、上述の実施形態及び変形例では、遮光層54の開口55,56に透明な材料を充填して遮光層54を平坦化しているが、開口55,56にはカラーフィルタ41の材料が充填されていても良い。   In the embodiment and the modification described above, the openings 55 and 56 of the light shielding layer 54 are filled with a transparent material to flatten the light shielding layer 54. However, the openings 55 and 56 are filled with the material of the color filter 41. May be.

なお、上述の実施形態及び変形例では、固体撮像装置10がCMOSイメージセンサであるが、固体撮像装置10はCCD型イメージセンサでも良い。   In the embodiment and the modification described above, the solid-state imaging device 10 is a CMOS image sensor, but the solid-state imaging device 10 may be a CCD image sensor.

なお、上述の実施形態及び変形例では、固体撮像装置10が裏面照射型のイメージセンサであるが、固体撮像装置10は表面照射型(FSI;front side illuminated)のイメージセンサでも良い。表面照射型イメージセンサは、配線層を介してPDに光が入射するタイプのイメージセンサである。   In the above-described embodiment and modification, the solid-state imaging device 10 is a back-side illuminated image sensor, but the solid-state imaging device 10 may be a front-side illuminated (FSI) image sensor. The surface irradiation type image sensor is a type of image sensor in which light is incident on a PD via a wiring layer.

なお、上述の実施形態では、遮光層54において、通常画素43間にも遮光材料が設けられおり、各通常画素43の開口55が分離しているが、通常画素43間の遮光材料を設けず通常画素43の開口55が全て連結していても良い。この場合、遮光層54は、位相差画素42a,42bにおいて入射光を制限する部分にだけ遮光材料を配置したものになる。   In the above-described embodiment, in the light shielding layer 54, the light shielding material is also provided between the normal pixels 43, and the opening 55 of each normal pixel 43 is separated, but the light shielding material between the normal pixels 43 is not provided. All the openings 55 of the normal pixels 43 may be connected. In this case, the light shielding layer 54 is a layer in which a light shielding material is disposed only in a portion that restricts incident light in the phase difference pixels 42a and 42b.

なお、上述の実施形態では、原色系のカラーフィルタ41を用いているが、補色系のカラーフィルタを用いても良い。また、カラーフィルタには無色(透明)な画素が含まれていても良く、位相差画素42a,42bがこの無色画素に形成されていても良い。   In the above-described embodiment, the primary color filter 41 is used, but a complementary color filter may be used. The color filter may include colorless (transparent) pixels, and the phase difference pixels 42a and 42b may be formed in these colorless pixels.

なお、上述の実施形態では、位相差画素42a,42bが左右方向(横方向)に非対称に形成されており、位相差画素42a,42bで左右の視差を得るようになっているが、視差の情報を得る方向は上下方向(縦方向)や斜め方向でも良い。すなわち、位相差画素の構造(開口の位置)を縦方向に非対称にし、対になる位相差画素を上下に並べておくことによって上下方向の視差を得るようにしても良い。斜め方向の視差を得る場合には、同様に斜め方向に非対称な構造の位相差画素の対を斜め方向に配置すれば良い。   In the above-described embodiment, the phase difference pixels 42a and 42b are formed asymmetrically in the left and right direction (lateral direction), and the left and right parallaxes are obtained by the phase difference pixels 42a and 42b. The direction in which information is obtained may be the vertical direction (vertical direction) or an oblique direction. In other words, the parallax in the vertical direction may be obtained by making the phase difference pixel structure (the position of the opening) asymmetric in the vertical direction and arranging the paired phase difference pixels vertically. In the case of obtaining oblique parallax, similarly, a pair of phase difference pixels having an asymmetric structure in the oblique direction may be arranged in the oblique direction.

また、上述の実施形態では、左右方向の視差を得る位相差画素42a,42bだけを受光領域11内に設けているが、左右方向の視差を得る位相差画素42a,42b、上下方向の視差を得る位相差画素、斜め方向の視差を得る位相差画素のうち2以上を受光領域11に設けておいても良い。上下方向の視差を得る位相差画素や斜め方向の視差を得る位相差画素を設ける場合、これらの各方向の位相差画素も、上述の実施形態の左右の位相差画素42a,42bと同様に、受光する光束の方向と、受光領域11内の位置に応じて拡大することが好ましい。   Further, in the above-described embodiment, only the phase difference pixels 42a and 42b that obtain the parallax in the horizontal direction are provided in the light receiving region 11, but the phase difference pixels 42a and 42b that obtain the parallax in the horizontal direction Two or more of the phase difference pixels to be obtained and the phase difference pixels to obtain the oblique parallax may be provided in the light receiving region 11. When providing a phase difference pixel that obtains vertical parallax and a phase difference pixel that obtains diagonal parallax, the phase difference pixels in each direction are also the same as the left and right phase difference pixels 42a and 42b of the above-described embodiment. It is preferable to enlarge according to the direction of the received light beam and the position in the light receiving region 11.

なお、上述の実施形態では、受光領域11内の一部の画素を位相差画素42a,42bにしているが、図25に示すように、全画素を位相差画素42a,42bにしても良い。全画素を位相差画素42a,42bにすると、例えば、単眼(1個の固体撮像装置)で3D画像を得ることができる。図25では、画素が正方配列であり、カラーフィルタ41を用いる場合を示しているが、画素がハニカム配列やカラーフィルタがベイヤー配列の場合も全画素を位相差画素42a,42bにしても良い。   In the above-described embodiment, some pixels in the light receiving region 11 are the phase difference pixels 42a and 42b. However, as shown in FIG. 25, all the pixels may be the phase difference pixels 42a and 42b. When all the pixels are phase difference pixels 42a and 42b, for example, a 3D image can be obtained with a single eye (one solid-state imaging device). FIG. 25 shows a case where the pixels are square array and the color filter 41 is used, but all pixels may be phase difference pixels 42a and 42b even when the pixels are honeycomb array and the color filters are Bayer array.

なお、上述の実施形態では、受光領域11内に満遍なく位相差画素42a,42bが複数設けられているが、位相差画素42a,42bは少なくとも中央部11aと周辺部11bL,11bRにあれば良く、位相差画素42a,42bの個数や配置及び分布は任意である。   In the above-described embodiment, a plurality of phase difference pixels 42a and 42b are provided uniformly in the light receiving region 11, but the phase difference pixels 42a and 42b may be at least in the central portion 11a and the peripheral portions 11bL and 11bR. The number, arrangement, and distribution of the phase difference pixels 42a and 42b are arbitrary.

本発明の固体撮像装置10は、薄型のデジタルカメラ、携帯電話機やPDA、スマーフォトン等に搭載されるカメラユニットに好適である。特に、撮像レンズ44と距離が近く、周辺部で主光線角度が大きいカメラやカメラユニットに好適である。   The solid-state imaging device 10 of the present invention is suitable for a camera unit mounted on a thin digital camera, a mobile phone, a PDA, a smart photon, or the like. In particular, it is suitable for a camera or a camera unit that is close to the imaging lens 44 and has a large principal ray angle in the peripheral portion.

10 固体撮像装置
11 受光領域
11a 中央部
11bL,11bR 周辺部
21 画素
42a,42b 位相差画素
43 通常画素
55,56L,56R 開口
DESCRIPTION OF SYMBOLS 10 Solid-state imaging device 11 Light reception area | region 11a Center part 11bL, 11bR Peripheral part 21 Pixel 42a, 42b Phase difference pixel 43 Normal pixel 55, 56L, 56R Aperture

Claims (10)

被写体の像が結像される受光領域に配列され、入射光を等方的に受光する複数の通常画素と、
前記受光領域内に複数設けられ、第1方向に偏心配置された開口を有する画素であり、前記入射光のうち前記第1方向の側に到達した光を選択的に受光する第1位相差画素と、
前記受光領域に複数設けられ、前記第1方向とは逆の第2方向に偏心配置された開口を有する画素であり、前記入射光のうち前記第2方向の側に到達した光を選択的に受光する第2位相差画素と、を備え、
前記受光領域の前記第2方向の側の周辺部にある前記第1位相差画素は、前記受光領域の中央部及び前記第1方向の側の周辺部にある前記第1位相差画素に対して、前記第1開口の面積が大きく、
前記受光領域の前記第1方向の側の周辺部にある前記第2位相差画素は、前記中央部及び前記第2方向の側の周辺部にある前記第2位相差画素に対して、前記第2開口の面積が大きい固体撮像装置。
A plurality of normal pixels that are arranged in a light receiving region where an image of a subject is formed, and isotropically receive incident light;
A plurality of pixels provided in the light receiving region and having an opening eccentrically arranged in a first direction, and a first phase difference pixel that selectively receives light reaching the first direction side of the incident light. When,
A plurality of pixels provided in the light receiving region and having an opening that is eccentrically arranged in a second direction opposite to the first direction, and selectively transmits light incident on the second direction side of the incident light. A second phase difference pixel for receiving light,
The first phase difference pixel in the peripheral portion on the second direction side of the light receiving region is compared with the first phase difference pixel in the central portion of the light receiving region and the peripheral portion on the first direction side. The area of the first opening is large;
The second phase difference pixel in the peripheral portion on the first direction side of the light receiving region is different from the second phase difference pixel in the peripheral portion on the central portion and the second direction side. A solid-state imaging device having a large area of two openings.
前記第2方向の側の前記周辺部にある前記第1位相差画素では、前記第1開口が前記中央部のものに対して前記第2方向の側に拡張され、
前記第1方向の側の前記周辺部にある前記第2位相差画素では、前記第2開口が前記中央部のものに対して前記第1方向の側に拡張されている請求項1記載の固体撮像装置。
In the first phase difference pixel in the peripheral portion on the second direction side, the first opening is extended to the second direction side with respect to that of the central portion,
2. The solid according to claim 1, wherein, in the second phase difference pixel in the peripheral portion on the first direction side, the second opening is extended to the first direction side with respect to the central portion. Imaging device.
前記第2方向の側の前記周辺部にある前記第1位相差画素では、前記第1開口の中心位置が前記中央部のものよりも前記第2方向の側にあり、
前記第1方向の側の前記周辺部にある前記第2位相差画素では、前記第2開口の中心位置が前記中央部のものよりも前記第1方向の側にある請求項1または2記載の固体撮像装置。
In the first phase difference pixel in the peripheral portion on the second direction side, the center position of the first opening is on the second direction side with respect to the central portion,
The said 2nd phase difference pixel in the said peripheral part of the said 1st direction side WHEREIN: The center position of the said 2nd opening exists in the said 1st direction side rather than the thing of the said center part. Solid-state imaging device.
前記第2方向の側の前記周辺部にある前記第1位相差画素、及び、前記第1方向の側の前記周辺部にある前記第2位相差画素は、光電変換をするためのフォトダイオードが前記中央部のものよりも小さい請求項1〜3のいずれか1項に記載の固体撮像装置。   The first phase difference pixel in the peripheral portion on the second direction side and the second phase difference pixel in the peripheral portion on the first direction side are photodiodes for photoelectric conversion. The solid-state imaging device according to claim 1, which is smaller than that of the central portion. 前記第1開口の面積は前記中央部からの前記第2方向の側の前記周辺部にかけて、前記第2開口の面積は前記中央部から前記第1方向の側の前記周辺部にかけて、それぞれ段階的に拡大されている請求項1〜4のいずれか1項に記載の固体撮像装置。   The area of the first opening extends from the central part to the peripheral part on the second direction side, and the area of the second opening extends from the central part to the peripheral part on the first direction side. The solid-state imaging device according to any one of claims 1 to 4, wherein the solid-state imaging device is enlarged. 前記第1位相差画素及び前記第2位相差画素は複数の色に設けられ、
前記第2方向の側の前記周辺部にある前記第1位相差画素の前記第1開口、及び、
前記第1方向の側の前記周辺部にある前記第2位相差画素の前記第2開口の各拡大率が、前記色ごとにそれぞれ定められている請求項1〜5のいずれか1項に記載の固体撮像装置。
The first phase difference pixel and the second phase difference pixel are provided in a plurality of colors,
The first opening of the first phase difference pixel in the peripheral portion on the second direction side; and
6. The enlargement ratio of the second opening of the second phase difference pixel in the peripheral portion on the first direction side is determined for each of the colors. 6. Solid-state imaging device.
ズームレンズによって前記被写体の像が前記受光領域に結像される場合に、
前記ズームレンズの焦点距離を最も長くした場合と最も短くした場合の中間の場合における前記入射光の入射角度に応じて、前記第2方向の側の前記周辺部にある前記第1位相差画素の前記第1開口、及び、前記第1方向の側の周辺部にある前記第2位相差画素の前記第2開口の各面積が定められている請求項1〜6のいずれか1項に記載の固体撮像装置。
When an image of the subject is formed on the light receiving area by a zoom lens,
Depending on the incident angle of the incident light when the focal length of the zoom lens is the longest and shortest, the first phase difference pixel in the peripheral portion on the second direction side The area of each said 2nd opening of the said 2nd phase difference pixel in the peripheral part of the said 1st opening and the said 1st direction side is defined in any one of Claims 1-6. Solid-state imaging device.
前記第1方向の側の前記周辺部にある前記第1位相差画素は、前記中央部にある前記第1位相差画素に対して、前記第1開口が小さく、
前記第2方向の側の前記周辺部にある前記第2位相差画素は、前記中央部にある前記第2位相差画素に対して、前記第2開口が小さい請求項1〜7のいずれか1項に記載の固体撮像装置。
The first phase difference pixel in the peripheral portion on the first direction side has a smaller first opening than the first phase difference pixel in the central portion,
The said 2nd phase difference pixel in the said peripheral part of the said 2nd direction side has said 2nd opening small with respect to the said 2nd phase difference pixel in the said center part, The any one of Claims 1-7 The solid-state imaging device according to item.
前記入射光を集光するマイクロレンズを画素毎に備え、
前記周辺部の前記マイクロレンズは、各画素への前記入射光の入射角度に応じて前記中央部の方向に偏心して配置されている請求項1〜8のいずれか1項に記載の固体撮像装置。
A microlens that collects the incident light is provided for each pixel,
9. The solid-state imaging device according to claim 1, wherein the microlenses in the peripheral portion are arranged eccentrically in the direction of the central portion according to an incident angle of the incident light to each pixel. .
被写体の像が結像される受光領域に複数設けられ、第1方向に偏心配置された開口を有する画素であり、前記入射光のうち前記第1方向の側に到達した光を選択的に受光する第1位相差画素と、
前記受光領域に複数設けられ、前記第1方向とは逆の第2方向に偏心配置された開口を有する画素であり、前記入射光のうち前記第2方向の側に到達した光を選択的に受光する第2位相差画素と、を備え、
前記受光領域の前記第2方向の側の周辺部にある前記第1位相差画素は、前記受光領域の中央部及び前記第1方向の側の周辺部にある前記第1位相差画素に対して、前記第1開口の面積が大きく、
前記受光領域の前記第1方向の側の周辺部にある前記第2位相差画素は、前記中央部及び前記第2方向の側の周辺部にある前記第2位相差画素に対して、前記第2開口の面積が大きく、
前記受光領域に配列された画素が全て前記第1位相差画素または前記第2位相差画素のいずれかである固体撮像装置。
A plurality of pixels provided in a light receiving region where an image of a subject is formed and having an opening that is eccentrically arranged in a first direction, and selectively receives light that reaches the first direction of the incident light. A first phase difference pixel that
A plurality of pixels provided in the light receiving region and having an opening that is eccentrically arranged in a second direction opposite to the first direction, and selectively transmits light incident on the second direction side of the incident light. A second phase difference pixel for receiving light,
The first phase difference pixel in the peripheral portion on the second direction side of the light receiving region is compared with the first phase difference pixel in the central portion of the light receiving region and the peripheral portion on the first direction side. The area of the first opening is large;
The second phase difference pixel in the peripheral portion on the first direction side of the light receiving region is different from the second phase difference pixel in the peripheral portion on the central portion and the second direction side. The area of 2 openings is large,
A solid-state imaging device in which all pixels arranged in the light receiving region are either the first phase difference pixel or the second phase difference pixel.
JP2012275453A 2012-12-18 2012-12-18 Solid-state image pickup device Pending JP2016029674A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2012275453A JP2016029674A (en) 2012-12-18 2012-12-18 Solid-state image pickup device
PCT/JP2013/082558 WO2014097884A1 (en) 2012-12-18 2013-12-04 Solid-state image pickup device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2012275453A JP2016029674A (en) 2012-12-18 2012-12-18 Solid-state image pickup device

Publications (1)

Publication Number Publication Date
JP2016029674A true JP2016029674A (en) 2016-03-03

Family

ID=50978219

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2012275453A Pending JP2016029674A (en) 2012-12-18 2012-12-18 Solid-state image pickup device

Country Status (2)

Country Link
JP (1) JP2016029674A (en)
WO (1) WO2014097884A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018056522A (en) * 2016-09-30 2018-04-05 株式会社ニコン Imaging element and focus adjustment device
US10341595B2 (en) 2017-08-10 2019-07-02 Samsung Electronics Co., Ltd. Image sensor for compensating for signal difference between pixels
CN110086972A (en) * 2019-04-28 2019-08-02 深圳市超诺科技有限公司 A kind of camera Intelligent supplemental lighting system
CN110475084A (en) * 2018-05-10 2019-11-19 爱思开海力士有限公司 Image sensering device
JP2021082931A (en) * 2019-11-19 2021-05-27 株式会社シグマ Imaging device and imaging apparatus
JP2022016546A (en) * 2016-06-28 2022-01-21 ソニーグループ株式会社 Solid-state image sensor and electronic equipment
US11477403B2 (en) 2018-07-09 2022-10-18 Sony Semiconductor Solutions Corporation Imaging element and method for manufacturing imaging element
JP2024043295A (en) * 2022-09-16 2024-03-29 キヤノン株式会社 Imaging device and its control method and program

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6476630B2 (en) * 2014-07-30 2019-03-06 株式会社ニコン Imaging device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004172836A (en) * 2002-11-19 2004-06-17 Canon Inc Image reading device
JP4322166B2 (en) * 2003-09-19 2009-08-26 富士フイルム株式会社 Solid-state image sensor
JP5402249B2 (en) * 2009-05-27 2014-01-29 ソニー株式会社 Solid-state imaging device and imaging apparatus
JP2012182332A (en) * 2011-03-02 2012-09-20 Sony Corp Imaging element and imaging device

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022016546A (en) * 2016-06-28 2022-01-21 ソニーグループ株式会社 Solid-state image sensor and electronic equipment
JP7310869B2 (en) 2016-06-28 2023-07-19 ソニーグループ株式会社 Solid-state imaging device and electronic equipment
WO2018061978A1 (en) * 2016-09-30 2018-04-05 株式会社ニコン Imaging element and focus adjustment device
JP2018056522A (en) * 2016-09-30 2018-04-05 株式会社ニコン Imaging element and focus adjustment device
US10341595B2 (en) 2017-08-10 2019-07-02 Samsung Electronics Co., Ltd. Image sensor for compensating for signal difference between pixels
US11843882B2 (en) 2017-08-10 2023-12-12 Samsung Electronics Co., Ltd. Image sensor for compensating for signal difference between pixels
US10819929B2 (en) 2017-08-10 2020-10-27 Samsung Electronics Co., Ltd. Image sensor for compensating for signal difference between pixels
KR20190129256A (en) * 2018-05-10 2019-11-20 에스케이하이닉스 주식회사 Image pickup device
CN110475084B (en) * 2018-05-10 2021-11-23 爱思开海力士有限公司 Image sensing device
KR102549481B1 (en) * 2018-05-10 2023-06-30 에스케이하이닉스 주식회사 Image pickup device
CN110475084A (en) * 2018-05-10 2019-11-19 爱思开海力士有限公司 Image sensering device
US11477403B2 (en) 2018-07-09 2022-10-18 Sony Semiconductor Solutions Corporation Imaging element and method for manufacturing imaging element
US11765476B2 (en) 2018-07-09 2023-09-19 Sony Semiconductor Solutions Corporation Imaging element and method for manufacturing imaging element
US12022217B2 (en) 2018-07-09 2024-06-25 Sony Semiconductor Solutions Corporation Imaging element and method for manufacturing imaging element
TWI866916B (en) * 2018-07-09 2024-12-21 日商索尼半導體解決方案公司 Imaging element and method for manufacturing the same
CN110086972A (en) * 2019-04-28 2019-08-02 深圳市超诺科技有限公司 A kind of camera Intelligent supplemental lighting system
JP2021082931A (en) * 2019-11-19 2021-05-27 株式会社シグマ Imaging device and imaging apparatus
JP7381067B2 (en) 2019-11-19 2023-11-15 株式会社シグマ Imaging device and imaging device
JP2024043295A (en) * 2022-09-16 2024-03-29 キヤノン株式会社 Imaging device and its control method and program
JP7699566B2 (en) 2022-09-16 2025-06-27 キヤノン株式会社 Imaging device, control method and program thereof

Also Published As

Publication number Publication date
WO2014097884A1 (en) 2014-06-26

Similar Documents

Publication Publication Date Title
US11882359B2 (en) Solid-state imaging device, method for driving the same, and electronic device for improved auto-focusing accuracy
US11444115B2 (en) Solid-state imaging device and electronic apparatus
JP5860168B2 (en) Solid-state imaging device
US10015426B2 (en) Solid-state imaging element and driving method therefor, and electronic apparatus
CN105308748B (en) Solid-state imaging devices and electronic equipment
JP2016029674A (en) Solid-state image pickup device
US10594961B2 (en) Generation of pixel signal with a high dynamic range and generation of phase difference information
JP5422889B2 (en) Solid-state imaging device and imaging apparatus using the same
EP2738812B1 (en) A pixel array
CN102693989B (en) Solid-state imaging device, method of manufacturing solid-state imaging device, and electronic device
JP5045012B2 (en) Solid-state imaging device and imaging apparatus using the same
US20150163464A1 (en) Solid-state imaging device
JP2008227253A (en) Back-illuminated solid-state image sensor
JP2015115345A (en) Solid-state imaging device
JP4839990B2 (en) Solid-state imaging device and imaging apparatus using the same
JP2016139988A (en) Solid-state imaging device
JP7504630B2 (en) Image pickup element, image pickup device, computer program, and storage medium
JP2012004264A (en) Solid-state imaging element and imaging device