[go: up one dir, main page]

WO2025084019A1 - Dispositif de détection de lumière - Google Patents

Dispositif de détection de lumière Download PDF

Info

Publication number
WO2025084019A1
WO2025084019A1 PCT/JP2024/031423 JP2024031423W WO2025084019A1 WO 2025084019 A1 WO2025084019 A1 WO 2025084019A1 JP 2024031423 W JP2024031423 W JP 2024031423W WO 2025084019 A1 WO2025084019 A1 WO 2025084019A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel
inter
photodetector
pixels
planarization layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/JP2024/031423
Other languages
English (en)
Japanese (ja)
Inventor
慎太郎 中食
良和 田中
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Semiconductor Solutions Corp
Original Assignee
Sony Semiconductor Solutions Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Semiconductor Solutions Corp filed Critical Sony Semiconductor Solutions Corp
Publication of WO2025084019A1 publication Critical patent/WO2025084019A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/20Filters

Definitions

  • This disclosure relates to a light detection device.
  • Japanese Patent Application Laid-Open No. 2003-233633 discloses a solid-state imaging element, and further discloses an imaging device and an electronic device that include this solid-state imaging element.
  • a solid-state imaging device a plurality of pixels are regularly arranged. Each of the plurality of pixels includes a sensor portion, a color filter, and a lens.
  • the sensor portion generates an electric signal according to incident light.
  • the color filter is formed to cover the sensor portion.
  • the lens is laminated via the color filter, and focuses the incident light onto the sensor portion.
  • a planarization layer is formed between the color filter and the lens.
  • the planarization layer is also formed between multiple pixels. For this reason, it is desirable to effectively suppress or prevent the light incident on a pixel from leaking to adjacent pixels by spreading in the planarization layer in the planarization layer.
  • the photodetection device comprises a plurality of pixels arranged two-dimensionally, an optical filter disposed at a position corresponding to each of the plurality of pixels, an optical lens laminated on the optical filter, a planarization layer disposed between the optical filter and the optical lens to reduce the step shape of the optical filter, and inter-pixel walls disposed in the thickness direction of the planarization layer at positions corresponding to the spaces between the plurality of pixels and having a lower refractive index than the planarization layer.
  • the refractive indexes of the optical lens, the planarization layer, and the inter-pixel wall are: The relational expression of optical lens ⁇ planarization layer>wall between pixels is satisfied.
  • the thickness of the inter-pixel wall in the same direction is the same as, thinner than, or protrudes toward the optical lens side and is thicker than the thickness of the planarization layer in the photodetector according to the first embodiment.
  • the inter-pixel wall in the photodetector according to the first embodiment has a sidewall surface that is inclined in the thickness direction when viewed from the side.
  • the surface of the planarization layer facing the optical lens is flatter than the surface facing the optical filter in the optical detection device according to the first embodiment.
  • a glass plate is disposed on the side of the optical lens opposite the optical filter in the optical detection device according to the first embodiment, with a sealing resin layer interposed therebetween.
  • FIG. 1 is a plan view showing the overall configuration (chip layout) of a photodetector according to a first embodiment of the present disclosure.
  • FIG. 2 is a cross-sectional view of a main portion showing a pixel region (effective pixel region and optical black region) of the photodetector according to the first embodiment (a cross-sectional view including a portion cut along the line AA shown in FIG. 3).
  • FIG. 3 is a plan view of a main portion showing a pixel region (effective pixel region) of the photodetector according to the first embodiment.
  • FIG. 4 is a cross-sectional view illustrating a first step in the method for manufacturing the photodetector according to the first embodiment, and corresponds to FIG. FIG.
  • FIG. 5 is a cross-sectional view of the second process.
  • FIG. 6 is a cross-sectional view of the third process.
  • FIG. 7 is a cross-sectional view of the fourth step.
  • FIG. 8 is a cross-sectional view of the fifth step.
  • FIG. 9 is a cross-sectional view of the sixth step.
  • FIG. 10 is a cross-sectional view of the seventh step.
  • FIG. 11 is a cross-sectional view of the eighth step.
  • FIG. 12 is a cross-sectional view of the ninth step.
  • FIG. 13 is a cross-sectional view of the tenth step.
  • FIG. 14 is a cross-sectional view corresponding to FIG. 2 and illustrating a first step in a manufacturing method for a photodetector according to the second embodiment of the present disclosure.
  • FIG. 15 is a cross-sectional view of the second process.
  • FIG. 16 is a cross-sectional view of the third process.
  • FIG. 17 is a cross-sectional view of the fourth step.
  • 18A to 18C are cross-sectional views illustrating steps in a method for manufacturing a photodetector according to a modified example of the second embodiment, the cross-sectional views corresponding to those shown in FIG.
  • FIG. 19 is a plan view of a main part corresponding to FIG. 3 and showing a pixel region of a photodetector according to a third embodiment of the present disclosure.
  • FIG. 20 is a plan view of a main part, corresponding to FIG. 3, showing a pixel region of a photodetector according to a first modified example of the third embodiment.
  • FIG. 21 is a plan view of a main portion, corresponding to FIG. 3, showing a pixel region of a photodetector according to a second modified example of the third embodiment.
  • FIG. 22 is a cross-sectional view of a main part corresponding to FIG. 2 and showing a pixel region of a photodetector according to a fourth embodiment of the present disclosure.
  • FIG. 23 is a cross-sectional view of a main part corresponding to FIG. 2 and showing a pixel region of a photodetector according to a fifth embodiment of the present disclosure.
  • FIG. 24 is a cross-sectional view of a main part corresponding to FIG. 2 and showing a pixel region of a photodetector according to a sixth embodiment of the present disclosure.
  • FIG. 22 is a cross-sectional view of a main part corresponding to FIG. 2 and showing a pixel region of a photodetector according to a sixth embodiment of the present disclosure.
  • FIG. 25 is a cross-sectional view of a main part corresponding to FIG. 2 and showing a pixel region of a photodetector according to a seventh embodiment of the present disclosure.
  • FIG. 26 is a cross-sectional view of a main part corresponding to FIG. 2 and showing a pixel region of a photodetector according to an eighth embodiment of the present disclosure.
  • FIG. 27 is a cross-sectional view of a main part corresponding to FIG. 2 and showing a pixel region of a photodetector according to a ninth embodiment of the present disclosure.
  • FIG. 28 is a cross-sectional view of a main part corresponding to FIG. 2 and showing a pixel region of a photodetector according to a tenth embodiment of the present disclosure.
  • FIG. 29 is a cross-sectional view of a main part corresponding to FIG. 2 and showing a pixel region of a photodetector according to an eleventh embodiment of the present disclosure.
  • FIG. 30 is a cross-sectional view of a main part corresponding to FIG. 2 and showing a pixel region of a photodetector according to a twelfth embodiment of the present disclosure.
  • FIG. 31 is a cross-sectional view of a main part corresponding to FIG. 2 and showing a pixel region of a photodetector according to a thirteenth embodiment of the present disclosure.
  • FIG. 32 is a cross-sectional view of a main part corresponding to FIG.
  • FIG. 33 is a cross-sectional view of a main part corresponding to FIG. 2 and showing a pixel region of a photodetector according to a fifteenth embodiment of the present disclosure.
  • FIG. 34 is a cross-sectional view of a main part corresponding to FIG. 2 and showing a pixel region of a photodetector according to a sixteenth embodiment of the present disclosure.
  • FIG. 35 is a cross-sectional view of a main part corresponding to FIG. 2 and showing a pixel region of a photodetector according to a seventeenth embodiment of the present disclosure.
  • FIG. 33 is a cross-sectional view of a main part corresponding to FIG. 2 and showing a pixel region of a photodetector according to a fifteenth embodiment of the present disclosure.
  • FIG. 34 is a cross-sectional view of a main part corresponding to FIG. 2 and showing a pixel region of a photodetector according to a sixteenth embodiment of the present disclosure.
  • FIG. 36 is a cross-sectional view of a main part corresponding to FIG. 2 and showing a pixel region of a photodetector according to an eighteenth embodiment of the present disclosure.
  • FIG. 37 is a cross-sectional view of a main part corresponding to FIG. 2 and showing a pixel region of a photodetector according to a nineteenth embodiment of the present disclosure.
  • FIG. 38 is a cross-sectional view of a main part corresponding to FIG. 2 and showing a pixel region of a photodetector according to a twentieth embodiment of the present disclosure.
  • FIG. 39 is a cross-sectional view of a main part corresponding to FIG.
  • FIG. 40 is a cross-sectional view of a main part corresponding to FIG. 2 and showing a pixel region of a photodetector according to a twenty-second embodiment of the present disclosure.
  • FIG. 41 is a cross-sectional view of a main part corresponding to FIG. 2 and showing a pixel region of a photodetector according to a twenty-third embodiment of the present disclosure.
  • FIG. 42 is a diagram showing an outline of a configuration example of a stacked solid-state imaging device (photodetection device) to which the technology according to the present disclosure can be applied.
  • FIG. 43 is a cross-sectional view showing a first configuration example of a stacked solid-state imaging device 23020.
  • FIG. 44 is a cross-sectional view showing a second configuration example of the stacked solid-state imaging device 23020.
  • FIG. 45 is a cross-sectional view showing another configuration example of a stacked solid-state imaging device (photodetector) to which the technology according to the present disclosure can be applied.
  • FIG. 46 is a plan view showing a first configuration example of a solid-state imaging device (photodetection device) in which a plurality of pixels are shared, to which the technology according to the present disclosure can be applied.
  • FIG. 47 is a plan view showing a second configuration example of a solid-state imaging device (photodetection device) that shares a plurality of pixels to which the technology according to the present disclosure can be applied.
  • FIG. 48 is a block diagram showing an example of a schematic configuration of a vehicle control system.
  • FIG. 49 is an explanatory diagram showing an example of the installation positions of the outside-vehicle information detection unit and the imaging unit.
  • FIG. 50 is a block diagram showing an example of a schematic configuration of an in-vivo information acquiring system.
  • Second Embodiment a first example in which the method of manufacturing inter-pixel walls in the photodetector according to the first embodiment is changed will be described. 3.
  • Third embodiment describes a second example in which the planar structure of the inter-pixel walls is changed in the photodetector according to the first embodiment.
  • the third embodiment further describes a modified example in which the planar structure of the inter-pixel walls is changed.
  • Fourth Embodiment a third example in which the cross-sectional structure of the inter-pixel walls in the photodetector according to the first embodiment is changed will be described. 5.
  • a fourth example will be described in which the cross-sectional structure of the inter-pixel walls is changed in the photodetector according to the first embodiment. 6.
  • Sixth Embodiment In the sixth embodiment a fifth example in which the cross-sectional structure of the inter-pixel walls in the photodetector according to the first embodiment is changed will be described. 7.
  • Seventh Embodiment In the seventh embodiment a sixth example in which the cross-sectional structure of the inter-pixel walls in the photodetector according to the first embodiment is changed will be described. 8.
  • a seventh example will be described in which the cross-sectional structure of the inter-pixel walls in the photodetector according to the first embodiment is changed. 9.
  • the ninth embodiment describes an eighth example in which the cross-sectional structure of the planarizing layer is changed in the photodetector according to the first embodiment. 10.
  • Tenth Embodiment In the tenth embodiment, a ninth example in which a separation structure is provided in an optical filter in the photodetector according to the first embodiment will be described. 11.
  • Eleventh Embodiment In the eleventh embodiment, a tenth example will be described in which a separation structure is provided in an optical filter in the photodetector according to the first embodiment. 12.
  • Twelfth Embodiment In the twelfth embodiment, an eleventh example will be described in which the isolation structure in the photodetector according to the eleventh embodiment is changed. 13.
  • the sixteenth embodiment describes a fifteenth example in which the cross-sectional structures of the optical filter, the planarizing layer, and the inter-pixel wall in the photodetector according to the first embodiment are changed. 17. Seventeenth Embodiment The seventeenth embodiment describes a sixteenth example in which the cross-sectional structures of the optical filter, the planarizing layer, and the inter-pixel wall in the photodetector according to the first embodiment are changed. 18. Eighteenth Embodiment The eighteenth embodiment describes a seventeenth example in which the photodetector according to the first embodiment is modified to have an optimum cross-sectional structure for processing the planarizing layer and inter-pixel walls. 19.
  • the nineteenth embodiment describes an eighteenth example in which the photodetector according to the seventeenth embodiment is modified to have an optimum cross-sectional structure for processing the planarizing layer and inter-pixel walls.
  • 20. Twentieth Embodiment In the twentieth embodiment a nineteenth example will be described in which the photodetector according to the first embodiment is modified to have an optimum cross-sectional structure for processing the planarizing layer and inter-pixel walls.
  • Twenty-Fourth Embodiment In the twenty-fourth embodiment, a first example in which a solid-state imaging device serving as a photodetector is applied to a stacked solid-state imaging device will be described. 25. Twenty-Fifth Embodiment In the twenty-fifth embodiment, a second example in which a solid-state imaging device serving as a photodetector is applied to a stacked solid-state imaging device will be described. 26. Twenty-Sixth Embodiment In the twenty-sixth embodiment, a third example in which a solid-state imaging device serving as a photodetector is applied to a stacked solid-state imaging device will be described. 27.
  • the twenty-seventh embodiment describes a first example in which the photodetector according to the first embodiment is applied to a shared structure in which a single pixel circuit is shared by a plurality of pixels. 28.
  • Twenty-Eighth Embodiment describes a second example in which the photodetector according to the first embodiment is applied to a shared structure.
  • FIG. 1 A photodetector 1 according to a first embodiment of the present disclosure will be described with reference to FIGS. 1 to 13.
  • the arrow X (or x) direction shown as appropriate in the drawings indicates one planar direction of the light detection device 1 placed on a flat surface for the sake of convenience.
  • the arrow Y (or y) direction indicates another planar direction perpendicular to the arrow X direction.
  • the arrow Z direction indicates the upward direction perpendicular to the arrow X and arrow Y directions.
  • the arrow X direction, arrow Y direction, and arrow Z direction exactly coincide with the X-axis direction, Y-axis direction, and Z-axis direction, respectively, of a three-dimensional coordinate system. Note that these directions are shown to facilitate understanding of the description, and are not intended to limit the directions of the present technology.
  • FIG. 1 shows an example of the overall planar configuration of the photodetection device 1.
  • the photodetector 1 according to the first embodiment is constructed using a substrate 100.
  • a semiconductor substrate is used as the substrate 100.
  • a single crystal silicon (Si) substrate is used as the semiconductor substrate.
  • the substrate 100 is formed in a rectangular shape when viewed from the direction of the arrow Z (hereinafter simply referred to as "in a plan view").
  • the photodetector 1 includes at least a pixel area (effective pixel area) PA, an optical black area OB, a vertical drive circuit VDC, a column signal processing circuit CSC, a horizontal drive circuit HDC, an output circuit OUT, a control circuit COC, and an input/output terminal IN.
  • the pixel area PA is disposed in the central portion of the substrate 100.
  • a plurality of pixels 10 are arranged in a matrix in each of the directions of the arrow X and the arrow Y.
  • the plurality of pixels 10 are arranged two-dimensionally in the planar direction.
  • the pixel 10 includes a photoelectric conversion element (not shown) that converts light into an electric charge, and a plurality of transistors (not shown) that process the converted electric charge as an electric signal.
  • the optical black area OB is disposed around the pixel area PA. Although not shown, the optical black area OB has pixels 10 similar to the pixels 10 arranged in the pixel area PA, and is light-shielded. In other words, in the optical black area OB, a dark current component used for offset calculation is generated.
  • the photoelectric conversion element of the pixel 10 is composed of, for example, a photodiode.
  • the plurality of transistors include at least a transfer transistor, a selection transistor, a reset transistor, an amplification transistor, etc.
  • the selection transistor, the reset transistor, and the amplification transistor configure a pixel circuit that performs signal processing on the electric charge converted from light in the photoelectric conversion element.
  • the transfer transistor transfers the electric charge converted in the photoelectric conversion element to the pixel circuit.
  • IGFETs insulated gate field effect transistors
  • IGFETs include at least a metal oxide semiconductor field effect transistor (MOSFET) and a metal insulator semiconductor field effect transistor (MISFET).
  • MOSFET metal oxide semiconductor field effect transistor
  • MISFET metal insulator semiconductor field effect transistor
  • the shared pixel structure is a structure in which the photoelectric conversion elements and multiple transfer transistors of multiple pixels 10 are connected to one shared pixel circuit by a common floating diffusion (floating diffusion region).
  • the shared pixel structure is a structure in which one pixel circuit is shared by multiple pixels 10.
  • the vertical drive circuit VDC, the column signal processing circuit CSC, the horizontal drive circuit HDC, the output circuit OUT, and the control circuit COC are disposed in the peripheral portion of the substrate 100 and form the peripheral circuitry of the photodetector 1 .
  • the control circuit COC receives an input clock signal and receives information instructing an operation mode, etc. Also, the control circuit COC outputs information generated inside.
  • control circuit COC generates clock signals and control signals that are the basis for the operations of the vertical drive circuit VDC, the column signal processing circuit CSC, and the horizontal drive circuit HDC based on the vertical synchronizing signal, the horizontal synchronizing signal, and the master clock signal, and outputs the generated clock signals and control signals to the vertical drive circuit VDC, the column signal processing circuit CSC, the horizontal drive circuit, etc.
  • the vertical drive circuit VDC is constructed of, for example, a shift register.
  • a predetermined pixel drive line Ld is selected from among a plurality of pixel drive lines Ld, and a pulse for driving the pixels 10 is supplied to the selected pixel drive line Ld.
  • the pixels 10 are driven in row units. That is, in the vertical drive circuit VDC, each pixel 10 in the pixel area PA is selected and scanned in the vertical direction in a row-by-row manner. In each selected and scanned pixel 10, a pixel signal based on the charge generated in the photoelectric conversion element according to the amount of received light is transmitted to the vertical signal line Lv. The pixel signal is then supplied to the column signal processing circuit CSC.
  • Multiple column signal processing circuits CSC are arranged for each column of pixels 10.
  • signal processing such as noise removal is performed on the pixel signals output from one row of pixels 10 for each column of pixels 10.
  • the column signal processing circuit CSC performs signal processing such as correlated double sampling (CDS) processing that removes fixed pattern noise specific to the pixels 10 and analog-to-digital (AD) conversion processing.
  • CDS correlated double sampling
  • AD analog-to-digital
  • the horizontal drive circuit HDC is constructed, for example, by a shift register.
  • horizontal scanning pulses are output sequentially, and each of the column signal processing circuits CSC is selected in turn.
  • a column signal processing circuit CSC is selected, a pixel signal is output from the column signal processing circuit CSC to the horizontal signal line Lh.
  • the output circuit OUT performs signal processing on the image signals sequentially supplied from each of the column signal processing circuits CSC through the horizontal signal line Lh, and outputs the processed pixel signals to the outside of the photodetection device 1.
  • the output circuit OUT performs, for example, buffering.
  • the output circuit OUT may further perform various digital signal processing such as black level adjustment and column variation correction.
  • the black level adjustment is performed based on the dark current component generated in the optical black region OB.
  • the input/output terminal IN transmits and receives signals between the outside and the inside of the photodetector 1 .
  • the photodetector 1 is a CMOS (Complementary Metal Oxide Semiconductor) image sensor known as a column AD type.
  • CMOS Complementary Metal Oxide Semiconductor
  • a column signal processing circuit CSC that performs CDS processing and AD conversion processing is arranged for each pixel column.
  • Fig. 2 shows an example of a vertical cross-sectional configuration of a plurality of pixels 10 arranged in a pixel area PA of the photodetector 1 and an optical black area OB.
  • Fig. 3 shows an example of a planar configuration of the pixel 10.
  • the pixel 10 has a photoelectric conversion element 101 as a light receiving element that converts incident light into an electric charge.
  • An optical filter 4 and an optical lens 7 are disposed at a position corresponding to the pixel 10.
  • the stacking direction is the thickness direction of the optical filter 4, which is the direction of the arrow Z.
  • the opposite side to the direction of the arrow Z is the direction in which light is incident on the pixel 10.
  • the photodetector 1 includes a planarization layer 5 between the optical filter 4 and the optical lens 7, and an inter-pixel wall 6 on the planarization layer 5 at a position corresponding to the space between adjacent pixels 10.
  • the photoelectric conversion element 101 is disposed on the substrate 100.
  • the photoelectric conversion element 101 is disposed for each pixel 10.
  • the photoelectric conversion element 101 is configured of a photodiode formed at a pn junction between a p-type semiconductor region and an n-type semiconductor region.
  • the circuits include, for example, a drive circuit that drives the photoelectric conversion element 101, a pixel circuit (readout circuit) that reads out a signal (charge) from the photoelectric conversion element 101, a signal processing circuit that processes the signal, and a control circuit COC that controls various circuits.
  • the optical filter 4 is disposed above the substrate 100 in the direction of the arrow Z at a position corresponding to the pixel 10.
  • the optical filter 4 is formed on the substrate 100 with the protective layer 2 interposed therebetween.
  • the protective layer 2 is used as a base film for the optical filter 4, and the surface of the protective layer 2 on the optical filter 4 side is flattened.
  • the protective layer 2 is formed, for example, from a resin material having optical transparency.
  • the thickness of the protective layer 2 in the film thickness direction is formed, for example, to be the same as or thicker than the pixel separation wall 3.
  • the light transmission wavelength of the first optical filter 4R is longer than the light transmission wavelength of the third optical filter 4G. Furthermore, the light transmission wavelength of the second optical filter 4B is shorter than the light transmission wavelength of the third optical filter 4G.
  • the optical filter 4 is formed, for example, from a resin material to which an organic pigment has been added.
  • the resin material may be an acrylic resin, a styrene resin, or the like.
  • the optical filter 4 is formed to have a thickness of, for example, 300 nm or more and 1000 nm or less.
  • the pixel isolation wall 3 is disposed in the protective layer 2 at a position corresponding to a gap between the pixels 10 in the pixel area PA.
  • the pixel isolation wall 3 is formed to surround the periphery of the pixel 10 in a plan view.
  • the pixel isolation wall 3 is configured to effectively suppress or prevent light leakage from a pixel 10 to another pixel 10 adjacent to the pixel 10.
  • one or more metal films selected from tungsten (W), aluminum (Al) and copper (Cu), or metal oxide films thereof can be used for the pixel separation wall 3.
  • the thickness of the pixel separation wall 3 in the direction of the arrow Z is set to be equal to or thinner than the thickness of the protective layer 2 in the same direction.
  • the pixel separation wall 3 is formed to a thickness of, for example, 100 nm or more.
  • the optical lens 7 is laminated above the optical filter 4 with the planarizing layer 5 therebetween.
  • the optical lens 7 includes a lens body 71 and an anti-reflection film 72 formed on the surface of the lens body 71.
  • the lens body 71 is formed in a curved shape protruding in the direction of the arrow Z for each pixel 10 in a side view.
  • the lens body 71 is formed of an inorganic material that is optically transparent and has a refractive index of, for example, 1.8 or more.
  • an inorganic material such as silicon nitride (SiN) can be used for the lens body 71.
  • SiN silicon nitride
  • the anti-reflection film 72 may be made of an inorganic material such as silicon oxynitride (SiON).
  • the optical lenses 7 arranged at positions corresponding to the pixels 10 are connected to other adjacent optical lenses 7 and formed integrally.
  • the optical lenses 7 are configured as on-chip lenses.
  • a glass plate 9 is disposed with a low refractive index resin film 81, an anti-reflection film 82, and a sealing resin film 83 interposed in that order.
  • the low refractive index resin film 81 is laminated on the optical lens 7.
  • the low refractive index resin film 81 is also used as a planarizing layer that reduces the stepped shape of the optical lens 7 and planarizes the surface on the glass plate 9 side.
  • the low refractive index resin film 81 is formed of an organic resin material having a refractive index of, for example, 1.5 or less.
  • the low refractive index resin film 81 eliminates the air gap formed between the optical lens 7 and the glass plate 9, improving the light collecting efficiency of the incident light.
  • the anti-reflection film 82 is laminated on the low refractive index resin film 81.
  • the anti-reflection film 82 is made of an inorganic material such as silicon oxide (SiO 2 ).
  • the sealing resin film 83 is laminated on the anti-reflection film 82.
  • the sealing resin film 83 is made of, for example, an organic resin material.
  • the glass plate 9 is layered on the sealing resin film 83 and is bonded to this sealing resin film 83.
  • a light-transmitting quartz glass plate is used for the glass plate 9.
  • the glass plate 9 has a packaging function for the light detection device 1.
  • the planarization layer 5 is disposed between the optical filter 4 and the optical lens 7.
  • the first optical filter 4R, the second optical filter 4B, and the third optical filter 4G of the optical filter 4 may have different thicknesses.
  • a step shape may be generated on the surface of the optical filter 4 as a whole.
  • the planarization layer 5 reduces (absorbs) such a step shape on the surface of the optical filter 4, and flattens the surface.
  • the surface of the planarization layer 5 on the optical lens 7 side is flatter than the surface on the optical filter 4 side.
  • the planarization layer 5 is formed of an organic resin material that is optically transparent and has a refractive index lower than that of the lens body 71 of the optical lens 7, for example, 1.8 or less.
  • the thickness of the planarization layer 5 is formed to be, for example, 50 nm or more and 1000 nm or less.
  • the thickness of the planarization layer 5 is formed to be, for example, 100 nm or more and 500 nm or less.
  • the inter-pixel walls 6 are disposed at positions corresponding to spaces between a plurality of adjacent pixels 10. In a plan view, the inter-pixel walls 6 are disposed at positions overlapping the pixel separation walls 3. The following will explain in detail.
  • the pixel 10 has a rectangular planar shape.
  • the inter-pixel wall 6 is formed to surround the periphery of the pixel 10.
  • the opening shape of the inter-pixel wall 6 is formed to be a rectangle similar to the planar shape of the pixel 10.
  • the inter-pixel walls 6 extend in the direction of the arrow X, are arranged at predetermined intervals in the direction of the arrow Y, and further extend in the direction of the arrow Y, and are arranged at predetermined intervals in the direction of the arrow X.
  • the planar shape of the inter-pixel walls 6 is formed into a lattice shape.
  • the opening size of the inter-pixel wall 6 disposed around one pixel 10 is the same as the opening size of the inter-pixel wall 6 disposed around another adjacent pixel 10.
  • the opening sizes of all the inter-pixel walls 6 in the pixel area PA are the same.
  • the inter-pixel walls 6 are disposed across the thickness of the planarization layer 5.
  • the thickness (height) of the inter-pixel walls 6 in the direction of the arrow Z is effectively the same as the thickness of the planarization layer in the same direction.
  • the inter-pixel walls 6 are formed of a colorless and transparent organic resin material having a refractive index lower than that of the planarization layer 5 and a light transmittance higher than that of the planarization layer 5.
  • the inter-pixel walls 6 for example, one or more materials selected from a styrene resin material and an acrylic resin material can be practically used. Therefore, the refractive indices of the optical lens 7 (lens body 71), the planarizing layer 5, and the inter-pixel wall 6 satisfy the following relational expression.
  • the light transmittance of the inter-pixel walls 6 is set to 90% or more by using the above-mentioned material.
  • the inter-pixel wall 6 may be formed including a filler for adjusting the refractive index. Adjusting the refractive index means adjusting the refractive index to a lower value.
  • the filler for example, one or more materials selected from porous silica, hollow silica, and silicon oxide (SiO 2 ) particles can be practically used.
  • the interface between the planarization layer 5 and the inter-pixel walls 6 functions as a reflective surface.
  • most of the incident light that has passed through the glass plate 9, the optical lens 7, and the planarization layer 5 is incident on the photoelectric conversion element 101 and is reflected at the interface between the planarization layer 5 and the inter-pixel walls 6, and is focused on the photoelectric conversion element 101. Since the inter-pixel walls 6 are disposed in the planarization layer 5, it is possible to effectively suppress or prevent the incident light from spreading in the planar direction, and it is possible to effectively suppress or prevent light leakage to adjacent pixels 10.
  • Method of manufacturing the photodetector 1] 4 to 13 are cross-sectional views illustrating steps in a method for manufacturing the photodetector 1 according to the first embodiment.
  • the method for manufacturing the photodetector 1 is as follows.
  • the photoelectric conversion element 101 is formed on the substrate 100 (see FIG. 4).
  • the substrate 100 is a semiconductor wafer from which a plurality of photodetectors 1 are manufactured.
  • the protective layer 2 and the pixel separation wall 3 are formed (see FIG. 4).
  • the protective layer 2 is formed over the entire area of the substrate 100.
  • the pixel separation wall 3 is formed at a position corresponding to the gap between the plurality of pixels 10.
  • an optical filter 4 is formed on the protective layer 2.
  • the optical filter 4 includes a first optical filter 4R, a second optical filter 4B, and a third optical filter 4G.
  • the first optical filter 4R, the second optical filter 4B, and the third optical filter 4G are each formed separately. Photolithography and etching techniques are used to form the optical filter 4.
  • planarization layer 5 is formed over the entire optical filter 4.
  • the planarization layer 5 is made of an organic resin material and is formed by spin coating technology.
  • a mask 501 is formed on the planarization layer 5.
  • the mask 501 is formed, for example, by using a photolithography technique.
  • An opening 501H is formed in the mask 501 at a position corresponding to the gap between the multiple pixels 10.
  • the mask 501 is used to partially remove the planarization layer 5 exposed from the opening 501H. As a result, an opening 5H is formed in the planarization layer 5.
  • a highly anisotropic dry etching technique is used to partially remove the planarization layer 5. Subsequently, the mask 501 is stripped off.
  • an organic resin material 61 is formed on the planarization layer 5.
  • the organic resin material 61 is a colorless and transparent organic resin material.
  • the organic resin material 61 is formed as a film by a spin coating technique.
  • the organic resin material 61 is filled in the opening 5H formed in the planarization layer 5.
  • the organic resin material 61 is uniformly removed from the surface.
  • a dry etching technique is used to remove the organic resin material 61.
  • the organic resin material 61 is removed by an etch-back process.
  • the removal of the organic resin material 61 is stopped. As a result, the organic resin material 61 is filled into the openings 5H of the planarization layer 5, and the inter-pixel walls 6 are formed from this organic resin material 61.
  • a lens formation film 701 is formed on the planarization layer 5 (see FIG. 11).
  • the lens formation film 701 is formed using a resin material by, for example, a spin coating technique.
  • a mask 702 is formed on a lens formation film 701.
  • a photoresist film is first formed using, for example, photolithography technology. This is followed by patterning to leave the photoresist film in positions corresponding to the pixels 10. A reflow process is then performed on the photoresist film, and the photoresist film is formed into a lens shape in a side view. This photoresist film is used as the mask 702.
  • the lens forming film 701 is patterned using a mask 702, and a lens body 71 having a lens shape is formed from the lens forming film 701 (see FIG. 12). 12, an anti-reflection film 72 is formed on a lens body 71. When the anti-reflection film 72 is formed, an optical lens 7 having the lens body 71 and the anti-reflection film 72 is formed.
  • a low refractive index resin film 81 and an anti-reflection film 82 are sequentially formed to cover the optical lens 7 .
  • a sealing resin film 83 and a glass plate 9 are formed to cover the anti-reflection film 82 .
  • the photodetection device 1 of the first embodiment has a plurality of pixels 10 arranged two-dimensionally, an optical filter 4, an optical lens 7, a planarization layer 5, and inter-pixel walls 6.
  • the optical filter 4 is disposed at a position corresponding to each of the plurality of pixels 10.
  • the optical lens 7 is laminated on the optical filter 4.
  • the planarizing layer 5 is disposed between the optical filter 4 and the optical lens 7, and reduces the step shape of the optical filter 4.
  • the inter-pixel walls 6 are disposed across the thickness of the planarization layer 5 at positions corresponding to the spaces between the pixels 10 , and have a refractive index lower than that of the planarization layer 5 .
  • incident light can be reflected at the interface between the planarization layer 5 and the inter-pixel walls 6. That is, the incident light is incident on the photoelectric conversion element 101, and is reflected at the interface between the planarization layer 5 and the inter-pixel walls 6, and is collected on the photoelectric conversion element 101.
  • the inter-pixel walls 6 are provided in the planarization layer 5, it is possible to effectively suppress or prevent the spread of incident light in the planar direction, and it is possible to effectively suppress or prevent light leakage to adjacent pixels 10. Therefore, in the photodetection device 1, it is possible to improve color reproducibility and resolution.
  • the optical lens 7 is made of an inorganic material having a refractive index of 1.8 or more
  • the planarization layer 5 is made of an organic resin material having a refractive index of 1.8 or less. Therefore, as described above, the refractive indexes of the optical lens 7, the planarization layer 5, and the inter-pixel wall 6 satisfy the following relational expression.
  • the inter-pixel walls 6 have a higher light transmittance than the planarization layer 5.
  • the inter-pixel walls 6 are formed of a colorless and transparent organic resin material, and the light transmittance of the inter-pixel walls 6 in the visible light region is 90% or more.
  • the incident light can be further reflected at the interface between the planarization layer 5 and the inter-pixel wall 6. This makes it possible to effectively suppress or prevent the incident light from spreading in the planarization layer 5 in the surface direction, and effectively suppress or prevent light leakage to the adjacent pixels 10.
  • the inter-pixel walls 6 surround each of the plurality of pixels 10 in a plan view and are formed in a lattice shape.
  • inter-pixel walls 6 are arranged in the planarization layer 5 so as to completely surround the periphery of the pixel 10, thereby effectively suppressing or preventing the spread of incident light in the planarization layer 5 in the planarization layer 5.
  • the thickness of the inter-pixel walls 6 in the same direction is the same as the thickness of the planarizing layer 5.
  • the inter-pixel walls 6 are disposed across the entire thickness direction of the planarization layer 5, so that the spread of incident light in the planarization layer 5 in the planarization layer 5 can be effectively suppressed or prevented.
  • a plurality of photoelectric conversion elements 101 that convert light into electric charges are disposed at positions corresponding to the plurality of pixels 10 on the opposite side of the optical filter 4 from the optical lens 7.
  • a pixel separation wall 3 that at least optically separates the plurality of pixels 10 is disposed between the optical filter 4 and the photoelectric conversion elements 101 at positions corresponding to the spaces between the plurality of pixels 10.
  • a pixel separation wall 3 is arranged in the path from the optical filter 4 to the photoelectric conversion element 101, so that the pixel separation wall 3 can effectively suppress or prevent light leakage to adjacent pixels 10.
  • the optical filter 4 is formed on the protective layer 2 in the same manner as in the process shown in FIG. 4 of the manufacturing method of the photodetector 1 according to the first embodiment described above (hereinafter simply referred to as the "first manufacturing method").
  • the optical filter 4 includes a first optical filter 4R, a second optical filter 4B, and a third optical filter 4G.
  • an organic resin material 61 is formed on the optical filter 4.
  • the organic resin material 61 is a colorless and transparent organic resin material.
  • the organic resin material 61 is formed into a film by a spin coating technique.
  • the organic resin material 61 is patterned using a mask 601 (see FIG. 16). After this, the mask 601 is peeled off.
  • the organic resin material 61 when the organic resin material 61 is patterned, the organic resin material 61 remains at positions corresponding to the spaces between the multiple pixels 10, and this organic resin material 61 is formed as the inter-pixel walls 6.
  • planarization layer 5 that covers the inter-pixel walls 6 is formed (see FIG. 17).
  • the planarization layer 5 is formed by using an organic resin material and a spin coating technique.
  • the planarization layer 5 is uniformly removed from the surface.
  • a dry etching technique is used to remove the planarization layer 5.
  • the planarization layer 5 is removed by an etch-back process.
  • the planarization layer 5 is removed until the surfaces of the inter-pixel walls 6 are exposed. This completes the formation of the inter-pixel walls 6 embedded in the planarization layer 5.
  • the order of forming the planarization layer 5 and the inter-pixel walls 6 in the first manufacturing method is reversed in the manufacturing method for the photodetector 1 according to the second embodiment (hereinafter simply referred to as the “second manufacturing method”).
  • the components other than those described above are the same or substantially the same as the components of the photodetector 1 according to the first embodiment. Furthermore, the steps other than those described above are the same or substantially the same as the steps of the first manufacturing method.
  • the photodetector 1 and manufacturing method according to the second embodiment can provide the same advantageous effects as those provided by the photodetector 1 and manufacturing method according to the first embodiment.
  • FIGS. 18A to 18C are cross-sectional views illustrating steps in a method for manufacturing the photodetector 1 according to the modified example of the second embodiment.
  • the method for manufacturing the photodetector 1 is as follows.
  • an organic resin material 61 is formed on the optical filter 4 in the same manner as in the process shown in FIG. 14 of the second manufacturing method described above.
  • a colorless and transparent organic resin material that can be patterned by photolithography technology is used as the organic resin material 61.
  • the organic resin material 61 is patterned using photolithography technology, and inter-pixel walls 6 are formed from the organic resin material 61.
  • the components other than those described above are the same or substantially the same as the components of the photodetector 1 according to the first embodiment. Furthermore, the steps other than those described above are the same or substantially the same as the steps of the second manufacturing method.
  • the manufacturing method of the photodetector 1 according to the modified example of the second embodiment it is possible to obtain the same advantageous effects as those obtained by the second manufacturing method. Furthermore, in the manufacturing method of the photodetector 1 according to the modified example, it is possible to eliminate the step of forming the mask 601, which corresponds to the step of the second manufacturing method shown in Fig. 15. Therefore, it is possible to reduce the number of steps in the manufacturing method of the photodetector 1.
  • FIG. 19 shows an example of a planar configuration of a plurality of pixels 10 arranged in a pixel area PA of the photodetector 1. 19, in the photodetector 1 according to the third embodiment, the opening sizes of the inter-pixel walls 6 are not uniform in plan view.
  • the opening size of the inter-pixel wall 6 surrounding the pixel 10 in which the second optical filter 4B of the optical filter 4 is disposed is formed to be large.
  • the opening size of the inter-pixel wall 6 surrounding the pixel 10 in which the first optical filter 4R is disposed is formed small.
  • the opening size of the inter-pixel wall 6 surrounding the pixel 10 in which the third optical filter 4G is disposed is formed to be intermediate between the two sizes.
  • the components other than those described above are the same or substantially the same as the components of the light detection device 1 according to the first embodiment.
  • the opening sizes of the inter-pixel walls 6 are not uniform. Therefore, the light incident on the pixel 10 can be adjusted appropriately according to the optical filter 4, so that the pixel 10 can obtain the optimal light collection efficiency according to the optical filter 4.
  • the first modified example of the third embodiment describes an example in which the opening shape of the inter-pixel wall 6 in the photodetector 1 according to the first embodiment is changed.
  • FIG. 20 shows an example of a planar configuration of a plurality of pixels 10 arranged in a pixel area PA of the photodetector 1.
  • the opening shape of the inter-pixel wall 6 is formed into a polygonal shape in a plan view.
  • the term "polygonal” is used to mean a polygon having 5 or more sides.
  • the opening shape of the inter-pixel wall 6 is formed in an octagonal shape.
  • the components other than those described above are the same or substantially the same as the components of the light detection device 1 according to the first embodiment.
  • the opening shape of the inter-pixel walls 6 is formed in a polygonal shape rather than a rectangular shape. This allows for greater freedom in designing and manufacturing the inter-pixel walls 6.
  • the second modification of the third embodiment describes an example in which the opening shape of the inter-pixel wall 6 in the photodetector 1 according to the first embodiment is changed.
  • FIG. 21 shows an example of a planar configuration of a plurality of pixels 10 arranged in a pixel area PA of the photodetector 1.
  • the opening shape of the inter-pixel wall 6 is formed in a circular shape in a plan view.
  • the opening shape of the inter-pixel wall 6 may be formed in an elliptical shape.
  • the components other than those described above are the same or substantially the same as the components of the light detection device 1 according to the first embodiment.
  • the opening shape of the inter-pixel walls 6 is formed to be circular or elliptical rather than rectangular. This improves the degree of freedom in designing and manufacturing the inter-pixel walls 6.
  • FIG. 22 shows an example of a vertical cross-sectional configuration of a plurality of pixels 10 arranged in a pixel area PA of the photodetector 1 and an optical black area OB. 22
  • the thickness of the inter-pixel walls 6 in the same direction is formed to be thinner than the thickness of the planarization layer 5. If the inter-pixel walls 6 are disposed in part of the thickness of the planarization layer 5, it is possible to effectively suppress or prevent the spread of incident light in the surface direction of the planarization layer 5.
  • the components other than those described above are the same or substantially the same as the components of the light detection device 1 according to the first embodiment.
  • FIG. 23 shows an example of a vertical cross-sectional configuration of a plurality of pixels 10 arranged in a pixel area PA of the photodetector 1 and an optical black area OB.
  • the thickness of the pixel walls 6 in the same direction is thicker than the thickness of the planarization layer 5, and protrudes from the planarization layer 5 toward the optical lens 7.
  • the inter-pixel walls 6 are disposed on the planarization layer 5 and extend from the planarization layer 5 into a portion of the optical lens 7.
  • the components other than those described above are the same or substantially the same as the components of the light detection device 1 according to the first embodiment.
  • the inter-pixel walls 6 are formed thick and protrude from the planarization layer 5 toward the optical lens 7. Therefore, even in a portion of the optical lens 7, light leakage between multiple pixels 10 can be effectively suppressed or prevented.
  • the sixth embodiment is an example in which the cross-sectional shape of the inter-pixel walls 6 in the photodetector 1 according to the first embodiment is changed, and is an application example of the photodetector 1 according to the fifth embodiment.
  • FIG. 24 shows an example of a vertical cross-sectional configuration of a plurality of pixels 10 arranged in a pixel area PA of the photodetector 1 and an optical black area OB.
  • the thickness of the inter-pixel walls 6 in the same direction is formed to be thicker than the thickness of the planarization layer 5, protruding from the planarization layer 5 toward the optical lens 7.
  • the inter-pixel walls 6 further penetrate the optical lens 7 and the low refractive index resin film 81, and extend to the anti-reflection film 82.
  • the components other than those described above are the same or substantially the same as the components of the light detection device 1 according to the first or fifth embodiment.
  • the photodetector 1 according to the sixth embodiment can provide the same effects as those provided by the photodetector 1 according to the fifth embodiment.
  • the inter-pixel walls 6 are formed so as to penetrate from the planarization layer 5 through the optical lens 7 and the low-refractive resin film 81, and reach the anti-reflection film 82. Therefore, it is possible to effectively suppress or prevent light leakage between multiple pixels 10 in the path of incident light from the planarization layer 5 to the low-refractive resin film 81.
  • FIG. 25 shows an example of a vertical cross-sectional configuration of a plurality of pixels 10 arranged in a pixel area PA of the photodetector 1 and an optical black area OB.
  • an inter-pixel wall 6 has a sidewall surface (tapered surface) 6S that is inclined with respect to the thickness direction (arrow Z direction) in a side view.
  • the cross-sectional shape of the inter-pixel wall 6 is trapezoidal, with the width dimension of the top surface on the optical lens 7 side being smaller than the width dimension of the bottom surface on the optical filter 4 side.
  • the side wall surface 6S is formed at an inclination angle of, for example, 60 degrees or more and less than 90 degrees with respect to the bottom surface of the inter-pixel wall 6.
  • the components other than those described above are the same or substantially the same as the components of the light detection device 1 according to the first embodiment.
  • the inter-pixel wall 6 has a side wall surface 6S.
  • the reflection angle of the incident light at the interface between the planarization layer 5 and the side wall surface 6S of the inter-pixel wall 6 is adjusted.
  • the reflected incident light is more easily collected on the photoelectric conversion element 101. Therefore, the efficiency of collecting incident light can be improved in the multiple pixels 10.
  • the eighth embodiment is an example in which the cross-sectional shape of the inter-pixel walls 6 in the photodetector 1 according to the first embodiment is changed, and is an application example of the photodetector 1 according to the seventh embodiment.
  • FIG. 26 shows an example of a vertical cross-sectional configuration of a plurality of pixels 10 arranged in a pixel area PA of the photodetector 1 and an optical black area OB.
  • an inter-pixel wall 6 has a sidewall surface (tapered surface) 6S that is inclined with respect to the thickness direction (arrow Z direction) in a side view.
  • the cross-sectional shape of the inter-pixel walls 6 has a structure opposite to that of the inter-pixel walls 6 of the photodetector 1 according to the seventh embodiment, and is formed into an inverted trapezoid shape in which the width dimension of the top surface on the optical lens 7 side is larger than the width dimension of the bottom surface on the optical filter 4 side.
  • the amount of etching in the lateral (horizontal) direction is adjusted to form the cross-sectional shape of the opening 5H in the planarization layer 5 into a trapezoidal shape.
  • the inter-pixel wall 6 is embedded in the opening 5H. This forms the inter-pixel wall 6 having sidewall surfaces 6S.
  • the components other than those described above are the same or substantially the same as the components of the light detection device 1 according to the seventh embodiment.
  • FIG. 27 shows an example of a vertical cross-sectional configuration of a plurality of pixels 10 arranged in a pixel area PA of the photodetector 1 and an optical black area OB.
  • the planarization layer 5 in the pixel area PA is formed into a lens shape that focuses light by curving the surface on the optical lens 7 side toward the optical lens 7 when viewed from the side.
  • the planarization layer 5 is formed in the same manner as in the steps of the first manufacturing method shown in Figures 11 and 12. That is, a mask having a lens shape is formed on the planarization layer 5, and the planarization layer 5 is patterned using this mask.
  • the inter-pixel walls 6 are disposed on the planarizing layer 5 at positions corresponding to the spaces between the pixels 10 .
  • the components other than those described above are the same or substantially the same as the components of the light detection device 1 according to the first embodiment.
  • the planarization layer 5 is formed in a lens shape. Therefore, when the incident light reaches the planarization layer 5, the incident light is focused at the center of the pixel 10, so that color mixing between multiple pixels 10 can be effectively suppressed or prevented.
  • FIG. 28 shows an example of a vertical cross-sectional configuration of a plurality of pixels 10 arranged in a pixel area PA of the photodetector 1 and an optical black area OB.
  • pixel separating walls 3 are provided on the optical filter 4 at positions corresponding to the spaces between the plurality of pixels 10 .
  • the pixel separation wall 3 is disposed so as to overlap with the inter-pixel wall 6 in a plan view.
  • the pixel separation wall 3 is disposed in a part of the optical filter 4 in the thickness direction, specifically, on the photoelectric conversion element 101 side.
  • the pixel separating wall 3 is formed of a metal film or a metal oxide film, similar to the pixel separating wall 3 of the photodetector 1 according to the first embodiment.
  • the protective layer 2 is not provided between the substrate 100 and the optical filter 4 .
  • the components other than those described above are the same or substantially the same as the components of the light detection device 1 according to the first embodiment.
  • a pixel separation wall 3 is disposed on the optical filter 4.
  • the protective layer 2 can be omitted, and the path of the incident light from the optical lens 7 to the photoelectric conversion element 101 is shortened. This makes it possible to more effectively suppress or prevent color mixing between multiple pixels 10.
  • FIG. 29 shows an example of a vertical cross-sectional configuration of a plurality of pixels 10 arranged in a pixel area PA of the photodetector 1 and an optical black area OB.
  • the pixel separation wall 3 of the photodetection device 1 of the 10th embodiment is configured with a laminated structure of two or more layers, including a first pixel separation wall 301 and a second pixel separation wall 302 laminated on the first pixel separation wall 301.
  • the pixel separation wall 3 is disposed on the optical filter 4 at a position corresponding to the gap between the pixels 10.
  • the pixel separation wall 3 is disposed so as to overlap the inter-pixel wall 6 in a planar view.
  • the first pixel separation wall 301 and the second pixel separation wall 302 are each formed to have the same planar shape and are disposed so as to overlap at the same position.
  • the pixel separation wall 3 is disposed over the entire area in the thickness direction of the optical filter 4.
  • the thickness (height) of the pixel separation wall 3 in the same direction is the same as the thickness of the optical filter 4.
  • the first pixel separation wall 301 is formed of a metal film or a metal oxide film, similar to the pixel separation wall 3 of the photodetector 1 according to the tenth embodiment.
  • the second pixel separation wall 302 is formed of one or more materials selected from an inorganic material and an organic resin material having a refractive index lower than that of the optical filter 4.
  • the second pixel separation wall 302 can be formed of a laminated structure of an inorganic material and an organic resin material.
  • the refractive index of the second pixel separation wall 302 is, for example, 1.6 or less.
  • inorganic materials that can be used include SiO2 , silicon nitride (SiN), etc.
  • the organic resin material may also include a filler for adjusting the refractive index.
  • the components other than those described above are the same or substantially the same as the components of the light detection device 1 according to the first or tenth embodiment.
  • the pixel separation wall 3 is disposed across the entire thickness of the optical filter 4. This makes it possible to more effectively suppress or prevent color mixing between multiple pixels 10.
  • FIG. 30 shows an example of a vertical cross-sectional configuration of a plurality of pixels 10 arranged in a pixel area PA of the photodetector 1 and an optical black area OB.
  • a pixel separation wall 3 is arranged on the optical filter 4 at a position corresponding to the gap between multiple pixels 10, similar to the pixel separation wall 3 of the photodetection device 1 of the 11th embodiment.
  • the pixel separation wall 3 is configured with a laminated structure including a first pixel separation wall 301 and a second pixel separation wall 302. The thickness of the first pixel separation wall 301 is thinner than the thickness of the second pixel separation wall 302, and the first pixel separation wall 301 is made thin.
  • the components other than those described above are the same or substantially the same as the components of the light detection device 1 according to the eleventh embodiment.
  • the thirteenth embodiment describes an example in which the cross-sectional structure of the inter-pixel walls 6 in the photodetector 1 according to the tenth embodiment is changed.
  • FIG. 31 shows an example of a vertical cross-sectional configuration of a plurality of pixels 10 arranged in a pixel area PA of the photodetector 1 and an optical black area OB.
  • the inter-pixel wall 6 on the optical filter 4 side extends in the thickness direction of the optical filter 4, up to the pixel separating wall 3.
  • the inter-pixel wall 6 is in contact with the pixel separating wall 3.
  • the components other than those described above are the same or substantially the same as the components of the light detection device 1 according to the tenth embodiment.
  • the inter-pixel walls 6 extend all the way to the pixel separation walls 3. This makes it possible to effectively suppress or prevent the incident light from spreading in the planar direction in the planarization layer 5 and the optical filter 4. In other words, it is possible to more effectively suppress or prevent color mixing between multiple pixels 10.
  • FIG. 32 shows an example of a vertical cross-sectional configuration of a plurality of pixels 10 arranged in a pixel area PA of the photodetector 1 and an optical black area OB.
  • the pixel separation wall 3 extends toward the planarization layer 5 side and is disposed across the thickness of the planarization layer 5 .
  • the pixel separation wall 3 is formed of a laminated structure of a first pixel separation wall 301 and a second pixel separation wall 302.
  • the second pixel separation wall 302 is formed of the same material as the pixel separation wall 6 of the photodetector 1 according to the first embodiment, etc. Therefore, the second pixel separation wall 302 is protruded up to the planarization layer 5 , and the second pixel separation wall 302 disposed within the planarization layer 5 is used as the inter-pixel wall 6 .
  • the components other than those described above are the same or substantially the same as the components of the light detection device 1 according to the twelfth embodiment.
  • the optical lens 7 side of the pixel separation wall 3 protrudes into the planarization layer 5, and a part of this pixel separation wall 3 is used as the inter-pixel wall 6. This allows the number of components to be reduced, and the photodetector 1 can be easily constructed.
  • the fifteenth embodiment describes an example in which the cross-sectional structure of the inter-pixel walls 6 associated with pupil correction is changed in the photodetector 1 according to the first embodiment.
  • FIG. 33 shows an example of a vertical cross-sectional configuration of a plurality of pixels 10 arranged in a pixel area PA of the photodetector 1 and an optical black area OB. As shown in FIG. 33, pupil correction is performed in the photodetector 1 according to the fifteenth embodiment.
  • the optical lens 7, the inter-pixel wall 6, the optical filter 4, and the pixel separation wall 3 are each continuously shifted toward the central portion within the angle of view.
  • the inter-pixel walls 6 are shifted toward the central part.
  • the components other than those described above are the same or substantially the same as the components of the light detection device 1 according to the first embodiment.
  • pupil correction is performed as shown in FIG. 33. Therefore, in the photodetection device 1, it is possible to reduce the output variation from the multiple pixels 10 in the pixel area PA, and improve the shading characteristics.
  • the sixteenth embodiment describes an example in which the cross-sectional structures of the optical filter 4, the planarizing layer 5, and the inter-pixel wall 6 in the photodetector 1 according to the first embodiment are changed.
  • FIG. 34 shows an example of a vertical cross-sectional configuration of a plurality of pixels 10 arranged in a pixel area PA of the photodetector 1 and an optical black area OB.
  • the first optical filter 4R, the second optical filter 4B, and the third optical filter 4G of the optical filter 4 are actually rounded at the corners on the optical lens 7 side.
  • a stepped shape is generated on the surface of the optical filter 4, as described above.
  • Such a step shape is alleviated by the planarization layer 5 formed on the optical filter 4.
  • the thickness of the inter-pixel wall 6 varies partially, ultimately, the surface of the inter-pixel wall 6 on the optical lens 7 side is made to coincide with the surface of the planarization layer 5.
  • the components other than those described above are the same or substantially the same as the components of the light detection device 1 according to the first embodiment.
  • the seventeenth embodiment describes an example in which the cross-sectional structures of the optical filter 4, the planarizing layer 5, and the inter-pixel wall 6 in the photodetector 1 according to the first embodiment are changed.
  • FIG. 35 shows an example of a vertical cross-sectional configuration of a plurality of pixels 10 arranged in a pixel area PA of the photodetector 1 and an optical black area OB.
  • the first optical filter 4R, the second optical filter 4B, and the third optical filter 4G of the optical filter 4 are actually formed to different thicknesses. That is, a stepped shape is generated on the surface of the optical filter 4, as described above. Such a step shape is alleviated by the planarization layer 5 formed on the optical filter 4.
  • the thickness of the inter-pixel wall 6 varies partially, ultimately, the surface of the inter-pixel wall 6 on the optical lens 7 side is made to coincide with the surface of the planarization layer 5.
  • the components other than those described above are the same or substantially the same as the components of the light detection device 1 according to the first embodiment.
  • FIG. 36 shows an example of a vertical cross-sectional configuration of a plurality of pixels 10 arranged in a pixel area PA of the photodetector 1 and an optical black area OB.
  • a stopper layer 605 is disposed between the optical filter 4 and the planarizing layer 5 and between the pixels 6 .
  • the stopper layer 605 has a different etching selectivity with respect to the planarization layer 5 or inter-pixel walls 6.
  • the stopper layer 605 is used as an etching stopper when patterning the planarization layer 5.
  • the stopper layer 605 is used as an etching stopper when patterning the inter-pixel walls 6.
  • the stopper layer 605 is formed of, for example, one or more materials selected from an inorganic material, a metal oxide material, and an organic resin material containing the metal oxide material as a filler.
  • inorganic materials for example, SiO 2 , SiN, silicon oxynitride (SiON), etc. can be practically used.
  • metal oxide materials for example, titanium oxide (TiO 2 ), tantalum oxide (TaO 2 ), etc. can be practically used.
  • the components other than those described above are the same or substantially the same as the components of the light detection device 1 according to the first embodiment.
  • a stopper layer 605 is disposed between the optical filter 4 and the planarization layer 5 and between the pixel walls 6. This makes it possible to effectively suppress or prevent variations in the thickness of the planarization layer 5 and between the pixel walls 6.
  • the 19th embodiment is an example in which the photodetector 1 according to the 17th embodiment and the photodetector 1 according to the 18th embodiment are combined, and an example in which a cross-sectional structure optimized for processing the planarization layer 5 and the inter-pixel wall 6 is described.
  • FIG. 37 shows an example of a vertical cross-sectional configuration of a plurality of pixels 10 arranged in a pixel area PA of the photodetector 1 and an optical black area OB.
  • a stopper layer 605 is disposed between the optical filter 4 and the planarizing layer 5 as well as between the pixel walls 6.
  • the optical filter 4 has a stepped shape similarly to the photodetector 1 according to the 17th embodiment. Even in the photodetector 1 configured in this manner, since the stopper layer 605 is provided, variations in the thickness of the planarizing layer 5 and the inter-pixel walls 6 can be effectively suppressed or prevented.
  • the components other than those described above are the same or substantially the same as the components of the light detection device 1 according to the 17th or 18th embodiment.
  • the twentieth embodiment is an application example of the photodetector 1 according to the eighteenth embodiment, and describes an example in which a cross-sectional structure is changed to an optimum one for processing the planarization layer 5 and the inter-pixel wall 6.
  • FIG. 38 shows an example of a vertical cross-sectional configuration of a plurality of pixels 10 arranged in a pixel area PA of the photodetector 1 and an optical black area OB.
  • a stopper layer 605 is disposed between the optical filter 4 and the planarizing layer 5 and between the pixel walls 6, and a stopper layer 606 is further disposed between the planarizing layer 5 and the optical lens 7 and between the pixel walls 6.
  • the stopper layer 606 is formed of, for example, the same material as the stopper layer 605.
  • the stopper layer 605 has the same function as the stopper layer 605 of the photodetector 1 according to the eighteenth embodiment.
  • the stopper layer 606 has a different etching selectivity with respect to the planarization layer 5.
  • the stopper layer 606 effectively suppresses or prevents over-etching of the planarization layer 5 in the etch-back process of the organic resin material 61. That is, the stopper layer 606 effectively suppresses or prevents variations in the thickness of the planarization layer 5, and as a result, it is possible to effectively suppress or prevent variations in the thickness of the inter-pixel walls 6.
  • the components other than those described above are the same or substantially the same as the components of the light detection device 1 according to the 18th embodiment.
  • the twenty-first embodiment is an application example of the photodetector 1 according to the twentieth embodiment, and describes an example in which a cross-sectional structure is changed to an optimum one for processing the planarization layer 5 and the inter-pixel wall 6.
  • FIG. 39 shows an example of a vertical cross-sectional configuration of a plurality of pixels 10 arranged in a pixel area PA of the photodetector 1 and an optical black area OB.
  • a stopper layer 606 is provided between the planarizing layer 5 and the optical lens 7 and between the pixel walls 6. No stopper layer 605 is provided.
  • the components other than those described above are the same or substantially the same as the components of the light detection device 1 according to the 20th embodiment.
  • the twenty-second embodiment is an application example of the photodetector 1 according to the first embodiment, and describes an example in which the structure of the inter-pixel walls 6 in the optical black region OB is changed.
  • FIG. 40 shows an example of a vertical cross-sectional configuration of a plurality of pixels 10 arranged in a pixel area PA of the photodetector 1 and an optical black area OB.
  • the inter-pixel walls 6 are not disposed in the optical black area OB.
  • the inter-pixel walls 6 have a low refractive index and therefore a high reflectance. In other words, the deterioration of flare can be improved.
  • inter-pixel walls 6 are arranged in the optical black area OB in the same arrangement layout as the inter-pixel walls 6 arranged in the pixel area (effective pixel area) PA.
  • the inter-pixel walls 6 are sparsely arranged in a part of the optical black area OB.
  • a dummy pixel area (not shown) is disposed, in which dummy pixels are arranged and have the same structure as the pixels 10 in the pixel area PA.
  • inter-pixel walls 6 are disposed in the same arrangement layout as the inter-pixel walls 6 disposed in the pixel area PA.
  • the components other than those described above are the same or substantially the same as the components of the light detection device 1 according to the first embodiment.
  • inter-pixel walls 6 are arranged in the optical black area OB in the same arrangement layout as the inter-pixel walls 6 arranged in the pixel area PA. Therefore, the inter-pixel walls 6 are sparsely arranged in the optical black area OB, which effectively suppresses or prevents the amount of reflection of incident light and improves flare deterioration.
  • the twenty-third embodiment is an application example of the photodetector 1 according to the twenty-second embodiment, and describes an example in which the structure of the inter-pixel walls 6 in the optical black region OB is changed.
  • FIG. 41 shows an example of a vertical cross-sectional configuration of a plurality of pixels 10 arranged in a pixel area PA of the photodetector 1 and an optical black area OB. As shown in FIG. 41, in the optical black region OB, inter-pixel walls 6 are disposed over almost the entire area.
  • the components other than those described above are the same or substantially the same as the components of the light detection device 1 according to the first embodiment.
  • the twenty-fourth embodiment will describe a first example in which a solid-state imaging device serving as a photodetector 1 is applied to a stacked solid-state imaging device or a back-illuminated solid-state imaging device.
  • FIG. 42 is a diagram showing an overview of an example configuration of a stacked solid-state imaging device to which the technology disclosed herein can be applied.
  • a in Fig. 42 shows an example of the schematic configuration of a non-stacked solid-state imaging device.
  • the solid-state imaging device 23010 has one die (semiconductor substrate) 23011.
  • This die 23011 is equipped with a pixel region 23012 in which pixels are arranged in an array, a control circuit 23013 that drives the pixels and performs various other controls, and a logic circuit 23014 for signal processing.
  • a pixel region 23012 and a control circuit 23013 are mounted on the sensor die 23021, and a logic circuit 23014 including a signal processing circuit that performs signal processing is mounted on the logic die 23024.
  • the sensor die 23021 is equipped with a pixel region 23012
  • the logic die 23024 is equipped with a control circuit 23013 and a logic circuit 23014.
  • FIG. 43 is a cross-sectional view showing a first configuration example of a stacked solid-state imaging device 23020.
  • a PD photodiode
  • an FD floating diffusion
  • a Tr MOSFET
  • a Tr that will become a control circuit 23013 are formed, which constitute pixels that will become the pixel region 23012.
  • a wiring layer 23101 having multiple layers, three layers in this example, of wiring 23110 is formed.
  • the control circuit 23013 (or the Tr that will become the control circuit) can be configured in the logic die 23024, not in the sensor die 23021.
  • Tr constituting the logic circuit 23014 is formed on the logic die 23024. Furthermore, a wiring layer 23161 having multiple layers, three layers in this example, of wiring 23170 is formed on the logic die 23024. Furthermore, a connection hole 23171 having an insulating film 23172 formed on the inner wall surface is formed on the logic die 23024, and a connection conductor 23173 connected to the wiring 23170 etc. is embedded in the connection hole 23171.
  • the sensor die 23021 and the logic die 23024 are bonded together so that their wiring layers 23101 and 23161 face each other, thereby forming a stacked solid-state imaging device 23020 in which the sensor die 23021 and the logic die 23024 are stacked.
  • a film 23191 such as a protective film is formed on the surface where the sensor die 23021 and the logic die 23024 are bonded together.
  • a connection hole 23111 is formed in the sensor die 23021, penetrating the sensor die 23021 from the back side (the side where light is incident on the PD) (upper side) of the sensor die 23021 to reach the top layer wiring 23170 of the logic die 23024. Furthermore, a connection hole 23121 is formed in the sensor die 23021, close to the connection hole 23111, from the back side of the sensor die 23021 to reach the first layer wiring 23110. An insulating film 23112 is formed on the inner wall surface of the connection hole 23111, and an insulating film 23122 is formed on the inner wall surface of the connection hole 23121. Then, connection conductors 23113 and 23123 are embedded in the connection holes 23111 and 23121, respectively.
  • connection conductor 23113 and the connection conductor 23123 are electrically connected on the back side of the sensor die 23021, so that the sensor die 23021 and the logic die 23024 are electrically connected via the wiring layer 23101, the connection hole 23121, the connection hole 23111, and the wiring layer 23161.
  • FIG. 44 is a cross-sectional view showing a second configuration example of a stacked solid-state imaging device 23020.
  • the sensor die 23021 (wiring layer 23101 (wiring 23110)) and the logic die 23024 (wiring layer 23161 (wiring 23170)) are electrically connected by one connection hole 23211 formed in the sensor die 23021.
  • connection hole 23211 is formed so as to penetrate the sensor die 23021 from the back side of the sensor die 23021, reach the wiring 23170 in the top layer of the logic die 23024, and reach the wiring 23110 in the top layer of the sensor die 23021.
  • An insulating film 23212 is formed on the inner wall surface of the connection hole 23211, and a connection conductor 23213 is embedded in the connection hole 23211.
  • the sensor die 23021 and the logic die 23024 are electrically connected by two connection holes 23111 and 23121, but in FIG. 44, the sensor die 23021 and the logic die 23024 are electrically connected by one connection hole 23211.
  • FIG. 45 is a cross-sectional view showing another configuration example of a stacked solid-state imaging device to which the technology according to the present disclosure can be applied.
  • the solid-state imaging device 23401 has a three-layer stacked structure in which three dies are stacked: a sensor die 23411, a logic die 23412, and a memory die 23413.
  • the memory die 23413 has, for example, a memory circuit that stores data that is temporarily required for signal processing performed by the logic die 23412.
  • the logic die 23412 and memory die 23413 are stacked in that order below the sensor die 23411, but the logic die 23412 and memory die 23413 can be stacked below the sensor die 23411 in the reverse order, i.e., the memory die 23413 and the logic die 23412.
  • the sensor die 23411 is formed with the PD that serves as the photoelectric conversion unit of the pixel and the source/drain region of the pixel Tr.
  • a gate electrode is formed around the PD with a gate insulating film interposed between them, and pixel Tr23421 and pixel Tr23422 are formed by the gate electrode and the paired source/drain regions.
  • the pixel Tr23421 adjacent to the PD is a transfer Tr, and one of the pair of source/drain regions constituting the pixel Tr23421 is an FD.
  • connection hole is formed in the interlayer insulating film.
  • a connection conductor 23431 that connects to pixel Tr 23421 and pixel Tr 23422 is formed in the connection hole.
  • a wiring layer 23433 having multiple layers of wiring 23432 connected to each connection conductor 23431 is formed on the sensor die 23411.
  • the solid-state imaging device 24400 has a pixel region 24401 in which pixels are arranged in a two-dimensional array.
  • the pixel region 24401 is configured with a total of eight pixels, 2 pixels horizontally and 4 pixels vertically, as shared pixel units 24410, which are arranged in a two-dimensional array.
  • the shared pixel unit 24410 which shares eight pixels (2 pixels horizontally and 4 pixels vertically), has a first light receiving portion 24421 and a second light receiving portion 24422.
  • the first light receiving portion 24421 and the second light receiving portion 24422 are arranged in the vertical direction (y direction) within the shared pixel unit 24410.
  • the first light receiving unit 24421 has PDs 24441 1 , 24441 2 , 24441 3 , and 24441 4 arranged in a matrix of 2 pixels horizontally by 2 pixels vertically, four transfer Tr 24451 for each of the PDs 24441 1 to 24441 4 , and an FD 24452 shared by the PDs 24441 1 to 24441 4.
  • the FD 24452 is disposed in the center of the PDs 24441 1 to 24441 4 .
  • the second light receiving unit 24422 has PDs 244415 , 244416 , 244417 , and 244418 arranged in a matrix of 2 pixels horizontally by 2 pixels vertically, four transfer Tr's 24461 for each of the PD's 244415 to 244418 , and an FD 24462 shared by the PD's 244415 to 244418.
  • the FD 24462 is disposed in the center of the PD's 244415 to 244418 .
  • the transfer Tr 24451 has a gate 24451G arranged between the PD 24441 i and the FD 24452 for the transfer Tr 24451, and operates in response to a voltage applied to the gate 24451G.
  • the transfer Tr 24461 has a gate 24461G arranged between the PD 24441 i and the FD 24462 for the transfer Tr 24461, and operates in response to a voltage applied to the gate 24461G.
  • the shared pixel unit 24410 has a first Tr group 24423 and a second Tr group 24424.
  • a reset Tr 24452, an amplification Tr 24453, and a selection Tr 24454 are arranged separately as shared Tr shared by the eight pixels of the shared pixel unit 24410.
  • the amplification Tr 24453 and the selection Tr 24454 are arranged in the first Tr group 24423, and the reset Tr 24452 is arranged in the second Tr group 24424.
  • each of the reset Tr 25051, the amplification Tr 24052, and the selection Tr 24053 can be composed of, for example, a plurality of transistors. Also, for example, if the selection Tr24053 is configured with multiple transistors, each of the multiple transistors as the selection Tr24053 can be connected to a separate vertical signal line VSL.
  • the first Tr group 24423 is disposed between the first light receiving unit 24421 and the second light receiving unit 24422.
  • the second Tr group 24424 is disposed in the peripheral region of the second light receiving unit 24422, in the region opposite the side of the second light receiving unit 24422 where the first Tr group 24423 is disposed.
  • the reset Tr 24452, the amplification Tr 24453 and the selection Tr 24454 are each composed of a pair of source/drain regions S/D and a gate G.
  • One of the pair of source/drain regions S/D functions as a source and the other functions as a drain.
  • the pairs of source/drain regions S/D and gates G constituting the reset Tr 24452, the amplification Tr 24453, and the selection Tr 24454 are arranged in the horizontal direction (x direction).
  • the gate G constituting the reset Tr 24452 is arranged in a region that is substantially opposite the PD 244418 at the lower right of the second light receiving unit 24422 in the vertical direction (y direction).
  • a first well contact 24431 and a second well contact 24432 are arranged between two shared pixel units 24410 arranged side by side.
  • the first light receiving section 24421, the second light receiving section 24422, the first Tr group 24423, and the second Tr group 24424 are formed in a semiconductor region as a predetermined well region formed in a Si substrate, and the first well contact 24431 and the second well contact 24432 are contacts that electrically connect the predetermined well region to the internal wiring of the solid-state imaging device 24400.
  • the first well contact 24431 is provided between the first Tr groups 24423 of the two shared pixel units 24410 arranged side by side
  • the second well contact 24432 is provided between the second Tr groups 24424 of the two shared pixel units 24410 arranged side by side.
  • each part in the sharing pixel unit 24410 is electrically connected so as to satisfy a connection relationship conforming to an equivalent circuit of a four-pixel sharing unit. ⁇ 28.
  • Twenty-eighth embodiment> The twenty-eighth embodiment will describe a second example in which the photodetector 1 according to the first embodiment is applied to a shared structure in which a single pixel circuit is shared by a plurality of pixels 10.
  • FIG. 47 is a plan view showing a second example configuration of a solid-state imaging device that shares multiple pixels to which the technology disclosed herein can be applied.
  • the solid-state imaging device 25400 has a pixel region 25401 in which pixels are arranged in a two-dimensional array.
  • the pixel region 25401 is configured such that a total of four pixels, one pixel horizontally and four pixels vertically, are defined as a shared pixel unit 24510, and the shared pixel units 24510 are arranged in a two-dimensional array.
  • the pixel region 25401 has a first well contact 24431 and a second well contact 24432 in addition to the shared pixel unit 24510.
  • the pixel region 25401 is common to the pixel region 24401 of FIG. 46 in that it has a first well contact 24431 and a second well contact 24432.
  • the pixel region 25401 differs from the pixel region 24401 of FIG. 46 in that it has a shared pixel unit 24510 of 1 pixel horizontal by 4 pixels vertical instead of the shared pixel unit 24410 of 2 pixels horizontal by 4 pixels vertical in FIG. 46.
  • the shared pixel unit 24510 has a first light receiving section 24521 and a second light receiving section 24522, and a first Tr group 24423 and a second Tr group 24424.
  • the shared pixel unit 24510 has the same structure as the shared pixel unit 24410 in FIG. 46 in that it has the first Tr group 24423 and the second Tr group 24424.
  • the shared pixel unit 24510 differs from the shared pixel unit 24410 in FIG. 46 in that it has the first light receiving section 24521 and the second light receiving section 24522 instead of the first light receiving section 24421 and the second light receiving section 24422, respectively.
  • the first light receiving unit 24521 has PDs 24441-1 and 24441-3 arranged in a matrix of 1 pixel horizontal by 2 pixels vertical, two transfer Tr's 24451 for each of the PDs 24441-1 and 24441-3 , and an FD 24452.
  • the first light receiving unit 24521 is common to the first light receiving unit 24421 in FIG. 46 in that it has PDs 24441-1 and 24441-3 , two transfer Tr's 24451 for each of the PDs 24441-1 and 24441-3 , and an FD 24452.
  • the first light receiving unit 24521 differs from the first light receiving unit 24421 in FIG. 46 in that it does not have the PDs 24441-2 and 24441-4 and the two transfer Tr's 24451 corresponding to the PDs 24441-2 and 24441-4 , respectively.
  • the second light receiving unit 24522 has PDs 244415 and 244417 arranged in a matrix of 1 pixel horizontal by 2 pixels vertical, two transfer Tr's 24461 for each of the PDs 244415 and 244417 , and an FD 24462.
  • the second light receiving unit 24522 is common to the second light receiving unit 24422 in FIG. 46 in that it has PDs 244415 and 244417, two transfer Tr's 24461 for each of the PDs 244415 and 244417 , and an FD 24462.
  • the second light receiving unit 24522 differs from the second light receiving unit 24422 in FIG. 46 in that it does not have the PDs 24441-6 and 24441-8 and the two transfer Tr's 24461 corresponding to the PDs 24441-6 and 24441-8 , respectively.
  • the gate G constituting the reset Tr 24452 is disposed in a region substantially facing the left side of the PD 244417 of the second light receiving portion 24522 in the vertical direction (y direction).
  • each part in the shared pixel unit 24510 is electrically connected so that the connection relationship conforms to the equivalent circuit of a four-pixel shared pixel.
  • the technology according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure may be realized as a device mounted on any type of moving body such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility device, an airplane, a drone, a ship, or a robot.
  • FIG. 48 is a block diagram showing a schematic configuration example of a vehicle control system, which is an example of a mobile object control system to which the technology disclosed herein can be applied.
  • the vehicle control system 12000 includes a plurality of electronic control units connected via a communication network 12001.
  • the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, an outside vehicle information detection unit 12030, an inside vehicle information detection unit 12040, and an integrated control unit 12050.
  • Also shown as functional components of the integrated control unit 12050 are a microcomputer 12051, an audio/video output unit 12052, and an in-vehicle network I/F (Interface) 12053.
  • the drive system control unit 12010 controls the operation of devices related to the drive system of the vehicle according to various programs.
  • the drive system control unit 12010 functions as a control device for a drive force generating device for generating the drive force of the vehicle, such as an internal combustion engine or a drive motor, a drive force transmission mechanism for transmitting the drive force to the wheels, a steering mechanism for adjusting the steering angle of the vehicle, and a braking device for generating a braking force for the vehicle.
  • the body system control unit 12020 controls the operation of various devices installed in the vehicle body according to various programs.
  • the body system control unit 12020 functions as a control device for a keyless entry system, a smart key system, a power window device, or various lamps such as headlamps, tail lamps, brake lamps, turn signals, and fog lamps.
  • radio waves or signals from various switches transmitted from a portable device that replaces a key can be input to the body system control unit 12020.
  • the body system control unit 12020 accepts the input of these radio waves or signals and controls the vehicle's door lock device, power window device, lamps, etc.
  • the outside-vehicle information detection unit 12030 detects information outside the vehicle equipped with the vehicle control system 12000.
  • the image capturing unit 12031 is connected to the outside-vehicle information detection unit 12030.
  • the outside-vehicle information detection unit 12030 causes the image capturing unit 12031 to capture images outside the vehicle and receives the captured images.
  • the outside-vehicle information detection unit 12030 may perform object detection processing or distance detection processing for people, cars, obstacles, signs, characters on the road surface, etc. based on the received images.
  • the imaging unit 12031 is an optical sensor that receives light and outputs an electrical signal according to the amount of light received.
  • the imaging unit 12031 can output the electrical signal as an image, or as distance measurement information.
  • the light received by the imaging unit 12031 may be visible light, or may be invisible light such as infrared light.
  • the in-vehicle information detection unit 12040 detects information inside the vehicle.
  • a driver state detection unit 12041 that detects the state of the driver is connected.
  • the driver state detection unit 12041 includes, for example, a camera that captures an image of the driver, and the in-vehicle information detection unit 12040 may calculate the driver's degree of fatigue or concentration based on the detection information input from the driver state detection unit 12041, or may determine whether the driver is dozing off.
  • the microcomputer 12051 can calculate control target values for the driving force generating device, steering mechanism, or braking device based on information inside and outside the vehicle acquired by the outside vehicle information detection unit 12030 or the inside vehicle information detection unit 12040, and output control commands to the drive system control unit 12010.
  • the microcomputer 12051 can perform cooperative control aimed at realizing the functions of an ADAS (Advanced Driver Assistance System), including vehicle collision avoidance or impact mitigation, following driving based on the distance between vehicles, maintaining vehicle speed, vehicle collision warning, or vehicle lane departure warning.
  • ADAS Advanced Driver Assistance System
  • the microcomputer 12051 can also perform cooperative control for the purpose of autonomous driving, which allows the vehicle to travel autonomously without relying on the driver's operation, by controlling the driving force generating device, steering mechanism, braking device, etc. based on information about the surroundings of the vehicle acquired by the outside vehicle information detection unit 12030 or the inside vehicle information detection unit 12040.
  • the microcomputer 12051 can also output control commands to the body system control unit 12020 based on information outside the vehicle acquired by the outside information detection unit 12030. For example, the microcomputer 12051 can control the headlamps according to the position of a preceding vehicle or an oncoming vehicle detected by the outside information detection unit 12030, and perform cooperative control aimed at preventing glare, such as switching from high beams to low beams.
  • the audio/image output unit 12052 transmits at least one output signal of audio and image to an output device capable of visually or audibly notifying the occupants of the vehicle or the outside of the vehicle of information.
  • an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are exemplified as output devices.
  • the display unit 12062 may include, for example, at least one of an on-board display and a head-up display.
  • FIG. 49 shows an example of the installation position of the imaging unit 12031.
  • the imaging unit 12031 includes imaging units 12101, 12102, 12103, 12104, and 12105.
  • the imaging units 12101, 12102, 12103, 12104, and 12105 are provided, for example, at the front nose, side mirrors, rear bumper, back door, and upper part of the windshield inside the vehicle cabin of the vehicle 12100.
  • the imaging unit 12101 provided at the front nose and the imaging unit 12105 provided at the upper part of the windshield inside the vehicle cabin mainly acquire images of the front of the vehicle 12100.
  • the imaging units 12102 and 12103 provided at the side mirrors mainly acquire images of the sides of the vehicle 12100.
  • the imaging unit 12104 provided at the rear bumper or back door mainly acquires images of the rear of the vehicle 12100.
  • the imaging unit 12105 provided at the upper part of the windshield inside the vehicle cabin is mainly used to detect leading vehicles, pedestrians, obstacles, traffic lights, traffic signs, lanes, etc.
  • FIG. 49 shows an example of the imaging ranges of the imaging units 12101 to 12104.
  • Imaging range 12111 indicates the imaging range of the imaging unit 12101 provided on the front nose
  • imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided on the side mirrors, respectively
  • imaging range 12114 indicates the imaging range of the imaging unit 12104 provided on the rear bumper or back door.
  • an overhead image of the vehicle 12100 viewed from above is obtained by superimposing the image data captured by the imaging units 12101 to 12104.
  • At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information.
  • at least one of the imaging units 12101 to 12104 may be a stereo camera consisting of multiple imaging elements, or an imaging element having pixels for phase difference detection.
  • the microcomputer 12051 can obtain the distance to each solid object within the imaging ranges 12111 to 12114 and the change in this distance over time (relative speed with respect to the vehicle 12100) based on the distance information obtained from the imaging units 12101 to 12104, and can extract as a preceding vehicle, in particular, the closest solid object on the path of the vehicle 12100 that is traveling in approximately the same direction as the vehicle 12100 at a predetermined speed (e.g., 0 km/h or faster). Furthermore, the microcomputer 12051 can set the inter-vehicle distance that should be maintained in advance in front of the preceding vehicle, and perform automatic braking control (including follow-up stop control) and automatic acceleration control (including follow-up start control). In this way, cooperative control can be performed for the purpose of automatic driving, which runs autonomously without relying on the driver's operation.
  • automatic braking control including follow-up stop control
  • automatic acceleration control including follow-up start control
  • the microcomputer 12051 determines the collision risk, which indicates the risk of collision with each obstacle, and when the collision risk is equal to or exceeds a set value and there is a possibility of a collision, it can provide driving assistance for collision avoidance by outputting an alarm to the driver via the audio speaker 12061 or the display unit 12062, or by performing forced deceleration or avoidance steering via the drive system control unit 12010.
  • At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
  • the microcomputer 12051 can recognize a pedestrian by determining whether or not a pedestrian is present in the image captured by the imaging units 12101 to 12104. The recognition of such a pedestrian is performed, for example, by a procedure of extracting feature points in the image captured by the imaging units 12101 to 12104 as infrared cameras, and a procedure of performing pattern matching processing on a series of feature points that indicate the contour of an object to determine whether or not it is a pedestrian.
  • the audio/image output unit 12052 controls the display unit 12062 to superimpose a rectangular contour line for emphasis on the recognized pedestrian.
  • the audio/image output unit 12052 may also control the display unit 12062 to display an icon or the like indicating a pedestrian at a desired position.
  • the technology disclosed herein can be applied to the imaging unit 12031.
  • leakage to adjacent pixels can be effectively suppressed or prevented.
  • the technology according to the present disclosure may be applied to various products.
  • the technology according to the present disclosure may be applied to an endoscopic surgery system.
  • FIG. 50 is a block diagram showing an example of the general configuration of a system for acquiring internal patient information using a capsule endoscope to which the technology disclosed herein can be applied.
  • the internal body information acquisition system 10001 is composed of a capsule endoscope 10100 and an external control device 10200.
  • the capsule endoscope 10100 is swallowed by the patient during the examination.
  • the capsule endoscope 10100 has an imaging function and a wireless communication function, and while moving inside the inside of the organs such as the stomach and intestines by peristalsis or the like until it is naturally expelled from the patient, it sequentially captures images of the inside of the organs (hereinafter also referred to as in-vivo images) at predetermined intervals, and sequentially wirelessly transmits information about the in-vivo images to the external control device 10200 outside the body.
  • the external control device 10200 comprehensively controls the operation of the in-vivo information acquisition system 10001.
  • the external control device 10200 receives information about the in-vivo images transmitted from the capsule endoscope 10100, and generates image data for displaying the in-vivo images on a display device (not shown) based on the received information about the in-vivo images.
  • the in-vivo information acquisition system 10001 can obtain in-vivo images capturing the state inside the patient's body at any time from the time the capsule endoscope 10100 is swallowed to the time it is expelled.
  • the capsule endoscope 10100 has a capsule-shaped housing 10101, which contains a light source unit 10111, an imaging unit 10112, an image processing unit 10113, a wireless communication unit 10114, a power supply unit 10115, a power supply unit 10116, and a control unit 10117.
  • the light source unit 10111 is composed of a light source such as an LED (light emitting diode) and irradiates light onto the imaging field of view of the imaging unit 10112.
  • a light source such as an LED (light emitting diode) and irradiates light onto the imaging field of view of the imaging unit 10112.
  • the imaging unit 10112 is composed of an imaging element and an optical system consisting of multiple lenses provided in front of the imaging element. Reflected light (hereinafter referred to as observation light) of light irradiated onto the body tissue to be observed is collected by the optical system and enters the imaging element. In the imaging unit 10112, the imaging element photoelectrically converts the observation light that is incident thereon, and an image signal corresponding to the observation light is generated. The image signal generated by the imaging unit 10112 is provided to the image processing unit 10113.
  • the image processing unit 10113 is composed of processors such as a CPU (Central Processing Unit) and a GPU (Graphics Processing Unit), and performs various signal processing on the image signal generated by the imaging unit 10112.
  • the image processing unit 10113 provides the image signal that has been subjected to signal processing to the wireless communication unit 10114 as RAW data.
  • the wireless communication unit 10114 performs predetermined processing such as modulation processing on the image signal that has been subjected to signal processing by the image processing unit 10113, and transmits the image signal to the external control device 10200 via the antenna 10114A.
  • the wireless communication unit 10114 also receives a control signal related to the drive control of the capsule endoscope 10100 from the external control device 10200 via the antenna 10114A.
  • the wireless communication unit 10114 provides the control signal received from the external control device 10200 to the control unit 10117.
  • the power supply unit 10115 is composed of an antenna coil for receiving power, a power regeneration circuit that regenerates power from the current generated in the antenna coil, and a boost circuit. In the power supply unit 10115, power is generated using the principle of so-called non-contact charging.
  • the power supply unit 10116 is composed of a secondary battery, and stores the power generated by the power supply unit 10115.
  • FIG. 50 to avoid cluttering the drawing, arrows and other indications indicating the destination of the power supply from the power supply unit 10116 are omitted, but the power stored in the power supply unit 10116 is supplied to the light source unit 10111, the imaging unit 10112, the image processing unit 10113, the wireless communication unit 10114, and the control unit 10117, and can be used to drive these units.
  • the control unit 10117 is configured with a processor such as a CPU, and appropriately controls the operation of the light source unit 10111, the imaging unit 10112, the image processing unit 10113, the wireless communication unit 10114, and the power supply unit 10115 in accordance with control signals transmitted from the external control device 10200.
  • a processor such as a CPU
  • the external control device 10200 is composed of a processor such as a CPU or a GPU, or a microcomputer or a control board in which a processor and a storage element such as a memory are mounted together.
  • the external control device 10200 controls the operation of the capsule endoscope 10100 by transmitting a control signal to the control unit 10117 of the capsule endoscope 10100 via the antenna 10200A.
  • the capsule endoscope 10100 for example, the light irradiation conditions for the observation object in the light source unit 10111 can be changed by the control signal from the external control device 10200.
  • the imaging conditions for example, the frame rate and exposure value in the imaging unit 10112
  • the contents of the processing in the image processing unit 10113 and the conditions for the wireless communication unit 10114 to transmit an image signal may be changed by the control signal from the external control device 10200.
  • the external control device 10200 also performs various image processing on the image signal transmitted from the capsule endoscope 10100 to generate image data for displaying the captured in-vivo image on a display device.
  • the image processing can include various signal processing such as development processing (demosaic processing), high image quality processing (band enhancement processing, super-resolution processing, NR (Noise reduction) processing, and/or image stabilization processing, etc.), and/or enlargement processing (electronic zoom processing).
  • the external control device 10200 controls the driving of the display device to display the captured in-vivo image based on the generated image data.
  • the external control device 10200 may record the generated image data in a recording device (not shown) or print it out on a printing device (not shown).
  • the above describes an example of an in-vivo information acquisition system to which the technology disclosed herein can be applied.
  • the technology disclosed herein can be applied to the imaging unit 10112 of the configuration described above.
  • leakage to adjacent pixels can be effectively suppressed or prevented.
  • the photodetection device includes a plurality of pixels arranged two-dimensionally, an optical filter, an optical lens, a planarization layer, and inter-pixel walls.
  • the optical filter is disposed at a position corresponding to each of the plurality of pixels.
  • the optical lens is laminated on the optical filter.
  • the planarization layer is disposed between the optical filter and the optical lens to reduce a step shape of the optical filter.
  • the inter-pixel walls are disposed across the planarization layer in the thickness direction at positions corresponding to the spaces between the plurality of pixels, and have a refractive index lower than that of the planarization layer.
  • the planarizing layer can effectively suppress or prevent the incident light from spreading in the planar direction, and can effectively suppress or prevent light leakage to adjacent pixels.
  • the refractive indices of the optical lens, the planarization layer, and the inter-pixel wall satisfy the relational expression optical lens ⁇ planarization layer > inter-pixel wall.
  • the planarization layer can more effectively suppress or prevent the spread of incident light in the planar direction, and can effectively suppress or prevent light leakage into adjacent pixels.
  • the thickness of the inter-pixel wall in the same direction is the same as, thinner than, or protrudes toward the optical lens side and is thicker than the thickness of the planarization layer in the photodetection device according to the first embodiment.
  • the flattening layer can appropriately adjust the spread of incident light in the planar direction.
  • the inter-pixel wall has a sidewall surface that is inclined with respect to the thickness direction in a side view.
  • the reflection angle of the incident light at the interface between the planarization layer and the sidewall surface of the inter-pixel wall is adjusted at the position corresponding to the gap between the pixels, so that the reflected incident light is more easily collected, thereby improving the light collection efficiency.
  • the surface of the planarization layer on the optical lens side is flatter than the surface on the optical filter side.
  • the planarizing layer can reduce the step shape of the optical filter.
  • a glass plate is disposed on the opposite side of the optical lens to the optical filter in the photodetector according to the first embodiment, with a sealing resin layer interposed therebetween.
  • a sealing resin layer interposed therebetween.
  • the present technology has the following configuration: According to the present technology having the following configuration, it is possible to provide a light detection device that can effectively suppress or prevent the spread of incident light in a planarization layer in a surface direction and effectively suppress or prevent light leakage to adjacent pixels.
  • (1) A plurality of pixels arranged two-dimensionally, an optical filter disposed at a position corresponding to each of the plurality of pixels; an optical lens laminated on the optical filter; a planarization layer disposed between the optical filter and the optical lens to reduce a step shape of the optical filter; and inter-pixel walls disposed in the planarization layer in a thickness direction at positions corresponding to spaces between the pixels, the inter-pixel walls having a refractive index lower than that of the planarization layer.
  • the optical lens is formed of an inorganic material having a refractive index of 1.8 or more
  • the light detection device according to (1) wherein the planarization layer is formed of an organic resin material having a refractive index of 1.8 or less.
  • the refractive indexes of the optical lens, the planarization layer, and the inter-pixel wall are The photodetector according to (1) or (2), wherein a relational expression of the optical lens ⁇ the planarization layer>the inter-pixel wall is satisfied.
  • the inter-pixel walls are formed of a colorless and transparent organic resin material
  • the planarization layer is formed into a lens shape that focuses light by curving the surface of the planarization layer toward the optical lens when viewed from the side.
  • a plurality of photoelectric conversion elements that convert light into electric charges are disposed at positions corresponding to the plurality of pixels on an opposite side of the optical filter from the optical lens;
  • the plurality of pixels constitute a pixel area;
  • the photodetector according to any one of (1) to (14), wherein a surface of the planarization layer facing the optical lens is flatter than a surface of the planarization layer facing the optical filter.
  • the optical filter includes a plurality of types of color filters that transmit different light wavelength ranges, The light detection device according to any one of (1) to (15), wherein the color filters are formed to have different thicknesses for each type.
  • the plurality of pixels constitute an effective pixel area, a dummy pixel area, and an optical black area;
  • the photodetection device according to any one of (1) to (18), wherein among the plurality of pixels, two or more adjacent pixels share one floating diffusion and are electrically connected to a pixel circuit.
  • the photodetector according to any one of (1) to (19), wherein the plurality of pixels constitute a back-illuminated solid-state imaging device or a stacked solid-state imaging device.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Solid State Image Pick-Up Elements (AREA)

Abstract

Ce dispositif de détection de lumière est pourvu d'une pluralité de pixels agencés de manière bidimensionnelle, le dispositif de détection de lumière comprenant : des filtres optiques qui sont disposés à des positions correspondant respectivement à la pluralité de pixels ; des lentilles optiques empilées sur les filtres optiques ; une couche de planarisation qui est disposée entre les filtres optiques et les lentilles optiques et atténue les formes étagées des filtres optiques ; et des parois interpixels qui sont disposées selon le sens de l'épaisseur sur la couche de planarisation à des positions correspondant à la pluralité de pixels, et qui présentent un indice de réfraction inférieur à celui de la couche de planarisation.
PCT/JP2024/031423 2023-10-19 2024-09-02 Dispositif de détection de lumière Pending WO2025084019A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2023-179968 2023-10-19
JP2023179968 2023-10-19

Publications (1)

Publication Number Publication Date
WO2025084019A1 true WO2025084019A1 (fr) 2025-04-24

Family

ID=95448884

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2024/031423 Pending WO2025084019A1 (fr) 2023-10-19 2024-09-02 Dispositif de détection de lumière

Country Status (1)

Country Link
WO (1) WO2025084019A1 (fr)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005294647A (ja) * 2004-04-01 2005-10-20 Matsushita Electric Ind Co Ltd 固体撮像装置およびその製造方法
JP2006196553A (ja) * 2005-01-11 2006-07-27 Matsushita Electric Ind Co Ltd 固体撮像装置
JP2006295125A (ja) * 2005-01-18 2006-10-26 Matsushita Electric Ind Co Ltd 固体撮像装置及びその製造方法並びにカメラ
JP2011176715A (ja) * 2010-02-25 2011-09-08 Nikon Corp 裏面照射型撮像素子および撮像装置
JP2012084608A (ja) * 2010-10-07 2012-04-26 Sony Corp 固体撮像装置とその製造方法、並びに電子機器
WO2013179972A1 (fr) * 2012-05-30 2013-12-05 ソニー株式会社 Elément capteur d'image, dispositif capteur d'image, et dispositif et procédé de fabrication
JP2015002340A (ja) * 2013-06-18 2015-01-05 キヤノン株式会社 固体撮像装置及びその製造方法
WO2018043654A1 (fr) * 2016-09-02 2018-03-08 ソニーセミコンダクタソリューションズ株式会社 Dispositif d'imagerie à semi-conducteurs et son procédé de fabrication, et appareil électronique
WO2019093135A1 (fr) * 2017-11-08 2019-05-16 ソニーセミコンダクタソリューションズ株式会社 Élément de capture d'image, procédé de fabrication associé, et appareil électronique
JP2019140252A (ja) * 2018-02-09 2019-08-22 キヤノン株式会社 光電変換装置および機器
WO2022209327A1 (fr) * 2021-03-30 2022-10-06 ソニーセミコンダクタソリューションズ株式会社 Dispositif d'imagerie
WO2023068172A1 (fr) * 2021-10-20 2023-04-27 ソニーセミコンダクタソリューションズ株式会社 Dispositif d'imagerie

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005294647A (ja) * 2004-04-01 2005-10-20 Matsushita Electric Ind Co Ltd 固体撮像装置およびその製造方法
JP2006196553A (ja) * 2005-01-11 2006-07-27 Matsushita Electric Ind Co Ltd 固体撮像装置
JP2006295125A (ja) * 2005-01-18 2006-10-26 Matsushita Electric Ind Co Ltd 固体撮像装置及びその製造方法並びにカメラ
JP2011176715A (ja) * 2010-02-25 2011-09-08 Nikon Corp 裏面照射型撮像素子および撮像装置
JP2012084608A (ja) * 2010-10-07 2012-04-26 Sony Corp 固体撮像装置とその製造方法、並びに電子機器
WO2013179972A1 (fr) * 2012-05-30 2013-12-05 ソニー株式会社 Elément capteur d'image, dispositif capteur d'image, et dispositif et procédé de fabrication
JP2015002340A (ja) * 2013-06-18 2015-01-05 キヤノン株式会社 固体撮像装置及びその製造方法
WO2018043654A1 (fr) * 2016-09-02 2018-03-08 ソニーセミコンダクタソリューションズ株式会社 Dispositif d'imagerie à semi-conducteurs et son procédé de fabrication, et appareil électronique
WO2019093135A1 (fr) * 2017-11-08 2019-05-16 ソニーセミコンダクタソリューションズ株式会社 Élément de capture d'image, procédé de fabrication associé, et appareil électronique
JP2019140252A (ja) * 2018-02-09 2019-08-22 キヤノン株式会社 光電変換装置および機器
WO2022209327A1 (fr) * 2021-03-30 2022-10-06 ソニーセミコンダクタソリューションズ株式会社 Dispositif d'imagerie
WO2023068172A1 (fr) * 2021-10-20 2023-04-27 ソニーセミコンダクタソリューションズ株式会社 Dispositif d'imagerie

Similar Documents

Publication Publication Date Title
JP7612772B2 (ja) 固体撮像装置、及び電子機器
JP7566940B2 (ja) 光検出装置および電子機器
TWI759382B (zh) 固態成像裝置及電子裝置
JP7316764B2 (ja) 固体撮像装置、及び電子機器
JP7544602B2 (ja) 撮像素子
CN111226318B (zh) 摄像器件和电子设备
US20240145507A1 (en) Imaging device
JP7544601B2 (ja) 撮像素子および撮像装置
US20240186352A1 (en) Imaging device
JP7503399B2 (ja) 固体撮像装置及びその製造方法、並びに電子機器
JP7636332B2 (ja) 撮像素子および撮像装置
WO2025084019A1 (fr) Dispositif de détection de lumière
JP2024094103A (ja) 光検出装置および電子機器
WO2025028016A1 (fr) Dispositif à semi-conducteur, procédé de fabrication de dispositif à semi-conducteur, et dispositif de détection de la lumière
JP2025036895A (ja) 光検出装置、光学素子、および電子機器
WO2025041370A1 (fr) Dispositif de détection de lumière et appareil électronique
WO2025100348A1 (fr) Élément de photodétection
WO2024225123A1 (fr) Dispositif d'imagerie et appareil électronique
WO2025109888A1 (fr) Dispositif de détection de lumière et équipement électronique
JP2025036894A (ja) 光検出装置、光学素子、および電子機器
WO2025028020A1 (fr) Dispositif à semi-conducteur et dispositif de détection de lumière
JP2024094104A (ja) 光学素子、光検出装置、および電子機器
WO2025204244A1 (fr) Dispositif de détection de lumière, procédé de production de dispositif de détection de lumière et appareil électronique
WO2024237002A1 (fr) Appareil de détection de lumière et dispositif électronique
WO2025121000A1 (fr) Dispositif de détection de lumière