[go: up one dir, main page]

WO2025121422A1 - Dispositif de détection de lumière - Google Patents

Dispositif de détection de lumière Download PDF

Info

Publication number
WO2025121422A1
WO2025121422A1 PCT/JP2024/043277 JP2024043277W WO2025121422A1 WO 2025121422 A1 WO2025121422 A1 WO 2025121422A1 JP 2024043277 W JP2024043277 W JP 2024043277W WO 2025121422 A1 WO2025121422 A1 WO 2025121422A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
layer
structures
pixel
section
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/JP2024/043277
Other languages
English (en)
Japanese (ja)
Inventor
雄介 守屋
賢太 長谷川
結以 高田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Semiconductor Solutions Corp
Original Assignee
Sony Semiconductor Solutions Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from PCT/JP2024/036675 external-priority patent/WO2025121000A1/fr
Application filed by Sony Semiconductor Solutions Corp filed Critical Sony Semiconductor Solutions Corp
Publication of WO2025121422A1 publication Critical patent/WO2025121422A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10FINORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
    • H10F39/00Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
    • H10F39/10Integrated devices
    • H10F39/12Image sensors

Definitions

  • This disclosure relates to a photodetector having a wavelength separation structure.
  • Patent Document 1 discloses an image sensor that has a light-shielding film on the top of pixels included in an optical black area that is located outside the pixel array.
  • the photodetector includes a semiconductor substrate having a first surface and a second surface facing each other, a pixel portion in which a plurality of pixels are arranged in an array, and a peripheral portion provided around the pixel portion; a light guide portion provided on the first surface side of the semiconductor substrate, the light guide portion including a medium having a refractive index different from that of the first structures and provided so as to fill the spaces between the first structures and each having a size equal to or smaller than the wavelength of the incident light, and provided across the pixel portion and the peripheral portion; a photoelectric conversion portion formed in the semiconductor substrate for each of the pixels and performing photoelectric conversion of the light incident through the light guide portion; a light absorption layer provided in the peripheral portion on the surface of the light guide portion opposite the semiconductor substrate; and a protective film covering the surface of the light absorption layer.
  • a semiconductor substrate has a pixel section in which a plurality of pixels are arranged in an array, and a peripheral section provided around the pixel section.
  • a plurality of first structures each having a size equal to or smaller than the wavelength of the incident light are provided on the first surface side, which is the light incident surface, so as to fill the spaces between the plurality of adjacent first structures, and a light guide section including a medium having a refractive index different from that of the plurality of first structures is provided across the pixel section and the peripheral section.
  • a light absorbing layer whose surface is covered with a protective film is provided on the light guide section in the peripheral section. This suppresses surface reflection in the peripheral section.
  • FIG. 1 is a schematic cross-sectional view illustrating an example of a configuration of a photodetector according to a first embodiment of the present disclosure.
  • FIG. 2 is a block diagram showing an example of a schematic configuration of the photodetector shown in FIG.
  • FIG. 3 is a schematic diagram showing an example of a pixel portion of the photodetector shown in FIG. 1 and its surroundings.
  • FIG. 4 is a diagram illustrating an example of a circuit configuration of a unit pixel of the photodetector shown in FIG.
  • FIG. 5A is a schematic cross-sectional view illustrating an example of a manufacturing process for the photodetector shown in FIG.
  • FIG. 5B is a schematic cross-sectional view showing a step subsequent to FIG. 5A.
  • FIG. 5A is a schematic cross-sectional view illustrating an example of a manufacturing process for the photodetector shown in FIG.
  • FIG. 5B is a schematic cross-sectional view showing a step subsequent
  • FIG. 5C is a schematic cross-sectional view showing a step subsequent to FIG. 5B.
  • FIG. 5D is a schematic cross-sectional view showing a step subsequent to FIG. 5C.
  • FIG. 5E is a schematic cross-sectional view showing a step subsequent to FIG. 5D.
  • FIG. 5F is a schematic cross-sectional view showing a step subsequent to FIG. 5E.
  • FIG. 5G is a schematic cross-sectional view showing a step subsequent to FIG. 5F.
  • FIG. 5H is a schematic cross-sectional view showing a step subsequent to FIG. 5G.
  • FIG. 5I is a schematic cross-sectional view showing a step subsequent to FIG. 5H.
  • FIG. 5J is a schematic cross-sectional view showing a step subsequent to FIG. 5I.
  • FIG. 5K is a schematic cross-sectional view showing a step subsequent to FIG. 5J.
  • FIG. 5L is a schematic cross-sectional view showing a step subsequent to FIG. 5K.
  • FIG. 6 is a schematic cross-sectional view illustrating an example of a configuration of a light detection device according to the first modification of the present disclosure.
  • FIG. 7 is a schematic cross-sectional view illustrating an example of a configuration of a light detection device according to Modification 2 of the present disclosure.
  • FIG. 8 is a schematic cross-sectional view illustrating an example of a configuration of a light detection device according to Modification 3 of the present disclosure.
  • FIG. 9 is a schematic cross-sectional view illustrating an example of a configuration of a light detection device according to Modification 4 of the present disclosure.
  • FIG. 10 is a schematic cross-sectional view illustrating another example of the configuration of a photodetector according to the fourth modification of the present disclosure.
  • FIG. 11 is a schematic cross-sectional view illustrating an example of a configuration of a light detection device according to Modification 5 of the present disclosure.
  • FIG. 12A is a schematic cross-sectional view illustrating an example of a manufacturing process for the photodetector shown in FIG.
  • FIG. 12B is a schematic cross-sectional view showing a step subsequent to FIG. 12A.
  • FIG. 12C is a schematic cross-sectional view showing a step subsequent to FIG. 12B.
  • FIG. 13 is a schematic cross-sectional view illustrating an example of a configuration of a light detection device according to the sixth modification of the present disclosure.
  • FIG. 13 is a schematic cross-sectional view illustrating an example of a configuration of a light detection device according to the sixth modification of the present disclosure.
  • FIG. 14A is a schematic cross-sectional view illustrating an example of a manufacturing process for the photodetector shown in FIG.
  • FIG. 14B is a schematic cross-sectional view showing a step subsequent to FIG. 14A.
  • FIG. 14C is a schematic cross-sectional view showing a step subsequent to FIG. 14B.
  • FIG. 15 is a schematic cross-sectional view illustrating an example of a configuration of a light detection device according to Modification 7 of the present disclosure.
  • FIG. 16 is a schematic cross-sectional view illustrating an example of the configuration of a light detection device according to Modification 8 of this disclosure.
  • FIG. 17 is a schematic cross-sectional view illustrating another example of the configuration of a photodetector according to Modification 8 of this disclosure.
  • FIG. 18 is a schematic cross-sectional view illustrating another example of the configuration of a photodetector according to the eighth modification of this disclosure.
  • FIG. 19 is a schematic cross-sectional view illustrating another example of the configuration of a photodetector according to the eighth modification of this disclosure.
  • FIG. 20 is a functional block diagram showing an example of an electronic device (camera) using the light detection device shown in FIG.
  • FIG. 21A is a schematic diagram showing an example of the overall configuration of a light detection system using the light detection device shown in FIG.
  • FIG. 21B is a diagram illustrating an example of a circuit configuration of the light detection system illustrated in FIG. 21A.
  • FIG. 22 is a diagram showing an example of a schematic configuration of an endoscopic surgery system.
  • FIG. 22 is a diagram showing an example of a schematic configuration of an endoscopic surgery system.
  • FIG. 23 is a block diagram showing an example of the functional configuration of the camera head and the CCU.
  • FIG. 24 is a block diagram showing an example of a schematic configuration of a vehicle control system.
  • FIG. 25 is an explanatory diagram showing an example of the installation positions of the outside-of-vehicle information detection unit and the imaging unit.
  • FIG. 26 is a schematic cross-sectional view illustrating an example of a configuration of a photodetector according to the second embodiment of the present disclosure.
  • FIG. 27 is a schematic plan view showing an example of the layout of a plurality of structures in the pixel portion and the peripheral portion of the photodetector shown in FIG.
  • FIG. 28A is a schematic cross-sectional view illustrating an example of a manufacturing process for the photodetector shown in FIG.
  • FIG. 28B is a schematic cross-sectional view showing a step subsequent to FIG. 28A.
  • FIG. 28C is a schematic cross-sectional view showing a step subsequent to FIG. 28B.
  • FIG. 28D is a schematic cross-sectional view showing a step subsequent to FIG. 28C.
  • FIG. 28E is a schematic cross-sectional view showing a step subsequent to FIG. 28D.
  • FIG. 28F is a schematic cross-sectional view showing a step subsequent to FIG. 28E.
  • FIG. 28G is a schematic cross-sectional view showing a step subsequent to FIG. 28F.
  • FIG. 29 is a schematic cross-sectional view illustrating an example of a configuration of a light detection device according to Modification 9 of the present disclosure.
  • FIG. 29 is a schematic cross-sectional view illustrating an example of a configuration of a light detection device according to Modification 9 of the present disclosure.
  • FIG. 29 is a schematic cross-sectional view illustrating an example of a configuration of
  • FIG. 30 is a schematic cross-sectional view illustrating another example of the configuration of a photodetector according to Modification 9 of the present disclosure.
  • FIG. 31 is a schematic cross-sectional view illustrating another example of the configuration of a photodetector according to Modification 9 of the present disclosure.
  • FIG. 32 is a schematic cross-sectional view illustrating an example of the configuration of a light detection device according to Modification 10 of this disclosure.
  • FIG. 33 is a schematic plan view showing an example of the layout of a plurality of structures in the pixel portion and the peripheral portion of the photodetector shown in FIG. 32. As shown in FIG. FIG. FIG.
  • FIG. 34 is a schematic cross-sectional view illustrating another example of the configuration of a photodetector according to the tenth modification of this disclosure.
  • FIG. 35 is a schematic plan view showing an example of the layout of a plurality of structures in the pixel portion and the peripheral portion of the photodetector shown in FIG.
  • FIG. 36 is a schematic cross-sectional view illustrating an example of the configuration of a light detection device according to Modification 11 of the present disclosure.
  • FIG. 37 is a schematic cross-sectional view illustrating another example of the configuration of a photodetector according to the eleventh modification of the present disclosure.
  • FIG. 38 is a schematic cross-sectional view illustrating an example of the configuration of a light detection device according to Modification 12 of the present disclosure.
  • FIG. 39 is a schematic cross-sectional view illustrating another example of the configuration of a photodetector according to the twelfth modification of the present disclosure.
  • Modification 4 (another example of the configuration of the light detection device) 2-5.
  • Modification 5 (another example of the configuration of the light detection device) 2-6.
  • Modification 6 (another example of the configuration of the light detection device) 2-7.
  • Modification 7 (another example of the configuration of the light detection device) 2-8.
  • Modification 8 (another example of the configuration of the light detection device) 3.
  • Second embodiment (an example of a photodetector having a plurality of structures having light absorption properties in a peripheral optical layer) 4.
  • Modifications 4-1 (another example of the configuration of the light detection device) 4-2.
  • Modification 10 (another example of the configuration of the photodetector) 4-3.
  • Modification 11 (another example of the configuration of the light detection device) 4-4.
  • Modification 12 (another example of the configuration of the light detection device) 5.
  • Application examples 6. Application examples
  • FIG. 1 is a schematic diagram showing an example of a cross-sectional configuration of a photodetector (photodetector 1) according to a first embodiment of the present disclosure.
  • FIG. 2 is a block diagram showing an example of a schematic configuration of the photodetector 1 shown in FIG. 1.
  • FIG. 3 is a schematic diagram showing an example of a pixel unit and its surrounding configuration of the photodetector 1 shown in FIG. 1.
  • the photodetector 1 is applicable to, for example, a CMOS (Complementary Metal Oxide Semiconductor) image sensor used in electronic devices such as digital still cameras and video cameras, and has a pixel unit (pixel unit 100A) in which a plurality of pixels are two-dimensionally arranged in a matrix as an imaging area.
  • the photodetector 1 is, for example, a so-called back-illuminated photodetector in this CMOS image sensor.
  • the photodetector 1 captures incident light (image light) from a subject via an optical lens system (e.g., optical system 1001, see FIG. 20), converts the amount of incident light imaged on an imaging surface into an electrical signal on a pixel-by-pixel basis, and outputs the electrical signal as a pixel signal.
  • the photodetector 1 has a pixel section 100A as an imaging area on a semiconductor substrate 11, and a peripheral section 100B around the pixel section 100A.
  • the peripheral section 100B has, for example, a pixel control section 111, a signal processing section 112, a control section 113, and a processing section 114.
  • the photodetector 1 is also provided with, for example, a plurality of control lines Lread and a plurality of signal lines VSL.
  • a plurality of unit pixels P are arranged two-dimensionally in a matrix.
  • a control line Lread (specifically, a row selection line and a reset control line) is wired for each pixel row
  • a signal line VSL is wired for each pixel column.
  • the control line Lread is a signal line capable of transmitting a signal that controls the unit pixel P, and is connected to the pixel control unit 111 and the unit pixel P of the pixel section 100A.
  • the control line Lread is configured to transmit a control signal for reading out a signal from the unit pixel P.
  • the control line Lread can also be considered a drive line (pixel drive line) that transmits a signal that drives the unit pixel P.
  • the signal line VSL is a signal line capable of transmitting a signal from a unit pixel P, and is connected to the unit pixel P of the pixel section 100A and the signal processing section 112.
  • the pixel section 100A for example, one or more signal lines VSL are wired for each pixel column composed of multiple unit pixels P aligned in the vertical direction (column direction).
  • the signal line VSL is configured to be capable of transmitting a signal output from the unit pixel P.
  • multiple signal lines VSL may be provided for one pixel column.
  • the pixel control unit 111 is configured to be able to control each unit pixel P of the pixel unit 100A.
  • the pixel control unit 111 is a control circuit and is configured with multiple circuits including, for example, a buffer, a shift register, and an address decoder.
  • the pixel control unit 111 generates a signal for controlling the unit pixel P and outputs it to each unit pixel P of the pixel unit 100A via the control line Lread.
  • the pixel control unit 111 is controlled by the control unit 113 and controls the unit pixels P of the pixel unit 100A.
  • the pixel control unit 111 generates signals for controlling the unit pixels P, such as a signal for controlling the transfer transistor of the unit pixel P, a signal for controlling the selection transistor, and a signal for controlling the reset transistor, and supplies these to each unit pixel P via a control line Lread.
  • the pixel control unit 111 can control the reading of pixel signals from each unit pixel P.
  • the pixel control unit 111 can also be referred to as a pixel driving unit configured to be able to drive each unit pixel P.
  • the pixel control unit 111 and the control unit 113 can also be referred to collectively as a pixel control unit.
  • the signal processing unit 112 is configured to be able to perform signal processing of the input pixel signal.
  • the signal processing unit 112 is a signal processing circuit, and has, for example, a load circuit, an analog-to-digital (AD) conversion circuit, and a horizontal selection switch.
  • the load circuit is configured by a current source capable of supplying current to the amplification transistor of the unit pixel P.
  • the signal processing unit 112 may have an amplifier circuit configured to amplify a signal read from the unit pixel P via the signal line VSL.
  • a load circuit, an amplifier circuit, an AD conversion circuit, etc. are provided for each of the multiple signal lines VSL, for example.
  • a load circuit, an amplifier circuit, an AD conversion circuit, etc. may be provided for each pixel column of the pixel unit 100A.
  • the signal output from each unit pixel P selected and scanned by the pixel control unit 111 is input to the signal processing unit 112 via the signal line VSL.
  • the signal processing unit 112 can perform signal processing such as AD conversion and correlated double sampling (CDS) of the signal of the unit pixel P.
  • the signal of each unit pixel P transmitted through each of the signal lines VSL is subjected to signal processing by the signal processing unit 112 and output to the processing unit 114.
  • the processing unit 114 is configured to be able to perform signal processing on the input signal.
  • the processing unit 114 is a processing circuit, and is configured, for example, by a circuit that performs various types of signal processing on pixel signals.
  • the processing unit 114 may include a processor and a memory.
  • the processing unit 114 performs signal processing on pixel signals input from the signal processing unit 112, and outputs the processed pixel signals.
  • the processing unit 114 can perform various types of signal processing, for example, noise reduction processing and gradation correction processing.
  • the control unit 113 is configured to be able to control each part of the photodetection device 1.
  • the control unit 113 receives an externally provided clock, data instructing the operation mode, etc., and can also output data such as internal information of the photodetection device 1.
  • the control unit 113 is a control circuit, and has, for example, a timing generator configured to be able to generate various timing signals.
  • the control unit 113 performs drive control of the pixel control unit 111, the signal processing unit 112, etc., based on the various timing signals (pulse signals, clock signals, etc.) generated by the timing generator.
  • the pixel section 100A, the pixel control section 111, the signal processing section 112, etc. may be provided on a single substrate.
  • the pixel control section 111, the signal processing section 112, the control section 113, the processing section 114, etc. may be provided on a single semiconductor substrate, or may be provided separately on multiple semiconductor substrates.
  • the photodetector 1 may have a layered structure formed by stacking multiple substrates. Some or all of the signal processing section 112, the control section 113, and the processing section 114 may be configured as an integrated unit.
  • the pixel unit 100A has an effective pixel area 100a1 in which a subject image formed by an imaging lens is photoelectrically converted in a photodiode (PD) to generate a signal for image generation, and a dummy pixel area 100a2 disposed outside the effective pixel area 100a1 to generate a pixel signal to assist in image generation.
  • the pixel unit 100A may further include a dummy pixel area outside the dummy pixel area 100a2 that does not generate a pixel signal.
  • an optical black (OPB) region 100b that outputs a background signal is provided on the periphery of the pixel portion 100A.
  • OPB region 100b for example, a photodiode (PD) is provided, similar to the pixel portion 100A.
  • [Circuit configuration of unit pixel] 4 shows an example of a circuit configuration of a unit pixel P of the photodetector 1 shown in FIG.
  • the unit pixel P has, for example, one photoelectric conversion unit 12 and a readout circuit 41.
  • the photoelectric conversion unit 12 is configured to receive light and generate a signal.
  • the readout circuit 41 is configured to be capable of outputting a signal based on the charge photoelectrically converted.
  • the readout circuit 41 can read out a pixel signal based on the charge photoelectrically converted by the photoelectric conversion unit 12.
  • the photoelectric conversion unit 12 is a so-called light receiving element, and is configured to be able to generate an electric charge by photoelectric conversion.
  • the photoelectric conversion unit 12 is, for example, a photodiode (PD), and converts incident light into an electric charge.
  • the photoelectric conversion unit 12 can perform photoelectric conversion to generate an electric charge according to the amount of light received.
  • the read circuit 41 has a transfer transistor TRG, a floating diffusion FD, an amplification transistor AMP, a selection transistor SEL, and a reset transistor RST.
  • the transfer transistor TRG, the amplification transistor AMP, the selection transistor SEL, and the reset transistor RST are each a MOS transistor (MOSFET) having a gate, a source, and a drain terminal.
  • MOSFET MOS transistor
  • the transfer transistor TRG, the amplification transistor AMP, the selection transistor SEL, and the reset transistor RST are each composed of an NMOS transistor.
  • each transistor that composes the read circuit 41 may be composed of a PMOS transistor.
  • the transfer transistor TRG is configured to be able to transfer the charge photoelectrically converted in the photoelectric conversion unit 12 to the floating diffusion FD.
  • the transfer transistor TRG is controlled by a signal STRG, and electrically connects or disconnects the photoelectric conversion unit 12 and the floating diffusion FD.
  • the transfer transistor TRG can transfer the charge photoelectrically converted and accumulated in the photoelectric conversion unit 12 to the floating diffusion FD.
  • the floating diffusion FD is an accumulation section and is configured to be able to accumulate the transferred charge.
  • the floating diffusion FD can accumulate the charge photoelectrically converted by the photoelectric conversion section 12.
  • the floating diffusion FD can also be said to be a retention section capable of retaining the transferred charge.
  • the floating diffusion FD accumulates the transferred charge and converts it into a voltage according to the capacity of the floating diffusion FD.
  • the amplification transistor AMP is configured to generate and output a signal based on the charge accumulated in the floating diffusion FD.
  • the amplification transistor AMP can generate and output a signal based on the charge converted by the photoelectric conversion unit 12.
  • the gate of the amplifier transistor AMP is electrically connected to the floating diffusion FD, and the voltage converted by the floating diffusion FD is input.
  • the drain of the amplifier transistor AMP is connected to, for example, a power supply line through which the power supply voltage VDD is supplied.
  • the source of the amplification transistor AMP is connected to the signal line VSL via the selection transistor SEL.
  • the amplification transistor AMP is configured to generate a signal based on the charge stored in the floating diffusion FD, i.e., a signal based on the voltage of the floating diffusion FD, and output it to the signal line VSL.
  • the selection transistor SEL is configured to be able to control the output of a pixel signal.
  • the selection transistor SEL is, for example, electrically connected in series with the amplification transistor AMP.
  • the selection transistor SEL is configured to be controlled by a signal SSEL and to be able to output a signal from the amplification transistor AMP to a signal line VSL.
  • the selection transistor SEL can control the output timing of the pixel signal.
  • the selection transistor SEL is configured to be capable of outputting a signal based on the charge converted by the photoelectric conversion unit 12.
  • the selection transistor SEL can output a pixel signal of the unit pixel P to a signal line VSL.
  • the selection transistor SEL may be electrically connected in series between a power supply line to which a power supply voltage VDD is applied and the amplification transistor AMP.
  • the selection transistor SEL may also be omitted as appropriate.
  • the reset transistor RST is configured to be able to reset the voltage of the floating diffusion FD.
  • the reset transistor RST is, for example, electrically connected to a power supply line to which a power supply voltage VDD is applied, and is configured to reset the charge of the unit pixel P.
  • the reset transistor RST is controlled by a signal SRST, and can reset the charge accumulated in the floating diffusion FD and reset the voltage of the floating diffusion FD.
  • the reset transistor RST can, for example, electrically connect the power supply line and the floating diffusion FD, and discharge the charge accumulated in the floating diffusion FD.
  • the reset transistor RST can also discharge the charge accumulated in the photoelectric conversion unit 12 via the transfer transistor TRG.
  • the pixel control unit 111 of the photodetector 1 supplies control signals to the gates of the transfer transistor TRG, selection transistor SEL, reset transistor RST, etc. of each unit pixel P via the control line Lread, turning the transistors on (conducting state) or off (non-conducting state).
  • the multiple control lines Lread for each pixel row of the photodetector 1 include, for example, a wiring line that transmits a signal STRG that controls the transfer transistor TRG, a wiring line that transmits a signal SSEL that controls the selection transistor SEL, and a wiring line that transmits a signal SRST that controls the reset transistor RST.
  • the readout circuit 41 may be configured to change the conversion efficiency (gain) when converting charge into voltage.
  • the readout circuit 41 may have a switching transistor used to set the conversion efficiency.
  • the switching transistor is electrically connected between the floating diffusion FD and the reset transistor RST.
  • the switching transistor when the switching transistor is turned on, the capacitance added to the floating diffusion FD of the unit pixel P increases, and the conversion efficiency is switched.
  • the switching transistor can change the capacitance connected to the gate of the amplification transistor AMP, thereby changing the conversion efficiency.
  • the transfer transistor TRG, selection transistor SEL, reset transistor RST, switching transistor, etc. are controlled to be turned on and off by the pixel control unit 111.
  • the pixel control unit 111 controls the readout circuit 41 of each unit pixel P to output a pixel signal from each unit pixel P to the signal line VSL.
  • the pixel control unit 111 can control the reading out of the pixel signal of each unit pixel P to the signal line VSL.
  • the light detection device 1 is a back-illuminated imaging device, and each of the unit pixels P arranged two-dimensionally in a matrix in the pixel section 100A has a configuration in which, for example, a light receiving section 10, an optical layer 20 provided on the light incident side S1 of the light receiving section 10, and a multi-layer wiring layer 40 provided on the opposite side of the light incident side S1 of the light receiving section 10 are stacked.
  • the light detection device 1 has a pixel section 100A in which the unit pixels P are arranged two-dimensionally in a matrix, and a peripheral section 100B surrounding the pixel section 100A.
  • the light receiving section 10, the optical layer 20, and the multi-layer wiring layer 40 are provided, for example, across the pixel section 100A and the peripheral section 100B.
  • the optical layer 20 includes a light guide section 27 configured to include, for example, a plurality of structures 27A that are nanostructures and a medium 27B that fills the spaces between the adjacent structures 27A.
  • a light shielding portion 32 having a surface covered with a protective layer 33 is provided on the light guiding portion 27 in the peripheral portion 100B.
  • the light guiding section 27 corresponds to a specific example of a "light guiding section” as one embodiment of the present disclosure.
  • the multiple structures 27A correspond to a specific example of a “multiple first structures” as one embodiment of the present disclosure
  • the medium 27B corresponds to a specific example of a “medium” as one embodiment of the present disclosure.
  • the light shielding section 32 corresponds to a specific example of a "light absorbing layer” as one embodiment of the present disclosure
  • the protective layer 33 corresponds to a specific example of a "protective film” as one embodiment of the present disclosure.
  • the light receiving unit 10 has a semiconductor substrate 11 having a first surface 11S1 and a second surface 11S2 facing each other, and a plurality of photoelectric conversion units 12 embedded in the semiconductor substrate 11.
  • the light receiving unit 10 further has a separation unit 13.
  • the semiconductor substrate 11 is made of, for example, a silicon substrate (Si).
  • the semiconductor substrate 11 may be an SOI (Silicon On Insulator) substrate, a SiGe (Silicon Germanium) substrate, a SiC (Silicon Carbide) substrate, or the like.
  • the semiconductor substrate 11 may be made of a III-V compound semiconductor material, or may be formed using other semiconductor materials.
  • the first surface 11S1 of the semiconductor substrate 11 is a light receiving surface (light incident surface).
  • the second surface 11S2 of the semiconductor substrate 11 is an element formation surface on which elements such as transistors are formed.
  • a gate electrode, a gate insulating film, and the like are provided on the second surface 11S2 of the semiconductor substrate 11.
  • the photoelectric conversion unit 12 is, for example, a positive intrinsic negative (PIN) type photodiode (PD) and has a pn junction in a predetermined region of the semiconductor substrate 11. For example, one photoelectric conversion unit 12 is embedded in each unit pixel P.
  • PIN positive intrinsic negative
  • PD photodiode
  • the separation portion 13 is provided between adjacent unit pixels P.
  • the separation portion 13 is provided so as to surround the unit pixels P, and is provided in a lattice pattern across the pixel portion 100A and the OPB region 100b on its periphery.
  • the separation portion 13 electrically and optically separates adjacent unit pixels P, and extends, for example, from the first surface 11S1 side of the semiconductor substrate 11 toward the second surface 11S2 side.
  • the isolation portion 13 can be formed, for example, by diffusing p-type impurities.
  • the isolation portion 13 may be, for example, a Shallow Trench Isolation (STI) structure or a Full Trench Isolation (FTI) structure in which an opening is formed in the semiconductor substrate 11 from the first surface 11S1 side and an insulating film is embedded therein.
  • An air gap may also be formed in the STI structure and the FTI structure.
  • the first surface 11S1 of the semiconductor substrate 11 is further provided with a dielectric layer 14 that also serves to prevent reflection on the first surface 11S1 of the semiconductor substrate 11.
  • the dielectric layer 14 may be, for example, a film having a positive fixed charge or a film having a negative fixed charge.
  • the dielectric layer 14 may be made of a semiconductor material or a conductive material having a bandgap wider than that of the semiconductor substrate 11 .
  • the optical layer 20 includes, for example, a partition 21, a color filter 22, a light-shielding film 23, a sealing film 24, an insulating layer 25, a planarizing layer 26, and a light-guiding section 27, and is configured to guide the light incident from the light incident side S1 to the light receiving section 10.
  • the partition 21 is provided at the boundary between adjacent unit pixels P and is a frame having an opening 21H for each unit pixel P.
  • the partition 21, like the separation section 13, is provided to surround the unit pixel P and is provided in a lattice pattern across the pixel section 100A and the OPB region 100b around its periphery.
  • the partition 21 is intended to prevent light incident obliquely from the light incident side S1 from leaking into the adjacent unit pixel P.
  • the partition 21 is made of a material that has a lower refractive index than the color filter 22, for example.
  • the partition 21 may also serve as a light shield for the unit pixel P that determines the optical black level.
  • the partition 21 may also serve as a light shield to suppress the generation of noise in the peripheral circuit provided in the peripheral portion 100B.
  • the partition 21 may be formed, for example, using a material having light shielding properties. Examples of such materials include tungsten (W), silver (Ag), copper (Cu), titanium (Ti), aluminum (Al), or alloys thereof. Other examples include metal compounds such as TiN.
  • the partition 21 may be configured, for example, as a single layer film or a laminated film.
  • the partition 21 is configured as a laminated film, for example, a layer made of Ti, tantalum (Ta), W, cobalt (Co), or molybdenum (Mo), or an alloy, nitride, oxide, or carbide thereof may be provided as an underlayer.
  • a layer made of Ti, tantalum (Ta), W, cobalt (Co), or molybdenum (Mo) may be provided as an underlayer.
  • the color filters 22 selectively transmit light of a specific wavelength, and include, for example, a red filter 22R that selectively transmits red light (R), a green filter 22G that selectively transmits green light (G), and a blue filter 22B that selectively transmits blue light (B).
  • a red filter 22R that selectively transmits red light (R)
  • a green filter 22G that selectively transmits green light (G)
  • a blue filter 22B that selectively transmits blue light (B).
  • Each of the color filters 22R, 22G, and 22B is formed by filling the opening 21H of the partition 21 with a resin material in which the desired pigment or dye is dispersed.
  • each color filter 22R, 22G, 22B For example, for four unit pixels P arranged in 2 rows and 2 columns, two green filters 22G are arranged diagonally, and one red filter 22R and one blue filter 22B are arranged on the orthogonal diagonal.
  • the corresponding color light is selectively photoelectrically converted in each photoelectric conversion unit 12.
  • unit pixels P red pixels Pr) that selectively receive red light (R) and perform photoelectric conversion
  • unit pixels P green pixels Pg) that selectively receive green light (G) and perform photoelectric conversion
  • unit pixels P blue pixels Pb) that selectively receive blue light (B) and perform photoelectric conversion
  • the red pixels Pr, green pixels Pg, and blue pixels Pb generate pixel signals of the red light (R) component, the green light (G) component, and the blue light (B) component, respectively. This allows the photodetection device 1 to obtain RGB pixel signals.
  • the color filters 22 may be provided with complementary color filters that selectively transmit cyan (C), magenta (M) and yellow (Y) in addition to the red filter 22R, green filter 22G and blue filter 22B. Furthermore, the color filters 22 may be provided with a filter corresponding to white (W), i.e., a filter that transmits light of all wavelengths incident on the light detection device 1. In addition, the color filters 22 may be provided with a filter that selectively transmits infrared light.
  • W white
  • W white
  • the color filters 22 may be provided with a filter that selectively transmits infrared light.
  • the thickness of the color filter 22 may be different for each color, taking into account the color reproducibility and sensor sensitivity of the optical spectrum.
  • the light-shielding film 23 is intended to block light incident on the photoelectric conversion section 12 provided in the OPB region 100b.
  • the light-shielding film 23 is provided in the OPB region 100b between the light-receiving section 10 (specifically, the dielectric layer 14) and the partition wall 21 and color filter 22.
  • the light-shielding film 23 can be formed using, for example, tungsten (W), silver (Ag), copper (Cu), titanium (Ti), aluminum (Al), or an alloy thereof.
  • the OPB region 100n is further provided with a sealing film 24 that covers the partition 21, the color filter 22, and the light-shielding film 23.
  • the sealing film 24 can be formed using, for example, silicon oxide (SiO), silicon nitride (SiN), silicon oxynitride (SiON), etc.
  • the insulating layer 25 is provided between the light receiving section 10 and the light guiding section 27.
  • the insulating layer 25 is formed across the pixel section 100A and the peripheral section 100B so as to be laminated, for example, on the partition 211 and color filter 22 provided in the pixel section 100A, and the sealing film 24 that covers the partition 21, color filter 22, and light shielding film 23 provided in the OPB region 100b.
  • the insulating layer 25 can be formed using, for example, silicon oxide (SiO), silicon nitride (SiN), aluminum oxide (AlO), etc.
  • Insulating layer 25 may be formed using a material with a low refractive index other than the insulating materials mentioned above, or may be formed from other materials that transmit light in the wavelength range to be measured. Insulating layer 25 can also be called a transparent layer or spacer layer that transmits light.
  • the planarization layer 26 is intended to fill in the steps of the insulating layer 25 that are generated by providing the insulating layer 25 to cover the partition 211, the color filter 22, and the sealing film 24 that covers the partition 21, the color filter 22, and the light-shielding film 23, as shown in FIG. 5D, for example, and to planarize the surface.
  • the planarization layer 26 can be formed using, for example, silicon oxide (SiO), silicon nitride (SiN), aluminum oxide (AlO), etc.
  • the light guide section 27 is configured as a light guide element capable of guiding light by, for example, imparting a phase delay to incident light.
  • the light guide section 27 is a light guide element that utilizes metamaterial (metasurface) technology.
  • the light guide section 27 can also be called a metasurface layer (or metamaterial layer).
  • the light guide section 27 is provided across the pixel section 100A and the peripheral section 100B.
  • the light guiding section 27 has a plurality of structures 27A and a medium 27B arranged around the plurality of structures 27A.
  • the light guiding section 27 uses the plurality of structures 27A, which are nanostructures, to propagate light toward the photoelectric conversion section 12.
  • Light from a subject, which is the object to be measured, is incident on the light guiding section 27.
  • the plurality of structures 27A have a size equal to or smaller than a predetermined wavelength of the incident light, for example, a size equal to or smaller than the wavelength range of visible light.
  • the plurality of structures 27A may also have a size equal to or smaller than the wavelength range of infrared light.
  • the multiple structures 27A are each, for example, a columnar (pillar-shaped) structure, and can be considered as a nanopillar.
  • the multiple structures 27A can be considered as a metasurface element.
  • the multiple structures 27A have a cylindrical shape.
  • the multiple structures 27A are arranged so as to be aligned with each other in the X-axis direction or the Y-axis direction, sandwiching the medium 27B therebetween.
  • the shape of the multiple structures 27A can be changed as appropriate, and each may be circular or rectangular in plan view.
  • the shape of the multiple structures 27A may also be elliptical, polygonal, cross-shaped, or other shapes.
  • the multiple structures 27A are also called metaatoms, nanoatoms, nanoposts, metasurface structures, microstructures, etc.
  • the medium 27B is arranged so as to fill the surroundings of the multiple structures 27A.
  • the multiple structures 27A are arranged within the medium 27B, and can also be said to be arranged by replacing part of the medium 27B.
  • the medium 27B can also be said to be a medium layer or a protective layer (protective member).
  • a plurality of structures 27A are arranged at intervals equal to or less than a predetermined wavelength of incident light.
  • a plurality of structures 27A are provided at intervals equal to or less than the wavelength range of visible light in the X-axis and Y-axis directions. Note that in the unit pixel P, a plurality of structures 27A may be arranged at intervals equal to or less than the wavelength range of infrared light.
  • the multiple structures 27A have a refractive index that is different from the refractive index of the surrounding medium 27B.
  • the multiple structures 27A have a refractive index that is higher than the refractive index of the medium 27B.
  • Examples of materials that may be used to form the multiple structures 27A include titanium oxide (TiO), silicon, polysilicon (Poly-Si), amorphous silicon (a-Si), germanium (Ge), etc.
  • the multiple structures 27A may be formed using titanium (Ti), hafnium (Hf), zirconium (Zr), aluminum (Al), tantalum (Ta), indium (In), niobium (Nb), or the like, or an oxide, nitride, oxynitride, or a compound thereof.
  • the multiple structures 27A may be formed by including other metal compounds (metal oxides, metal nitrides, etc.).
  • the multiple structures 27A may be formed using GaP, GaN, GaAs, SiC, or the like.
  • the multiple structures 27A may be formed using silicon oxide (SiO), silicon nitride (SiN), silicon oxynitride (SiON), silicon carbide (SiC), oxygen-doped silicon carbide (SiOC), or other silicon compounds.
  • the multiple structures 27A may be constructed using materials different from each other.
  • the medium 27B is, for example, made of an inorganic material such as an oxide, a nitride, or an oxynitride.
  • the medium 27B may be made of, for example, silicon oxide (SiO), silicon nitride (SiN), silicon oxide nitride (SiON), silicon carbide (SiC), oxygen-doped silicon carbide (SiOC), or other silicon compounds.
  • the medium 27B may be made of TEOS.
  • the medium 27B may be formed using a siloxane resin, a styrene resin, an acrylic resin, or the like.
  • the medium 27B may be made of any of these resins containing fluorine.
  • the medium 27B may be formed using any of these resins filled with beads (filler) that have a refractive index higher (or lower) than that of the resin.
  • the materials of the multiple structures 27A and the medium 27B can be selected according to the refractive index difference with the surrounding medium, the wavelength range of the incident light to be measured, etc. Note that a portion of the multiple structures 27A and the medium 27B may be made using air. For example, the multiple structures 27A may be made to include air (voids).
  • the light-guiding section 27 can control the wavefront of the light by, for example, causing a phase delay in the incident light due to the refractive index difference between the multiple structures 27A and the medium surrounding them.
  • the light-guiding section 27 can adjust the propagation direction of the light by, for example, imparting a phase delay to the incident light using the multiple structures 27A and medium 27B.
  • the materials (optical constants of each material) of the multiple structures 27A and medium 27B, the size (width (diameter), height, etc.) of the multiple structures 27A, the pitch (arrangement interval), etc. are determined so that light of a desired wavelength range from the incident light from the measurement target travels in the desired direction.
  • the material (refractive index), dimensions, pitch, material (refractive index) of the multiple structures 27A, etc. can be set.
  • the material, size, arrangement number, etc. of the multiple structures 27A of each unit pixel P are determined so that light of a specific wavelength band to be detected travels to the photoelectric conversion section 12 of the desired unit pixel P.
  • the multiple structures 27A provided in the red pixel Pr, green pixel Pg, and blue pixel Pb may be formed so that their sizes (e.g., width, height), arrangement positions, etc. are different.
  • the light-guiding section 27 may be configured, for example, as a spectroscopic section (spectroscopic element) capable of splitting incident light.
  • the optical layer 20 (or the light-guiding section 27) may also be referred to as a splitter (color splitter).
  • the optical layer 20 may also be referred to as a color splitter layer or a wavelength separation layer.
  • the optical layer 20 (or the light-guiding section 27) may also be referred to as an optical element configured to redirect light.
  • a protective layer 31 On the light incident side S1 of the optical layer 20, a protective layer 31, a light shielding portion 32, and a protective layer 33 are provided in this order.
  • the protective layer 31 is for protecting the surface of the light guide section 27, and is provided across the pixel section 100A and the peripheral section 100B.
  • the protective layer 31 is composed of, for example, a single layer film made of any of silicon oxide (SiO), silicon nitride (SiN), silicon oxynitride (SiON), etc., or a laminated film made of two or more of these materials.
  • the light-shielding portion 32 is intended to prevent surface reflection in the peripheral portion 100B.
  • the light-shielding portion 32 is provided in the peripheral portion 100B so as to surround the pixel portion 100A, and its surface is covered with a protective layer 33.
  • the light-shielding portion 32 is composed of, for example, a black color filter or a metal film having light-shielding properties such as tungsten (W).
  • a black color filter or a metal film having light-shielding properties such as tungsten (W).
  • materials that can be used for the black color filter include titanium oxide filler dispersion resin, carbon black pigment dispersion resin, and organic pigment dispersion resin.
  • the thickness of the light-shielding portion 32 is, for example, 0.1 ⁇ m or more and 3 ⁇ m or less.
  • the protective layer 33 is intended to prevent surface reflection in the light-shielding portion 32.
  • the protective layer 33 covers the surface of the light-shielding portion 32, specifically, the top and side surfaces of the light-shielding portion 32, and extends onto the surrounding protective layer 31.
  • the protective layer 33 is composed of, for example, any one of silicon oxide (SiO), silicon nitride (SiN), silicon oxide-nitride (SiON), silicon carbide (SiC), oxygen-doped silicon carbide (SiOC), nitrogen-doped silicon carbide (SiNC), aluminum oxide (AlO), hafnium oxide (HfO), tantalum oxide (TaO), and indium oxide (InO).
  • the multi-layer wiring layer 40 is laminated on the semiconductor substrate 11.
  • the multi-layer wiring layer 40 includes, for example, a conductor film and an insulating film, and has multiple wirings and vias.
  • the multi-layer wiring layer 40 has a configuration in which multiple wirings are laminated via an insulating film serving as an interlayer insulating film.
  • the multi-layer wiring layer 40 includes, for example, two or more layers of wirings.
  • the wiring of the multi-layer wiring layer 40 is formed using a metal material such as aluminum (Al), copper (Cu), or tungsten (W).
  • the wiring of the multi-layer wiring layer 40 may be formed using polysilicon (Poly-Si) or other conductive materials.
  • the interlayer insulating film may be formed using, for example, silicon oxide (SiO), silicon nitride (SiN), or silicon oxynitride (SiON).
  • the semiconductor substrate 11 and the multi-layer wiring layer 40 are provided with the above-mentioned readout circuit 41, for example, for each unit pixel P or for each set of multiple unit pixels P.
  • the multi-layer wiring layer 40 may also be formed with a pixel control unit 111, a signal processing unit 112, a control unit 113, a processing unit 114, and the like.
  • the partition wall 21, the color filter 22, the light-shielding film 23, and the sealing film 24 are formed on the first surface 11S1 of the semiconductor substrate 11.
  • the insulating layer 25 is formed, for example, by chemical vapor deposition (CVD), and then, as shown in Fig. 5C, the planarization layer 26 is formed on the insulating layer 25, for example, by CVD.
  • the surface of the planarization layer 26 is planarized using, for example, a chemical mechanical polishing (CMP) method.
  • CMP chemical mechanical polishing
  • a medium 27B made of a low refractive index material for example, SiN/TEOS
  • a hard mask 51 is patterned on the medium 27B using a photolithography technique, and then the medium 27B is processed using, for example, dry etching to form an opening 27H.
  • a high refractive index material e.g., TiO 2
  • ALD atomic layer deposition
  • the high refractive index film formed on the medium 27B is removed by, for example, a CMP method. This forms the light guide section 27 in which a plurality of structures 27A are embedded in the medium 27B.
  • a protective layer 31 is formed on the light-guiding section 27 using, for example, a sputtering method.
  • a light-shielding section 32 is formed in the peripheral section 100B using, for example, photolithography.
  • a continuous protective layer 33 is formed on the protective layer 31 and the light-shielding section 32 using, for example, a sputtering method.
  • a light-shielding section 32 having a surface covered with a protective layer 33 is provided on a light-guiding section 27 including a plurality of structures 27A and a medium 27B filling spaces between adjacent structures 27A, the light-guiding section 27 being provided on the first surface 11S1 side of the semiconductor substrate 11 in which a photoelectric conversion section 12 is embedded for each unit pixel P.
  • the light-shielding section 32 has a surface covered with a protective layer 33 and is provided so as to surround the pixel section 100A. This will be described below.
  • the nanopost structure has a high refractive index compared to the surrounding medium, and a phase difference according to the wavelength occurs between the nanopost structure and the medium. Therefore, by optimizing the radius, length, and arrangement of the nanoposts, it is possible to distribute the visible wavelength band corresponding to each color pixel that makes up the pixel unit. This makes it possible to achieve higher sensitivity than full-color image sensors that obtain RGB color information using general color filters.
  • a light-shielding section 32 having a surface covered with a protective layer 33 is provided on the light-guiding section 27 that is configured to include a plurality of structures 27A and a medium 27B that fills the spaces between the adjacent plurality of structures 27A and extends to the peripheral section 100B.
  • the photodetector 1 can reduce the reflectance by, for example, 90% compared to a case in which a light-shielding film is provided on the top of the pixels included in the OPB region as described above.
  • the reflectance can be reduced by, for example, 50% compared to a case in which a bare light-shielding section 32 that is not covered with a protective layer 33 is provided on the light-guiding section 27 that extends to the peripheral section 100B.
  • Modifications (2-1. Modification 1) 6 is a schematic diagram illustrating an example of a cross-sectional configuration of a photodetector (photodetector 1A) according to Modification 1 of the present disclosure.
  • the photodetector 1A is, for example, a CMOS image sensor used in electronic devices such as digital still cameras and video cameras, and is, for example, a so-called back-illuminated photodetector similar to the first embodiment.
  • a protective layer 33 was provided that covered the upper and side surfaces of the light-shielding portion 32 with a substantially uniform thickness, but this is not limited to this.
  • a protective layer 63 having a plurality of microlenses 63L is provided on the light-shielding portion 32.
  • microlenses 63L are provided above the light-shielding portion 32, and the scattering effect of the microlenses 63L can further suppress surface reflections in the light-shielding portion 32. This makes it possible to improve reliability.
  • (2-2. Modification 2) 7 is a schematic diagram illustrating an example of a cross-sectional configuration of a photodetector (photodetector 1B) according to Modification 2 of the present disclosure.
  • the photodetector 1B is, for example, a CMOS image sensor used in electronic devices such as digital still cameras and video cameras, and is, for example, a so-called back-illuminated photodetector, similar to the first embodiment.
  • a protective layer 33 was provided that covered the upper and side surfaces of the light-shielding portion 32 with a substantially uniform thickness, but this is not limited to this.
  • a protective layer 73 having a concave-convex structure 73X is provided on the light-shielding portion 32.
  • the uneven structure 73X is provided above the light-shielding portion 32, which further suppresses surface reflections in the light-shielding portion 32. This makes it possible to improve reliability.
  • (2-3. Modification 3) 8 is a schematic diagram illustrating an example of a cross-sectional configuration of a photodetector (photodetector 1C) according to Modification 3 of the present disclosure.
  • the photodetector 1C is, for example, a CMOS image sensor used in electronic devices such as digital still cameras and video cameras, and is, for example, a so-called back-illuminated photodetector similar to the first embodiment.
  • a protective layer 33 was provided that covered the upper and side surfaces of the light-shielding portion 32 with a substantially uniform thickness, but this is not limiting.
  • a multilayer interference film 83 is further laminated on the protective layer 33 that covers the light-shielding portion 32.
  • the multilayer interference film 83 has a structure in which layers with different refractive indices are alternately stacked.
  • the multilayer interference film 83 may be a multilayer film in which silicon oxide films and oxynitride films are alternately stacked, or a multilayer film in which silicon oxide films and tantalum oxide films are alternately stacked.
  • the multilayer interference film 83 is further laminated on the protective layer 33, so that surface reflection in the light-shielding portion 32 can be further suppressed. This makes it possible to improve reliability.
  • FIG. 9 is a schematic diagram showing an example (photodetection device 1D) of a cross-sectional configuration of a photodetection device according to Modification 4 of the present disclosure.
  • Fig. 10 is a schematic diagram showing another example (photodetection device 1E) of a cross-sectional configuration of a photodetection device according to Modification 4 of the present disclosure.
  • the photodetection devices 1D and 1E are, for example, CMOS image sensors used in electronic devices such as digital still cameras and video cameras, and are, for example, so-called back-illuminated photodetection devices similar to the first embodiment.
  • a light guide section 27 consisting of one layer was provided, but this is not limited to this.
  • the light detection devices 1D and 1E of this modified example differ from the first embodiment described above in that the light guide section is stacked in two or three stages (light guide sections 27, 28, and 29).
  • the light guide sections 28 and 29 are configured as light guide elements that can guide light by, for example, imparting a phase delay to incident light.
  • the light guide sections 28 and 29 are light guide elements that utilize metamaterial (metasurface) technology.
  • the light guide sections 28 and 29 can also be called metasurface layers (or metamaterial layers).
  • the photodetector devices 1D and 1E like the light guide section 27 of the first embodiment, the light guide sections 28 and 29 are provided across the pixel section 100A and the peripheral section 100B.
  • the light guides 28 and 29 each have a plurality of structures 28A and 29A and a medium 28B and 29B arranged around the plurality of structures 28A and 29A.
  • the light guides 28 and 29 use the plurality of structures 28A and 29A, which are nanostructures, to propagate light toward the photoelectric conversion unit 12.
  • Light from a subject, which is the object to be measured is incident on the light guides 28 and 29.
  • light that has passed through an optical system such as an imaging lens is incident on the plurality of structures 28A and 29A.
  • the plurality of structures 28A and 29A have a size equal to or smaller than a predetermined wavelength of the incident light, for example, a size equal to or smaller than the wavelength range of visible light.
  • the plurality of structures 28A and 29A may also have a size equal to or smaller than the wavelength range of infrared light.
  • the multiple structures 28A, 29A are each, for example, columnar (pillar-shaped) structures, and can be considered as nanopillars.
  • the multiple structures 28A, 29A can be considered as metasurface elements.
  • the multiple structures 28A, 29A have a cylindrical shape.
  • the multiple structures 28A, 29A are arranged so as to be aligned with each other in the X-axis direction or the Y-axis direction, sandwiching the medium 28B, 29B therebetween.
  • the shape of the multiple structures 28A, 29A can be changed as appropriate, and may be circular or rectangular in plan view.
  • the shape of the multiple structures 28A, 29A may be elliptical, polygonal, cross-shaped, or other shapes.
  • the multiple structures 28A, 29A are also called metaatoms, nanoatoms, nanoposts, metasurface structures, microstructures, etc.
  • the media 28B and 29B are provided so as to fill the periphery of the plurality of structures 28A and 29A, respectively.
  • the plurality of structures 28A and 29A are provided within the media 28B and 29B, respectively, and can be said to be disposed by replacing a portion of the media 28B and 29B.
  • the media 28B and 29B can also be said to be a medium layer or a protective layer (protective member).
  • a plurality of structures 28A, 29A are arranged at intervals equal to or less than a predetermined wavelength of incident light.
  • a plurality of structures 28A, 29A are provided at intervals equal to or less than the wavelength range of visible light in the X-axis and Y-axis directions.
  • a plurality of structures 28A, 29A may be arranged at intervals equal to or less than the wavelength range of infrared light.
  • the multiple structures 28A, 29A have a refractive index that is different from the refractive index of the surrounding medium 28B, 29B.
  • the multiple structures 28A, 29A have a refractive index higher than the refractive index of the medium 28B, 29B.
  • Examples of materials that may be used to form the multiple structures 28A and 29A include titanium oxide (TiO), silicon, polysilicon (Poly-Si), amorphous silicon (a-Si), germanium (Ge), etc.
  • the multiple structures 28A, 29A may be formed from titanium (Ti), hafnium (Hf), zirconium (Zr), aluminum (Al), tantalum (Ta), indium (In), niobium (Nb), or the like, or from their oxides, nitrides, oxynitrides, or compounds thereof.
  • the multiple structures 28A, 29A may be formed from other metal compounds (metal oxides, metal nitrides, etc.).
  • the multiple structures 28A, 29A may be formed using GaP, GaN, GaAs, SiC, etc.
  • the multiple structures 28A, 29A may be formed using silicon oxide, silicon nitride, silicon nitride oxide, silicon carbide, silicon oxide carbide, and other silicon compounds.
  • the multiple structures 28A, 29A may be constructed using materials different from each other.
  • the media 28B and 29B are, for example, made of inorganic materials such as oxides, nitrides, and oxynitrides.
  • the media 28B and 29B may be formed of, for example, silicon oxide, silicon nitride, silicon oxynitride, silicon carbide, silicon oxide carbide, and other silicon compounds.
  • the media 28B and 29B may be made of TEOS.
  • the media 28B, 29B may be made of a siloxane resin, a styrene resin, an acrylic resin, or the like.
  • the media 28B, 29B may be made of any of these resins that contain fluorine.
  • the media 28B, 29B may be made of any of these resins that are filled with beads (filler) that have a refractive index higher (or lower) than that of the resin.
  • the materials of the multiple structures 28A, 29A and the media 28B, 29B can be selected according to the refractive index difference with the surrounding medium, the wavelength range of the incident light to be measured, etc.
  • a portion of the multiple structures 28A, 29A and the media 28B, 29B may be composed of air.
  • the multiple structures 28A, 29A may be composed to include air (voids).
  • the light guide sections 27, 28, and 29 are configured as light guide elements capable of, for example, imparting a phase delay to incident light and guiding the light.
  • the multiple structures 27A, 28A, and 29A and the media 27B, 28B, and 29B are arranged so as to impart a desired phase profile to the incident light.
  • the materials (optical constants of each material) of the multiple structures 27A, 28A, and 29A and the media 27B, 28B, and 29B, the sizes (width (diameter), height, etc.) and pitch (arrangement interval) of the multiple structures 27A, 28A, and 29A and the media 27B, 28B, and 29B are determined so that the light in the wavelength band to be detected is focused on the photoelectric conversion section 12.
  • the materials (refractive index), dimensions, and pitch of the multiple structures 27A, 28A, and 29A, the materials (refractive index) of the media 27B, 28B, and 29B, and the like can be set.
  • Protective layers 34, 35 are provided between light guide section 27 and light guide section 28, and between light guide section 28 and light guide section 29, respectively.
  • Protective layers 34, 35 are formed, for example, of a single layer film made of silicon oxide, silicon nitride, silicon oxynitride, etc., or a laminate film made of two or more of these.
  • Protective layers 34, 35 may be omitted as appropriate.
  • a light guiding section having a multi-stage structure is provided on the first surface 11S1 side of the semiconductor substrate 11.
  • a light shielding section 32 whose surface is covered with a protective layer 33 on the topmost light guiding section (here, light guiding section 28 or light guiding section 29), it is possible to obtain the same effect as in the first embodiment.
  • (2-5. Modification 5) 11 is a schematic diagram illustrating an example of a cross-sectional configuration of a photodetector (photodetector 1F) according to Modification 5 of the present disclosure.
  • the photodetector 1F is, for example, a CMOS image sensor used in electronic devices such as digital still cameras and video cameras, and is, for example, a so-called back-illuminated photodetector similar to the first embodiment.
  • the light guides are stacked in two or three stages (light guides 27, 28, 29), and a light shielding section 32 with a surface covered by a protective layer 33 is provided only on the topmost light guide (here, light guide 28 or light guide 29), but this is not limiting.
  • a light shielding section 36 is also provided on light guide 27, which is a layer below the topmost light guide 28.
  • FIGS. 12A to 12C show the manufacturing method of the photodetector 1F in order of steps.
  • the light guide section 27 is formed in the same manner as in the first embodiment.
  • a recessed section 27X of a predetermined depth is formed in the medium 27B extending to the peripheral section 100B by using, for example, photolithography and etching.
  • the light shielding portion 36 is formed, for example, by sputtering to fill the recessed portion 27X, and then the light shielding portion 36 formed on the light guide portion 27 is removed, for example, by CMP, and the surface is flattened.
  • the protective layer 34 and the medium 28B are sequentially formed, and then the multiple structures 28A, protective layer 31, light shielding portion 32, and protective layer 33 are sequentially formed in the same manner as in the first embodiment. With the above steps, the light detection device 1F shown in FIG. 11 is completed.
  • a light-shielding section 36 is also provided between light-guiding section 27 and light-guiding section 28, which have a multi-stage structure. Even with this configuration, it is possible to obtain the same effect as in the first embodiment.
  • (2-6. Modification 6) 13 is a schematic diagram illustrating an example of a cross-sectional configuration of a photodetector (photodetector 1G) according to Modification 6 of the present disclosure.
  • the photodetector 1G is, for example, a CMOS image sensor used in electronic devices such as digital still cameras and video cameras, and is, for example, a so-called back-illuminated photodetector similar to the first embodiment.
  • the light-shielding portion 36 was embedded in the medium 27B of the light-guiding portion 27 below the uppermost light-guiding portion 28, but this is not limited to this.
  • the light-shielding portion 36 is provided on the medium 27B.
  • FIGS. 14A to 14C show the manufacturing method of the photodetector 1G in order of steps.
  • the light guide section 27 is formed in the same manner as in the first embodiment.
  • the light shielding section 32 is formed in the peripheral section 100B using, for example, photolithography.
  • the protective layer 34 and the medium 28B are sequentially formed.
  • the surface of the medium 28B is planarized using, for example, a CMP method. Thereafter, similar to the first embodiment, a plurality of structures 28A, a protective layer 31, a light shielding portion 32, and a protective layer 33 are sequentially formed. With the above steps, the photodetector 1G shown in FIG. 13 is completed.
  • the light-shielding section 36 between the light-guiding section 27 and the light-guiding section 28, which have a multi-stage structure, is provided on the light-guiding section 27. Even with this configuration, it is possible to obtain the same effect as in the first embodiment.
  • (2-7. Modification 7) 15 is a schematic diagram illustrating an example of a cross-sectional configuration of a photodetector (photodetector 1H) according to Modification 7 of the present disclosure.
  • the photodetector 1H is, for example, a CMOS image sensor used in electronic devices such as digital still cameras and video cameras, and is, for example, a so-called back-illuminated photodetector similar to the first embodiment.
  • the light-shielding portion 36 is formed in the medium 27B of the light-guiding portion 27, which is a layer below the uppermost light-guiding portion 28, but the light-shielding portion 37 may also be formed on the first surface 11S1 of the semiconductor substrate 11.
  • the light-shielding portion 37 is also formed on the first surface 11S1 of the semiconductor substrate 11. Even with this configuration, it is possible to obtain the same effect as in the first embodiment described above.
  • FIG. 16 is a schematic diagram showing an example of a cross-sectional configuration of a photodetector according to the eighth modified example of the present disclosure (photodetector 1I).
  • FIG. 17 is a schematic diagram showing another example of a cross-sectional configuration of a photodetector according to the eighth modified example of the present disclosure (photodetector 1J).
  • FIG. 18 is a schematic diagram showing another example of a cross-sectional configuration of a photodetector according to the eighth modified example of the present disclosure (photodetector 1K).
  • FIG. 19 is a schematic diagram showing another example of a cross-sectional configuration of a photodetector according to the eighth modified example of the present disclosure (photodetector 1L).
  • the photodetectors 1I, 1J, 1K, and 1L are, for example, CMOS image sensors used in electronic devices such as digital still cameras and video cameras, and are, for example, so-called back-illuminated photodetectors, as in the first embodiment.
  • the light-shielding portions 32, 36, and 37 shown in the first embodiment and modifications 1 to 7 above may be electrically connected to the semiconductor substrate 11, for example, by through-hole wiring 38 that reaches the semiconductor substrate 11.
  • FIG. 26 is a schematic diagram showing an example of a cross-sectional configuration of a photodetector (photodetector 2) according to a second embodiment of the present disclosure.
  • FIG. 27 is a schematic diagram showing an example of a planar layout of a plurality of structures 61, 67A in the pixel section 100A and the peripheral section 100B of the photodetector 1 shown in FIG. 26.
  • the photodetector 2 is applicable to, for example, a CMOS image sensor used in electronic devices such as digital still cameras and video cameras, as with the photodetector 1 of the first embodiment, and has a pixel section (pixel section 100A) in which a plurality of pixels are two-dimensionally arranged in a matrix as an imaging area.
  • the photodetector 1 is, for example, a so-called back-illuminated photodetector in this CMOS image sensor.
  • the photodetection device 2 is a back-illuminated imaging device.
  • the photodetection device 2 has a pixel section 100A in which a plurality of unit pixels P are arranged two-dimensionally in a matrix, and a peripheral section 100B surrounding the pixel section 100A.
  • the plurality of unit pixels P arranged two-dimensionally in a matrix in the pixel section 100A each have a configuration in which, for example, a light receiving section 10, an optical layer 60 provided on the light incident side S1 of the light receiving section 10, and a multi-layer wiring layer 40 provided on the opposite side of the light incident side S1 of the light receiving section 10 are stacked.
  • the light receiving section 10, the optical layer 60, and the multi-layer wiring layer 40 are provided, for example, across the pixel section 100A and the peripheral section 100B.
  • the optical layer 60 includes, for example, a light guide section 67 including a plurality of structures 67A that are nanostructures and a medium 67B that fills the spaces between the adjacent structures 67A.
  • a light-shielding section 32 having a surface covered with a protective layer 33 is provided on a light-guiding section 67 of the peripheral portion 100B, and a plurality of structures 61 having light absorbing properties are provided on the light-guiding section 67 below the light-shielding section 32.
  • the light guide portion 67 corresponds to a specific example of a "light guide portion” as one embodiment of the present disclosure.
  • the multiple structures 67A correspond to a specific example of a "multiple first structures" as one embodiment of the present disclosure
  • the medium 67B corresponds to a specific example of a “medium” as one embodiment of the present disclosure.
  • the light shielding portion 32 corresponds to a specific example of a "light absorbing layer” as one embodiment of the present disclosure
  • the protective layer 33 corresponds to a specific example of a "protective film” as one embodiment of the present disclosure.
  • the multiple structures 61 correspond to a specific example of a "multiple second structures" as one embodiment of the present disclosure.
  • the optical layer 60 includes, for example, a partition 21, a color filter 22, a light-shielding film 23, a sealing film 24, an insulating layer 25, a planarization layer 26, and a light-guiding section 67, and is configured to guide the light incident from the light incident side S1 to the light receiving section 10.
  • the partition 21 is provided at the boundary between adjacent unit pixels P and is a frame having an opening 21H for each unit pixel P.
  • the partition 21, like the separation section 13, is provided to surround the unit pixel P and is provided in a lattice pattern across the pixel section 100A and the OPB region 100b around its periphery.
  • the partition 21 is intended to prevent light incident obliquely from the light incident side S1 from leaking into the adjacent unit pixel P.
  • the partition 21 is made of a material that has a lower refractive index than the color filter 22, for example.
  • the partition 21 may also serve as a light shield for the unit pixel P that determines the optical black level.
  • the partition 21 may also serve as a light shield to suppress the generation of noise in the peripheral circuitry provided in the peripheral portion 100B.
  • the partition 21 may be formed, for example, using a material that has light shielding properties. Examples of such materials include W, Ag, Cu, Ti, Al, or alloys thereof. Other examples include metal compounds such as TiN.
  • the partition 21 may be configured, for example, as a single layer film or a laminated film. When the partition 21 is configured as a laminated film, a layer made of, for example, Ti, Ta, W, Co, or Mo, or alloys, nitrides, oxides, or carbides thereof may be provided as an underlayer.
  • the color filters 22 selectively transmit light of a specific wavelength, and include, for example, a red filter 22R that selectively transmits red light (R), a green filter 22G that selectively transmits green light (G), and a blue filter 22B that selectively transmits blue light (B).
  • a red filter 22R that selectively transmits red light (R)
  • a green filter 22G that selectively transmits green light (G)
  • a blue filter 22B that selectively transmits blue light (B).
  • Each of the color filters 22R, 22G, and 22B is formed by filling the opening 21H of the partition 21 with a resin material in which the desired pigment or dye is dispersed.
  • each color filter 22R, 22G, 22B For example, for four unit pixels P arranged in 2 rows and 2 columns, two green filters 22G are arranged diagonally, and one red filter 22R and one blue filter 22B are arranged on the orthogonal diagonal.
  • the corresponding color light is selectively photoelectrically converted in each photoelectric conversion unit 12.
  • unit pixels P red pixels Pr) that selectively receive red light (R) and perform photoelectric conversion
  • unit pixels P green pixels Pg) that selectively receive green light (G) and perform photoelectric conversion
  • unit pixels P blue pixels Pb) that selectively receive blue light (B) and perform photoelectric conversion
  • the red pixels Pr, green pixels Pg, and blue pixels Pb generate pixel signals of the red light (R) component, the green light (G) component, and the blue light (B) component, respectively. This allows the photodetection device 2 to obtain RGB pixel signals.
  • the arrangement of the color filters 22R, 22G, and 22B is not limited to the above and can be set arbitrarily.
  • the color filters 22R, 22G, and 22B may be arranged such that a red filter 22R, a green filter 22G, and a blue filter 22B are arranged for each unit, with four unit pixels P arranged in two rows and two columns, or nine unit pixels P arranged in three rows and three columns.
  • the red filters 22R, green filters 22G, and blue filters 22B arranged for each unit are arranged in the same manner as when they are arranged for each unit pixel P, for example, two green filters 22G are arranged on the diagonal, and one red filter 22R and one blue filter 22B are arranged on the orthogonal diagonal for four units arranged in two rows and two columns (see, for example, FIG. 27).
  • the color filter 22 may be provided with complementary color filters that selectively transmit cyan (C), magenta (M), and yellow (Y). Furthermore, the color filter 22 may be provided with a filter corresponding to white (W), that is, a filter that transmits light of all wavelengths incident on the light detection device 2. In addition, the color filter 22 may be provided with a filter that selectively transmits infrared light.
  • the thickness of the color filter 22 may be different for each color, taking into account the color reproducibility and sensor sensitivity of the optical spectrum.
  • the light-shielding film 23 is intended to block light incident on the photoelectric conversion section 12 provided in the OPB region 100b.
  • the light-shielding film 23 is provided in the OPB region 100b between the light-receiving section 10 (specifically, the dielectric layer 14) and the partition wall 21 and color filter 22.
  • the light-shielding film 23 can be formed using, for example, W, Ag, Cu, Ti, Al, or an alloy thereof.
  • the OPB region 100n is further provided with a sealing film 24 that covers the partitions 21, the color filters 22, and the light-shielding film 23.
  • the sealing film 24 can be formed using, for example, SiO, SiN, SiON, or the like.
  • the sealing film 24 may be formed of a color filter of a different color from the color filters 22 arranged in the peripheral portion 100B. For example, if the color filters 22 arranged in the peripheral portion 100B are red filters 22R and green filters 22G, the sealing film 24 may be formed of blue filters 22B.
  • the insulating layer 25 is provided between the light receiving section 10 and the light guiding section 67.
  • the insulating layer 25 is formed across the pixel section 100A and the peripheral section 100B so as to be laminated, for example, on the partition 211 and color filter 22 provided in the pixel section 100A, and the sealing film 24 that covers the partition 21, color filter 22, and light shielding film 23 provided in the OPB region 100b.
  • the insulating layer 25 can be formed using, for example, SiO, SiN, AlO, etc.
  • Insulating layer 25 may be formed using a material with a low refractive index other than the insulating materials mentioned above, or may be formed from other materials that transmit light in the wavelength range to be measured. Insulating layer 25 can also be called a transparent layer or spacer layer that transmits light.
  • the planarization layer 26 is intended to fill in the steps of the insulating layer 25 that are generated by providing the insulating layer 25 to cover the partition 211, the color filter 22, and the sealing film 24 that covers the partition 21, the color filter 22, and the light-shielding film 23, thereby planarizing the surface.
  • the planarization layer 26 can be formed using, for example, SiO, SiN, AlO, etc.
  • the light guide section 67 is configured as a light guide element capable of guiding light by, for example, imparting a phase delay to incident light.
  • the light guide section 67 is a light guide element that utilizes metamaterial (metasurface) technology.
  • the light guide section 67 can also be called a metasurface layer (or metamaterial layer).
  • the light guide section 67 is provided across the pixel section 100A and the peripheral section 100B.
  • the light guiding section 67 has a plurality of structures 67A and a medium 67B arranged around the plurality of structures 67A.
  • the light guiding section 67 uses the plurality of structures 67A, which are nanostructures, to propagate light toward the photoelectric conversion section 12.
  • Light from a subject, which is the object to be measured, is incident on the light guiding section 67.
  • Light that has passed through an optical system such as an imaging lens is incident on the plurality of structures 67A.
  • the plurality of structures 67A have a size equal to or smaller than a predetermined wavelength of the incident light, for example, a size equal to or smaller than the wavelength range of visible light.
  • the plurality of structures 67A may also have a size equal to or smaller than the wavelength range of infrared light.
  • the multiple structures 67A are each, for example, a columnar (pillar-shaped) structure, and can be considered as a nanopillar.
  • the multiple structures 67A can be considered as a metasurface element.
  • the multiple structures 67A have a cylindrical shape.
  • the multiple structures 67A are arranged so as to be aligned with each other in the X-axis direction or the Y-axis direction, sandwiching the medium 67B therebetween.
  • the shape of the multiple structures 67A can be changed as appropriate, and each may be circular or rectangular in plan view.
  • the shape of the multiple structures 67A may also be elliptical, polygonal, cross-shaped, or other shapes.
  • the multiple structures 67A are also called metaatoms, nanoatoms, nanoposts, metasurface structures, microstructures, etc.
  • the medium 67B is arranged so as to fill the surroundings of the multiple structures 67A.
  • the multiple structures 67A are arranged within the medium 67B, and can be said to be arranged by replacing part of the medium 67B.
  • the medium 67B can also be said to be a medium layer or a protective layer (protective member).
  • multiple structures 67A are arranged at intervals equal to or less than a predetermined wavelength of incident light.
  • multiple structures 67A are provided in the X-axis and Y-axis directions at intervals equal to or less than the wavelength range of visible light. Note that in the unit pixel P, multiple structures 67A may be arranged at intervals equal to or less than the wavelength range of infrared light.
  • the plurality of structures 67A has a refractive index that is different from the refractive index of the surrounding medium 67B.
  • the plurality of structures 67A has a refractive index that is higher than the refractive index of the medium 67B.
  • Examples of materials that may be used to form the multiple structures 67A include titanium oxide (TiO), silicon, polysilicon (Poly-Si), amorphous silicon (a-Si), germanium (Ge), etc.
  • the multiple structures 67A may be formed using titanium (Ti), hafnium (Hf), zirconium (Zr), aluminum (Al), tantalum (Ta), indium (In), niobium (Nb), or the like, or an oxide, nitride, oxynitride, or a compound of these.
  • the multiple structures 67A may be formed by including other metal compounds (metal oxides, metal nitrides, etc.).
  • the multiple structures 67A may be formed using GaP, GaN, GaAs, SiC, or the like.
  • the multiple structures 67A may be formed using silicon oxide (SiO), silicon nitride (SiN), silicon oxynitride (SiON), silicon carbide (SiC), oxygen-doped silicon carbide (SiOC), or other silicon compounds.
  • the multiple structures 67A may be constructed using materials different from each other.
  • the medium 67B is, for example, made of an inorganic material such as an oxide, a nitride, or an oxynitride.
  • the medium 67B may be made of, for example, silicon oxide (SiO), silicon nitride (SiN), silicon oxide nitride (SiON), silicon carbide (SiC), oxygen-doped silicon carbide (SiOC), or other silicon compounds.
  • the medium 67B may be made of TEOS.
  • the medium 67B may be formed using a siloxane resin, a styrene resin, an acrylic resin, or the like.
  • the medium 67B may be made of any of these resins containing fluorine.
  • the medium 67B may be formed using any of these resins filled with beads (filler) that have a refractive index higher (or lower) than that of the resin.
  • the materials of the multiple structures 67A and the medium 67B can be selected according to the refractive index difference with the surrounding medium, the wavelength range of the incident light to be measured, etc. Note that a portion of the multiple structures 67A and the medium 67B may be made using air. For example, the multiple structures 67A may be made to include air (voids).
  • the light guide 67 can control the wavefront of the light by, for example, causing a phase delay in the incident light due to the refractive index difference between the multiple structures 67A and the medium surrounding them.
  • the light guide 67 can adjust the propagation direction of the light by, for example, giving a phase delay to the incident light by the multiple structures 67A and medium 67B.
  • the materials (optical constants of each material) of the multiple structures 67A and medium 67B, the size (width (diameter), height, etc.) of the multiple structures 67A, the pitch (arrangement interval), etc. are determined so that light of a desired wavelength range from the incident light from the measurement target travels in the desired direction.
  • the material (refractive index), dimensions, pitch, and material (refractive index) of the multiple structures 67A and medium 67B can be set.
  • the material, size, arrangement number, etc. of the multiple structures 67A of each unit pixel P are determined so that light of a specific wavelength band to be detected travels to the photoelectric conversion section 12 of the desired unit pixel P.
  • the multiple structures 67A provided in the red pixel Pr, green pixel Pg, and blue pixel Pb may be formed so that their sizes (e.g., width, height), arrangement positions, etc. are different.
  • the light guide section 67 may be configured, for example, as a spectroscopic section (spectroscopic element) capable of splitting incident light.
  • the optical layer 60 (or the light guide section 67) may also be referred to as a splitter (color splitter).
  • the optical layer 60 may also be referred to as a color splitter layer or a wavelength separation layer.
  • the optical layer 60 (or the light guide section 67) may also be referred to as an optical element configured to redirect light.
  • the light guide section 67 further has a plurality of structures 61 arranged in the peripheral section 100B.
  • the plurality of structures 61 are intended to absorb stray light that is obliquely incident on the light detection device 2 and propagates in the XY plane direction, and to reduce flare caused by surface reflection in the peripheral section 100B.
  • the plurality of structures 61 have the same configuration as the plurality of structures 67A described above, except for their constituent materials.
  • the multiple structures 61 have a size equal to or smaller than a predetermined wavelength of the incident light, for example, a size equal to or smaller than the wavelength range of visible light. Note that the multiple structures 61 may also have a size equal to or smaller than the wavelength range of infrared light.
  • Each of the multiple structures 61 is, for example, a columnar (pillar-shaped) structure, and can be called a nanopillar.
  • the multiple structures 61 can be called a metasurface element.
  • the multiple structures 61 have a cylindrical shape.
  • the multiple structures 61 are arranged so as to be aligned with each other in the X-axis direction or the Y-axis direction, sandwiching the medium 67B therebetween.
  • the shape of the multiple structures 61 can be changed as appropriate, and each may be circular or rectangular in plan view.
  • the shape of the multiple structures 61 may also be elliptical, polygonal, cross-shaped, or other shapes.
  • the plurality of structures 61 are formed, for example, using the same material as the light-shielding portion 32. Specifically, the plurality of structures 61 are formed of a black color filter or a metal film having light-shielding properties such as W. Examples of materials that can be used for the black color filter include titanium oxide filler dispersion resin, carbon black pigment dispersion resin, and organic pigment dispersion resin.
  • the multiple structures 61 are arranged in a predetermined pattern below the light-shielding portion 32 provided in the peripheral portion 100B so as to surround the pixel portion 100A. Specifically, the multiple structures 61 are arranged in the same array pattern as the multiple structures 67A arranged in the pixel portion 100A, as shown in FIG. 27, for example.
  • a light-shielding portion 32 and a protective layer 33 are provided in this order on the light incident side S1 of the optical layer 60.
  • the light-shielding portion 32 is intended to prevent surface reflection in the peripheral portion 100B.
  • the light-shielding portion 32 is provided in the peripheral portion 100B so as to surround the pixel portion 100A, and its surface is covered with a protective layer 33.
  • the light-shielding portion 32 is provided directly on the light-guiding portion 67, and the light-shielding portion 32 and the multiple structures 61 provided in the light-guiding portion 67 are in contact with each other.
  • the light-shielding portion 32 is composed of, for example, a black color filter or a metal film having light-shielding properties such as W.
  • Examples of materials that can be used for the black color filter include titanium oxide filler dispersion resin, carbon black pigment dispersion resin, and organic pigment dispersion resin.
  • the thickness of the light-shielding portion 32 is, for example, 0.1 ⁇ m or more and 3 ⁇ m or less.
  • the protective layer 33 is intended to prevent surface reflection in the light-shielding portion 32.
  • the protective layer 33 covers the surface of the light-shielding portion 32, specifically, the top and side surfaces of the light-shielding portion 32, and extends onto the surrounding light-guiding portion 67.
  • the protective layer 33 is composed of, for example, any one of SiO, SiN, SiON, SiC, SiOC, SiNC, AlO, HfO, TaO, and InO.
  • a medium 67B made of a low refractive index material (e.g., SiN/TEOS) is formed on the planarization layer 26.
  • a hard mask 51 is patterned on the medium 67B using photolithography technology, and then the medium 67B is processed using, for example, dry etching to form an opening 67H across the pixel portion 100A and the peripheral portion 100B.
  • a black color filter is embedded in the opening 67H provided in the peripheral portion 100B by using, for example, a photolithography technique to form a plurality of structures 61.
  • a high refractive index material for example, TiO 2
  • the high refractive index film formed on the medium 67B is removed by using, for example, a CMP method. This forms a light guide portion 67 in which a plurality of structures 67A and a plurality of structures 61 are embedded in the medium 67B.
  • the light-shielding portion 32 is formed on the light-guiding portion 67 of the peripheral portion 100B using, for example, photolithography.
  • a continuous protective layer 33 is formed on the light-guiding portion 67 and the light-shielding portion 32 using, for example, a sputtering method.
  • a plurality of light-absorbing structures 61 are provided in the light guide section 67 below the light shielding section 32 provided to surround the pixel section 100A in the above-described first embodiment. This allows stray light that is obliquely incident on the photodetector 2 and propagates in the XY in-plane direction to be absorbed by the plurality of structures 61. Furthermore, compared to the photodetector 1 of the above-described first embodiment, flare due to surface reflection in the peripheral section 100B is further reduced.
  • the reliability of the photodetector 2 of this embodiment can be further improved.
  • a plurality of structures 61 having a configuration similar to that of the plurality of structures 67A are arranged around the pixel section 100A, so it is possible to reduce the shape deviation of the plurality of structures 67A near the outermost periphery of the pixel section 100A.
  • the structures 61 provided below the light-shielding portion 32 are formed using the same material as the light-shielding portion 32, improving the adhesion of the light-shielding portion 32 to the light-guiding portion 67. Therefore, peeling of the light-shielding portion 32 is reduced compared to the photodetector 1 of the first embodiment.
  • Modifications (4-1. Modification 9) 29 is a schematic diagram showing an example of a cross-sectional configuration of a photodetector according to the ninth modified example of the present disclosure (photodetector 2A).
  • FIG. 30 is a schematic diagram showing another example of a cross-sectional configuration of a photodetector according to the ninth modified example of the present disclosure (photodetector 2B).
  • FIG. 31 is a schematic diagram showing another example of a cross-sectional configuration of a photodetector according to the ninth modified example of the present disclosure (photodetector 2C).
  • the photodetectors 2A to 2C are, for example, CMOS image sensors used in electronic devices such as digital still cameras and video cameras, and are, for example, so-called back-illuminated photodetectors, similar to the first embodiment.
  • the plurality of structures 61 provided in the peripheral portion 100B of the light guide portion 67 extend from the light shielding portion 32 to the planarization layer 26, but the present invention is not limited to this.
  • the photodetector 2A of this modification the lower ends of the plurality of structures 61 are embedded in the insulating layer 25.
  • the photodetector 2B of this modification the lower ends of the plurality of structures 61 embedded in the insulating layer 25 are further extended, and a portion of the structures 61 is in contact with the sealing film 24.
  • the photodetectors 2A to 2C have substantially the same configuration as the photodetector 2 of the second embodiment.
  • the multiple structures 61 provided in the peripheral portion 100B of the light-guiding portion 67 are extended further toward the light-receiving portion 10, so that stray light that is obliquely incident on the photodetector 2 and propagates in the XY plane direction is further absorbed compared to the photodetector 2 of the second embodiment described above. Therefore, the photodetectors 2A to 2C of this modified example can further improve their reliability.
  • FIG. 32 is a schematic representation of an example of a cross-sectional configuration of a photodetector according to the tenth modified example of the present disclosure (photodetector 2D).
  • FIG. 33 is a schematic representation of an example of a planar layout of a plurality of structures 61A, 67A in the pixel section 100A and the peripheral section 100B of the photodetector 2D shown in FIG. 32.
  • FIG. 34 is a schematic representation of another example of a cross-sectional configuration of a photodetector according to the tenth modified example of the present disclosure (photodetector 2E).
  • FIG. 35 is a schematic representation of an example of a planar layout of a plurality of structures 61, 61A, 67A in the pixel section 100A and the peripheral section 100B of the photodetector 2E shown in FIG. 34.
  • the photodetectors 2D and 2E are, for example, CMOS image sensors used in electronic devices such as digital still cameras and video cameras, and are, for example, so-called back-illuminated photodetectors, as in the first embodiment.
  • the plurality of structures 61 provided in the peripheral portion 100B of the light guide portion 67 are arranged in the same arrangement pattern as the plurality of structures 67A arranged in the pixel portion 100A, but this is not limited to this.
  • the light detection device 2D of this modified example has a plurality of structures 61A arranged in a different pattern from the plurality of structures 67A arranged in the pixel portion 100A in the peripheral portion 100B of the light guide portion 67, as shown in FIG. 33, for example.
  • the light detection device 2E of this modified example has a plurality of structures 61 arranged in the same pattern as the plurality of structures 67A arranged in the pixel portion 100A in the peripheral portion 100B of the light guide portion 67, and a plurality of structures 61A arranged in a different pattern from the plurality of structures 67A arranged in the pixel portion 100A, as shown in FIG. 35, for example.
  • the plurality of structures 61 arranged in the same pattern as the plurality of structures 67A are arranged in the OPB region 100b on the periphery of the pixel portion 100A.
  • the multiple structures 61A which are arranged in a pattern different from the multiple structures 67A, are arranged on the outer periphery of the OPB region 100b. Except for this, the photodetectors 2D and 2E have substantially the same configuration as the photodetector 2 of the second embodiment.
  • optical detection devices 2D and 2E of this modified example can achieve the same effects as those of the second embodiment described above.
  • Fig. 36 is a schematic diagram showing an example of a cross-sectional configuration of a photodetector according to Modification 11 of the present disclosure (photodetector 2F).
  • Fig. 37 is a schematic diagram showing another example of a cross-sectional configuration of a photodetector according to Modification 11 of the present disclosure (photodetector 2G).
  • the photodetectors 2F and 2G are, for example, CMOS image sensors used in electronic devices such as digital still cameras and video cameras, and are, for example, so-called back-illuminated photodetectors, similar to the first embodiment.
  • the light shielding portion 32 and the multiple structures 61 are provided in the OPB region 100b adjacent to the pixel portion 100A, but this is not limited to this.
  • the photodetector 2F of this modified example the light shielding portion 32 and the multiple structures 61 are provided in the peripheral portion 100B at a predetermined distance from the pixel portion 100A, as shown in FIG. 36, for example.
  • the photodetector 2G of this modified example the light shielding portion 32 and the multiple structures 61 are provided from the dummy pixel region 100a2 to the peripheral portion 100B, as shown in FIG. 37, for example. Except for this point, the photodetectors 2F and 2G have substantially the same configuration as the photodetector 2 of the second embodiment described above.
  • the photodetectors 2F and 2G of this modified example can achieve the same effects as those of the second embodiment described above.
  • Fig. 38 is a schematic diagram showing an example of a cross-sectional configuration of a photodetector according to Modification 12 of the present disclosure (photodetector 2H).
  • Fig. 39 is a schematic diagram showing another example of a cross-sectional configuration of a photodetector according to Modification 12 of the present disclosure (photodetector 2I).
  • the photodetectors 2H and 2I are, for example, CMOS image sensors used in electronic devices such as digital still cameras and video cameras, and are, for example, so-called back-illuminated photodetectors, similar to the first embodiment.
  • the light guide section 67 consisting of one layer was provided, but this is not limited to this.
  • the light detection devices 2H and 2I of this modification differ from the second embodiment described above in that, like the fourth modification described above, the light guide section is stacked in two or three stages (light guide sections 67, 68, and 69).
  • the light guide sections 68 and 69 are configured as light guide elements that can guide light by, for example, imparting a phase delay to incident light.
  • the light guide sections 68 and 69 are light guide elements that utilize metamaterial (metasurface) technology.
  • the light guide sections 68 and 69 can also be called metasurface layers (or metamaterial layers).
  • the photodetector devices 2H and 2I like the light guide section 67 of the first embodiment, the light guide sections 68 and 69 are provided across the pixel section 100A and the peripheral section 100B.
  • the light guides 68, 69 each have a plurality of structures 68A, 69A and a medium 68B, 69B arranged around the plurality of structures 68A, 69A.
  • the light guides 68, 69 use the plurality of structures 68A, 69A, which are nanostructures, to propagate light toward the photoelectric conversion unit 12.
  • Light from a subject, which is the object to be measured, is incident on the light guides 68, 69.
  • an optical system such as an imaging lens
  • the plurality of structures 68A, 69A have a size equal to or smaller than a predetermined wavelength of the incident light, for example, a size equal to or smaller than the wavelength range of visible light. Note that the plurality of structures 68A, 69A may have a size equal to or smaller than the wavelength range of infrared light.
  • the multiple structures 68A, 69A are each, for example, columnar (pillar-shaped) structures and can be called nanopillars.
  • the multiple structures 68A, 69A can be called metasurface elements.
  • the multiple structures 68A, 69A have a cylindrical shape.
  • the multiple structures 68A, 69A are arranged side by side in the X-axis direction or the Y-axis direction, sandwiching the medium 68B, 69B.
  • the shape of the multiple structures 68A, 69A can be changed as appropriate, and may be circular or rectangular in plan view.
  • the shape of the multiple structures 68A, 69A may be elliptical, polygonal, cross-shaped, or other shapes.
  • the multiple structures 68A, 69A are also called metaatoms, nanoatoms, nanoposts, metasurface structures, microstructures, etc.
  • the media 68B, 69B are provided so as to fill the periphery of the multiple structures 68A, 69A, respectively.
  • the multiple structures 68A, 69A are provided within the media 68B, 69B, respectively, and can be said to be arranged by replacing a part of the media 68B, 69B.
  • the media 68B, 69B can also be said to be a medium layer or a protective layer (protective member).
  • multiple structures 68A, 69A are arranged at intervals equal to or less than a predetermined wavelength of incident light.
  • multiple structures 68A, 69A are provided in the X-axis and Y-axis directions at intervals equal to or less than the wavelength range of visible light.
  • multiple structures 68A, 69A may be arranged at intervals equal to or less than the wavelength range of infrared light.
  • the multiple structures 68A, 69A have a refractive index that is different from the refractive index of the surrounding medium 68B, 69B.
  • the multiple structures 68A, 69A have a refractive index higher than the refractive index of the medium 68B, 69B.
  • the constituent materials of the multiple structures 68A, 69A include, for example, TiO, silicon, polysilicon (Poly-Si), amorphous silicon (a-Si), germanium (Ge), etc.
  • the multiple structures 68A, 69A may be formed from Ti, Hf, Zr, Al, Ta, In, Nb, etc., or from their oxides, nitrides, oxynitrides, or compounds thereof.
  • the multiple structures 68A, 69A may be formed from other metal compounds (metal oxides, metal nitrides, etc.).
  • the multiple structures 68A, 69A may be formed using GaP, GaN, GaAs, SiC, etc.
  • the multiple structures 68A, 69A may be formed using silicon oxide, silicon nitride, silicon nitride oxide, silicon carbide, silicon oxide carbide, and other silicon compounds.
  • the multiple structures 68A, 69A may be constructed using materials different from each other.
  • the media 68B, 69B are, for example, made of inorganic materials such as oxides, nitrides, and oxynitrides.
  • the media 68B, 69B may be made of, for example, silicon oxide, silicon nitride, silicon oxynitride, silicon carbide, silicon oxide carbide, and other silicon compounds.
  • the media 68B, 69B may be made of TEOS.
  • the media 68B, 69B may be made of a siloxane resin, a styrene resin, an acrylic resin, or the like.
  • the media 68B, 69B may be made of any of these resins containing fluorine.
  • the media 68B, 69B may be made of any of these resins filled with beads (filler) that have a refractive index higher (or lower) than that of the resin.
  • the materials of the multiple structures 68A, 69A and the media 68B, 69B can be selected depending on the refractive index difference with the surrounding medium, the wavelength range of the incident light to be measured, etc. Note that a portion of the multiple structures 68A, 69A and the media 68B, 69B may be composed of air. For example, the multiple structures 68A, 69A may be composed to include air (voids).
  • the light guide units 67, 68, 69 are configured as light guide elements capable of, for example, imparting a phase delay to incident light and guiding the light.
  • the multiple structures 67A, 68A, 69A and the media 67B, 68B, 69B are arranged so as to impart a desired phase profile to the incident light.
  • the materials (optical constants of each material) of the multiple structures 67A, 68A, 69A and the media 67B, 68B, 69B, the sizes (width (diameter), height, etc.) and pitch (arrangement interval) of the multiple structures 67A, 68A, 69A and the media 67B, 68B, 69B are determined so that the light in the wavelength band to be detected is focused on the photoelectric conversion unit 12.
  • the materials (refractive index), dimensions, pitch, and materials (refractive index) of the multiple structures 67A, 68A, 69A and the media 67B, 68B, 69B can be set.
  • Protective layers 34, 35 are provided between light guide section 67 and light guide section 68, and between light guide section 68 and light guide section 69, respectively.
  • Protective layers 34, 35 are formed, for example, of a single layer film made of silicon oxide, silicon nitride, silicon oxynitride, etc., or a laminate film made of two or more of these materials.
  • Protective layers 34, 35 may be omitted as appropriate.
  • light-guiding sections 68 and 69 each have a plurality of structures 61 arranged in peripheral section 100B.
  • the plurality of structures 61 provided in light-guiding sections 67, 68, and 69 may be arranged in the same pattern as each other, or may be arranged in different patterns.
  • the photodetectors 2H and 2I of this modified example can achieve the same effects as those of the second embodiment described above.
  • photodetection device e.g., photodetection device 1
  • various electronic devices such as imaging systems such as digital still cameras and digital video cameras, mobile phones with imaging functions, or other devices with imaging functions.
  • FIG. 20 is a block diagram showing an example of the configuration of electronic device 1000.
  • the electronic device 1000 includes an optical system 1001, a photodetector 1, and a DSP (Digital Signal Processor) 1002.
  • the DSP 1002, memory 1003, display device 1004, recording device 1005, operation system 1006, and power supply system 1007 are connected via a bus 1008, and the electronic device 1000 is capable of capturing still and moving images.
  • the optical system 1001 is composed of one or more lenses, and captures incident light (image light) from a subject and forms an image on the imaging surface of the light detection device 1.
  • the light detection device 1 converts the amount of incident light focused on the imaging surface by the optical system 1001 into an electrical signal on a pixel-by-pixel basis and supplies the signal to the DSP 1002 as a pixel signal.
  • the DSP 1002 performs various signal processing on the signal from the light detection device 1 to obtain an image, and temporarily stores the image data in the memory 1003.
  • the image data stored in the memory 1003 is recorded in the recording device 1005 or supplied to the display device 1004 to display the image.
  • the operation system 1006 accepts various operations by the user and supplies operation signals to each block of the electronic device 1000, and the power supply system 1007 supplies the power required to drive each block of the electronic device 1000.
  • Fig. 21A is a schematic diagram showing an example of the overall configuration of a light detection system 2000 including a light detection device 1.
  • Fig. 21B is a diagram showing an example of the circuit configuration of the light detection system 2000.
  • the light detection system 2000 includes a light emitting device 2001 as a light source unit that emits infrared light L2, and a light detection device 2002 as a light receiving unit having a photoelectric conversion element.
  • the light detection device 1 described above can be used as the light detection device 2002.
  • the light detection system 2000 may further include a system control unit 2003, a light source driving unit 2004, a sensor control unit 2005, a light source side optical system 2006, and a camera side optical system 2007.
  • the light detection device 2002 can detect light L1 and light L2.
  • Light L1 is external ambient light reflected by the subject (measurement target) 2100 (FIG. 21A).
  • Light L2 is light emitted by the light emitting device 2001 and then reflected by the subject 2100.
  • Light L1 is, for example, visible light, and light L2 is, for example, infrared light.
  • Light L1 can be detected by the photoelectric conversion unit in the light detection device 2002, and light L2 can be detected by the photoelectric conversion region in the light detection device 2002.
  • Image information of the subject 2100 can be obtained from the light L1, and distance information between the subject 2100 and the light detection system 2000 can be obtained from the light L2.
  • the light detection system 2000 can be mounted on, for example, an electronic device such as a smartphone or a moving object such as a car.
  • the light emitting device 2001 can be configured, for example, by a semiconductor laser, a surface-emitting semiconductor laser, or a vertical-cavity surface-emitting laser (VCSEL).
  • the detection method of the light L2 emitted from the light emitting device 2001 by the light detection device 2002 may be, for example, an iTOF method, but is not limited thereto.
  • the photoelectric conversion unit can measure the distance to the subject 2100 by, for example, the time-of-flight (TOF).
  • the detection method of the light L2 emitted from the light emitting device 2001 by the light detection device 2002 may be, for example, a structured light method or a stereo vision method.
  • a structured light method a predetermined pattern of light is projected onto the subject 2100, and the distance between the light detection system 2000 and the subject 2100 can be measured by analyzing the degree of distortion of the pattern.
  • the stereo vision method for example, two or more cameras are used to obtain two or more images of the subject 2100 viewed from two or more different viewpoints, thereby measuring the distance between the light detection system 2000 and the subject.
  • the light emitting device 2001 and the light detecting device 2002 can be synchronously controlled by the system control unit 2003.
  • FIG. 22 is a diagram showing an example of the general configuration of an endoscopic surgery system to which the technology disclosed herein (the present technology) can be applied.
  • an operator (doctor) 11131 is shown using an endoscopic surgery system 11000 to perform surgery on a patient 11132 on a patient bed 11133.
  • the endoscopic surgery system 11000 is composed of an endoscope 11100, other surgical tools 11110 such as an insufflation tube 11111 and an energy treatment tool 11112, a support arm device 11120 that supports the endoscope 11100, and a cart 11200 on which various devices for endoscopic surgery are mounted.
  • the endoscope 11100 is composed of a lens barrel 11101, the tip of which is inserted into the body cavity of the patient 11132 at a predetermined length, and a camera head 11102 connected to the base end of the lens barrel 11101.
  • the endoscope 11100 is configured as a so-called rigid scope having a rigid lens barrel 11101, but the endoscope 11100 may also be configured as a so-called flexible scope having a flexible lens barrel.
  • the tip of the tube 11101 has an opening into which an objective lens is fitted.
  • a light source device 11203 is connected to the endoscope 11100, and light generated by the light source device 11203 is guided to the tip of the tube by a light guide extending inside the tube 11101, and is irradiated via the objective lens towards an object to be observed inside the body cavity of the patient 11132.
  • the endoscope 11100 may be a direct-viewing endoscope, an oblique-viewing endoscope, or a side-viewing endoscope.
  • An optical system and an image sensor are provided inside the camera head 11102, and reflected light (observation light) from the object being observed is focused onto the image sensor by the optical system.
  • the image sensor converts the observation light into an electric signal corresponding to the observation light, i.e., an image signal corresponding to the observed image.
  • the image signal is sent to the camera control unit (CCU: Camera Control Unit) 11201 as RAW data.
  • CCU Camera Control Unit
  • the CCU 11201 is configured with a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), etc., and performs overall control of the operations of the endoscope 11100 and the display device 11202. Furthermore, the CCU 11201 receives an image signal from the camera head 11102, and performs various types of image processing on the image signal, such as development processing (demosaic processing), for displaying an image based on the image signal.
  • a CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • the display device 11202 under the control of the CCU 11201, displays an image based on the image signal that has been subjected to image processing by the CCU 11201.
  • the light source device 11203 is composed of a light source such as an LED (light emitting diode) and supplies illumination light to the endoscope 11100 when photographing the surgical site, etc.
  • a light source such as an LED (light emitting diode)
  • the input device 11204 is an input interface for the endoscopic surgery system 11000.
  • a user can input various information and instructions to the endoscopic surgery system 11000 via the input device 11204.
  • the user inputs an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) of the endoscope 11100.
  • the treatment tool control device 11205 controls the operation of the energy treatment tool 11112 for cauterizing tissue, incising, sealing blood vessels, etc.
  • the insufflation device 11206 sends gas into the body cavity of the patient 11132 via the insufflation tube 11111 to inflate the body cavity in order to ensure a clear field of view for the endoscope 11100 and to ensure a working space for the surgeon.
  • the recorder 11207 is a device capable of recording various types of information related to the surgery.
  • the printer 11208 is a device capable of printing various types of information related to the surgery in various formats such as text, images, or graphs.
  • the light source device 11203 that supplies irradiation light to the endoscope 11100 when photographing the surgical site can be composed of a white light source composed of, for example, an LED, a laser light source, or a combination of these.
  • a white light source composed of, for example, an LED, a laser light source, or a combination of these.
  • the white light source is composed of a combination of RGB laser light sources, the output intensity and output timing of each color (each wavelength) can be controlled with high precision, so that the white balance of the captured image can be adjusted in the light source device 11203.
  • the light source device 11203 may be controlled to change the intensity of the light it outputs at predetermined time intervals.
  • the image sensor of the camera head 11102 may be controlled to acquire images in a time-division manner in synchronization with the timing of the change in the light intensity, and the images may be synthesized to generate an image with a high dynamic range that is free of so-called blackout and whiteout.
  • the light source device 11203 may be configured to supply light in a predetermined wavelength range corresponding to the special light observation.
  • a narrow band light is irradiated compared to the irradiation light (i.e., white light) during normal observation, and a predetermined tissue such as blood vessels on the mucosal surface is photographed with high contrast, so-called narrow band imaging is performed.
  • a fluorescent observation may be performed in which an image is obtained by fluorescence generated by irradiating an excitation light.
  • an excitation light is irradiated to a body tissue and the fluorescence from the body tissue is observed (autofluorescence observation), or a reagent such as indocyanine green (ICG) is locally injected into the body tissue and an excitation light corresponding to the fluorescent wavelength of the reagent is irradiated to the body tissue to obtain a fluorescent image.
  • the light source device 11203 may be configured to supply narrow band light and/or excitation light corresponding to such special light observation.
  • FIG. 23 is a block diagram showing an example of the functional configuration of the camera head 11102 and CCU 11201 shown in FIG. 22.
  • the camera head 11102 has a lens unit 11401, an imaging unit 11402, a drive unit 11403, a communication unit 11404, and a camera head control unit 11405.
  • the CCU 11201 has a communication unit 11411, an image processing unit 11412, and a control unit 11413.
  • the camera head 11102 and the CCU 11201 are connected to each other via a transmission cable 11400 so that they can communicate with each other.
  • the lens unit 11401 is an optical system provided at the connection with the lens barrel 11101. Observation light taken in from the tip of the lens barrel 11101 is guided to the camera head 11102 and enters the lens unit 11401.
  • the lens unit 11401 is composed of a combination of multiple lenses including a zoom lens and a focus lens.
  • the imaging unit 11402 may include one imaging element (a so-called single-plate type) or multiple imaging elements (a so-called multi-plate type).
  • each imaging element may generate an image signal corresponding to each of RGB, and a color image may be obtained by combining these.
  • the imaging unit 11402 may be configured to have a pair of imaging elements for acquiring image signals for the right eye and the left eye corresponding to a 3D (dimensional) display. By performing a 3D display, the surgeon 11131 can more accurately grasp the depth of the biological tissue in the surgical site.
  • multiple lens units 11401 may be provided corresponding to each imaging element.
  • the imaging unit 11402 does not necessarily have to be provided in the camera head 11102.
  • the imaging unit 11402 may be provided inside the lens barrel 11101, immediately after the objective lens.
  • the driving unit 11403 is composed of an actuator, and moves the zoom lens and focus lens of the lens unit 11401 a predetermined distance along the optical axis under the control of the camera head control unit 11405. This allows the magnification and focus of the image captured by the imaging unit 11402 to be adjusted appropriately.
  • the communication unit 11404 is configured with a communication device for transmitting and receiving various information to and from the CCU 11201.
  • the communication unit 11404 transmits the image signal obtained from the imaging unit 11402 as RAW data to the CCU 11201 via the transmission cable 11400.
  • the communication unit 11404 also receives control signals for controlling the operation of the camera head 11102 from the CCU 11201, and supplies them to the camera head control unit 11405.
  • the control signals include information on the imaging conditions, such as information specifying the frame rate of the captured image, information specifying the exposure value during imaging, and/or information specifying the magnification and focus of the captured image.
  • the above-mentioned frame rate, exposure value, magnification, focus, and other imaging conditions may be appropriately specified by the user, or may be automatically set by the control unit 11413 of the CCU 11201 based on the acquired image signal.
  • the endoscope 11100 is equipped with the so-called AE (Auto Exposure) function, AF (Auto Focus) function, and AWB (Auto White Balance) function.
  • the camera head control unit 11405 controls the operation of the camera head 11102 based on a control signal from the CCU 11201 received via the communication unit 11404.
  • the communication unit 11411 is configured with a communication device for transmitting and receiving various information to and from the camera head 11102.
  • the communication unit 11411 receives an image signal transmitted from the camera head 11102 via the transmission cable 11400.
  • the communication unit 11411 also transmits to the camera head 11102 a control signal for controlling the operation of the camera head 11102.
  • the image signal and the control signal can be transmitted by electrical communication, optical communication, etc.
  • the image processing unit 11412 performs various image processing operations on the image signal, which is the RAW data transmitted from the camera head 11102.
  • the control unit 11413 performs various controls related to the imaging of the surgical site, etc. by the endoscope 11100, and the display of the captured images obtained by imaging the surgical site, etc. For example, the control unit 11413 generates a control signal for controlling the driving of the camera head 11102.
  • the control unit 11413 also causes the display device 11202 to display the captured image showing the surgical site, etc., based on the image signal that has been image-processed by the image processing unit 11412. At this time, the control unit 11413 may recognize various objects in the captured image using various image recognition techniques. For example, the control unit 11413 can recognize surgical tools such as forceps, specific body parts, bleeding, mist generated when the energy treatment tool 11112 is used, etc., by detecting the shape and color of the edges of objects included in the captured image. When the control unit 11413 causes the display device 11202 to display the captured image, it may use the recognition result to superimpose various types of surgical support information on the image of the surgical site. By superimposing the surgical support information and presenting it to the surgeon 11131, the burden on the surgeon 11131 can be reduced and the surgeon 11131 can proceed with the surgery reliably.
  • various image recognition techniques such as forceps, specific body parts, bleeding, mist generated when the energy treatment tool 11112 is used, etc.
  • the transmission cable 11400 that connects the camera head 11102 and the CCU 11201 is an electrical signal cable that supports electrical signal communication, an optical fiber that supports optical communication, or a composite cable of these.
  • communication is performed wired using a transmission cable 11400, but communication between the camera head 11102 and the CCU 11201 may also be performed wirelessly.
  • the technology according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure may be realized as a device mounted on any type of moving body, such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility device, an airplane, a drone, a ship, a robot, a construction machine, or an agricultural machine (tractor).
  • FIG. 24 is a block diagram showing a schematic configuration example of a vehicle control system, which is an example of a mobile object control system to which the technology disclosed herein can be applied.
  • the vehicle control system 12000 includes a plurality of electronic control units connected via a communication network 12001.
  • the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, an outside vehicle information detection unit 12030, an inside vehicle information detection unit 12040, and an integrated control unit 12050.
  • Also shown as functional components of the integrated control unit 12050 are a microcomputer 12051, an audio/video output unit 12052, and an in-vehicle network I/F (interface) 12053.
  • the drive system control unit 12010 controls the operation of devices related to the drive system of the vehicle according to various programs.
  • the drive system control unit 12010 functions as a control device for a drive force generating device for generating the drive force of the vehicle, such as an internal combustion engine or a drive motor, a drive force transmission mechanism for transmitting the drive force to the wheels, a steering mechanism for adjusting the steering angle of the vehicle, and a braking device for generating a braking force for the vehicle.
  • the body system control unit 12020 controls the operation of various devices installed in the vehicle body according to various programs.
  • the body system control unit 12020 functions as a control device for a keyless entry system, a smart key system, a power window device, or various lamps such as headlamps, tail lamps, brake lamps, turn signals, and fog lamps.
  • radio waves or signals from various switches transmitted from a portable device that replaces a key can be input to the body system control unit 12020.
  • the body system control unit 12020 accepts the input of these radio waves or signals and controls the vehicle's door lock device, power window device, lamps, etc.
  • the outside-vehicle information detection unit 12030 detects information outside the vehicle equipped with the vehicle control system 12000.
  • the image capturing unit 12031 is connected to the outside-vehicle information detection unit 12030.
  • the outside-vehicle information detection unit 12030 causes the image capturing unit 12031 to capture images outside the vehicle and receives the captured images.
  • the outside-vehicle information detection unit 12030 may perform object detection processing or distance detection processing for people, cars, obstacles, signs, characters on the road surface, etc. based on the received images.
  • the imaging unit 12031 is an optical sensor that receives light and outputs an electrical signal according to the amount of light received.
  • the imaging unit 12031 can output the electrical signal as an image, or as distance measurement information.
  • the light received by the imaging unit 12031 may be visible light, or may be invisible light such as infrared light.
  • the in-vehicle information detection unit 12040 detects information inside the vehicle.
  • a driver state detection unit 12041 that detects the state of the driver is connected.
  • the driver state detection unit 12041 includes, for example, a camera that captures an image of the driver, and the in-vehicle information detection unit 12040 may calculate the driver's degree of fatigue or concentration based on the detection information input from the driver state detection unit 12041, or may determine whether the driver is dozing off.
  • the microcomputer 12051 can calculate control target values for the driving force generating device, steering mechanism, or braking device based on information inside and outside the vehicle acquired by the outside vehicle information detection unit 12030 or the inside vehicle information detection unit 12040, and output control commands to the drive system control unit 12010.
  • the microcomputer 12051 can perform cooperative control aimed at realizing the functions of an ADAS (Advanced Driver Assistance System), including vehicle collision avoidance or impact mitigation, following driving based on the distance between vehicles, maintaining vehicle speed, vehicle collision warning, or vehicle lane departure warning.
  • ADAS Advanced Driver Assistance System
  • the microcomputer 12051 can also perform cooperative control for the purpose of autonomous driving, which allows the vehicle to travel autonomously without relying on the driver's operation, by controlling the driving force generating device, steering mechanism, braking device, etc. based on information about the surroundings of the vehicle acquired by the outside vehicle information detection unit 12030 or the inside vehicle information detection unit 12040.
  • the microcomputer 12051 can also output control commands to the body system control unit 12020 based on information outside the vehicle acquired by the outside information detection unit 12030. For example, the microcomputer 12051 can control the headlamps according to the position of a preceding vehicle or an oncoming vehicle detected by the outside information detection unit 12030, and perform cooperative control aimed at preventing glare, such as switching from high beams to low beams.
  • the audio/image output unit 12052 transmits at least one output signal of audio and image to an output device capable of visually or audibly notifying the occupants of the vehicle or the outside of the vehicle of information.
  • an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are exemplified as output devices.
  • the display unit 12062 may include, for example, at least one of an on-board display and a head-up display.
  • FIG. 25 shows an example of the installation position of the imaging unit 12031.
  • the imaging unit 12031 includes imaging units 12101, 12102, 12103, 12104, and 12105.
  • the imaging units 12101, 12102, 12103, 12104, and 12105 are provided, for example, at the front nose, side mirrors, rear bumper, back door, and upper part of the windshield inside the vehicle cabin of the vehicle 12100.
  • the imaging unit 12101 provided at the front nose and the imaging unit 12105 provided at the upper part of the windshield inside the vehicle cabin mainly acquire images of the front of the vehicle 12100.
  • the imaging units 12102 and 12103 provided at the side mirrors mainly acquire images of the sides of the vehicle 12100.
  • the imaging unit 12104 provided at the rear bumper or back door mainly acquires images of the rear of the vehicle 12100.
  • the imaging unit 12105 provided at the upper part of the windshield inside the vehicle cabin is mainly used to detect leading vehicles, pedestrians, obstacles, traffic lights, traffic signs, lanes, etc.
  • FIG. 25 shows an example of the imaging ranges of the imaging units 12101 to 12104.
  • Imaging range 12111 indicates the imaging range of the imaging unit 12101 provided on the front nose
  • imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided on the side mirrors, respectively
  • imaging range 12114 indicates the imaging range of the imaging unit 12104 provided on the rear bumper or back door.
  • an overhead image of the vehicle 12100 viewed from above is obtained by superimposing the image data captured by the imaging units 12101 to 12104.
  • At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information.
  • at least one of the imaging units 12101 to 12104 may be a stereo camera consisting of multiple imaging elements, or an imaging element having pixels for phase difference detection.
  • the microcomputer 12051 can obtain the distance to each solid object within the imaging ranges 12111 to 12114 and the change in this distance over time (relative speed with respect to the vehicle 12100) based on the distance information obtained from the imaging units 12101 to 12104, and can extract as a preceding vehicle, in particular, the closest solid object on the path of the vehicle 12100 that is traveling in approximately the same direction as the vehicle 12100 at a predetermined speed (e.g., 0 km/h or faster). Furthermore, the microcomputer 12051 can set the inter-vehicle distance that should be maintained in advance in front of the preceding vehicle, and perform automatic braking control (including follow-up stop control) and automatic acceleration control (including follow-up start control). In this way, cooperative control can be performed for the purpose of automatic driving, which runs autonomously without relying on the driver's operation.
  • automatic braking control including follow-up stop control
  • automatic acceleration control including follow-up start control
  • the microcomputer 12051 classifies and extracts three-dimensional object data on three-dimensional objects, such as two-wheeled vehicles, ordinary vehicles, large vehicles, pedestrians, utility poles, and other three-dimensional objects, based on the distance information obtained from the imaging units 12101 to 12104, and can use the data to automatically avoid obstacles.
  • the microcomputer 12051 distinguishes obstacles around the vehicle 12100 into obstacles that are visible to the driver of the vehicle 12100 and obstacles that are difficult to see.
  • the microcomputer 12051 determines the collision risk, which indicates the risk of collision with each obstacle, and when the collision risk is equal to or exceeds a set value and there is a possibility of a collision, it can provide driving assistance for collision avoidance by outputting an alarm to the driver via the audio speaker 12061 or the display unit 12062, or by performing forced deceleration or avoidance steering via the drive system control unit 12010.
  • At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
  • the microcomputer 12051 can recognize a pedestrian by determining whether or not a pedestrian is present in the image captured by the imaging units 12101 to 12104. The recognition of such a pedestrian is performed, for example, by a procedure of extracting feature points in the image captured by the imaging units 12101 to 12104 as infrared cameras, and a procedure of performing pattern matching processing on a series of feature points that indicate the contour of an object to determine whether or not it is a pedestrian.
  • the audio/image output unit 12052 controls the display unit 12062 to superimpose a rectangular contour line for emphasis on the recognized pedestrian.
  • the audio/image output unit 12052 may also control the display unit 12062 to display an icon or the like indicating a pedestrian at a desired position.
  • the present disclosure may also be configured as follows: According to the present technology having the following configuration, it is possible to improve reliability. (1) a semiconductor substrate having a first surface and a second surface facing each other, the semiconductor substrate having a pixel portion in which a plurality of pixels are arranged in an array and a peripheral portion provided around the pixel portion; a light guide section provided on the first surface side of the semiconductor substrate, the light guide section including a medium having a refractive index different from that of the plurality of structures and a size equal to or smaller than the wavelength of incident light and a space between adjacent ones of the plurality of structures, the medium having a refractive index different from that of the plurality of structures, the light guide section being provided across the pixel section and the peripheral section; a photoelectric conversion unit that is embedded in the semiconductor substrate and that converts light incident through the light guide unit into an electric signal, for each of the plurality of pixels; a light absorbing layer provided on a surface of the light guiding portion opposite to the semiconductor substrate in the peripheral portion; and a protective film
  • the photodetector according to (1) wherein the light absorbing layer is made of a black color filter or a tungsten film.
  • the black color filter contains a titanium oxide filler-dispersed resin, a carbon black pigment-dispersed resin, or an organic pigment-dispersed resin.
  • the protective film includes any one of silicon oxide, silicon nitride, silicon oxynitride, silicon carbide, oxygen-doped silicon carbide, nitrogen-doped silicon carbide, aluminum oxide, hafnium oxide, tantalum oxide, indium oxide, and indium oxide.
  • the protective film is a multilayer interference film in which layers having refractive indices different from one another are stacked.
  • the light guide portion has a multilayer structure in which a plurality of layers each including the plurality of structures and the medium are stacked,
  • the light detection device according to any one of (1) to (8), wherein the light absorption layer is provided on at least an outermost surface of the light guiding section having the multilayer structure.
  • the light guiding section includes a first layer and a second layer stacked in order from the semiconductor substrate side as the plurality of layers,
  • the photodetector according to claim 9 wherein the light absorbing layer is further provided between the first layer and the second layer.
  • the photodetector according to claim 10 wherein the light absorbing layer provided between the first layer and the second layer is embedded in the first layer.
  • a spacer layer is further provided between the semiconductor substrate and the light guiding portion, The photodetector according to any one of (14) to (16), wherein the second structures penetrate the light guiding section and are at least partially embedded in the spacer layer.

Landscapes

  • Solid State Image Pick-Up Elements (AREA)

Abstract

Un dispositif de détection de lumière selon un mode de réalisation de la présente divulgation comprend : un substrat semi-conducteur présentant une première surface et une seconde surface opposées l'une à l'autre, le substrat semi-conducteur comportant également une partie pixels dans laquelle une pluralité de pixels sont agencés sous la forme d'un réseau, et une partie périphérique disposée autour de la partie pixels ; une partie de guidage de la lumière disposée de façon à englober la partie pixels et la partie périphérique, la partie de guidage de la lumière comprenant une pluralité de premières structures qui sont disposées du côté de la première surface du substrat semi-conducteur et qui présentent chacune une taille égale ou inférieure à la longueur d'onde de la lumière incidente, et un milieu qui est disposé de façon à remplir les espaces entre la pluralité de premières structures adjacentes et qui présente un indice de réfraction différent de celui de la pluralité de premières structures ; des parties de conversion photoélectrique destinées à convertir photoélectriquement la lumière incidente traversant la partie de guidage de la lumière, les parties de conversion photoélectrique étant formées de manière à être incorporées dans le substrat semi-conducteur dans chaque pixel de la pluralité de pixels ; une couche absorbant la lumière qui est disposée, dans la partie périphérique, à la surface de la partie de guidage de la lumière du côté opposé au côté substrat semi-conducteur ; et un film protecteur recouvrant la surface de la couche absorbant la lumière.
PCT/JP2024/043277 2023-12-07 2024-12-06 Dispositif de détection de lumière Pending WO2025121422A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2023206958 2023-12-07
JP2023-206958 2023-12-07
PCT/JP2024/036675 WO2025121000A1 (fr) 2023-12-07 2024-10-15 Dispositif de détection de lumière
JPPCT/JP2024/036675 2024-10-15

Publications (1)

Publication Number Publication Date
WO2025121422A1 true WO2025121422A1 (fr) 2025-06-12

Family

ID=95980102

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2024/043277 Pending WO2025121422A1 (fr) 2023-12-07 2024-12-06 Dispositif de détection de lumière

Country Status (1)

Country Link
WO (1) WO2025121422A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018131509A1 (fr) * 2017-01-13 2018-07-19 ソニーセミコンダクタソリューションズ株式会社 Élément de capture d'image, procédé de fabrication, et dispositif électronique
WO2018139278A1 (fr) * 2017-01-30 2018-08-02 ソニーセミコンダクタソリューションズ株式会社 Élément de capture d'image, procédé de fabrication, et dispositif électronique
JP2021150473A (ja) * 2020-03-19 2021-09-27 ソニーセミコンダクタソリューションズ株式会社 固体撮像装置及び電子機器
WO2023013521A1 (fr) * 2021-08-06 2023-02-09 ソニーセミコンダクタソリューションズ株式会社 Photodétecteur, son procédé de fabrication et dispositif électronique

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018131509A1 (fr) * 2017-01-13 2018-07-19 ソニーセミコンダクタソリューションズ株式会社 Élément de capture d'image, procédé de fabrication, et dispositif électronique
WO2018139278A1 (fr) * 2017-01-30 2018-08-02 ソニーセミコンダクタソリューションズ株式会社 Élément de capture d'image, procédé de fabrication, et dispositif électronique
JP2021150473A (ja) * 2020-03-19 2021-09-27 ソニーセミコンダクタソリューションズ株式会社 固体撮像装置及び電子機器
WO2023013521A1 (fr) * 2021-08-06 2023-02-09 ソニーセミコンダクタソリューションズ株式会社 Photodétecteur, son procédé de fabrication et dispositif électronique

Similar Documents

Publication Publication Date Title
KR102723593B1 (ko) 촬상 소자 및 촬상 소자의 제조 방법
US20240429254A1 (en) Imaging device
US20240204014A1 (en) Imaging device
JP7503399B2 (ja) 固体撮像装置及びその製造方法、並びに電子機器
US20240413179A1 (en) Imaging device
WO2024202674A1 (fr) Élément semi-conducteur et équipement électronique
WO2023167027A1 (fr) Dispositif de détection de lumière et appareil électronique
WO2023106308A1 (fr) Dispositif de réception de lumière
WO2025121422A1 (fr) Dispositif de détection de lumière
WO2025121000A1 (fr) Dispositif de détection de lumière
WO2024085005A1 (fr) Photodétecteur
WO2025053221A1 (fr) Photodétecteur, élément optique et appareil électronique
WO2024142627A1 (fr) Photodétecteur et appareil électronique
WO2025053220A1 (fr) Photodétecteur, élément optique et appareil électronique
WO2025041370A1 (fr) Dispositif de détection de lumière et appareil électronique
WO2025018080A1 (fr) Appareil de photodétection, élément optique et dispositif électronique
WO2025164103A1 (fr) Dispositif de photodétection, élément optique et dispositif électronique
JP2024094104A (ja) 光学素子、光検出装置、および電子機器
JP2025121724A (ja) 光検出装置
JP2024164446A (ja) 光検出装置および電子機器
WO2024162113A1 (fr) Détecteur optique, élément optique et dispositif électronique
JP2024110674A (ja) 光検出装置、光学素子、および電子機器
WO2025192164A1 (fr) Photodétecteur et équipement électronique
WO2025253793A1 (fr) Appareil de détection de lumière et dispositif électronique
WO2025204244A1 (fr) Dispositif de détection de lumière, procédé de production de dispositif de détection de lumière et appareil électronique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24900714

Country of ref document: EP

Kind code of ref document: A1