WO2020179290A1 - センサおよび測距装置 - Google Patents
センサおよび測距装置 Download PDFInfo
- Publication number
- WO2020179290A1 WO2020179290A1 PCT/JP2020/003325 JP2020003325W WO2020179290A1 WO 2020179290 A1 WO2020179290 A1 WO 2020179290A1 JP 2020003325 W JP2020003325 W JP 2020003325W WO 2020179290 A1 WO2020179290 A1 WO 2020179290A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- pixel
- light
- sensor according
- semiconductor substrate
- chip lens
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/481—Constructional features, e.g. arrangements of optical elements
- G01S7/4816—Constructional features, e.g. arrangements of optical elements of receivers alone
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
- G01S17/10—Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/483—Details of pulse systems
- G01S7/486—Receivers
- G01S7/4861—Circuits for detection, sampling, integration or read-out
- G01S7/4863—Detector arrays, e.g. charge-transfer gates
-
- H—ELECTRICITY
- H10—SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
- H10F—INORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
- H10F30/00—Individual radiation-sensitive semiconductor devices in which radiation controls the flow of current through the devices, e.g. photodetectors
- H10F30/20—Individual radiation-sensitive semiconductor devices in which radiation controls the flow of current through the devices, e.g. photodetectors the devices having potential barriers, e.g. phototransistors
- H10F30/21—Individual radiation-sensitive semiconductor devices in which radiation controls the flow of current through the devices, e.g. photodetectors the devices having potential barriers, e.g. phototransistors the devices being sensitive to infrared, visible or ultraviolet radiation
- H10F30/22—Individual radiation-sensitive semiconductor devices in which radiation controls the flow of current through the devices, e.g. photodetectors the devices having potential barriers, e.g. phototransistors the devices being sensitive to infrared, visible or ultraviolet radiation the devices having only one potential barrier, e.g. photodiodes
- H10F30/225—Individual radiation-sensitive semiconductor devices in which radiation controls the flow of current through the devices, e.g. photodetectors the devices having potential barriers, e.g. phototransistors the devices being sensitive to infrared, visible or ultraviolet radiation the devices having only one potential barrier, e.g. photodiodes the potential barrier working in avalanche mode, e.g. avalanche photodiodes
-
- H—ELECTRICITY
- H10—SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
- H10F—INORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
- H10F39/00—Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
- H10F39/10—Integrated devices
- H10F39/12—Image sensors
- H10F39/18—Complementary metal-oxide-semiconductor [CMOS] image sensors; Photodiode array image sensors
- H10F39/182—Colour image sensors
-
- H—ELECTRICITY
- H10—SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
- H10F—INORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
- H10F39/00—Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
- H10F39/10—Integrated devices
- H10F39/12—Image sensors
- H10F39/18—Complementary metal-oxide-semiconductor [CMOS] image sensors; Photodiode array image sensors
- H10F39/184—Infrared image sensors
-
- H—ELECTRICITY
- H10—SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
- H10F—INORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
- H10F39/00—Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
- H10F39/80—Constructional details of image sensors
- H10F39/805—Coatings
-
- H—ELECTRICITY
- H10—SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
- H10F—INORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
- H10F39/00—Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
- H10F39/80—Constructional details of image sensors
- H10F39/806—Optical elements or arrangements associated with the image sensors
- H10F39/8063—Microlenses
-
- H—ELECTRICITY
- H10—SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
- H10F—INORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
- H10F39/00—Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
- H10F39/80—Constructional details of image sensors
- H10F39/806—Optical elements or arrangements associated with the image sensors
- H10F39/8067—Reflectors
-
- H—ELECTRICITY
- H10—SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
- H10F—INORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
- H10F39/00—Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
- H10F39/80—Constructional details of image sensors
- H10F39/807—Pixel isolation structures
-
- H—ELECTRICITY
- H10—SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
- H10F—INORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
- H10F39/00—Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
- H10F39/80—Constructional details of image sensors
- H10F39/811—Interconnections
-
- H—ELECTRICITY
- H10—SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
- H10F—INORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
- H10F77/00—Constructional details of devices covered by this subclass
- H10F77/40—Optical elements or arrangements
- H10F77/413—Optical elements or arrangements directly associated or integrated with the devices, e.g. back reflectors
-
- H—ELECTRICITY
- H10—SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
- H10F—INORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
- H10F77/00—Constructional details of devices covered by this subclass
- H10F77/93—Interconnections
- H10F77/933—Interconnections for devices having potential barriers
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J1/00—Photometry, e.g. photographic exposure meter
- G01J1/02—Details
- G01J1/04—Optical or mechanical part supplementary adjustable parts
- G01J1/0407—Optical elements not provided otherwise, e.g. manifolds, windows, holograms, gratings
- G01J1/0414—Optical elements not provided otherwise, e.g. manifolds, windows, holograms, gratings using plane or convex mirrors, parallel phase plates, or plane beam-splitters
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J1/00—Photometry, e.g. photographic exposure meter
- G01J1/42—Photometry, e.g. photographic exposure meter using electric radiation detectors
- G01J1/44—Electric circuits
- G01J2001/4446—Type of detector
- G01J2001/446—Photodiode
- G01J2001/4466—Avalanche
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
- G01S17/894—3D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
-
- H—ELECTRICITY
- H10—SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
- H10F—INORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
- H10F39/00—Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
- H10F39/10—Integrated devices
- H10F39/12—Image sensors
- H10F39/199—Back-illuminated image sensors
-
- H—ELECTRICITY
- H10—SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
- H10F—INORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
- H10F39/00—Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
- H10F39/80—Constructional details of image sensors
- H10F39/802—Geometry or disposition of elements in pixels, e.g. address-lines or gate electrodes
- H10F39/8027—Geometry of the photosensitive area
-
- H—ELECTRICITY
- H10—SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
- H10F—INORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
- H10F39/00—Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
- H10F39/80—Constructional details of image sensors
- H10F39/809—Constructional details of image sensors of hybrid image sensors
Definitions
- the present technology relates to a sensor using an avalanche photo diode (APD) and a distance measuring device equipped with the sensor.
- APD avalanche photo diode
- a sensor has a first surface and a second surface facing each other, a semiconductor substrate provided with an avalanche photodiode, and an on-chip lens provided on the first surface side of the semiconductor substrate.
- a first reflection member provided on the on-chip lens, and a wiring layer provided on the second surface side of the semiconductor substrate and including the second reflection member.
- a range finder according to an embodiment of the present technology includes the sensor according to the embodiment of the present technology.
- the first reflecting member that reflects the light reflected by the second reflecting member is provided.
- the light reflected by the second reflecting member can be efficiently incident on the avalanche photodiode.
- FIG. 2 is a schematic sectional view showing an example of a configuration of a main part of the pixel array section shown in FIG. 1.
- FIG. 3 is a schematic plan view showing an example of the configuration of the pixel shown in FIG. 2.
- FIG. 3 is a schematic diagram illustrating an example of a planar configuration of a reflecting member illustrated in FIG. 2. It is a schematic diagram which shows the other example (1) of the plane structure of the reflection member shown in FIG. 4A. It is a schematic diagram which shows the other example (2) of the plane structure of the reflection member shown in FIG. 4A. It is a schematic diagram which shows an example of the relationship between the size of the reflective member and the quantum efficiency shown in FIG.
- FIG. 9 is a schematic cross-sectional view showing a configuration of a main part of a sensor chip according to Modification 1.
- FIG. 9 is a schematic cross-sectional view showing the configuration of a main part of a sensor chip according to Modification 2. It is a plane schematic diagram for demonstrating the position of each pixel shown to FIG. 10A.
- FIG. 10B is a schematic diagram illustrating another example of the cross-sectional configuration of the sensor chip illustrated in FIG. 10A.
- FIG. 9 is a schematic cross-sectional view showing a configuration of a main part of a sensor chip according to Modification 3.
- FIG. 12B is a schematic diagram illustrating an example of a planar configuration of the light shielding member illustrated in FIG. 12A.
- FIG. 9 is a schematic cross-sectional view showing a configuration of a main part of a sensor chip according to Modification 4.
- FIG. 9 is a schematic cross-sectional view showing a configuration of a main part of a sensor chip according to Modification 5.
- FIG. 15 is a schematic diagram illustrating another example of the cross-sectional configuration of the sensor chip illustrated in FIG. 14.
- FIG. 5 is a schematic cross-sectional view for explaining another example of the position of the reflective member shown in FIG. 2 and the like. It is a cross-sectional schematic diagram for demonstrating the sensor chip which provided the other condensing structure instead of the on-chip lens shown in FIG.
- Embodiment (example of sensor chip having reflective member on on-chip lens) 2.
- Modification 1 (Example of having an antireflection member laminated on the reflection member) 3.
- Modification 2 (Example in which the positions of the reflecting members have pixels different from each other) 4.
- Modification 3 (example in which a plurality of reflecting members are provided separately) 5.
- Modification 4 (Example of having an inverted pyramid structure on the surface of a semiconductor substrate) 6.
- Modification 5 (Example having a pixel that receives light having a wavelength in the visible region) 7.
- Application example (ranging device) 8.
- Application example (ranging device) 8.
- FIG. 1 is a block diagram showing a configuration example of a sensor chip 11 according to an embodiment of the present technology.
- the sensor chip 11 corresponds to a specific but not limitative example of “sensor” of the present technology.
- the sensor chip 11 has, for example, a pixel array section 12 provided with a plurality of pixels 21 and a bias voltage application section 13 electrically connected to the pixels 21.
- the sensor chip 11 is applied to, for example, a distance measuring device (a distance measuring device 200 in FIG. 16 described later), and receives light of wavelengths in the near infrared region and the infrared region to generate a light reception signal. To do.
- the wavelengths in the near infrared region and the infrared region are, for example, wavelengths of 600 nm or more, and wavelengths of 850 nm, 905 nm, 940 nm and the like.
- each pixel 21 has, for example, an APD 31, a FET (Field Effect Transistor) 32, and an inverter 33.
- a large negative voltage V BD is applied to the cathode of the APD 31.
- an avalanche multiplication region is formed in the APD 31, and electrons generated by incidence of one photon are avalanche multiplied.
- the FET 32 is composed of, for example, a P-type MOSFET (Metal Oxide Semiconductor Field Effect Transistor).
- the FET 32 is electrically connected to the APD 31.
- the inverter 33 is composed of, for example, a CMOS (Complementary Metal Oxide Semiconductor) inverter.
- the inverter 33 is electrically connected to the APD 31.
- the inverter 33 shapes the voltage generated by the electrons multiplied by the APD 31 and outputs a light receiving signal (APD OUT).
- the light reception signal is a pulse waveform generated from the arrival time of one photon as a starting point.
- the bias voltage application unit 13 is electrically connected to each of the plurality of pixels 21.
- the bias voltage applying unit 13 applies a bias voltage to each of the plurality of pixels 21.
- FIG. 2 shows an example of the cross-sectional configuration of the pixel array unit 12 of the sensor chip 11.
- FIG. 2 shows three pixels 21.
- the sensor chip 11 is composed of, for example, a laminated body of a semiconductor substrate 41 (sensor substrate), a sensor-side wiring layer 42, a logic-side wiring layer 43, and a logic-side semiconductor substrate (logic circuit board, not shown).
- the semiconductor substrate 41 corresponds to a specific example of the semiconductor substrate of the present disclosure
- the wiring layer 42 corresponds to a specific example of the wiring layer of the present disclosure.
- the semiconductor substrate 41 has a first surface S1 and a second surface S2 facing each other, and a wiring layer 42 and a wiring layer 43 are laminated in this order on the second surface S2 side of the semiconductor substrate 41.
- the first surface S1 of the semiconductor substrate 41 constitutes a light receiving surface.
- the sensor chip 11 further includes an antireflection film 64, an oxide film 65, and an on-chip lens 71 on the first surface S1 side of the semiconductor substrate 41.
- the sensor chip 11 has a so-called backside illumination type structure.
- the semiconductor substrate 41 is made of, for example, single crystal silicon (Si).
- the semiconductor substrate 41 is provided with the APD 31 for each pixel 21.
- the APD 31 includes, for example, an N well region 51, a P type diffusion layer 52, an N type diffusion layer 53, a hole accumulation layer 54, a pinning layer 55 and a P type region 56.
- the semiconductor substrate 41 is provided with a pixel separation unit 63 that separates adjacent APDs 31.
- the pixel separation portion 63 is configured by, for example, a groove that penetrates the semiconductor substrate 41 from the first surface S1 to the second surface S2.
- the insulating film 62 is embedded in the groove of the pixel separating portion 63.
- the N-well region 51 is widely provided in the thickness direction of the semiconductor substrate 41 (Z-axis direction in FIG. 2).
- the N-well region 51 forms an electric field that transfers electrons generated by photoelectric conversion to an avalanche multiplication region (avalanche multiplication region 57 described later).
- a P-well region may be provided instead of the N-well region 51 (not shown).
- the P-type diffusion layer 52 and the N-type diffusion layer 53 are provided in the vicinity of the second surface S2 of the semiconductor substrate 41.
- the P-type diffusion layer 52 and the N-type diffusion layer 53 are provided in a stacked manner.
- the N-type diffusion layer 53 is arranged closer to the second surface S2 than the P-type diffusion layer 52.
- the P-type diffusion layer 52 and the N-type diffusion layer 53 are widely provided in the pixel 21 in a plan view (XY plane in FIG. 2).
- the P-type diffusion layer 52 is a layer in which P-type impurities are diffused at a high concentration
- the N-type diffusion layer 53 is a layer in which N-type impurities are diffused at a high concentration.
- a depletion layer is formed in the region where the P-type diffusion layer 52 and the N-type diffusion layer 53 are connected to each other.
- An avalanche multiplier region 57 is formed in this depletion layer.
- the N-type diffusion layer 53 is electrically connected to the wiring of the wiring layer 42 (wiring 104 described later), and the negative voltage V BD is applied to the N-type diffusion layer 53 from this wiring.
- the N-type diffusion layer 53 may have a convex shape so that a part thereof extends to the second surface S2 of the semiconductor substrate 41.
- the avalanche multiplication region 57 is formed on the boundary surface between the P-type diffusion layer 52 and the N-type diffusion layer 53 as described above.
- the avalanche multiplication region 57 is a high electric field region formed by the negative voltage VBD applied to the N type diffusion layer 53.
- the avalanche multiplication region 57 multiplies the electrons (e ⁇ ) generated when one photon is incident on the APD 31.
- the hole storage layer 54 is provided, for example, between the N well region 51 and the first surface S1 and between the N well region 51 and the pixel isolation portion 63.
- the hole accumulation layer 54 is a layer in which P-type impurities are diffused, and is adapted to accumulate holes.
- the hole accumulation layer 54 is electrically connected to the wiring (wiring 105 described later) of the wiring layer 42 via the P-type region 56, and the bias is adjusted.
- the pinning layer 55 is provided between the hole storage layer 54 and the first surface S1 and between the hole storage layer 54 and the pixel separation portion 63.
- the pinning layer 55 is a layer in which P-type impurities are diffused at a high concentration.
- the hole concentration in the hole storage layer 54 is increased by adjusting the bias of the hole storage layer 54, the pinning by the pinning layer 55 becomes strong. Thereby, for example, generation of dark current can be suppressed.
- the P-type region 56 is a layer in which P-type impurities are diffused at a high concentration, and is provided near the second surface S2 of the semiconductor substrate 41.
- FIG. 3 schematically shows an example of the planar configuration of the pixel 21.
- the P-shaped region 56 is provided so as to surround the N-well region 51 in a plan view, for example.
- the P-type region 56 electrically connects the wiring 105 of the wiring layer 42 and the hole storage layer 54.
- the pixel separation unit 63 is provided in a grid pattern in plan view so as to partition the APD 31, for example (FIG. 3).
- the insulating film 62 embedded in the groove of the pixel isolation portion 63 is provided, for example, over the entire thickness direction of the semiconductor substrate 41.
- the insulating film 62 is made of, for example, silicon oxide (SiO) or the like.
- the light shielding film 61 is provided on the pixel separating portion 63.
- the light-shielding film 61 is provided in a grid pattern so as to overlap the pixel separation portion 63 in a plan view, for example. By providing such a light-shielding film 61, it is possible to suppress the occurrence of crosstalk caused by obliquely incident light.
- the light shielding film 61 may be embedded in the groove of the pixel separation portion 63.
- the light shielding film 61 embedded in the groove of the pixel separating portion 63 may be integrally formed with the light shielding film 61 on the pixel separating portion 63.
- the light-shielding film 61 can be made of a material having a light-shielding property with respect to light having wavelengths in the near infrared region and infrared region, and for example, tungsten (W) or the like can be used.
- the wiring layer 42 is provided with, for example, a metal pad 102, a contact electrode 103, a plurality of wirings (wirings 104, 105, 106) and an interlayer insulating film separating these.
- the metal pad 102 is exposed at the joint surface between the wiring layer 42 and the wiring layer 43, and is joined to the metal pad of the wiring layer 43 (metal pad 101 described later). Thereby, the wiring layer 42 and the wiring layer 43 are mechanically and electrically joined.
- the metal pad 102 is made of, for example, copper (Cu), and the wiring layer 42 and the wiring layer 43 are bonded by Cu—Cu bonding.
- the contact electrode 103 is, for example, a connection between the semiconductor substrate 41 (specifically, the N-type diffusion layer 53 and the P-type region 56) and the wiring (for example, the wirings 104 and 105) of the wiring layer 42, or the wiring. It is used for connection between the wiring of the layer 42 and the metal pad 102.
- the wiring 104 of the wiring layer 42 is electrically connected to the N-type diffusion layer 53 of the semiconductor substrate 41 via the contact electrode 103. That is, the wiring 104 functions as the cathode of the APD 31, and is provided for each pixel 21.
- the wiring 104 faces the on-chip lens 71 with the APD 31 in between.
- the wiring 104 has reflectivity to light having wavelengths in the near infrared region and the infrared region.
- copper (Cu), aluminum (Al), or the like can be used.
- the wiring 104 is preferably provided in a position overlapping the avalanche multiplication region 57 in a plan view and wider than the avalanche multiplication region 57.
- the wiring 104 is provided so as to cover substantially the entire pixel 21 inside the P-type region 56 in plan view (FIG. 3 ).
- the wiring 105 of the wiring layer 42 is electrically connected to the P-type region 56 of the semiconductor substrate 41 via the contact electrode 103. That is, the wiring 105 functions as the anode of the APD 31.
- the wiring 105 is arranged, for example, at a position overlapping the P-type region 56 in a plan view and surrounds the wiring 104.
- wiring may be provided at a position overlapping the pixel separation unit 63 in a plan view.
- the wiring 106 is arranged, for example, at a corner of the pixel 21 (FIG. 3).
- the light shielding film 61 may be embedded in the groove of the pixel separating portion 63, and the light shielding film 61 may be electrically connected to the wiring 106.
- the wirings 104, 105, 106 are electrically connected to the metal pad 102 via the contact electrode 103.
- the wiring layer 43 is provided with, for example, an electrode pad 91, a contact electrode 95, a metal pad 101, and an interlayer insulating film separating these.
- the electrode pad 91 is provided at a position farther from the second surface S2 of the semiconductor substrate 41 than the metal pad 101, and is connected to the circuit board of the wiring layer 43.
- the contact electrode 95 connects the electrode pad 91 and the metal pad 101.
- the metal pad 101 is joined to the metal pad 102 of the wiring layer 42 at the joint surface between the wiring layer 43 and the wiring layer 42. That is, the APD 31 is electrically connected to the wiring layer 43 via the wiring layer 42.
- the on-chip lens 71 is provided on the first surface S1 of the semiconductor substrate 41 via the antireflection film 64 and the oxide film 65.
- the antireflection film 64 is provided, for example, between the semiconductor substrate 41 (first surface S1) and the oxide film 65 and covers substantially the entire first surface S1 of the semiconductor substrate 41. By providing the antireflection film 64, it is possible to prevent the light passing through the on-chip lens 71 from being reflected by the first surface S1 of the semiconductor substrate 41.
- a barrier metal such as titanium nitride (TiN), silicon nitride (SiN), or silicon oxynitride (SiON) can be used.
- the oxide film 65 covers the entire first surface S1 of the semiconductor substrate 41 with the antireflection film 64 in between, for example.
- silicon oxide (SiO) or the like can be used.
- the on-chip lens 71 is provided for each pixel 21.
- the on-chip lens 71 covers each APD 31 from the first surface S1 side.
- the on-chip lens 71 collects the incident light on the APD 31 of the semiconductor substrate 41 for each pixel 21.
- the on-chip lens 71 may be made of an organic material or may be made of an inorganic material. Examples of the organic material include siloxane-based resin, styrene-based resin, acrylic-based resin, and the like. Examples of the inorganic material include silicon nitride (SiN) and silicon oxynitride (SiON).
- the surface of the on-chip lens 71 is preferably covered with an antireflection film (antireflection film 72). Thereby, reflection of light on the surface of the on-chip lens 71 can be suppressed.
- antireflection film 72 the same material as the antireflection film 64 can be used.
- a reflection member 73 is provided at a position facing the APD 31 with the on-chip lens 71 and the antireflection film 72 in between.
- the reflecting member 73 has reflectivity for light having wavelengths in the near-infrared region and the infrared region, and reflects the light reflected by the wiring 104 of the wiring layer 42 toward the APD 31 again. Although the details will be described later, this makes it possible to prevent the light reflected by the wiring 104 from going outside the APD 31, that is, from the first surface S1 of the semiconductor substrate 41 to the outside.
- the reflecting member 73 corresponds to a specific example of the first reflecting member of the present disclosure.
- the reflecting member 73 is provided, for example, on each on-chip lens 71 in a film shape so as to cover a part of the on-chip lens 71.
- the reflecting member 73 is arranged at the center of the on-chip lens 71.
- a film made of tungsten (W), silver (Ag), aluminum (Al), gold (Au), copper (Cu), or the like can be used.
- the reflecting member 73 has a planar shape such as a quadrangle (FIG. 4A), a circle (FIG. 4B) or a hexagon (FIG. 4C).
- the reflecting member 73 may have any planar shape, and for example, the planar shape of the reflecting member 73 may be a polygon other than a hexagon, an ellipse, or the like.
- the area occupied by the reflection member 73 in plan view is preferably 25% or less of the area of the pixel 21.
- the size of the reflecting member 73 will be described below.
- FIG. 5 shows a result of a relationship between the size of the reflection member 73 (the area of the reflection member 73 in plan view) and the quantum efficiency obtained by simulation.
- the quantum efficiency of the APD 31 is increased by about 1% to 2% when the reflective member 73 having an area sufficiently smaller than the area of the pixel 21 is provided, as compared with the case where the reflective member 73 is not provided. I understood.
- the area of the reflection member 73 exceeds X, the quantum efficiency of the APD 31 is significantly reduced.
- This X is a value at which the area of the reflection member 73 is 25% of the area of the pixel 21.
- the reflecting member 73 having an area of 25 ⁇ m 2 or less.
- the area occupied by the reflection member 73 is more preferably 1% to 4% of the area of the pixel 21.
- the thickness of the reflecting member 73 is, for example, 300 nm.
- the light condensed by the on-chip lens 71 for each pixel 21 (light having a wavelength in the near infrared region and a wavelength in the infrared region) is incident on the APD 31.
- a pair of holes and electrons is generated (photoelectrically converted) in the APD 31.
- the negative voltage V BD is supplied from the wiring layer 43 to the N type diffusion layer 53 via the metal pads 101 and 102, the contact electrode 103 and the wiring 104
- the avalanche multiplication region 57 is formed in the APD 31.
- the electrons are avalanche-multiplied and a light reception signal is generated.
- a predetermined voltage is supplied to the P-type region 56 from the wiring layer 43 via the metal pads 101 and 102, the contact electrode 103 and the wiring 105.
- the sensor chip 11 of the present embodiment is provided with a reflecting member 73 that reflects the light reflected by the wiring 104. This makes it easier for the light reflected by the wiring 104 to efficiently enter the APD 31.
- a reflecting member 73 that reflects the light reflected by the wiring 104.
- FIG. 6 shows a schematic cross-sectional configuration of a main part of the sensor chip (sensor chip 111) according to the comparative example.
- FIG. 6 corresponds to FIG. 2 showing the sensor chip 11.
- the sensor chip 111 has a laminated structure of the semiconductor substrate 41, the wiring layer 42, and the wiring layer 43, and the wiring layer 42 is provided with the wiring 104.
- An on-chip lens 71 is provided for each pixel 21 on the first surface S1 of the semiconductor substrate 41.
- the sensor chip 111 is not provided with the reflecting member (the reflecting member 73 in FIG. 2) on the on-chip lens 71. In this respect, the sensor chip 111 is different from the sensor chip 11.
- a part of the light condensed by the on-chip lens 71 for each pixel 21 passes through the semiconductor substrate 41. Then, it is incident on the wiring layer 42.
- the light L IR incident on the wiring layer 42 is reflected by the wiring 104 and heads for the semiconductor substrate 41 again (FIG. 6).
- the wiring 104 by providing the wiring 104, a part of the light L IR transmitted through the semiconductor substrate 41 is likely to enter the semiconductor substrate 41 again. Thereby, the sensitivity can be improved as compared with the case where the wiring 104 is not provided.
- the remaining light L IR goes out from the first surface S1 of the semiconductor substrate 41 to the outside of the semiconductor substrate 41.
- about 45% of the light L IR reflected by the wiring 104 escapes to the outside of the semiconductor substrate 41.
- flare may occur due to the light L IR emitted to the outside of the semiconductor substrate 41.
- the reflecting member 73 that faces the APD 31 is provided with the on-chip lens 71 in between, the light reflected by the wiring 104 can be made to enter the APD 31 more efficiently. it can.
- FIG. 7 shows an example of the optical LI R path in the sensor chip 11.
- the light L IR reflected by the wiring 104 the light L IR that has not been photoelectrically converted by the APD 31 is reflected by the reflecting member 73 again.
- the light reflected by the reflecting member 73 goes to the APD 31 and is photoelectrically converted by the APD 31.
- the light reflected by the reflecting member 73 enters the wiring layer 42 from the semiconductor substrate 41 and is reflected by the wiring 104.
- FIG. 8 shows the relationship between the depth of the semiconductor substrate (for example, the semiconductor substrate 41) and the absorption amount of light of each wavelength (540 nm, 550 nm, 560 nm, 850 nm, 900 nm, 940 nm).
- About 100% of light having a wavelength in the visible region (540 nm, 550 nm, 560 nm) is absorbed at a shallow position of the semiconductor substrate.
- the light of wavelengths in the infrared region 850 nm, 900 nm, 940 nm
- the amount absorbed into the semiconductor substrate increases. That is, for light of wavelengths in the near infrared region and the infrared region, the sensitivity can be effectively improved by increasing the optical path length.
- the light L IR is emitted between the wiring 104 and the reflecting member 73.
- the reflection can be repeated with.
- the light L IR reflected by the wiring 104 easily and efficiently enters the APD 31, and the sensitivity can be improved as compared with the sensor chip 111.
- the amount of light that goes out from the first surface S1 of the semiconductor substrate 41 to the outside of the semiconductor substrate 41 is reduced, so that flare can be suppressed.
- the reflecting member 73 that reflects the light L IR reflected by the wiring 104 is provided, the light L IR reflected by the wiring 104 is the APD 31. It is possible to prevent it from going outside. Therefore, it is possible to improve the sensitivity. In addition, the occurrence of flare can be suppressed.
- FIG. 9 shows a schematic cross-sectional configuration of a main part of a sensor chip (sensor chip 11A) according to the first modification of the above-described embodiment.
- FIG. 9 shows the cross-sectional structure of one pixel 21.
- the sensor chip 11A has an antireflection film 74 on the reflection member 73.
- the antireflection film 74 corresponds to a specific example of the antireflection member of the present disclosure.
- the sensor chip 11A according to Modification 1 has the same configuration as the sensor chip 11 of the above-described embodiment, and the operation and effect are also the same.
- the configuration of the APD 31 is simplified and shown.
- the illustration of the APD 31 will be simplified for the cross-sectional views described in the second and subsequent modifications.
- the antireflection film 74 is laminated on the side of the reflection member 73 opposite to the on-chip lens 71.
- the antireflection film 74 is provided, for example, at a position overlapping the reflecting member 73 in a plan view, and has the same planar shape as the reflecting member 73.
- the end surface of the antireflection film 74 and the end surface of the reflection member 73 are provided at the same position.
- the antireflection film 74 is provided for each on-chip lens 71 (for each pixel 21).
- the insulating film 62 (pixel separation portion 63) has an inverse tapered shape in FIGS. 9 and later, the insulating film 62 may have another shape, for example, a tapered shape. May be.
- the antireflection film 74 is made of a material that prevents reflection of light having wavelengths in the near infrared region and the infrared region.
- a carbon black film, a silicon oxide (SiO) film, or the like can be used for the antireflection film 74.
- the antireflection film 74 may be formed of a laminated film of a carbon black film and a silicon oxide film.
- the reflecting member 73 since the reflecting member 73 is provided as in the above-described embodiment, the light LIR reflected by the wiring 104 easily enters the APD 31 efficiently. Further, since the reflection member 73 is covered with the antireflection film 74, light reflection on the surface of the reflection member 73 (the surface opposite to the on-chip lens 71) can be suppressed. Thereby, the flare can be suppressed more effectively.
- FIG. 10A shows a schematic configuration of a main part of a sensor chip (sensor chip 11B) according to Modification 2 of the above-described embodiment.
- the position of the reflecting member 73 on the on-chip lens 71 differs depending on the position of the pixel 21 in the pixel array section 12 (FIG. 1).
- the sensor chip 11B according to the modified example 2 has the same configuration as the sensor chip 11 of the above-described embodiment, and the operation and effect thereof are also the same.
- FIG. 10B shows an example of the position of the pixel 21 shown in FIG. 10A.
- the reflection member 73 is provided at a position overlapping the center line CL of the on-chip lens 71.
- the pixels 21 for example, pixels 21R and 21L described later
- the pixel array unit 12 the pixels 21 (for example, pixels 21R and 21L described later) arranged outside the central pixel 21C are reflected at positions outside the central line CL of the on-chip lens 71.
- a member 73 is provided.
- the reflecting member 73 is provided on the right side of the center line CL of the on-chip lens 71.
- the position of the reflection member 73 may be gradually shifted from the pixel 21C at the center of the pixel array section 12 to the pixel 21R at the right end.
- the reflecting member 73 is provided on the left side of the center line CL of the on-chip lens 71.
- the position of the reflection member 73 may be gradually shifted from the pixel 21C at the center of the pixel array section 12 to the pixel 21L at the left end.
- the position of the reflecting member 73 with respect to the on-chip lens 71 differs depending on the image height in the pixel array unit 12.
- FIG. 11 shows the configuration of the sensor chip 11B having the antireflection film 74 described in the first modification.
- the antireflection film 74 may be laminated on the reflection member 73 of the sensor chip 11B.
- the reflection member 73 since the reflection member 73 is provided as in the above-described embodiment, the light L IR reflected by the wiring 104 easily enters the APD 31 efficiently. Further, since the position of the reflecting member 73 with respect to the on-chip lens 71 is made to be different in each pixel 21 according to the image height of the pixel array section 12, light incident on the on-chip lens 71 from an oblique direction is incident on the APD 31. It is incident efficiently. That is, the sensitivity of the sensor chip 11B can be improved by the same effect as the pupil correction.
- FIG. 12A and 12B show a schematic configuration of a main part of a sensor chip (sensor chip 11C) according to Modification 3 of the above embodiment.
- FIG. 12A shows an example of the cross-sectional configuration of the sensor chip 11C
- FIG. 12B shows an example of the planar configuration of the sensor chip 11C.
- the reflection member 73 is provided separately in a plurality of portions (main portion 73m, small portion 73s). Except for this point, the sensor chip 11C according to Modification 3 has the same configuration as the sensor chip 11 of the above-described embodiment, and the operation and effect thereof are also the same.
- Each pixel 21 is provided with a reflective member 73 including a main portion 73m and a small portion 73s separated from each other.
- the main portion 73m has, for example, a quadrangular planar shape, and is arranged in the center of the on-chip lens 71.
- a plurality of small portions 73s are provided in each pixel 21, for example.
- Each small portion 73s has, for example, a quadrangular planar shape, and has an area smaller than the area of the main portion 73m in a plan view.
- a plurality of small portions 73s are arranged so as to surround one main portion 73m.
- the reflective member 73 separated into a plurality of portions may be provided in a fence shape.
- the reflection member 73 since the reflection member 73 is provided as in the above-described embodiment, the light L IR reflected by the wiring 104 easily enters the APD 31 efficiently. Further, since the reflection member 73 is divided into a plurality of parts and provided in each pixel 21, it becomes easy to efficiently confine the light L IR entering the on-chip lens 71 in each pixel 21. Further, the loss of the optical LIR incident on the on-chip lens 71 is suppressed.
- FIG. 13 shows a schematic cross-sectional configuration of a main part of a sensor chip (sensor chip 11D) according to Modification 4 of the above embodiment.
- the sensor chip 11D has an inverted pyramid array structure (inverted pyramid array structure 41P) on the first surface S1 of the semiconductor substrate 41. Except for this point, the sensor chip 11D according to Modification 4 has the same configuration as the sensor chip 11 of the above-described embodiment, and the operation and effect thereof are also the same.
- the inverted pyramid array structure 41P is provided on substantially the entire surface of each pixel 21.
- the inverted pyramid array structure 41P is a so-called IPA (Inverted Pyramid Array) structure, and is a pyramid-shaped (quadrangular pyramid-shaped) minute uneven structure provided on the first surface S1 of the semiconductor substrate 41.
- IPA Inverted Pyramid Array
- the reflection member 73 is provided as in the above-described embodiment, the light L IR reflected by the wiring 104 easily enters the APD 31 efficiently. Further, since the inverted pyramid array structure 41P is provided on the first surface S1 of the semiconductor substrate 41, the sensitivity of the APD 31 can be further increased by light diffraction.
- FIG. 14 shows a schematic cross-sectional configuration of a main part of a sensor chip (sensor chip 11E) according to Modification 5 of the above-described embodiment.
- the pixel array unit 12 (FIG. 1) is provided with the pixels 21 having the APD 31 and the pixels 21V having the PD (Photo Diode) 31V that receives light having a wavelength in the visible region.
- the sensor chip 11E according to Modification 5 has the same configuration as the sensor chip 11 of the above-described embodiment, and the operation and effect thereof are also the same.
- the pixel 21V corresponds to a specific example of the second pixel of the present technology
- the PD 31V corresponds to a specific example of the photodiode of the present technology.
- the pixel 21V is a pixel that receives light in the red wavelength band, the green wavelength band, and the blue wavelength band to generate a light reception signal, and the color filter 75 is provided between the first surface S1 of the semiconductor substrate 41 and the on-chip lens 71. have.
- the color filter 75 selectively transmits light in any of the red wavelength region, the green wavelength region, and the blue wavelength region.
- the PD 31V provided in the pixel 21V may not be provided with the avalanche multiplication region (the avalanche multiplication region 57 in FIG. 2).
- an APD that receives light having a wavelength in the visible region may be used for the pixel 21V.
- the reflective member 73 is selectively provided in, for example, the pixel 21 that receives light of wavelengths in the near infrared region and the infrared region among the pixels 21 and 21V. Thereby, in the pixel 21V that receives light having a wavelength in the visible region, it is possible to suppress the loss of light caused by the reflecting member 73.
- FIG. 15 represents another example of the sensor chip 11E.
- the pixels 21 and 21V are provided with a laminated structure of the reflection member 73 and the optical functional film 76.
- the laminated structure of the reflection member 73 and the optical functional film 76 has a property of reflecting light of wavelengths in the near infrared region and infrared region and transmitting light of wavelengths in the visible region.
- the optical function film 76 can be configured by, for example, a laminated film in which a high refractive index material film and a low refractive index material film are alternately laminated.
- a silicon oxide (SiO 2 ) film can be used for the low refractive index material film
- a titanium oxide (TiO 2 ) film can be used for the high refractive index material film.
- the reflection member 73 since the reflection member 73 is provided as in the above-described embodiment, the light L IR reflected by the wiring 104 easily enters the APD 31 efficiently. Moreover, since the pixel 21V that receives light having a wavelength in the visible region is provided together with the pixel 21 that receives light having a wavelength in the near-infrared region and the infrared region, acquired information can be increased.
- FIG. 16 is a block diagram showing a configuration example of a distance measuring device 200 which is an electronic device using the sensor chips 11, 11A, 11B, 11C, 11D and 11E.
- the distance measuring device 200 includes a distance measuring image sensor 201 and a light source device 211.
- the ranging image sensor 201 includes an optical system 202, a sensor chip 203, an image processing circuit 204, a monitor 205, and a memory 206. Then, the distance measurement image sensor 201 receives the light (modulated light or pulsed light) emitted from the light source device 211 toward the subject and reflected by the surface of the subject, and thereby the distance according to the distance to the subject. Images can be acquired.
- the optical system 202 is configured to have one or more lenses, guides image light (incident light) from a subject to the sensor chip 203, and forms an image on the light receiving surface (sensor unit) of the sensor chip 203.
- the sensor chip 11, 11A, 11B, 11C, 11D, 11E described above is applied as the sensor chip 203, and the distance signal indicating the distance obtained from the light receiving signal (APD OUT) output from the sensor chip 203 is used as the image processing circuit. It is supplied to 204.
- the image processing circuit 204 performs image processing for constructing a distance image based on the distance signal supplied from the sensor chip 203, and the distance image (image data) obtained by the image processing is supplied to the monitor 205 and displayed. Or is supplied to the memory 206 and stored (recorded).
- the distance measuring image sensor 201 configured as described above, by applying the above-described sensor chips 11, 11A, 11B, 11C, 11D, and 11E, as the sensitivity of the pixel 21 is improved, for example, a more accurate A range image can be acquired.
- FIG. 17 is a diagram showing a usage example using the above-mentioned image sensor (distance measuring image sensor 201).
- the image sensor described above can be used in various cases for sensing light such as visible light, infrared light, ultraviolet light, and X-rays as described below.
- ⁇ Devices that capture images used for viewing such as digital cameras and portable devices with camera functions
- Safe driving such as automatic stop and recognition of the driver's condition
- Gestures of devices/users used for traffic such as in-vehicle sensors for taking images of the back, surroundings, and inside the vehicle, surveillance cameras for monitoring traveling vehicles and roads, ranging sensors for ranging between vehicles, etc.
- Devices used for home appliances such as TVs, refrigerators, and air conditioners, endoscopes, and devices for imaging blood vessels by receiving infrared light in order to take images and operate the devices according to the gestures.
- Equipment used for agriculture Of devices used for medical and health care, surveillance cameras for crime prevention, cameras used for person authentication, and other devices used for security, skin measuring instruments for skin, and scalp Devices such as microscopes used for beauty, action cameras and wearable cameras for sports applications, devices used for sports, cameras for monitoring the condition of fields and crops, etc. , Equipment used for agriculture
- the technology according to the present disclosure (this technology) can be applied to various products.
- the technology according to the present disclosure may be applied to an endoscopic surgery system.
- FIG. 18 is a block diagram showing an example of a schematic configuration of a patient internal information acquisition system using a capsule endoscope to which the technology according to the present disclosure (the present technology) can be applied.
- the in-vivo information acquisition system 10001 includes a capsule-type endoscope 10100 and an external control device 10200.
- the capsule endoscope 10100 is swallowed by a patient at the time of inspection.
- the capsule endoscope 10100 has an imaging function and a wireless communication function, and moves inside the organ such as the stomach or the intestine by peristaltic movement or the like while being naturally discharged from the patient, and the inside of the organ.
- Images (hereinafter, also referred to as in-vivo images) are sequentially captured at predetermined intervals, and information regarding the in-vivo images is sequentially wirelessly transmitted to the external control device 10200 outside the body.
- the external control device 10200 comprehensively controls the operation of the internal information acquisition system 10001. Further, the external control device 10200 receives information about the in-vivo image transmitted from the capsule endoscope 10100, and displays the in-vivo image on a display device (not shown) based on the received information about the in-vivo image. Image data for displaying is generated.
- the in-vivo information acquisition system 10001 can obtain an in-vivo image of the inside of the patient's body at any time during the period from when the capsule endoscope 10100 is swallowed until it is discharged.
- the capsule endoscope 10100 has a capsule-shaped casing 10101, and in the casing 10101, a light source unit 10111, an imaging unit 10112, an image processing unit 10113, a wireless communication unit 10114, a power feeding unit 10115, and a power supply unit. 10116 and the control part 10117 are stored.
- the light source unit 10111 is composed of, for example, a light source such as an LED (light emission diode), and irradiates the imaging field of view of the imaging unit 10112 with light.
- a light source such as an LED (light emission diode)
- the image pickup unit 10112 is composed of an image pickup element and an optical system including a plurality of lenses provided in front of the image pickup element.
- the reflected light (hereinafter referred to as observation light) of the light applied to the body tissue to be observed is collected by the optical system and incident on the image pickup element.
- the observation light incident on the image sensor is photoelectrically converted, and an image signal corresponding to the observation light is generated.
- the image signal generated by the image capturing unit 10112 is provided to the image processing unit 10113.
- the image processing unit 10113 is configured by a processor such as a CPU (Central Processing Unit) and a GPU (Graphics Processing Unit), and performs various signal processing on the image signal generated by the imaging unit 10112.
- the image processing unit 10113 provides the image signal subjected to the signal processing to the wireless communication unit 10114 as RAW data.
- the wireless communication unit 10114 performs a predetermined process such as a modulation process on the image signal subjected to the signal processing by the image processing unit 10113, and transmits the image signal to the external control device 10200 via the antenna 10114A. Further, the wireless communication unit 10114 receives a control signal related to drive control of the capsule endoscope 10100 from the external control device 10200 via the antenna 10114A. The wireless communication unit 10114 provides the control signal received from the external control device 10200 to the control unit 10117.
- a predetermined process such as a modulation process
- the wireless communication unit 10114 receives a control signal related to drive control of the capsule endoscope 10100 from the external control device 10200 via the antenna 10114A.
- the wireless communication unit 10114 provides the control signal received from the external control device 10200 to the control unit 10117.
- the power feeding unit 10115 includes an antenna coil for receiving power, a power regeneration circuit that regenerates power from the current generated in the antenna coil, a booster circuit, and the like. In the power feeding unit 10115, electric power is generated using the so-called non-contact charging principle.
- the power supply unit 10116 is composed of a secondary battery and stores the electric power generated by the power supply unit 10115.
- the arrows and the like indicating the destinations of the power supply from the power supply unit 10116 are omitted, but the power stored in the power supply unit 10116 is the light source unit 10111. , Image processing unit 10112, image processing unit 10113, wireless communication unit 10114, and control unit 10117, and can be used to drive these.
- the control unit 10117 is configured by a processor such as a CPU, and controls the driving of the light source unit 10111, the imaging unit 10112, the image processing unit 10113, the wireless communication unit 10114, and the power feeding unit 10115 from the external control device 10200. Control as appropriate.
- the external control device 10200 is composed of a processor such as a CPU or GPU, or a microcomputer or a control board on which a processor and a storage element such as a memory are mixedly mounted.
- the external control device 10200 controls the operation of the capsule endoscope 10100 by transmitting a control signal to the control unit 10117 of the capsule endoscope 10100 via the antenna 10200A.
- a control signal from the external control device 10200 can change the light irradiation conditions for the observation target in the light source unit 10111.
- the imaging conditions for example, the frame rate in the imaging unit 10112, the exposure value, etc.
- the control signal from the external control device 10200 may change the content of the processing in the image processing unit 10113 and the condition (for example, the transmission interval, the number of transmission images, etc.) at which the wireless communication unit 10114 transmits the image signal. ..
- the external control device 10200 also performs various types of image processing on the image signal transmitted from the capsule endoscope 10100, and generates image data for displaying the captured in-vivo image on the display device.
- image processing include development processing (demosaic processing), high image quality processing (band emphasis processing, super-resolution processing, NR (Noise reduction) processing and/or camera shake correction processing), and/or enlargement processing ( Various signal processing such as electronic zoom processing) can be performed.
- the external control device 10200 controls the drive of the display device to display an in-vivo image captured based on the generated image data.
- the external control device 10200 may have the generated image data recorded in a recording device (not shown) or printed out in a printing device (not shown).
- the above is an example of an in-vivo information acquisition system to which the technology according to the present disclosure can be applied.
- the technique according to the present disclosure can be applied to, for example, the imaging unit 10112 among the configurations described above. This improves the detection accuracy.
- the technology according to the present disclosure (this technology) can be applied to various products.
- the technology according to the present disclosure may be applied to an endoscopic surgery system.
- FIG. 19 is a diagram showing an example of a schematic configuration of an endoscopic surgery system to which the technology according to the present disclosure (the present technology) can be applied.
- FIG. 19 illustrates a situation in which an operator (doctor) 11131 is operating on a patient 11132 on a patient bed 11133 using the endoscopic surgery system 11000.
- the endoscopic surgery system 11000 includes an endoscope 11100, other surgical tools 11110 such as a pneumoperitoneum tube 11111 and an energy treatment tool 11112, and a support arm device 11120 that supports the endoscope 11100.
- a cart 11200 equipped with various devices for endoscopic surgery.
- the endoscope 11100 is composed of a lens barrel 11101 into which a region having a predetermined length from the distal end is inserted into the body cavity of the patient 11132, and a camera head 11102 connected to the base end of the lens barrel 11101.
- the endoscope 11100 configured as a so-called rigid mirror having the rigid barrel 11101 is illustrated, but the endoscope 11100 may be configured as a so-called flexible mirror having a flexible barrel. Good.
- An opening in which an objective lens is fitted is provided at the tip of the lens barrel 11101.
- a light source device 11203 is connected to the endoscope 11100, and light generated by the light source device 11203 is guided to the tip of the lens barrel by a light guide extending inside the lens barrel 11101. It is irradiated toward the observation target in the body cavity of the patient 11132 through the lens.
- the endoscope 11100 may be a direct-viewing endoscope, or may be a perspective or side-viewing endoscope.
- An optical system and an image pickup device are provided inside the camera head 11102, and the reflected light (observation light) from the observation target is condensed on the image pickup device by the optical system.
- the observation light is photoelectrically converted by the imaging element, and an electric signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated.
- the image signal is transmitted as RAW data to a camera control unit (CCU: Camera Control Unit) 11201.
- CCU Camera Control Unit
- the CCU 11201 is configured by a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like, and integrally controls the operations of the endoscope 11100 and the display device 11202. Further, the CCU 11201 receives the image signal from the camera head 11102, and performs various image processing such as development processing (demosaic processing) on the image signal for displaying an image based on the image signal.
- image processing such as development processing (demosaic processing)
- the display device 11202 displays an image based on an image signal subjected to image processing by the CCU 11201 under the control of the CCU 11201.
- the light source device 11203 is composed of, for example, a light source such as an LED (light emission diode), and supplies irradiation light to the endoscope 11100 when photographing an operating part or the like.
- a light source such as an LED (light emission diode)
- the input device 11204 is an input interface for the endoscopic surgery system 11000.
- the user can input various information and input instructions to the endoscopic surgery system 11000 via the input device 11204.
- the user inputs an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 11100.
- the treatment tool control device 11205 controls the drive of the energy treatment tool 11112 for cauterizing, incising, sealing a blood vessel, or the like of a tissue.
- the pneumoperitoneum device 11206 is used to inflate the body cavity of the patient 11132 through the pneumoperitoneum tube 11111 in order to inflate the body cavity of the patient 11132 for the purpose of securing the visual field by the endoscope 11100 and the working space of the operator.
- the recorder 11207 is a device capable of recording various information related to surgery.
- the printer 11208 is a device that can print various types of information regarding surgery in various formats such as text, images, or graphs.
- the light source device 11203 that supplies the irradiation light to the endoscope 11100 when photographing the surgical site can be composed of, for example, an LED, a laser light source, or a white light source composed of a combination thereof.
- a white light source is configured by combining RGB laser light sources, the output intensity and output timing of each color (each wavelength) can be controlled with high accuracy. Therefore, the light source device 11203 adjusts the white balance of the captured image. It can be carried out.
- the laser light from each of the RGB laser light sources is irradiated to the observation target in a time-division manner, and the drive of the image sensor of the camera head 11102 is controlled in synchronization with the irradiation timing to support each of RGB. It is also possible to capture the image in a time-division manner. According to this method, a color image can be obtained without providing a color filter on the image sensor.
- the drive of the light source device 11203 may be controlled so as to change the intensity of the output light at predetermined time intervals.
- the drive of the image sensor of the camera head 11102 in synchronization with the timing of changing the light intensity to acquire an image in a time-division manner and synthesizing the image, so-called high dynamic without blackout and overexposure. Images of the range can be generated.
- the light source device 11203 may be configured to be able to supply light in a predetermined wavelength band corresponding to special light observation.
- special light observation for example, by utilizing the wavelength dependence of light absorption in body tissue to irradiate light in a narrow band as compared with the irradiation light (that is, white light) in normal observation, the mucosal surface layer.
- narrow band imaging in which a predetermined tissue such as a blood vessel is photographed with high contrast, is performed.
- fluorescence observation in which an image is obtained by the fluorescence generated by irradiating the excitation light may be performed.
- the body tissue is irradiated with excitation light to observe the fluorescence from the body tissue (autofluorescence observation), or a reagent such as indocyanine green (ICG) is locally injected into the body tissue and the body tissue is injected. It is possible to obtain a fluorescence image by irradiating excitation light corresponding to the fluorescence wavelength of the reagent.
- the light source device 11203 may be configured to be capable of supplying narrow band light and / or excitation light corresponding to such special light observation.
- FIG. 20 is a block diagram showing an example of the functional configuration of the camera head 11102 and the CCU 11201 shown in FIG.
- the camera head 11102 includes a lens unit 11401, an imaging unit 11402, a driving unit 11403, a communication unit 11404, and a camera head control unit 11405.
- CCU11201 has a communication unit 11411, an image processing unit 11412, and a control unit 11413.
- the camera head 11102 and CCU11201 are communicatively connected to each other by a transmission cable 11400.
- the lens unit 11401 is an optical system provided at a connection portion with the lens barrel 11101.
- the observation light taken in from the tip of the lens barrel 11101 is guided to the camera head 11102 and enters the lens unit 11401.
- the lens unit 11401 is configured by combining a plurality of lenses including a zoom lens and a focus lens.
- the image sensor constituting the image pickup unit 11402 may be one (so-called single plate type) or a plurality (so-called multi-plate type).
- each image pickup element may generate an image signal corresponding to each of RGB, and a color image may be obtained by synthesizing them.
- the image pickup unit 11402 may be configured to have a pair of image pickup elements for acquiring image signals for the right eye and the left eye corresponding to 3D (dimensional) display, respectively.
- the 3D display enables the operator 11131 to more accurately grasp the depth of the biological tissue in the surgical site.
- a plurality of lens units 11401 may be provided corresponding to each image pickup element.
- the image pickup unit 11402 does not necessarily have to be provided in the camera head 11102.
- the imaging unit 11402 may be provided inside the lens barrel 11101 immediately after the objective lens.
- the drive unit 11403 is composed of an actuator, and moves the zoom lens and the focus lens of the lens unit 11401 by a predetermined distance along the optical axis under the control of the camera head control unit 11405. Accordingly, the magnification and focus of the image captured by the image capturing unit 11402 can be adjusted as appropriate.
- the communication unit 11404 is composed of a communication device for transmitting/receiving various information to/from the CCU 11201.
- the communication unit 11404 transmits the image signal obtained from the image pickup unit 11402 as RAW data to the CCU 11201 via the transmission cable 11400.
- the communication unit 11404 receives a control signal for controlling the drive of the camera head 11102 from the CCU 11201 and supplies the control signal to the camera head control unit 11405.
- the control signal includes, for example, information to specify the frame rate of the captured image, information to specify the exposure value at the time of imaging, and / or information to specify the magnification and focus of the captured image. Contains information about the condition.
- the image capturing conditions such as the frame rate, the exposure value, the magnification, and the focus may be appropriately designated by the user, or may be automatically set by the control unit 11413 of the CCU 11201 based on the acquired image signal. Good. In the latter case, the so-called AE (Auto Exposure) function, AF (Auto Focus) function, and AWB (Auto White Balance) function are installed in the endoscope 11100.
- AE Auto Exposure
- AF Auto Focus
- AWB Auto White Balance
- the camera head control unit 11405 controls the driving of the camera head 11102 based on the control signal from the CCU 11201 received via the communication unit 11404.
- the communication unit 11411 is composed of a communication device for transmitting and receiving various information to and from the camera head 11102.
- the communication unit 11411 receives the image signal transmitted from the camera head 11102 via the transmission cable 11400.
- the communication unit 11411 transmits a control signal for controlling the driving of the camera head 11102 to the camera head 11102.
- Image signals and control signals can be transmitted by telecommunications, optical communication, or the like.
- the image processing unit 11412 performs various types of image processing on the image signal that is the RAW data transmitted from the camera head 11102.
- the control unit 11413 performs various controls regarding imaging of a surgical site or the like by the endoscope 11100 and display of a captured image obtained by imaging the surgical site or the like. For example, the control unit 11413 generates a control signal for controlling the driving of the camera head 11102.
- the control unit 11413 also causes the display device 11202 to display a captured image of the surgical site or the like based on the image signal subjected to the image processing by the image processing unit 11412. At this time, the control unit 11413 may recognize various objects in the captured image by using various image recognition techniques. For example, the control unit 11413 detects the shape, color, etc. of the edge of an object included in the captured image to remove surgical tools such as forceps, a specific biological part, bleeding, mist when using the energy treatment tool 11112, and the like. Can be recognized. When displaying the captured image on the display device 11202, the control unit 11413 may superimpose and display various surgical support information on the image of the surgical unit by using the recognition result. By superimposing and displaying the operation support information and presenting it to the operator 11131, it is possible to reduce the burden on the operator 11131 and to allow the operator 11131 to proceed with the operation reliably.
- various image recognition techniques For example, the control unit 11413 detects the shape, color, etc. of the edge
- the transmission cable 11400 that connects the camera head 11102 and the CCU 11201 is an electric signal cable that supports electric signal communication, an optical fiber that supports optical communication, or a composite cable of these.
- wired communication is performed using the transmission cable 11400, but communication between the camera head 11102 and the CCU 11201 may be performed wirelessly.
- the above is an example of an endoscopic surgery system to which the technology according to the present disclosure can be applied.
- the technique according to the present disclosure can be applied to the imaging unit 11402 among the configurations described above. By applying the technique according to the present disclosure to the imaging unit 11402, the detection accuracy is improved.
- the technology according to the present disclosure can be applied to various products.
- the technology according to the present disclosure is applicable to any type of movement such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility, an airplane, a drone, a ship, a robot, a construction machine, and an agricultural machine (tractor). It may be realized as a device mounted on the body.
- FIG. 21 is a block diagram showing a schematic configuration example of a vehicle control system which is an example of a mobile body control system to which the technology according to the present disclosure can be applied.
- the vehicle control system 12000 includes a plurality of electronic control units connected via a communication network 12001.
- the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, an outside information detection unit 12030, an in-vehicle information detection unit 12040, and an integrated control unit 12050.
- a microcomputer 12051, a voice image output unit 12052, and an in-vehicle network I/F (interface) 12053 are illustrated as a functional configuration of the integrated control unit 12050.
- the drive system control unit 12010 controls the operation of devices related to the drive system of the vehicle according to various programs.
- the drive system control unit 12010 provides a driving force generator for generating the driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle of the vehicle. It functions as a steering mechanism for adjustment and a control device such as a braking device that generates a braking force of the vehicle.
- the body system control unit 12020 controls the operation of various devices mounted on the vehicle body according to various programs.
- the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as headlamps, back lamps, brake lamps, blinkers or fog lamps.
- the body system control unit 12020 may be input with radio waves transmitted from a portable device that substitutes for the key or signals of various switches.
- the body system control unit 12020 receives inputs of these radio waves or signals and controls a vehicle door lock device, a power window device, a lamp, and the like.
- the vehicle exterior information detection unit 12030 detects information outside the vehicle equipped with the vehicle control system 12000.
- the imaging unit 12031 is connected to the vehicle outside information detection unit 12030.
- the vehicle outside information detection unit 12030 causes the image pickup unit 12031 to capture an image of the outside of the vehicle and receives the captured image.
- the vehicle exterior information detection unit 12030 may perform object detection processing or distance detection processing such as people, vehicles, obstacles, signs, or characters on the road surface based on the received image.
- the imaging unit 12031 is an optical sensor that receives light and outputs an electric signal according to the amount of the light received.
- the imaging unit 12031 can output the electric signal as an image or as distance measurement information.
- the light received by the imaging unit 12031 may be visible light or invisible light such as infrared light.
- the in-vehicle information detection unit 12040 detects in-vehicle information.
- a driver state detection unit 12041 that detects the state of the driver is connected.
- the driver state detection unit 12041 includes, for example, a camera that images the driver, and the in-vehicle information detection unit 12040 determines the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated or it may be determined whether the driver is asleep.
- the microcomputer 12051 calculates the control target value of the driving force generator, the steering mechanism, or the braking device based on the information inside and outside the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, and the drive system control unit.
- a control command can be output to 12010.
- the microcomputer 12051 realizes ADAS (Advanced Driver Assistance System) functions including vehicle collision avoidance or impact mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, vehicle lane deviation warning, and the like. It is possible to perform cooperative control for the purpose of.
- ADAS Advanced Driver Assistance System
- the microcomputer 12051 controls the driving force generator, the steering mechanism, the braking device, and the like based on the information around the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040. It is possible to perform cooperative control for the purpose of autonomous driving or the like that autonomously travels without depending on operation.
- the microcomputer 12051 can output a control command to the body system control unit 12020 based on the information outside the vehicle acquired by the outside information detection unit 12030.
- the microcomputer 12051 controls the headlamps according to the position of the preceding vehicle or the oncoming vehicle detected by the external information detection unit 12030, and performs cooperative control for the purpose of antiglare such as switching the high beam to the low beam. It can be carried out.
- the voice image output unit 12052 transmits an output signal of at least one of a voice and an image to an output device capable of visually or audibly notifying information to an occupant of the vehicle or the outside of the vehicle.
- an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are exemplified as output devices.
- the display unit 12062 may include, for example, at least one of an onboard display and a head-up display.
- FIG. 22 is a diagram showing an example of the installation position of the imaging unit 12031.
- the image capturing unit 12031 includes image capturing units 12101, 12102, 12103, 12104, and 12105.
- the imaging units 12101, 12102, 12103, 12104, 12105 are provided at positions such as the front nose, side mirrors, rear bumpers, back doors, and the upper part of the windshield in the vehicle interior of the vehicle 12100, for example.
- the image capturing unit 12101 provided on the front nose and the image capturing unit 12105 provided on the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 12100.
- the imaging units 12102 and 12103 included in the side mirrors mainly acquire images of the side of the vehicle 12100.
- the imaging unit 12104 provided on the rear bumper or the back door mainly acquires an image of the rear of the vehicle 12100.
- the imaging unit 12105 provided on the upper part of the windshield in the vehicle interior is mainly used for detecting a preceding vehicle, a pedestrian, an obstacle, a traffic signal, a traffic sign, a lane, or the like.
- FIG. 22 shows an example of the photographing range of the imaging units 12101 to 12104.
- the imaging range 12111 indicates the imaging range of the imaging unit 12101 provided on the front nose
- the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided on the side mirrors
- the imaging range 12114 indicates The imaging range of the imaging part 12104 provided in a rear bumper or a back door is shown.
- a bird's-eye view image of the vehicle 12100 viewed from above can be obtained.
- At least one of the image capturing units 12101 to 12104 may have a function of acquiring distance information.
- at least one of the image capturing units 12101 to 12104 may be a stereo camera including a plurality of image capturing elements, or may be an image capturing element having pixels for phase difference detection.
- the microcomputer 12051 has a distance to each three-dimensional object within the imaging range 12111 to 12114 based on the distance information obtained from the imaging units 12101 to 12104, and a temporal change of this distance (relative velocity with respect to the vehicle 12100).
- a predetermined speed for example, 0 km / h or more.
- the microcomputer 12051 can set an inter-vehicle distance to be secured in front of the preceding vehicle in advance, and can perform automatic braking control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like. In this way, it is possible to perform cooperative control for the purpose of autonomous driving or the like that autonomously travels without depending on the operation of the driver.
- the microcomputer 12051 converts three-dimensional object data related to a three-dimensional object into two-wheeled vehicles, ordinary vehicles, large vehicles, pedestrians, electric poles, and other three-dimensional objects based on the distance information obtained from the imaging units 12101 to 12104. It can be classified and extracted and used for automatic avoidance of obstacles. For example, the microcomputer 12051 identifies an obstacle around the vehicle 12100 into an obstacle visible to the driver of the vehicle 12100 and an obstacle difficult to see. Then, the microcomputer 12051 determines the collision risk indicating the risk of collision with each obstacle, and when the collision risk is equal to or higher than the set value and there is a possibility of collision, the microcomputer 12051 via the audio speaker 12061 or the display unit 12062. By outputting an alarm to the driver and performing forced deceleration and avoidance steering via the drive system control unit 12010, driving support for collision avoidance can be provided.
- At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
- the microcomputer 12051 can recognize a pedestrian by determining whether or not a pedestrian is present in the images captured by the imaging units 12101 to 12104.
- pedestrian recognition includes, for example, a procedure for extracting feature points in an image captured by an imaging unit 12101 to 12104 as an infrared camera, and pattern matching processing for a series of feature points indicating the outline of an object to determine whether or not the pedestrian is a pedestrian. Is performed by the procedure for determining.
- the audio image output unit 12052 When the microcomputer 12051 determines that a pedestrian is present in the captured images of the imaging units 12101 to 12104 and recognizes the pedestrian, the audio image output unit 12052 outputs a square contour line for emphasizing the recognized pedestrian.
- the display unit 12062 is controlled so as to superimpose and display. Further, the audio image output unit 12052 may control the display unit 12062 so as to display an icon or the like indicating a pedestrian at a desired position.
- the technology according to the present disclosure can be applied to the imaging unit 12031 among the configurations described above.
- the technique according to the present disclosure can be applied to the image capturing unit 12031, a more easily captured image can be obtained, and thus fatigue of the driver can be reduced.
- the present disclosure is not limited to the above embodiments and the like, and various modifications can be made.
- the configuration of the sensor chip described in the above embodiments and the like is an example, and another layer may be further provided.
- the material and thickness of each layer are also examples, and are not limited to those described above.
- the reflecting member 73 may be provided on the first surface S1 side of the semiconductor substrate 41.
- the on-chip lens 71 may be provided so as to cover the reflection member 73 provided on the first surface S1 of the semiconductor substrate 41.
- the on-chip lens 71 is used as the condensing structure, but as shown in FIG. 24, a digital lens (digital lens 71D) is used instead of the on-chip lens 71. You may do so.
- the digital lens 71D focuses the incident light on the APD 31 using diffraction.
- the reflection member 73 is arranged on the first surface S1 of the semiconductor substrate 41 together with the digital lens 71D, for example.
- the configuration of the APD 31 may further include other elements, or may not include all the elements. Further, the arrangement of the elements constituting the APD 31 may be another arrangement.
- the wiring 104 functions as the cathode of the APD 31 and the wiring 105 functions as the anode of the APD 31 has been described, but the wiring 104 may function as the anode and the wiring 105 may function as the cathode.
- the conductive type (P type, N type) described in the above embodiment may have opposite configurations.
- the first reflecting member of the present technology is the wiring 104 of the wiring layer 42
- the first reflecting member may be configured by a reflecting member other than the wiring 104. It may be.
- the wiring layer 42 or the wiring layer 43 may be provided with another reflective member in addition to the wiring 104.
- the present disclosure may have the following configurations.
- the second reflecting member that reflects the light reflected by the first reflecting member since the second reflecting member that reflects the light reflected by the first reflecting member is provided, the light reflected by the first reflecting member is an avalanche photo. It is possible to prevent the diode from going outside. Therefore, the sensitivity can be improved.
- a semiconductor substrate having a first surface and a second surface facing each other and provided with an avalanche photodiode; An on-chip lens provided on the first surface side of the semiconductor substrate, A first reflecting member provided on the on-chip lens, A wiring layer that is provided on the second surface side of the semiconductor substrate and that includes a second reflecting member.
- the sensor according to (1) wherein the light reflected by the second reflecting member is further reflected by the first reflecting member.
- the antireflection member includes a carbon black film or a silicon oxide film.
- the avalanche photodiode and the on-chip lens are provided for each pixel, The sensor according to any one of (1) to (9), including the pixels in which the position of the first reflecting member with respect to the on-chip lens is different.
- the first reflecting member is provided in the central portion of the on-chip lens
- the pixel array portion outside the central portion of the pixel array portion has the pixel provided with the first reflective member at a position deviated from the central portion of the on-chip lens according to the above (10).
- Sensor. (12) The sensor according to any one of (1) to (11), wherein an inverted pyramid array structure is provided on the surface of the semiconductor substrate. (13) The avalanche photodiode and the on-chip lens are provided for each pixel, An area occupied by the first reflecting member is 25% or less of an area of the pixel.
- the said 1st reflective member is a sensor as described in any one of said (1) thru
- the on-chip lens is provided for each pixel,
- the plurality of pixels includes a first pixel and a second pixel, In the first pixel, the avalanche photodiode receives light of wavelengths in the near infrared region and the infrared region,
- a semiconductor substrate having a first surface and a second surface facing each other and provided with an avalanche photodiode;
- An on-chip lens provided on the first surface side of the semiconductor substrate and The first reflective member provided on the on-chip lens and
- a distance measuring device provided with a sensor provided on the second surface side of the body substrate and including a wiring layer including a second reflecting member.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Electromagnetism (AREA)
- Solid State Image Pick-Up Elements (AREA)
- Photometry And Measurement Of Optical Pulse Characteristics (AREA)
- Light Receiving Elements (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
Description
1.実施の形態(オンチップレンズ上に反射部材を有するセンサチップの例)
2.変形例1(反射部材に積層された反射防止部材を有する例)
3.変形例2(反射部材の位置が互いに異なる画素を有する例)
4.変形例3(反射部材を複数に分離して設けた例)
5.変形例4(半導体基板の表面に逆ピラミッド構造を有する例)
6.変形例5(可視領域の波長の光を受光する画素を有する例)
7.適用例(測距装置)
8.応用例
[センサチップ11の構成]
図1は、本技術の一実施の形態に係るセンサチップ11の構成例を示すブロック図である。このセンサチップ11が、本技術の「センサ」の一具体例に対応する。
センサチップ11では、画素21毎にオンチップレンズ71で集光された光(近赤外領域および赤外領域の波長の光)がAPD31へ光入射する。これにより、APD31では正孔(ホール)および電子の対が発生する(光電変換される)。例えば、配線層43からメタルパッド101,102、コンタクト電極103および配線104を介してN型拡散層53に、負電圧VBDが供給されると、APD31にアバランシェ増倍領域57が形成される。これにより、例えば、電子がアバランシェ増倍され、受光信号が生成される。また、P型領域56には、配線層43からメタルパッド101,102、コンタクト電極103および配線105を介して所定の電圧が供給される。
本実施の形態のセンサチップ11では、配線104で反射された光を反射する反射部材73が設けられている。これにより、配線104で反射された光は効率的にAPD31に入射しやすくなる。以下、この作用および効果について、比較例を用いて説明する。
図9は、上記実施の形態の変形例1に係るセンサチップ(センサチップ11A)の要部の模式的な断面構成を表したものである。図9は、1つの画素21の断面構成を表している。このセンサチップ11Aは、反射部材73上に反射防止膜74を有している。ここでは、反射防止膜74が、本開示の反射防止部材の一具体例に対応する。この点を除き、変形例1に係るセンサチップ11Aは、上記実施の形態のセンサチップ11と同様の構成を有し、その作用および効果も同様である。図9では、APD31の構成を簡略化して表している。変形例2以降で説明する断面図についても、同様に、APD31の図示を簡略化する。
図10Aは、上記実施の形態の変形例2に係るセンサチップ(センサチップ11B)の要部の模式的な構成を表したものである。このセンサチップ11Bでは、画素アレイ部12(図1)での画素21の位置により、オンチップレンズ71上の反射部材73の位置が異なっている。この点を除き、変形例2に係るセンサチップ11Bは、上記実施の形態のセンサチップ11と同様の構成を有し、その作用および効果も同様である。
図12A,図12Bは、上記実施の形態の変形例3に係るセンサチップ(センサチップ11C)の要部の模式的な構成を表したものである。図12Aは、センサチップ11Cの断面構成の一例を表し、図12Bは、センサチップ11Cの平面構成の一例を表している。このセンサチップ11Cでは、反射部材73が複数の部分(主部分73m,小部分73s)に分離して設けられている。この点を除き、変形例3に係るセンサチップ11Cは、上記実施の形態のセンサチップ11と同様の構成を有し、その作用および効果も同様である。
図13は、上記実施の形態の変形例4に係るセンサチップ(センサチップ11D)の要部の模式的な断面構成を表したものである。このセンサチップ11Dは、半導体基板41の第1面S1に逆ピラミッドアレイ構造(逆ピラミッドアレイ構造41P)を有している。この点を除き、変形例4に係るセンサチップ11Dは、上記実施の形態のセンサチップ11と同様の構成を有し、その作用および効果も同様である。
図14は、上記実施の形態の変形例5に係るセンサチップ(センサチップ11E)の要部の模式的な断面構成を表したものである。このセンサチップ11Eでは、画素アレイ部12(図1)に、APD31を有する画素21とともに、可視領域の波長の光を受光するPD(Photo Diode)31Vを有する画素21Vが設けられている。この点を除き、変形例5に係るセンサチップ11Eは、上記実施の形態のセンサチップ11と同様の構成を有し、その作用および効果も同様である。ここでは、画素21Vが本技術の第2画素の一具体例に対応し、PD31Vが本技術のフォトダイオードの一具体例に対応する。
図16は、センサチップ11,11A,11B,11C,11D,11Eを利用した電子機器である測距装置200の構成例を示すブロック図である。
図17は、上述のイメージセンサ(測距画像センサ201)を使用する使用例を示す図である。
・ユーザのジェスチャを撮影して、そのジェスチャに従った機器操作を行うために、TVや、冷蔵庫、エアーコンディショナ等の家電に供される装置
・内視鏡や、赤外光の受光による血管撮影を行う装置等の、医療やヘルスケアの用に供される装置
・防犯用途の監視カメラや、人物認証用途のカメラ等の、セキュリティの用に供される装置
・肌を撮影する肌測定器や、頭皮を撮影するマイクロスコープ等の、美容の用に供される装置
・スポーツ用途等向けのアクションカメラやウェアラブルカメラ等の、スポーツの用に供される装置
・畑や作物の状態を監視するためのカメラ等の、農業の用に供される装置
更に、本開示に係る技術(本技術)は、様々な製品へ応用することができる。例えば、本開示に係る技術は、内視鏡手術システムに適用されてもよい。
本開示に係る技術(本技術)は、様々な製品へ応用することができる。例えば、本開示に係る技術は、内視鏡手術システムに適用されてもよい。
本開示に係る技術は、様々な製品へ応用することができる。例えば、本開示に係る技術は、自動車、電気自動車、ハイブリッド電気自動車、自動二輪車、自転車、パーソナルモビリティ、飛行機、ドローン、船舶、ロボット、建設機械、農業機械(トラクター)などのいずれかの種類の移動体に搭載される装置として実現されてもよい。
(1)
対向する第1面および第2面を有するとともに、アバランシェフォトダイオードが設けられた半導体基板と、
前記半導体基板の前記第1面側に設けられたオンチップレンズと、
前記オンチップレンズ上に設けられた第1反射部材と、
前記半導体基板の前記第2面側に設けられるとともに、第2反射部材を含む配線層と
を備えたセンサ。
(2)
前記第2反射部材で反射された光は、前記第1反射部材でさらに反射される
前記(1)に記載のセンサ。
(3)
近赤外領域および赤外領域の波長の光を受光する、前記アバランシェフォトダイオードを含む
前記(1)または(2)に記載のセンサ。
(4)
複数の前記オンチップレンズを有する
前記(1)ないし(3)のうちいずれか1つに記載のセンサ。
(5)
複数の前記オンチップレンズ各々の上に、前記第1反射部材を有する
前記(4)に記載のセンサ。
(6)
前記オンチップレンズ上の中央部から外れた位置に配置された、前記第1反射部材を有する
前記(5)に記載のセンサ。
(7)
前記オンチップレンズ上の中央部に配置された、前記第1反射部材を有する
前記(6)に記載のセンサ。
(8)
更に、前記第1反射部材の前記オンチップレンズと反対側に積層された、反射防止部材を有する
前記(1)ないし(7)のうちいずれか1つに記載のセンサ。
(9)
前記反射防止部材は、カーボンブラック膜または酸化シリコン膜を含む
前記(8)に記載のセンサ。
(10)
前記アバランシェフォトダイオードおよび前記オンチップレンズは画素毎に設けられ、
前記オンチップレンズに対する前記第1反射部材の位置が異なる前記画素を有する
前記(1)ないし(9)のうちいずれか1つに記載のセンサ。
(11)
更に、複数の前記画素が設けられた画素アレイ部を有し、
前記画素アレイ部の中央部の前記画素では、前記オンチップレンズの中央部に前記第1反射部材が設けられ、
前記画素アレイ部の前記中央部より外側の前記画素アレイ部に、前記オンチップレンズの前記中央部から外れた位置に前記第1反射部材が設けられた前記画素を有する
前記(10)に記載のセンサ。
(12)
前記半導体基板の表面に、逆ピラミッドアレイ構造が設けられている
前記(1)ないし(11)のうちいずれか1つに記載のセンサ。
(13)
前記アバランシェフォトダイオードおよび前記オンチップレンズは画素毎に設けられ、
前記第1反射部材の占有面積は、前記画素の面積の25%以下である
前記(1)ないし(12)のうちいずれか1つに記載のセンサ。
(14)
前記第1反射部材は、タングステン,銀,アルミニウム,金または銅を含む
前記(1)ないし(13)のうちいずれか1つに記載のセンサ。
(15)
前記オンチップレンズは画素毎に設けられ、
複数の前記画素は第1画素および第2画素を含み、
前記第1画素では、前記アバランシェフォトダイオードが近赤外領域および赤外領域の波長の光を受光し、
前記第2画素では、フォトダイオードが可視領域の波長の光を受光する
前記(1)ないし(14)のうちいずれか1つに記載のセンサ。
(16)
前記第1反射部材は、前記第1画素および前記第2画素のうち、前記第1画素に選択的に設けられている
前記(15)に記載のセンサ。
(17)
更に、前記第1反射部材に積層された光学機能膜を有し、
前記光学機能膜および前記第1反射部材は、前記第1画素および前記第2画素に設けられ、かつ、前記可視領域の波長の光を透過するとともに、前記近赤外領域および赤外領域の波長の光を反射する
前記(15)に記載のセンサ。
(18)
前記第1反射部材が複数に分離して設けられている
前記(1)ないし(17)のうちいずれか1つに記載のセンサ。
(19)
対向する第1面および第2面を有するとともに、アバランシェフォトダイオードが設けられた半導体基板と、
前記半導体基板の前記第1面側に設けられたオンチップレンズと、
前記オンチップレンズ上に設けられた第1反射部材と、
前記半導「体基板の前記第2面側に設けられるとともに、第2反射部材を含む配線層と
を含むセンサを備えた測距装置。
Claims (19)
- 対向する第1面および第2面を有するとともに、アバランシェフォトダイオードが設けられた半導体基板と、
前記半導体基板の前記第1面側に設けられたオンチップレンズと、
前記オンチップレンズ上に設けられた第1反射部材と、
前記半導体基板の前記第2面側に設けられるとともに、第2反射部材を含む配線層と
を備えたセンサ。 - 前記第2反射部材で反射された光は、前記第1反射部材でさらに反射される
請求項1に記載のセンサ。 - 近赤外領域および赤外領域の波長の光を受光する、前記アバランシェフォトダイオードを含む
請求項1に記載のセンサ。 - 複数の前記オンチップレンズを有する
請求項1に記載のセンサ。 - 複数の前記オンチップレンズ各々の上に、前記第1反射部材を有する
請求項4に記載のセンサ。 - 前記オンチップレンズ上の中央部から外れた位置に配置された、前記第1反射部材を有する
請求項5に記載のセンサ。 - 前記オンチップレンズ上の中央部に配置された、前記第1反射部材を有する
請求項6に記載のセンサ。 - 更に、前記第1反射部材の前記オンチップレンズと反対側に積層された、反射防止部材を有する
請求項1に記載のセンサ。 - 前記反射防止部材は、カーボンブラック膜または酸化シリコン膜を含む
請求項8に記載のセンサ。 - 前記アバランシェフォトダイオードおよび前記オンチップレンズは画素毎に設けられ、
前記オンチップレンズに対する前記第1反射部材の位置が異なる前記画素を有する
請求項1に記載のセンサ。 - 更に、複数の前記画素が設けられた画素アレイ部を有し、
前記画素アレイ部の中央部の前記画素では、前記オンチップレンズの中央部に前記第1反射部材が設けられ、
前記画素アレイ部の前記中央部より外側の前記画素アレイ部に、前記オンチップレンズの前記中央部から外れた位置に前記第1反射部材が設けられた前記画素を有する
請求項10に記載のセンサ。 - 前記半導体基板の表面に、逆ピラミッドアレイ構造が設けられている
請求項1に記載のセンサ。 - 前記アバランシェフォトダイオードおよび前記オンチップレンズは画素毎に設けられ、
前記第1反射部材の占有面積は、前記画素の面積の25%以下である
請求項1に記載のセンサ。 - 前記第1反射部材は、タングステン,銀,アルミニウム,金または銅を含む
請求項1に記載のセンサ。 - 前記オンチップレンズは画素毎に設けられ、
複数の前記画素は第1画素および第2画素を含み、
前記第1画素では、前記アバランシェフォトダイオードが近赤外領域および赤外領域の波長の光を受光し、
前記第2画素では、フォトダイオードが可視領域の波長の光を受光する
請求項1に記載のセンサ。 - 前記第1反射部材は、前記第1画素および前記第2画素のうち、前記第1画素に選択的に設けられている
請求項15に記載のセンサ。 - 更に、前記第1反射部材に積層された光学機能膜を有し、
前記光学機能膜および前記第1反射部材は、前記第1画素および前記第2画素に設けられ、かつ、前記可視領域の波長の光を透過するとともに、前記近赤外領域および赤外領域の波長の光を反射する
請求項15に記載のセンサ。 - 前記第1反射部材が複数に分離して設けられている
請求項1に記載のセンサ。 - 対向する第1面および第2面を有するとともに、アバランシェフォトダイオードが設けられた半導体基板と、
前記半導体基板の前記第1面側に設けられたオンチップレンズと、
前記オンチップレンズ上に設けられた第1反射部材と、
前記半導体基板の前記第2面側に設けられるとともに、第2反射部材を含む配線層と
を含むセンサを備えた測距装置。
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP20766455.8A EP3936838A4 (en) | 2019-03-06 | 2020-01-30 | SENSOR AND DISTANCE MEASUREMENT DEVICE |
| US17/310,841 US12235392B2 (en) | 2019-03-06 | 2020-01-30 | Sensor and distance measurement apparatus having an avalanche photodiode and on-chip lens |
| JP2021503462A JP7529652B2 (ja) | 2019-03-06 | 2020-01-30 | センサおよび測距装置 |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2019-040812 | 2019-03-06 | ||
| JP2019040812 | 2019-03-06 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2020179290A1 true WO2020179290A1 (ja) | 2020-09-10 |
Family
ID=72337531
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2020/003325 Ceased WO2020179290A1 (ja) | 2019-03-06 | 2020-01-30 | センサおよび測距装置 |
Country Status (4)
| Country | Link |
|---|---|
| US (1) | US12235392B2 (ja) |
| EP (1) | EP3936838A4 (ja) |
| JP (1) | JP7529652B2 (ja) |
| WO (1) | WO2020179290A1 (ja) |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2022112594A (ja) * | 2021-01-22 | 2022-08-03 | キヤノン株式会社 | 光電変換装置、光検出システム |
| JP2022170442A (ja) * | 2021-04-28 | 2022-11-10 | キヤノン株式会社 | 光電変換装置及び光検出システム |
| CN115718290A (zh) * | 2021-08-23 | 2023-02-28 | 原相科技股份有限公司 | 多光源的光机及其封装结构 |
| WO2025099884A1 (ja) * | 2023-11-08 | 2025-05-15 | ソニーセミコンダクタソリューションズ株式会社 | 光検出装置及び測距システム |
Families Citing this family (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR102588651B1 (ko) * | 2020-11-12 | 2023-10-12 | 한국표준과학연구원 | 광 검출 소자 및 그 제조 방법 |
| US12183754B2 (en) * | 2021-08-24 | 2024-12-31 | Globalfoundries Singapore Pte. Ltd. | Single-photon avalanche diodes with deep trench isolation |
| KR20230089689A (ko) * | 2021-12-14 | 2023-06-21 | 에스케이하이닉스 주식회사 | 이미지 센싱 장치 |
| US20230307481A1 (en) * | 2022-03-25 | 2023-09-28 | Sensors Unlimited, Inc. | Photodetector array (pda) metallization |
| US20250248156A1 (en) * | 2024-01-31 | 2025-07-31 | Taiwan Semiconductor Manufacturing Company, Ltd. | Semiconductor device and methods of manufacturing the same |
Citations (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH0677518A (ja) * | 1992-08-26 | 1994-03-18 | Nec Corp | 半導体受光素子 |
| JP2001320081A (ja) * | 2000-05-12 | 2001-11-16 | Fujitsu Ltd | 半導体受光素子 |
| JP2007249190A (ja) * | 2006-02-14 | 2007-09-27 | Fujifilm Electronic Materials Co Ltd | 光硬化性組成物、それを用いた反射防止膜、及び固体撮像素子 |
| JP2011124450A (ja) * | 2009-12-11 | 2011-06-23 | Nec Corp | 半導体受光素子 |
| US20110169117A1 (en) * | 2009-04-30 | 2011-07-14 | Massachusetts Institute Of Technology | Cross-Talk Suppression in Geiger-Mode Avalanche Photodiodes |
| JP2014154834A (ja) * | 2013-02-13 | 2014-08-25 | Panasonic Corp | 固体撮像素子 |
| WO2017126329A1 (ja) * | 2016-01-21 | 2017-07-27 | ソニー株式会社 | 撮像素子および電子機器 |
| JP2017163023A (ja) * | 2016-03-10 | 2017-09-14 | 株式会社東芝 | 光検出器およびこれを用いた被写体検知システム |
| JP2018088488A (ja) | 2016-11-29 | 2018-06-07 | ソニーセミコンダクタソリューションズ株式会社 | センサチップおよび電子機器 |
| JP2018117117A (ja) * | 2017-01-19 | 2018-07-26 | ソニーセミコンダクタソリューションズ株式会社 | 受光素子、撮像素子、および、撮像装置 |
| WO2018173872A1 (ja) * | 2017-03-24 | 2018-09-27 | ソニーセミコンダクタソリューションズ株式会社 | センサチップおよび電子機器 |
| JP2019040812A (ja) | 2017-08-28 | 2019-03-14 | カルソニックカンセイ株式会社 | 組電池 |
Family Cites Families (13)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH09326511A (ja) * | 1996-06-05 | 1997-12-16 | Toshiba Electron Eng Corp | 光半導体素子およびその製造方法 |
| JP4048217B2 (ja) | 2000-12-19 | 2008-02-20 | ユーディナデバイス株式会社 | 半導体受光装置 |
| JP2003249632A (ja) * | 2002-02-22 | 2003-09-05 | Sony Corp | 固体撮像素子およびその製造方法 |
| JP3976700B2 (ja) * | 2003-03-24 | 2007-09-19 | 独立行政法人科学技術振興機構 | 極薄分子結晶を用いたアバランシェ増幅型フォトセンサー及びその製造方法 |
| KR20070082026A (ko) | 2006-02-14 | 2007-08-20 | 후지 필름 일렉트로닉 머트리얼즈 가부시키가이샤 | 광경화성 조성물, 그것을 사용한 반사방지막, 및고체촬상소자 |
| JP5365221B2 (ja) * | 2009-01-29 | 2013-12-11 | ソニー株式会社 | 固体撮像装置、その製造方法および撮像装置 |
| JP2013030592A (ja) * | 2011-07-28 | 2013-02-07 | Sony Corp | 光電変換素子及び撮像装置、太陽電池 |
| WO2014027588A1 (ja) * | 2012-08-14 | 2014-02-20 | ソニー株式会社 | 固体撮像装置および電子機器 |
| JP6668036B2 (ja) | 2015-10-14 | 2020-03-18 | ソニーセミコンダクタソリューションズ株式会社 | 撮像素子及びその製造方法、並びに、撮像装置及びその製造方法 |
| JP2017174936A (ja) * | 2016-03-23 | 2017-09-28 | ソニー株式会社 | 固体撮像素子及び電子機器 |
| JP7058479B2 (ja) | 2016-10-18 | 2022-04-22 | ソニーセミコンダクタソリューションズ株式会社 | 光検出器 |
| CN111830527B (zh) | 2017-01-19 | 2024-06-18 | 索尼半导体解决方案公司 | 光接收元件、成像元件和成像装置 |
| KR102432861B1 (ko) * | 2017-06-15 | 2022-08-16 | 삼성전자주식회사 | 거리 측정을 위한 이미지 센서 |
-
2020
- 2020-01-30 WO PCT/JP2020/003325 patent/WO2020179290A1/ja not_active Ceased
- 2020-01-30 US US17/310,841 patent/US12235392B2/en active Active
- 2020-01-30 JP JP2021503462A patent/JP7529652B2/ja active Active
- 2020-01-30 EP EP20766455.8A patent/EP3936838A4/en not_active Withdrawn
Patent Citations (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH0677518A (ja) * | 1992-08-26 | 1994-03-18 | Nec Corp | 半導体受光素子 |
| JP2001320081A (ja) * | 2000-05-12 | 2001-11-16 | Fujitsu Ltd | 半導体受光素子 |
| JP2007249190A (ja) * | 2006-02-14 | 2007-09-27 | Fujifilm Electronic Materials Co Ltd | 光硬化性組成物、それを用いた反射防止膜、及び固体撮像素子 |
| US20110169117A1 (en) * | 2009-04-30 | 2011-07-14 | Massachusetts Institute Of Technology | Cross-Talk Suppression in Geiger-Mode Avalanche Photodiodes |
| JP2011124450A (ja) * | 2009-12-11 | 2011-06-23 | Nec Corp | 半導体受光素子 |
| JP2014154834A (ja) * | 2013-02-13 | 2014-08-25 | Panasonic Corp | 固体撮像素子 |
| WO2017126329A1 (ja) * | 2016-01-21 | 2017-07-27 | ソニー株式会社 | 撮像素子および電子機器 |
| JP2017163023A (ja) * | 2016-03-10 | 2017-09-14 | 株式会社東芝 | 光検出器およびこれを用いた被写体検知システム |
| JP2018088488A (ja) | 2016-11-29 | 2018-06-07 | ソニーセミコンダクタソリューションズ株式会社 | センサチップおよび電子機器 |
| JP2018117117A (ja) * | 2017-01-19 | 2018-07-26 | ソニーセミコンダクタソリューションズ株式会社 | 受光素子、撮像素子、および、撮像装置 |
| WO2018173872A1 (ja) * | 2017-03-24 | 2018-09-27 | ソニーセミコンダクタソリューションズ株式会社 | センサチップおよび電子機器 |
| JP2019040812A (ja) | 2017-08-28 | 2019-03-14 | カルソニックカンセイ株式会社 | 組電池 |
Non-Patent Citations (1)
| Title |
|---|
| See also references of EP3936838A4 |
Cited By (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2022112594A (ja) * | 2021-01-22 | 2022-08-03 | キヤノン株式会社 | 光電変換装置、光検出システム |
| US12148771B2 (en) | 2021-01-22 | 2024-11-19 | Canon Kabushiki Kaisha | Photoelectric conversion apparatus and optical detection system |
| JP7614855B2 (ja) | 2021-01-22 | 2025-01-16 | キヤノン株式会社 | 光電変換装置、光検出システム |
| JP2025038171A (ja) * | 2021-01-22 | 2025-03-18 | キヤノン株式会社 | 光電変換装置、光検出システム |
| JP2022170442A (ja) * | 2021-04-28 | 2022-11-10 | キヤノン株式会社 | 光電変換装置及び光検出システム |
| US12369423B2 (en) | 2021-04-28 | 2025-07-22 | Canon Kabushiki Kaisha | Photoelectric conversion device and photodetection system |
| JP7746028B2 (ja) | 2021-04-28 | 2025-09-30 | キヤノン株式会社 | 光電変換装置及び光検出システム |
| CN115718290A (zh) * | 2021-08-23 | 2023-02-28 | 原相科技股份有限公司 | 多光源的光机及其封装结构 |
| WO2025099884A1 (ja) * | 2023-11-08 | 2025-05-15 | ソニーセミコンダクタソリューションズ株式会社 | 光検出装置及び測距システム |
Also Published As
| Publication number | Publication date |
|---|---|
| JP7529652B2 (ja) | 2024-08-06 |
| JPWO2020179290A1 (ja) | 2020-09-10 |
| US12235392B2 (en) | 2025-02-25 |
| EP3936838A4 (en) | 2022-06-08 |
| EP3936838A1 (en) | 2022-01-12 |
| US20220120868A1 (en) | 2022-04-21 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP7449317B2 (ja) | 撮像装置 | |
| US12027546B2 (en) | Imaging element, fabrication method, and electronic equipment | |
| US10872998B2 (en) | Chip size package, method of manufacturing the same, electronic device, and endoscope | |
| JP7529652B2 (ja) | センサおよび測距装置 | |
| JP7270616B2 (ja) | 固体撮像素子および固体撮像装置 | |
| US20210183928A1 (en) | Imaging element, method of manufacturing the same, and electronic appliance | |
| CN110431668B (zh) | 固态摄像装置和电子设备 | |
| TWI849045B (zh) | 攝像裝置及攝像系統 | |
| CN110998849B (zh) | 成像装置、相机模块和电子设备 | |
| US20240006443A1 (en) | Solid-state imaging device, imaging device, and electronic apparatus | |
| TWI872176B (zh) | 光檢測器 | |
| US20220005859A1 (en) | Solid-state imaging device and electronic apparatus | |
| JP2020064893A (ja) | センサモジュールおよび電子機器 | |
| JPWO2018155183A1 (ja) | 撮像素子および電子機器 | |
| JP2024028045A (ja) | 光検出装置 | |
| WO2023105678A1 (ja) | 光検出装置および光学フィルタ | |
| CN112136215A (zh) | 摄像装置 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20766455 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2021503462 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| ENP | Entry into the national phase |
Ref document number: 2020766455 Country of ref document: EP Effective date: 20211006 |
|
| WWG | Wipo information: grant in national office |
Ref document number: 17310841 Country of ref document: US |
|
| WWW | Wipo information: withdrawn in national office |
Ref document number: 2020766455 Country of ref document: EP |