WO2019188131A1 - Dispositif à semi-conducteur et procédé de fabrication de dispositif à semi-conducteur - Google Patents
Dispositif à semi-conducteur et procédé de fabrication de dispositif à semi-conducteur Download PDFInfo
- Publication number
- WO2019188131A1 WO2019188131A1 PCT/JP2019/009398 JP2019009398W WO2019188131A1 WO 2019188131 A1 WO2019188131 A1 WO 2019188131A1 JP 2019009398 W JP2019009398 W JP 2019009398W WO 2019188131 A1 WO2019188131 A1 WO 2019188131A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- semiconductor element
- insulating film
- semiconductor
- semiconductor chip
- film
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- H—ELECTRICITY
- H10—SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
- H10F—INORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
- H10F39/00—Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
- H10F39/10—Integrated devices
- H10F39/12—Image sensors
-
- H10W20/01—
-
- H10W20/48—
-
- H10W70/60—
-
- H10W90/00—
Definitions
- the present disclosure relates to a semiconductor device having a plurality of stacked semiconductor elements and a method for manufacturing the same.
- a stacked solid-state imaging device has been proposed (see, for example, Patent Document 1).
- this solid-state imaging device for example, a semiconductor element provided with a PD (Photo Diode) or the like for each pixel and a semiconductor element provided with a circuit for processing a signal obtained from each pixel are stacked.
- the plurality of semiconductor elements are electrically connected by, for example, CuCu bonding.
- a method for manufacturing a semiconductor device includes: forming a first semiconductor element on a substrate; covering the first semiconductor element with a stopper film; forming an insulating film on the stopper film; The second semiconductor element is opposed to the first semiconductor element with a stopper film in between, and the first semiconductor element is electrically connected to the second semiconductor element through a wiring penetrating the insulating film and the stopper film. is there.
- the insulating film is formed on the stopper film after the first semiconductor element is covered with the stopper film, an insulating film with high flatness is formed. .
- a semiconductor device includes a substrate, a first semiconductor element provided on the substrate, a stopper film covering the first semiconductor element, and facing the first semiconductor element with the stopper film interposed therebetween.
- the stopper film is provided between the first semiconductor element and the insulating film, the flatness of the insulating film is improved.
- the flatness of the insulating film on the stopper film can be improved. Therefore, the flatness of the insulating film covering the semiconductor element can be improved.
- FIG. 4 is a schematic plan view illustrating an example of a configuration of a first semiconductor chip, a second semiconductor chip, and a third semiconductor chip illustrated in FIG. 1.
- FIG. 10 is a schematic plan view illustrating another example of the configuration of the first semiconductor chip, the second semiconductor chip, and the third semiconductor chip illustrated in FIG. 1.
- It is a cross-sectional schematic diagram showing 1 process of the manufacturing method of the imaging device shown in FIG.
- It is a cross-sectional schematic diagram showing the process following FIG. 3A.
- FIG. 3B is a cross-sectional schematic diagram showing the process following FIG. 3C.
- FIG. 4A It is a cross-sectional schematic diagram showing the process following FIG. 4A. It is a cross-sectional schematic diagram showing the process following FIG. 4B. It is a cross-sectional schematic diagram showing the process following FIG. 4C. It is a cross-sectional schematic diagram showing the process following FIG. 4D. It is a cross-sectional schematic diagram showing the process following FIG. 4E. It is a cross-sectional schematic diagram showing 1 process of the manufacturing method of the imaging device which concerns on a comparative example. It is a cross-sectional schematic diagram showing the process following FIG. 5A. It is a cross-sectional schematic diagram showing the process following FIG. 5B. It is a cross-sectional schematic diagram showing the other example of the process shown in FIG.
- Embodiment imaging device having a stopper film between a first semiconductor element and a second semiconductor element
- Application example electronic equipment
- Application examples
- FIG. 1 schematically illustrates an example of a cross-sectional configuration of a solid-state imaging device (imaging device 1) according to an embodiment of the present disclosure.
- the imaging device 1 is, for example, a backside illumination type CMOS (Complementary Metal Oxide Semiconductor) image sensor.
- the imaging device 1 includes a first semiconductor chip 21 (first semiconductor element), a second semiconductor chip 11 (second semiconductor element), and a third semiconductor chip 22 (third semiconductor element).
- the first semiconductor chip 21 and the third semiconductor chip 22 are provided on a support substrate 31 (substrate).
- the second semiconductor chip 11 faces the support substrate 31 with the first semiconductor chip 21 and the third semiconductor chip 22 in between.
- a stopper film 23 and an insulating film 24 are provided between the first semiconductor chip 21 and the third semiconductor chip 22 and the second semiconductor chip 11.
- a color filter 41 and an on-chip lens 42 are provided on the second semiconductor chip 11.
- FIGS. 2A and 2B show an example of a planar configuration of the first semiconductor chip 21, the second semiconductor chip 11, and the third semiconductor chip 22.
- FIG. the second semiconductor chip 11 is provided over a wide area, and the first semiconductor chip 21 and the third semiconductor chip 22 are disposed in an area overlapping the second semiconductor chip 11 in plan view (XY plane in FIGS. 2A and 2B). Has been placed.
- the chip sizes of the first semiconductor chip 21 and the third semiconductor chip 22 are smaller than the chip size of the second semiconductor chip 11.
- the sizes of the first semiconductor chip 21 and the third semiconductor chip 22 may be substantially the same (FIG. 2A) or different (FIG. 2B).
- the second semiconductor chip 11 having a large chip size has, for example, a sensor circuit.
- the second semiconductor chip 11 includes, for example, a semiconductor substrate, and a PD (photoelectric conversion unit) is provided for each pixel on the semiconductor substrate.
- the second semiconductor chip 11 has a plurality of wirings 11W and rewiring layers E11A and E11B.
- the rewiring layer E11A is for electrically connecting the wiring 11W of the second semiconductor chip 11 and the wiring of the first semiconductor chip 21 (wiring 21W described later).
- the rewiring layer E11B is for electrically connecting the wiring 11W of the second semiconductor chip 11 and the wiring of the third semiconductor chip 22 (a wiring 22W described later).
- the wiring 11W and the rewiring layers E11A and E11B are made of, for example, copper (Cu).
- the color filter 41 and the on-chip lens 42 are provided on the light receiving surface of the second semiconductor chip 11, and the first semiconductor chip 21 and the third semiconductor chip 22 are provided on the surface opposite to the light receiving surface of the second semiconductor chip 11. ing.
- the first semiconductor chip 21 provided to face the second semiconductor chip 11 has, for example, a memory circuit electrically connected to the PD of the second semiconductor chip 21.
- the first semiconductor chip 21 has, for example, a semiconductor substrate, and a plurality of MOS (Metal Oxide Semiconductor) transistors are provided in a p-type semiconductor well region of the semiconductor substrate.
- the memory circuit is configured using, for example, the plurality of MOS transistors.
- the first semiconductor chip 21 has a plurality of wirings 21W and a rewiring layer E21.
- the wiring 21W and the rewiring layer E21 (wiring) are made of, for example, copper (Cu).
- the rewiring layer E21 penetrates the stopper film 23 and the insulating film 24 and is connected to the rewiring layer E11A of the second semiconductor chip 11.
- the electrical connection between the wiring 21W and the wiring 11W via the rewiring layers E21 and E11A is configured by, for example, CuCu bonding. That is, the first semiconductor chip 21 is electrically connected to the second semiconductor chip 11 by, for example, CuCu bonding.
- the third semiconductor chip 22 is provided so as to face the second semiconductor chip 11 together with the first semiconductor chip 21. In other words, the third semiconductor chip 22 and the first semiconductor chip 21 are arranged in the same layer.
- the third semiconductor chip 22 has, for example, a logic circuit that is electrically connected to the PD of the second semiconductor chip 21.
- the third semiconductor chip 22 has, for example, a semiconductor substrate, and a plurality of MOS transistors are provided in a p-type semiconductor well region of the semiconductor substrate.
- the logic circuit is configured using, for example, the plurality of MOS transistors.
- the third semiconductor chip 22 has a plurality of wirings 22W and a rewiring layer E22.
- the wiring 22W and the rewiring layer E22 are made of, for example, copper (Cu).
- the rewiring layer E22 penetrates the stopper film 23 and the insulating film 24 and is connected to the rewiring layer E11B of the second semiconductor chip 11.
- the electrical connection between the wiring 22W and the wiring 11W via the rewiring layers E22 and E11B is configured by, for example, CuCu bonding. That is, the third semiconductor chip 22 is electrically connected to the second semiconductor chip 11 by, for example, CuCu bonding.
- the first semiconductor chip 21 and the third semiconductor chip 22 are covered with a stopper film 23.
- the stopper film 23 covers the upper surface and side surfaces of the first semiconductor chip 21 and the third semiconductor chip 22 and is provided on the support substrate 31.
- the stopper film 23 is used as an etching stopper when the insulating film 24 is formed. In the present embodiment, since the stopper film 23 is provided, the flatness of the insulating film 24 can be improved.
- the stopper film 23 is made of a material that functions as an etching stopper for the constituent material of the insulating film 24.
- the stopper film 23 can be made of silicon nitride (SiN), silicon carbide (SiC), or the like.
- the stopper film 23 has a thin film portion 23T at a position overlapping the central portion of the first semiconductor chip 21 and the central portion of the third semiconductor chip 22 in plan view (XY plane in FIG. 1). More specifically, the thin film portion 23T is provided in the central portion of the upper surface of the first semiconductor chip 21 and the central portion of the upper surface of the third semiconductor chip 22.
- the thin film portion 23T is a portion having a thickness smaller than the thickness of the stopper film 23 in other portions. In other words, the thickness of the stopper film 23 that covers the side surfaces from the end portions of the upper surfaces of the first semiconductor chip 21 and the third semiconductor chip 22 is the stopper that covers the central portions of the upper surfaces of the first semiconductor chip 21 and the third semiconductor chip 22.
- the thin film portion 23T is provided, for example, about 5 ⁇ m to 20 ⁇ m from the ends of the first semiconductor chip 21 and the third semiconductor chip 22. Although details will be described later, the thin film portion 23T of the stopper film 23 is generated by the reverse etching process when the insulating film 24 is formed.
- the stopper film 23 other than the thin film portion 23T has a thickness of, for example, 500 nm, and the thin film portion 23T has a thickness of, for example, 400 nm.
- the first semiconductor chip 21 and the third semiconductor chip 22 are covered with an insulating film 24 with a stopper film 23 in between.
- the insulating film 24 has a flat surface, and the second semiconductor chip 11 is provided in contact with the flat surface.
- the flat surface of the insulating film 24 is formed, for example, by a flattening process such as CMP (Chemical Mechanical Polishing) (described later).
- the insulating film 24 is made of, for example, silicon oxide (SiO).
- the thickness of the insulating film 24 is, for example, 1500 nm.
- the color filter 41 provided on the light receiving surface of the second semiconductor chip 11 is, for example, one of a red (R) filter, a green (G) filter, a blue (B) filter, and a white filter (W). Is provided. These color filters 41 are provided in a regular color arrangement (for example, a Bayer arrangement). By providing such a color filter 41, the imaging device 1 can obtain light reception data of a color corresponding to the color arrangement.
- the on-chip lens 42 on the color filter 41 is provided at a position facing the PD of the second semiconductor chip 11 for each pixel.
- the light incident on the on-chip lens 42 is condensed on the PD for each pixel.
- the lens system of the on-chip lens 42 is set to a value corresponding to the pixel size.
- Examples of the lens material of the on-chip lens 42 include an organic material and a silicon oxide film (SiO).
- the support substrate 31 supports the first semiconductor chip 21 and the third semiconductor chip 22.
- the support substrate 31 is, for example, for ensuring the strength of the first semiconductor chip 21 and the third semiconductor chip 22 in the manufacturing stage, and is constituted by, for example, a silicon (Si) substrate.
- Such an imaging apparatus 1 can be manufactured, for example, as follows (FIGS. 3A to 4F).
- the first semiconductor chip 21 and the third semiconductor chip 22 are arranged side by side on the rearrangement substrate 51.
- the first semiconductor chip 21 and the third semiconductor chip 22 are fixed to the rearrangement substrate 51 on the side opposite to the side where the wirings 21W and 22W are provided.
- the first semiconductor chip 21 and the third semiconductor chip 22 are, for example, separated from the wafer state.
- an adhesive is provided between the rearrangement substrate 51 and the first semiconductor chip 21 and the third semiconductor chip 22.
- the rearrangement substrate 51 is interposed between the first semiconductor chip 21 and the third semiconductor chip 22.
- the support substrate 31 is made to face. Thereafter, these are reversed, and the rearrangement substrate 51 is peeled from the first semiconductor chip 21 and the third semiconductor chip 22.
- dummy chips 25 ⁇ / b> A and 25 ⁇ / b> B are provided on the support substrate 31.
- the dummy chips 25A and 25B are for suppressing a decrease in flatness of the insulating film 24 due to a step between a region where the first semiconductor chip 21 and the third semiconductor chip 22 are provided and a region where these are not present. .
- the following steps may be performed without providing the dummy chips 25A and 25B. Although two dummy chips 25A and 25B are shown in FIG. 3C, three or more dummy chips may be provided.
- the support substrate 31 After providing the first semiconductor chip 21 and the third semiconductor chip 22 on the support substrate 31, as shown in FIG. 4A, the support substrate 31 is covered so as to cover the first semiconductor chip 21 and the third semiconductor chip 22. A stopper film 23 and a first insulating film 24A are formed in this order on the entire surface.
- a stopper film 23 and a first insulating film 24A are formed in this order on the entire surface.
- dummy chips 25A and 25B (FIG. 3C) are provided on the support substrate 31, these dummy chips 25A and 25B are also covered with the stopper film 23 and the first insulating film 24A.
- the stopper film 23 is formed, for example, by depositing silicon nitride having a thickness of 500 nm.
- the first insulating film 24A is for forming the insulating film 24 together with a second insulating film 24B described later.
- the first insulating film 24A is formed by depositing silicon oxide having a thickness tA.
- reverse etching is performed as shown in FIG. 4B.
- the reverse etching process for example, a dry etching process is performed.
- this reverse etching process is performed until the stopper film 23 at the center of the upper surface of the first semiconductor chip 21 and the third semiconductor chip 22 is exposed.
- this reversal etching process the exposed stopper film 23 is thinned to form a thin film portion 23T.
- the first insulating film 24A covering the end portions of the first semiconductor chip 21 and the third semiconductor chip remains, and a convex portion 24AC of the first insulating film 24A is formed at the end portions of the first semiconductor chip 21 and the third semiconductor chip. Is done.
- the first insulating film 24A remains with a predetermined thickness.
- a second insulating film 24B having a thickness tB is formed on the stopper film 23 and the first insulating film 24A.
- the second insulating film 24B is for securing a necessary amount of cutting when the convex portion 24AC of the first insulating film 24A is flattened (FIG. 4D described later). In other words, it is sufficient to form the second insulating film 24B to be a margin.
- the second insulating film 24B is formed by depositing silicon oxide.
- the thickness tB of the second insulating film 24B is 3 ⁇ m, for example.
- the second insulating film 24B and the first insulating film 24A are planarized using, for example, a CMP method. Thereby, the convex portion 24AC is removed, and the planarized insulating film 24 is formed (FIG. 4D). At this time, the insulating film 24 is formed with a predetermined thickness on the first semiconductor chip 21 and the third semiconductor chip 22.
- the planarization process of the first insulating film 24A and the second insulating film 24B in a state where the dummy chips 25A and 25B are provided on the support substrate 31 together with the first semiconductor chip 21 and the third semiconductor chip 22. Further, film loss in the inversion region due to the deflection of the polishing pad can be suppressed. Therefore, the flatness of the insulating film 24 can be further improved.
- the second semiconductor chip 11 fixed to the rearrangement substrate 52 is opposed to the first semiconductor chip 21 and the third semiconductor chip 22, and the second semiconductor chip 11 is moved to the first semiconductor chip.
- the chip 21 and the third semiconductor chip 22 are connected.
- the rearrangement substrate 52 is peeled from the second semiconductor chip 11 to complete the imaging device 1 shown in FIG.
- signal charges for example, electrons
- the imaging apparatus 1 signal charges (for example, electrons) are acquired as follows.
- the on-chip lens 42 and the color filter 41 and enters the second semiconductor chip 11 this light is detected (absorbed) by the PD of each pixel, and red, green, or blue color light is photoelectrically converted.
- the Of the electron-hole pairs generated in the PD signal charges (for example, electrons) are converted into imaging signals and processed by the memory circuit of the first semiconductor chip 21 and the logic circuit of the third semiconductor chip 22.
- the first semiconductor chip 21 and the third semiconductor chip 22 are covered with the stopper film 23, and then the first insulating film 24A is formed on the stopper film 23.
- the reverse etching process of the insulating film 24A can be finished. Therefore, a decrease in the flatness of the insulating film 24 due to variations in the amount of reverse etching is suppressed, and the flatness of the insulating film 24 can be improved.
- this effect is demonstrated using a comparative example.
- FIG. 5A and 5B show the manufacturing method of the imaging device according to the comparative example in the order of steps.
- This imaging device is manufactured as follows. First, as shown in FIG. 5A, the first semiconductor chip 21 and the third semiconductor chip 22 on the support substrate 31 are directly covered with an insulating film 124. Next, as shown in FIG. 5B, the insulating film 124 is subjected to reverse etching. This reverse etching process is performed in order to improve the flatness of the insulating film 124. Specifically, during the planarization process of the insulating film 124, the polishing rate of the insulating film 124 is lowered in the vicinity of the first semiconductor chip 21 and the third semiconductor chip 22, and a global step is likely to occur in the insulating film 124.
- the insulating film 124 is left on the first semiconductor chip 21 and the third semiconductor chip 22 with a thickness t100 during the reverse etching process.
- FIG. 6 and 7 show the process following FIG. 5B.
- the thickness t100 is large, when the insulating film 124 is planarized, the insulating film 124 on the first semiconductor chip 21 and the third semiconductor chip 22 is likely to rise in a convex shape (FIG. 6).
- the thickness t100 is small, when the insulating film 124 is planarized, the insulating film 124 on the first semiconductor chip 21 and the third semiconductor chip 22 is likely to be recessed in a concave shape (FIG. 7).
- a global step is generated, and the flatness of the insulating film 124 is lowered.
- the insulating film 124 having a global level difference causes, for example, defocusing in an exposure process and causes variation in line width.
- a stopper film 23 is provided on the first semiconductor chip 21 and the third semiconductor chip 22, and reverse etching is performed on the first insulating film 24A until the stopper film 23 is exposed. ing. Therefore, variations in the amount of reverse etching are suppressed.
- the first insulating film 24A and the second insulating film 24B are planarized to form the insulating film 24. Therefore, the occurrence of a global step in the insulating film 24 due to variations in the amount of reverse etching is suppressed. Therefore, the flatness of the insulating film 24 can be improved. Thereby, in the imaging device 1, the occurrence of defocus in the exposure process can be suppressed, and a pattern with a uniform line width can be formed.
- the flatness of the insulating film 24 on the stopper film 23 can be improved. . Therefore, the flatness of the insulating film 24 covering the first semiconductor chip 21 and the third semiconductor chip 22 can be improved.
- FIG. 8 is a functional block diagram illustrating an example of the overall configuration of the imaging apparatus 1.
- the imaging device 1 includes an element region R1 and a circuit unit 130 that drives the element region R1.
- the circuit unit 130 includes, for example, a row scanning unit 131, a horizontal selection unit 133, a column scanning unit 134, and a system control unit 132.
- the element region R1 has, for example, a plurality of pixels P that are two-dimensionally arranged in a matrix.
- a pixel drive line Lread (for example, a row selection line and a reset control line) is wired for each pixel row, and a vertical signal line Lsig is wired for each pixel column.
- the pixel drive line Lread transmits a drive signal for reading a signal from the pixel P.
- One end of the pixel drive line Lread is connected to an output end corresponding to each row of the row scanning unit 131.
- the row scanning unit 131 includes a shift register, an address decoder, and the like, and is a pixel driving unit that drives each pixel P in the element region R1 in units of rows, for example.
- a signal output from each pixel P in the pixel row selected and scanned by the row scanning unit 131 is supplied to the horizontal selection unit 133 through each of the vertical signal lines Lsig.
- the horizontal selection unit 133 is configured by an amplifier, a horizontal selection switch, and the like provided for each vertical signal line Lsig.
- the column scanning unit 134 includes a shift register, an address decoder, and the like, and drives the horizontal selection switches in the horizontal selection unit 133 in order while scanning. By the selective scanning by the column scanning unit 134, the signal of each pixel transmitted through each of the vertical signal lines Lsig is sequentially output to the horizontal signal line 135 and is input to the signal processing unit (not shown) through the horizontal signal line 135.
- the signal processing unit not shown
- FIG. 9 shows a schematic configuration of the electronic apparatus 3 (camera) as an example.
- the electronic device 3 is a camera capable of taking a still image or a moving image, for example, and includes an imaging device 1, an optical system (optical lens) 310, a shutter device 311, and a drive unit that drives the imaging device 1 and the shutter device 311. 313 and a signal processing unit 312.
- the optical system 310 guides image light (incident light) from the subject to the imaging apparatus 1.
- the optical system 310 may be composed of a plurality of optical lenses.
- the shutter device 311 controls the light irradiation period and the light shielding period to the imaging apparatus 1.
- the drive unit 313 controls the transfer operation of the imaging device 1 and the shutter operation of the shutter device 311.
- the signal processing unit 312 performs various types of signal processing on the signal output from the imaging device 1.
- the video signal Dout after the signal processing is stored in a storage medium such as a memory, or is output to a monitor or the like.
- the technology (present technology) according to the present disclosure can be applied to various products.
- the technology according to the present disclosure may be applied to an endoscopic surgery system.
- FIG. 10 is a block diagram illustrating an example of a schematic configuration of a patient in-vivo information acquisition system using a capsule endoscope to which the technique (present technique) according to the present disclosure can be applied.
- the in-vivo information acquisition system 10001 includes a capsule endoscope 10100 and an external control device 10200.
- the capsule endoscope 10100 is swallowed by the patient at the time of examination.
- the capsule endoscope 10100 has an imaging function and a wireless communication function, and moves inside the organ such as the stomach and the intestine by peristaltic motion or the like until it is spontaneously discharged from the patient.
- Images (hereinafter also referred to as in-vivo images) are sequentially captured at predetermined intervals, and information about the in-vivo images is sequentially wirelessly transmitted to the external control device 10200 outside the body.
- the external control device 10200 comprehensively controls the operation of the in-vivo information acquisition system 10001. Further, the external control device 10200 receives information about the in-vivo image transmitted from the capsule endoscope 10100 and, based on the received information about the in-vivo image, displays the in-vivo image on the display device (not shown). The image data for displaying is generated.
- an in-vivo image obtained by imaging the inside of the patient's body can be obtained at any time in this manner until the capsule endoscope 10100 is swallowed and discharged.
- the capsule endoscope 10100 includes a capsule-type casing 10101.
- a light source unit 10111 In the casing 10101, a light source unit 10111, an imaging unit 10112, an image processing unit 10113, a wireless communication unit 10114, a power supply unit 10115, and a power supply unit 10116 and the control unit 10117 are stored.
- the light source unit 10111 includes a light source such as an LED (light-emitting diode), and irradiates the imaging field of the imaging unit 10112 with light.
- a light source such as an LED (light-emitting diode)
- the image capturing unit 10112 includes an image sensor and an optical system including a plurality of lenses provided in front of the image sensor. Reflected light (hereinafter referred to as observation light) of light irradiated on the body tissue to be observed is collected by the optical system and enters the image sensor. In the imaging unit 10112, in the imaging element, the observation light incident thereon is photoelectrically converted, and an image signal corresponding to the observation light is generated. The image signal generated by the imaging unit 10112 is provided to the image processing unit 10113.
- the image processing unit 10113 is configured by a processor such as a CPU (Central Processing Unit) or a GPU (Graphics Processing Unit), and performs various types of signal processing on the image signal generated by the imaging unit 10112.
- the image processing unit 10113 provides the radio communication unit 10114 with the image signal subjected to signal processing as RAW data.
- the wireless communication unit 10114 performs predetermined processing such as modulation processing on the image signal that has been subjected to signal processing by the image processing unit 10113, and transmits the image signal to the external control apparatus 10200 via the antenna 10114A.
- the wireless communication unit 10114 receives a control signal related to drive control of the capsule endoscope 10100 from the external control device 10200 via the antenna 10114A.
- the wireless communication unit 10114 provides a control signal received from the external control device 10200 to the control unit 10117.
- the power feeding unit 10115 includes a power receiving antenna coil, a power regeneration circuit that regenerates power from a current generated in the antenna coil, a booster circuit, and the like. In the power feeding unit 10115, electric power is generated using a so-called non-contact charging principle.
- the power supply unit 10116 is composed of a secondary battery, and stores the electric power generated by the power supply unit 10115.
- FIG. 10 in order to avoid complication of the drawing, illustration of an arrow indicating a power supply destination from the power supply unit 10116 is omitted, but the power stored in the power supply unit 10116 is stored in the light source unit 10111.
- the imaging unit 10112, the image processing unit 10113, the wireless communication unit 10114, and the control unit 10117 can be used for driving them.
- the control unit 10117 includes a processor such as a CPU, and a control signal transmitted from the external control device 10200 to drive the light source unit 10111, the imaging unit 10112, the image processing unit 10113, the wireless communication unit 10114, and the power feeding unit 10115. Control accordingly.
- a processor such as a CPU
- the external control device 10200 is configured by a processor such as a CPU or GPU, or a microcomputer or a control board in which a processor and a storage element such as a memory are mounted.
- the external control device 10200 controls the operation of the capsule endoscope 10100 by transmitting a control signal to the control unit 10117 of the capsule endoscope 10100 via the antenna 10200A.
- the capsule endoscope 10100 for example, the light irradiation condition for the observation target in the light source unit 10111 can be changed by a control signal from the external control device 10200.
- an imaging condition for example, a frame rate or an exposure value in the imaging unit 10112
- a control signal from the external control device 10200 can be changed by a control signal from the external control device 10200.
- the contents of processing in the image processing unit 10113 and the conditions (for example, the transmission interval, the number of transmission images, etc.) by which the wireless communication unit 10114 transmits an image signal may be changed by a control signal from the external control device 10200. .
- the external control device 10200 performs various image processing on the image signal transmitted from the capsule endoscope 10100, and generates image data for displaying the captured in-vivo image on the display device.
- image processing for example, development processing (demosaic processing), image quality enhancement processing (band enhancement processing, super-resolution processing, NR (Noise reduction) processing and / or camera shake correction processing, etc.), and / or enlargement processing ( Various signal processing such as electronic zoom processing can be performed.
- the external control device 10200 controls driving of the display device to display an in-vivo image captured based on the generated image data.
- the external control device 10200 may cause the generated image data to be recorded on a recording device (not shown) or may be printed out on a printing device (not shown).
- the technology according to the present disclosure can be applied to, for example, the imaging unit 10112 among the configurations described above. Thereby, detection accuracy improves.
- the technology according to the present disclosure can be applied to various products.
- the technology according to the present disclosure may be applied to an endoscopic surgery system.
- FIG. 11 is a diagram illustrating an example of a schematic configuration of an endoscopic surgery system to which the technology (present technology) according to the present disclosure can be applied.
- FIG. 11 shows a state where an operator (doctor) 11131 is performing an operation on a patient 11132 on a patient bed 11133 using an endoscopic operation system 11000.
- an endoscopic surgery system 11000 includes an endoscope 11100, other surgical instruments 11110 such as an insufflation tube 11111 and an energy treatment instrument 11112, and a support arm device 11120 that supports the endoscope 11100. And a cart 11200 on which various devices for endoscopic surgery are mounted.
- the endoscope 11100 includes a lens barrel 11101 in which a region having a predetermined length from the distal end is inserted into the body cavity of the patient 11132, and a camera head 11102 connected to the proximal end of the lens barrel 11101.
- a lens barrel 11101 in which a region having a predetermined length from the distal end is inserted into the body cavity of the patient 11132, and a camera head 11102 connected to the proximal end of the lens barrel 11101.
- an endoscope 11100 configured as a so-called rigid mirror having a rigid lens barrel 11101 is illustrated, but the endoscope 11100 may be configured as a so-called flexible mirror having a flexible lens barrel. Good.
- An opening into which the objective lens is fitted is provided at the tip of the lens barrel 11101.
- a light source device 11203 is connected to the endoscope 11100, and light generated by the light source device 11203 is guided to the tip of the lens barrel by a light guide extending inside the lens barrel 11101. Irradiation is performed toward the observation target in the body cavity of the patient 11132 through the lens.
- the endoscope 11100 may be a direct endoscope, a perspective mirror, or a side endoscope.
- An optical system and an image sensor are provided inside the camera head 11102, and reflected light (observation light) from the observation target is condensed on the image sensor by the optical system. Observation light is photoelectrically converted by the imaging element, and an electrical signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated.
- the image signal is transmitted to a camera control unit (CCU: “Camera Control Unit”) 11201 as RAW data.
- the CCU 11201 is configured by a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like, and comprehensively controls the operations of the endoscope 11100 and the display device 11202. Further, the CCU 11201 receives an image signal from the camera head 11102 and performs various kinds of image processing for displaying an image based on the image signal, such as development processing (demosaic processing), for example.
- image processing for example, development processing (demosaic processing
- the display device 11202 displays an image based on an image signal subjected to image processing by the CCU 11201 under the control of the CCU 11201.
- the light source device 11203 includes a light source such as an LED (light emitting diode), and supplies irradiation light to the endoscope 11100 when photographing a surgical site or the like.
- a light source such as an LED (light emitting diode)
- the input device 11204 is an input interface for the endoscopic surgery system 11000.
- a user can input various information and instructions to the endoscopic surgery system 11000 via the input device 11204.
- the user inputs an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 11100.
- the treatment instrument control device 11205 controls the drive of the energy treatment instrument 11112 for tissue ablation, incision, blood vessel sealing, or the like.
- the pneumoperitoneum device 11206 passes gas into the body cavity via the insufflation tube 11111.
- the recorder 11207 is an apparatus capable of recording various types of information related to surgery.
- the printer 11208 is a device that can print various types of information related to surgery in various formats such as text, images, or graphs.
- the light source device 11203 that supplies the irradiation light when the surgical site is imaged to the endoscope 11100 can be configured by, for example, a white light source configured by an LED, a laser light source, or a combination thereof.
- a white light source is configured by a combination of RGB laser light sources, the output intensity and output timing of each color (each wavelength) can be controlled with high accuracy. Therefore, the light source device 11203 adjusts the white balance of the captured image. It can be carried out.
- laser light from each of the RGB laser light sources is irradiated on the observation target in a time-sharing manner, and the drive of the image sensor of the camera head 11102 is controlled in synchronization with the irradiation timing, thereby corresponding to each RGB. It is also possible to take a time-division image. According to this method, a color image can be obtained without providing a color filter in the image sensor.
- the driving of the light source device 11203 may be controlled so as to change the intensity of the output light every predetermined time. Synchronously with the timing of changing the intensity of the light, the drive of the image sensor of the camera head 11102 is controlled to acquire an image in a time-sharing manner, and the image is synthesized, so that high dynamic without so-called blackout and overexposure A range image can be generated.
- the light source device 11203 may be configured to be able to supply light of a predetermined wavelength band corresponding to special light observation.
- special light observation for example, by utilizing the wavelength dependence of light absorption in body tissue, the surface of the mucous membrane is irradiated by irradiating light in a narrow band compared to irradiation light (ie, white light) during normal observation.
- a so-called narrow-band light observation (Narrow Band Imaging) is performed in which a predetermined tissue such as a blood vessel is imaged with high contrast.
- fluorescence observation may be performed in which an image is obtained by fluorescence generated by irradiating excitation light.
- the body tissue is irradiated with excitation light to observe fluorescence from the body tissue (autofluorescence observation), or a reagent such as indocyanine green (ICG) is locally injected into the body tissue and applied to the body tissue. It is possible to obtain a fluorescence image by irradiating excitation light corresponding to the fluorescence wavelength of the reagent.
- the light source device 11203 can be configured to be able to supply narrowband light and / or excitation light corresponding to such special light observation.
- FIG. 12 is a block diagram showing an example of the functional configuration of the camera head 11102 and CCU 11201 shown in FIG.
- the camera head 11102 includes a lens unit 11401, an imaging unit 11402, a drive unit 11403, a communication unit 11404, and a camera head control unit 11405.
- the CCU 11201 includes a communication unit 11411, an image processing unit 11412, and a control unit 11413.
- the camera head 11102 and the CCU 11201 are connected to each other by a transmission cable 11400 so that they can communicate with each other.
- the lens unit 11401 is an optical system provided at a connection portion with the lens barrel 11101. Observation light taken from the tip of the lens barrel 11101 is guided to the camera head 11102 and enters the lens unit 11401.
- the lens unit 11401 is configured by combining a plurality of lenses including a zoom lens and a focus lens.
- the imaging device constituting the imaging unit 11402 may be one (so-called single plate type) or plural (so-called multi-plate type).
- image signals corresponding to RGB may be generated by each imaging element, and a color image may be obtained by combining them.
- the imaging unit 11402 may be configured to include a pair of imaging elements for acquiring right-eye and left-eye image signals corresponding to 3D (dimensional) display. By performing the 3D display, the operator 11131 can more accurately grasp the depth of the living tissue in the surgical site.
- a plurality of lens units 11401 can be provided corresponding to each imaging element.
- the imaging unit 11402 is not necessarily provided in the camera head 11102.
- the imaging unit 11402 may be provided inside the lens barrel 11101 immediately after the objective lens.
- the driving unit 11403 is configured by an actuator, and moves the zoom lens and the focus lens of the lens unit 11401 by a predetermined distance along the optical axis under the control of the camera head control unit 11405. Thereby, the magnification and the focus of the image captured by the imaging unit 11402 can be adjusted as appropriate.
- the communication unit 11404 is configured by a communication device for transmitting and receiving various types of information to and from the CCU 11201.
- the communication unit 11404 transmits the image signal obtained from the imaging unit 11402 as RAW data to the CCU 11201 via the transmission cable 11400.
- the communication unit 11404 receives a control signal for controlling driving of the camera head 11102 from the CCU 11201 and supplies the control signal to the camera head control unit 11405.
- the control signal includes, for example, information for designating the frame rate of the captured image, information for designating the exposure value at the time of imaging, and / or information for designating the magnification and focus of the captured image. Contains information about the condition.
- the imaging conditions such as the frame rate, exposure value, magnification, and focus may be appropriately specified by the user, or may be automatically set by the control unit 11413 of the CCU 11201 based on the acquired image signal. Good.
- a so-called AE (Auto-Exposure) function, AF (Auto-Focus) function, and AWB (Auto-White Balance) function are mounted on the endoscope 11100.
- the camera head control unit 11405 controls driving of the camera head 11102 based on a control signal from the CCU 11201 received via the communication unit 11404.
- the communication unit 11411 is configured by a communication device for transmitting and receiving various types of information to and from the camera head 11102.
- the communication unit 11411 receives an image signal transmitted from the camera head 11102 via the transmission cable 11400.
- the communication unit 11411 transmits a control signal for controlling driving of the camera head 11102 to the camera head 11102.
- the image signal and the control signal can be transmitted by electrical communication, optical communication, or the like.
- the image processing unit 11412 performs various types of image processing on the image signal that is RAW data transmitted from the camera head 11102.
- the control unit 11413 performs various types of control related to imaging of the surgical site by the endoscope 11100 and display of a captured image obtained by imaging of the surgical site. For example, the control unit 11413 generates a control signal for controlling driving of the camera head 11102.
- control unit 11413 causes the display device 11202 to display a picked-up image showing the surgical part or the like based on the image signal subjected to the image processing by the image processing unit 11412.
- the control unit 11413 may recognize various objects in the captured image using various image recognition techniques.
- the control unit 11413 detects surgical tools such as forceps, specific biological parts, bleeding, mist when using the energy treatment tool 11112, and the like by detecting the shape and color of the edge of the object included in the captured image. Can be recognized.
- the control unit 11413 may display various types of surgery support information superimposed on the image of the surgical unit using the recognition result. Surgery support information is displayed in a superimposed manner and presented to the operator 11131, thereby reducing the burden on the operator 11131 and allowing the operator 11131 to proceed with surgery reliably.
- the transmission cable 11400 for connecting the camera head 11102 and the CCU 11201 is an electric signal cable corresponding to electric signal communication, an optical fiber corresponding to optical communication, or a composite cable thereof.
- communication is performed by wire using the transmission cable 11400.
- communication between the camera head 11102 and the CCU 11201 may be performed wirelessly.
- the technology according to the present disclosure can be applied to various products.
- the technology according to the present disclosure may be any type of movement such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, personal mobility, an airplane, a drone, a ship, a robot, a construction machine, and an agricultural machine (tractor). You may implement
- FIG. 13 is a block diagram illustrating a schematic configuration example of a vehicle control system that is an example of a mobile control system to which the technology according to the present disclosure can be applied.
- the vehicle control system 12000 includes a plurality of electronic control units connected via a communication network 12001.
- the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, a vehicle exterior information detection unit 12030, a vehicle interior information detection unit 12040, and an integrated control unit 12050.
- a microcomputer 12051, an audio image output unit 12052, and an in-vehicle network I / F (interface) 12053 are illustrated as a functional configuration of the integrated control unit 12050.
- the drive system control unit 12010 controls the operation of the device related to the drive system of the vehicle according to various programs.
- the drive system control unit 12010 includes a driving force generator for generating a driving force of a vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism that adjusts and a braking device that generates a braking force of the vehicle.
- the body system control unit 12020 controls the operation of various devices mounted on the vehicle body according to various programs.
- the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as a headlamp, a back lamp, a brake lamp, a blinker, or a fog lamp.
- the body control unit 12020 can be input with radio waves transmitted from a portable device that substitutes for a key or signals from various switches.
- the body system control unit 12020 receives input of these radio waves or signals, and controls a door lock device, a power window device, a lamp, and the like of the vehicle.
- the vehicle outside information detection unit 12030 detects information outside the vehicle on which the vehicle control system 12000 is mounted.
- the imaging unit 12031 is connected to the vehicle exterior information detection unit 12030.
- the vehicle exterior information detection unit 12030 causes the imaging unit 12031 to capture an image outside the vehicle and receives the captured image.
- the vehicle outside information detection unit 12030 may perform an object detection process or a distance detection process such as a person, a car, an obstacle, a sign, or a character on a road surface based on the received image.
- the imaging unit 12031 is an optical sensor that receives light and outputs an electrical signal corresponding to the amount of received light.
- the imaging unit 12031 can output an electrical signal as an image, or can output it as distance measurement information. Further, the light received by the imaging unit 12031 may be visible light or invisible light such as infrared rays.
- the vehicle interior information detection unit 12040 detects vehicle interior information.
- a driver state detection unit 12041 that detects a driver's state is connected to the in-vehicle information detection unit 12040.
- the driver state detection unit 12041 includes, for example, a camera that images the driver, and the vehicle interior information detection unit 12040 determines the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated or it may be determined whether the driver is asleep.
- the microcomputer 12051 calculates a control target value of the driving force generator, the steering mechanism, or the braking device based on the information inside / outside the vehicle acquired by the vehicle outside information detection unit 12030 or the vehicle interior information detection unit 12040, and the drive system control unit A control command can be output to 12010.
- the microcomputer 12051 realizes ADAS (Advanced Driver Assistance System) functions including vehicle collision avoidance or impact mitigation, following traveling based on inter-vehicle distance, vehicle speed maintenance traveling, vehicle collision warning, or vehicle lane departure warning. It is possible to perform cooperative control for the purpose.
- ADAS Advanced Driver Assistance System
- the microcomputer 12051 controls the driving force generator, the steering mechanism, the braking device, and the like based on the information around the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040. It is possible to perform cooperative control for the purpose of automatic driving that autonomously travels without depending on the operation.
- the microcomputer 12051 can output a control command to the body system control unit 12020 based on information outside the vehicle acquired by the vehicle outside information detection unit 12030.
- the microcomputer 12051 controls the headlamp according to the position of the preceding vehicle or the oncoming vehicle detected by the outside information detection unit 12030, and performs cooperative control for the purpose of anti-glare, such as switching from a high beam to a low beam. It can be carried out.
- the sound image output unit 12052 transmits an output signal of at least one of sound and image to an output device capable of visually or audibly notifying information to a vehicle occupant or the outside of the vehicle.
- an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are illustrated as output devices.
- the display unit 12062 may include at least one of an on-board display and a head-up display, for example.
- FIG. 14 is a diagram illustrating an example of an installation position of the imaging unit 12031.
- the imaging unit 12031 includes imaging units 12101, 12102, 12103, 12104, and 12105.
- the imaging units 12101, 12102, 12103, 12104, and 12105 are provided, for example, at positions such as a front nose, a side mirror, a rear bumper, a back door, and an upper part of a windshield in the vehicle interior of the vehicle 12100.
- the imaging unit 12101 provided in the front nose and the imaging unit 12105 provided in the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 12100.
- the imaging units 12102 and 12103 provided in the side mirror mainly acquire an image of the side of the vehicle 12100.
- the imaging unit 12104 provided in the rear bumper or the back door mainly acquires an image behind the vehicle 12100.
- the imaging unit 12105 provided on the upper part of the windshield in the passenger compartment is mainly used for detecting a preceding vehicle or a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
- FIG. 14 shows an example of the shooting range of the imaging units 12101 to 12104.
- the imaging range 12111 indicates the imaging range of the imaging unit 12101 provided in the front nose
- the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided in the side mirrors, respectively
- the imaging range 12114 The imaging range of the imaging part 12104 provided in the rear bumper or the back door is shown. For example, by superimposing the image data captured by the imaging units 12101 to 12104, an overhead image when the vehicle 12100 is viewed from above is obtained.
- At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information.
- at least one of the imaging units 12101 to 12104 may be a stereo camera including a plurality of imaging elements, or may be an imaging element having pixels for phase difference detection.
- the microcomputer 12051 based on the distance information obtained from the imaging units 12101 to 12104, the distance to each three-dimensional object in the imaging range 12111 to 12114 and the temporal change in this distance (relative speed with respect to the vehicle 12100).
- a predetermined speed for example, 0 km / h or more
- the microcomputer 12051 can set an inter-vehicle distance to be secured in advance before the preceding vehicle, and can perform automatic brake control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like.
- automatic brake control including follow-up stop control
- automatic acceleration control including follow-up start control
- cooperative control for the purpose of autonomous driving or the like autonomously traveling without depending on the operation of the driver can be performed.
- the microcomputer 12051 converts the three-dimensional object data related to the three-dimensional object to other three-dimensional objects such as a two-wheeled vehicle, a normal vehicle, a large vehicle, a pedestrian, and a utility pole based on the distance information obtained from the imaging units 12101 to 12104. It can be classified and extracted and used for automatic avoidance of obstacles.
- the microcomputer 12051 identifies obstacles around the vehicle 12100 as obstacles that are visible to the driver of the vehicle 12100 and obstacles that are difficult to see. Then, the microcomputer 12051 determines the collision risk indicating the risk of collision with each obstacle. By outputting an alarm to the driver and performing forced deceleration or avoidance steering via the drive system control unit 12010, driving assistance for collision avoidance can be performed.
- At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
- the microcomputer 12051 can recognize a pedestrian by determining whether a pedestrian is present in the captured images of the imaging units 12101 to 12104. Such pedestrian recognition is, for example, whether or not the user is a pedestrian by performing a pattern matching process on a sequence of feature points indicating the outline of an object and a procedure for extracting feature points in the captured images of the imaging units 12101 to 12104 as infrared cameras. It is carried out by the procedure for determining.
- the audio image output unit 12052 When the microcomputer 12051 determines that there is a pedestrian in the captured images of the imaging units 12101 to 12104 and recognizes the pedestrian, the audio image output unit 12052 has a rectangular contour line for emphasizing the recognized pedestrian.
- the display unit 12062 is controlled so as to be superimposed and displayed.
- voice image output part 12052 may control the display part 12062 so that the icon etc. which show a pedestrian may be displayed on a desired position.
- the technology according to the present disclosure can be applied to the imaging unit 12031 among the configurations described above.
- the technology according to the present disclosure can be applied to the imaging unit 12031, it is possible to obtain a captured image that is easier to see, and thus it is possible to reduce driver fatigue.
- the present disclosure is not limited to the above-described embodiments, and various modifications can be made.
- the configuration of the imaging device described in the above embodiment is an example, and other layers may be provided.
- the material and thickness of each layer are examples, and are not limited to those described above.
- the imaging device 1 includes the first semiconductor chip 21, the second semiconductor chip 11, and the third semiconductor chip 22 has been described.
- the imaging device 1 includes at least two semiconductor chips. It only has to be. Further, the imaging device 1 may have four or more semiconductor chips.
- the present technology may be applied to a semiconductor device other than the imaging device.
- the present disclosure may be configured as follows. (1) A substrate, A first semiconductor element provided on the substrate; A stopper film covering the first semiconductor element; A second semiconductor element facing the first semiconductor element with the stopper film interposed therebetween; An insulating film provided between the second semiconductor element and the stopper film; A semiconductor device comprising: a wiring that penetrates the insulating film and the stopper film and electrically connects the second semiconductor element and the first semiconductor element. (2) And a third semiconductor element provided on the substrate and covered with the stopper film, The semiconductor device according to (1), wherein the second semiconductor element is provided to face the first semiconductor element and the third semiconductor element. (3) The semiconductor device according to (1) or (2), wherein the stopper film has a thin film portion having a smaller thickness than other portions at a position overlapping the first semiconductor element.
- the insulating film has a flat surface; The semiconductor device according to any one of (1) to (4), wherein the second semiconductor element is provided on a flat surface of the insulating film.
- the stopper film includes silicon nitride (SiN) or silicon carbide (SiC).
- the formation of the insulating film is as follows: Forming a first insulating film on the stopper film; Reversing etching is performed on the first insulating film on the first semiconductor element until the stopper film is exposed, After performing the reverse etching process, a second insulating film is formed on the stopper film or the first insulating film, The method for manufacturing a semiconductor device according to (10), including planarizing the second insulating film and the first insulating film. (12) The planarization of the second insulating film and the first insulating film is performed using CMP. The method of manufacturing a semiconductor device according to (11).
Landscapes
- Solid State Image Pick-Up Elements (AREA)
- Internal Circuitry In Semiconductor Integrated Circuit Devices (AREA)
Abstract
L'invention concerne un dispositif à semi-conducteur comprenant : un substrat ; un premier élément semi-conducteur disposé sur le substrat ; un film d'arrêt recouvrant le premier élément semi-conducteur ; un second élément semi-conducteur situé à l'opposé du premier élément semi-conducteur avec le film d'arrêt entre ceux-ci ; un film isolant disposé entre le second élément semi-conducteur et le film d'arrêt ; et un câblage qui pénètre dans le film isolant et le film d'arrêt et qui connecte électriquement le second élément semi-conducteur au premier élément semi-conducteur.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2018066600A JP2019179782A (ja) | 2018-03-30 | 2018-03-30 | 半導体装置および半導体装置の製造方法 |
| JP2018-066600 | 2018-03-30 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2019188131A1 true WO2019188131A1 (fr) | 2019-10-03 |
Family
ID=68059879
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2019/009398 Ceased WO2019188131A1 (fr) | 2018-03-30 | 2019-03-08 | Dispositif à semi-conducteur et procédé de fabrication de dispositif à semi-conducteur |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP2019179782A (fr) |
| WO (1) | WO2019188131A1 (fr) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2021049302A1 (fr) * | 2019-09-10 | 2021-03-18 | ソニーセミコンダクタソリューションズ株式会社 | Dispositif d'imagerie, appareil électronique, et procédé de fabrication |
| WO2025018144A1 (fr) * | 2023-07-20 | 2025-01-23 | ソニーセミコンダクタソリューションズ株式会社 | Dispositif d'imagerie, dispositif à semi-conducteur, et dispositif électronique |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR20240012369A (ko) | 2021-05-21 | 2024-01-29 | 가부시끼가이샤 레조낙 | 반도체 장치의 제조 방법, 및, 반도체 장치 |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2006270037A (ja) * | 2005-02-28 | 2006-10-05 | Sony Corp | ハイブリットモジュール及びその製造方法並びにハイブリット回路装置 |
| WO2012161044A1 (fr) * | 2011-05-24 | 2012-11-29 | ソニー株式会社 | Dispositif à semi-conducteurs |
| JP2015216334A (ja) * | 2014-04-21 | 2015-12-03 | ソニー株式会社 | 固体撮像素子、固体撮像素子の製造方法、並びに、電子機器 |
-
2018
- 2018-03-30 JP JP2018066600A patent/JP2019179782A/ja active Pending
-
2019
- 2019-03-08 WO PCT/JP2019/009398 patent/WO2019188131A1/fr not_active Ceased
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2006270037A (ja) * | 2005-02-28 | 2006-10-05 | Sony Corp | ハイブリットモジュール及びその製造方法並びにハイブリット回路装置 |
| WO2012161044A1 (fr) * | 2011-05-24 | 2012-11-29 | ソニー株式会社 | Dispositif à semi-conducteurs |
| JP2015216334A (ja) * | 2014-04-21 | 2015-12-03 | ソニー株式会社 | 固体撮像素子、固体撮像素子の製造方法、並びに、電子機器 |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2021049302A1 (fr) * | 2019-09-10 | 2021-03-18 | ソニーセミコンダクタソリューションズ株式会社 | Dispositif d'imagerie, appareil électronique, et procédé de fabrication |
| US12324267B2 (en) | 2019-09-10 | 2025-06-03 | Sony Semiconductor Solutions Corporation | Imaging device, electronic device, and manufacturing method |
| WO2025018144A1 (fr) * | 2023-07-20 | 2025-01-23 | ソニーセミコンダクタソリューションズ株式会社 | Dispositif d'imagerie, dispositif à semi-conducteur, et dispositif électronique |
Also Published As
| Publication number | Publication date |
|---|---|
| JP2019179782A (ja) | 2019-10-17 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP7270616B2 (ja) | 固体撮像素子および固体撮像装置 | |
| US10872998B2 (en) | Chip size package, method of manufacturing the same, electronic device, and endoscope | |
| US12002825B2 (en) | Solid-state imaging device and electronic apparatus with improved sensitivity | |
| WO2018074250A1 (fr) | Dispositif à semi-conducteur, procédé de fabrication et unité électronique | |
| WO2018088284A1 (fr) | Élément d'imagerie à semiconducteur, procédé de fabrication et dispositif électronique | |
| US12087796B2 (en) | Imaging device | |
| US11715751B2 (en) | Solid-state imaging element, electronic apparatus, and semiconductor device | |
| WO2019188131A1 (fr) | Dispositif à semi-conducteur et procédé de fabrication de dispositif à semi-conducteur | |
| WO2019239754A1 (fr) | Élément d'imagerie à semi-conducteur, procédé de fabrication d'élément d'imagerie à semi-conducteur et dispositif électronique | |
| US20220005853A1 (en) | Semiconductor device, solid-state imaging device, and electronic equipment | |
| WO2017122537A1 (fr) | Élément de réception de lumière, procédé de fabrication d'élément de réception de lumière, élément de capture d'image et dispositif électronique | |
| WO2020195564A1 (fr) | Dispositif d'imagerie | |
| JP7422676B2 (ja) | 撮像装置 | |
| JPWO2020137282A1 (ja) | 撮像素子および電子機器 | |
| US20210272995A1 (en) | Imaging element and electronic apparatus | |
| WO2022138097A1 (fr) | Dispositif d'imagerie à semi-conducteur et son procédé de fabrication | |
| WO2019230243A1 (fr) | Dispositif d'imagerie | |
| US20200335426A1 (en) | Semiconductor device, manufacturing method for semiconductor, and imaging unit | |
| WO2019239767A1 (fr) | Dispositif de capture d'image | |
| JP2022094727A (ja) | 固体撮像装置およびその製造方法 | |
| WO2019176518A1 (fr) | Dispositif d'imagerie et procédé de production pour dispositif d'imagerie | |
| JPWO2020059495A1 (ja) | 撮像素子、半導体素子および電子機器 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19776449 Country of ref document: EP Kind code of ref document: A1 |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 19776449 Country of ref document: EP Kind code of ref document: A1 |