[go: up one dir, main page]

WO2018230367A1 - Dispositif de capture d'images - Google Patents

Dispositif de capture d'images Download PDF

Info

Publication number
WO2018230367A1
WO2018230367A1 PCT/JP2018/021144 JP2018021144W WO2018230367A1 WO 2018230367 A1 WO2018230367 A1 WO 2018230367A1 JP 2018021144 W JP2018021144 W JP 2018021144W WO 2018230367 A1 WO2018230367 A1 WO 2018230367A1
Authority
WO
WIPO (PCT)
Prior art keywords
storage unit
address
exposure
address storage
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2018/021144
Other languages
English (en)
Japanese (ja)
Inventor
田中 秀樹
直樹 河津
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Semiconductor Solutions Corp
Original Assignee
Sony Semiconductor Solutions Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Semiconductor Solutions Corp filed Critical Sony Semiconductor Solutions Corp
Publication of WO2018230367A1 publication Critical patent/WO2018230367A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/57Control of the dynamic range
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors
    • H04N25/77Pixel circuitry, e.g. memories, A/D converters, pixel amplifiers, shared circuits or shared components
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors
    • H04N25/779Circuitry for scanning or addressing the pixel array

Definitions

  • the present technology relates to an imaging apparatus, for example, an imaging apparatus that can finely set an exposure time.
  • CCD Charge Coupled Device
  • CMOS Complementary Metal Oxide Semiconductor
  • a method of expanding the dynamic range in a CMOS image sensor there is known a method of continuously photographing and synthesizing a plurality of images having different exposure times. That is, a long-exposure image and a short-exposure image are taken individually and continuously, a long-exposure image is used for dark image areas, and a bright image area that is overexposed for long-exposure images.
  • This is a technique for generating one image by a synthesis process using a short-time exposure image. In this way, by combining a plurality of different exposure images, an image having a wide dynamic range without overexposure, that is, a wide dynamic range image (HDR image) can be obtained.
  • HDR image wide dynamic range image
  • Patent Document 1 discloses a configuration in which two images set with a plurality of different exposure times are taken, and these images are combined to obtain an image with a wide dynamic range.
  • Patent Document 1 needs to perform processing for individually capturing and combining a long exposure image and a short exposure image. In this case, it is necessary to appropriately adjust the exposure time when capturing the long exposure image and the short exposure image.
  • This technique has been made in view of such a situation, and makes it possible to finely adjust the exposure time.
  • a first imaging device includes a first address storage unit that stores an address of a pixel that starts exposure in a pixel array unit in which a plurality of pixels are arranged in an array, and the first address storage unit A second address storage unit that stores the address transferred from the address storage unit, and controls the start of the exposure based on the address stored in the second address storage unit.
  • a second imaging device includes a first exposure address storage unit that stores an address of a pixel that starts exposure in a pixel array unit in which a plurality of pixels are arranged in an array, and a second exposure address storage unit An exposure address storage unit and a read address storage unit that stores an address of a pixel to be read are provided.
  • a first address storage unit that stores an address of a pixel to start exposure in a pixel array unit in which a plurality of pixels are arranged in an array
  • a first And a second address storage unit that stores the address transferred from the address storage unit. The exposure is controlled based on the address stored in the second address storage unit.
  • a first exposure address storage unit that stores an address of a pixel that starts exposure in a pixel array unit in which a plurality of pixels are arranged in an array
  • a second address storage unit An exposure address storage unit and a read address storage unit that stores an address of a pixel to be read.
  • the imaging device may be an independent device or an internal block constituting one device.
  • the exposure time can be finely adjusted.
  • FIG. 1 is a block diagram illustrating a configuration example of an embodiment of an imaging apparatus to which the present technology is applied.
  • the imaging device 11 includes a pixel array unit 12, an address recorder 13, a pixel timing driving unit 14, a column signal processing unit 15, and a sensor controller 16.
  • a plurality of pixels 21 are arranged in an array, and each pixel 21 is connected to the pixel timing driving unit 14 via a horizontal signal line and also via a vertical signal line VSL. Connected to the column signal processing unit 15.
  • the plurality of pixels 21 each output a pixel signal corresponding to the amount of light irradiated through an optical system (not shown), and an image of a subject imaged on the pixel array unit 12 is constructed from these pixel signals. .
  • the pixel 21 includes a photodiode 22 that performs photoelectric conversion, a transfer transistor 23 that transfers charges of the photodiode 22 to a floating diffusion (FD) 24, a floating diffusion 24 that is a floating diffusion region, and a signal via a source follower unit. , A selection transistor 26 for selecting an electronic shutter row and a readout row, and a reset transistor 27 for resetting the floating diffusion 24.
  • FD floating diffusion
  • a selection transistor 26 for selecting an electronic shutter row and a readout row
  • reset transistor 27 for resetting the floating diffusion 24.
  • the pixel array unit 12 is a pixel region in which the pixels 21 are arranged in a planar shape or a curved shape.
  • the pixel array may have any configuration as long as each unit pixel can be classified (grouped) into two different directions, and is a typical NxM array arranged in two linear directions orthogonal to each other. It does not have to be a matrix configuration. That is, for example, a line (row) or a column (column) of unit pixels may not be a straight line like a honeycomb structure. In other words, the unit pixels of each line and each column need not be arranged in a straight line, and the unit pixel line and the column need not be orthogonal.
  • the address decoder 13 and the pixel timing driving unit 14 constitute a vertical scanning unit, and are controlled by the sensor controller 16 to drive each pixel of the pixel array unit 12 for each line and to read out a pixel signal.
  • the address recorder 13 decodes the address designation information supplied from the sensor controller 16 and supplies a control signal to the configuration corresponding to the designated address of the pixel timing driving unit 14.
  • the pixel timing driving unit 14 supplies a control signal for driving each pixel of the pixel array unit 12 according to a logical sum of a control signal from the address recorder 13 and a pixel driving pulse from the sensor controller 16.
  • the column signal processing unit 15 performs CDS (Correlated Double Sampling) processing on the pixel signals output from the plurality of pixels 21 via the vertical signal line VSL, thereby performing AD conversion of the pixel signals. And reset noise.
  • the column signal processing unit 15 includes a plurality of AD converters 33 corresponding to the number of columns of the pixels 21, and can perform CDS processing in parallel for each column of the pixels 21.
  • the column signal processing unit 15 includes a constant current unit 31 that forms a load MOS unit of the source follower unit, and a single slope type DA converter 32 for analog-digital conversion of the potential of the vertical signal line VSL.
  • the AD converter 33 includes capacitors 34 and 35, a comparator 36, and a counter 37.
  • the potential of the pixel signal is applied to the capacitor 34 via the vertical signal line VSL, and the potential of the ramp wave output from the DA converter 32 is applied to the capacitor 35.
  • the comparator 36 compares the potential of the pixel signal supplied via the vertical signal line VSL with the potential of the ramp wave supplied from the DA converter 32, and inverts the signal at the timing at which the potentials intersect. Output a pulse.
  • the counter 37 counts an AD period corresponding to the timing at which the potential of the pixel signal and the potential of the ramp wave intersect in order to convert the analog value into a digital value.
  • the sensor controller 16 controls the overall driving of the imaging device 11. For example, the sensor controller 16 generates a clock signal according to the driving cycle of each block constituting the imaging device 11 and supplies the clock signal to each block.
  • the pixel 21 is described as a unit pixel 21.
  • the unit pixels 21 are arranged in an array, for example.
  • a vertical signal line VSL for transferring a pixel signal is assigned to each unit pixel 21 for each column of the unit pixels 21.
  • operations related to reading out pixel signals are controlled for each unit pixel line.
  • the photodiode 22 of the unit pixel 21 photoelectrically converts the received light into a photocharge (here, photoelectrons) having a charge amount corresponding to the light amount, and accumulates the photocharge.
  • the anode electrode of the photodiode 22 is connected to the ground (pixel ground) of the pixel region, and the cathode electrode is connected to the floating diffusion 24 via the transfer transistor 23.
  • the transfer transistor 23 controls the reading of the photocharge from the photodiode 22.
  • the transfer transistor 23 has a drain electrode connected to the floating diffusion 24 and a source electrode connected to the cathode electrode of the photodiode 22.
  • a control signal TRG is supplied from the pixel timing driver 14 to the gate electrode of the transfer transistor 23.
  • the control signal TRG that is, the gate potential of the transfer transistor 23
  • the control signal TRG that is, the gate potential of the transfer transistor 23
  • the control signal TRG that is, the gate potential of the transfer transistor 23
  • the control signal TRG that is, the gate potential of the transfer transistor 23
  • the photocharge accumulated in the photodiode 22 is read out and supplied to the floating diffusion 24.
  • the reset transistor 27 resets the potential of the floating diffusion 24.
  • the reset transistor 27 has a drain electrode connected to the power supply potential and a source electrode connected to the floating diffusion 24.
  • a control signal RST is supplied from the pixel timing driver 14 to the gate electrode of the reset transistor 27. When the control signal RST (that is, the gate potential of the reset transistor 27) is off, the floating diffusion 24 is disconnected from the power supply potential. When the control signal RST (that is, the gate potential of the reset transistor 27) is on, the charge of the floating diffusion 24 is discarded to the power supply potential, and the floating diffusion 24 is reset.
  • the amplification transistor 25 amplifies the potential change of the floating diffusion 24 and outputs it as an electric signal (analog signal).
  • the amplification transistor 25 has a gate electrode connected to the floating diffusion 24, a drain electrode connected to the power supply potential, and a source electrode connected to the drain electrode of the selection transistor 26.
  • the amplification transistor 25 outputs the potential of the floating diffusion 24 reset by the reset transistor 27 to the selection transistor 26 as a reset signal (reset level).
  • the amplification transistor 25 outputs the potential of the floating diffusion 24 to which the photocharge has been transferred by the transfer transistor 23 to the selection transistor 26 as a light accumulation signal (signal level).
  • the selection transistor 26 controls the output of the electric signal supplied from the amplification transistor 25 to the vertical signal line VSL.
  • the selection transistor 26 has a drain electrode connected to the source electrode of the amplification transistor 25 and a source electrode connected to the vertical signal line VSL.
  • a control signal SEL is supplied from the pixel timing driver 14 to the gate electrode of the selection transistor 26.
  • the control signal SEL that is, the gate potential of the selection transistor 26
  • the amplification transistor 25 and the vertical signal line VSL are electrically disconnected. Therefore, in this state, no pixel signal is output from the unit pixel.
  • the control signal SEL that is, the gate potential of the selection transistor 26
  • the unit pixel is in the selected state. That is, the amplification transistor 25 and the vertical signal line VSL are electrically connected, and a signal output from the amplification transistor 25 is supplied to the vertical signal line VSL as a pixel signal of the unit pixel. That is, a pixel signal is read from the unit pixel.
  • the configuration of the unit pixel 21 is arbitrary and is not limited to the example of FIG.
  • the transfer transistor 23 may be omitted.
  • the number of pixels per unit pixel is arbitrary, and may be one pixel as in the example of FIG. 1 or a plurality of pixels.
  • FIG. 2 shows a configuration example of a unit pixel in the case of having a plurality of pixels.
  • the unit pixel 21 has two photodiodes 22 (photodiode 22-1 and photodiode 22-2). That is, in this case, the unit pixel 21 is composed of two pixels.
  • the unit pixel 21 has two transfer transistors 23 (transfer transistor 23-1 and transfer transistor 23-2).
  • the transfer transistor 23-1 controls reading of the photocharge from the photodiode 22-1 based on the control signal TRG supplied from the pixel timing driving unit 14.
  • the transfer transistor 23-2 controls the reading of the photocharge from the photodiode 22-2 based on the control signal TRG supplied from the pixel timing driver 14.
  • the configuration of the floating diffusion 24, the amplification transistor 25, the selection transistor 26, the reset transistor 27, and the like is shared within the unit pixel.
  • the pixel signals of the respective pixels are transmitted through the same vertical signal line VSL.
  • a horizontal synchronizing signal indicating one horizontal synchronizing period indicating one horizontal synchronizing period
  • a TRG driving pulse for driving the transfer transistor 23 transfer pulse at the time of reading and a transfer pulse at the time of electronic shutter
  • an RST driving pulse for driving the reset transistor 27 (at the time of electronic shutter)
  • a reset pulse and a read reset pulse) and a SEL drive pulse for driving the selection transistor 26 read selection pulse
  • the potential of the photodiode 22 is reset by turning on the electronic shutter transfer pulse and the electronic shutter reset pulse. Thereafter, charges are accumulated in the photodiode 22 during the accumulation time, and a read pulse is issued from the sensor controller 16.
  • the potential of the floating diffusion 24 is reset by turning on a reset pulse at the time of reading, and then the potential of the pre-data phase (P phase) is AD converted. Thereafter, the charge of the photodiode 22 is transferred to the floating diffusion 24 by a transfer pulse at the time of reading, and the data phase (D phase) is AD converted.
  • the selection pulse at the time of reading is in an on state.
  • the horizontal synchronization signal indicating one horizontal synchronization period
  • the potential of the ramp signal output from the DA converter 32 solid line
  • the potential of the pixel signal output from the vertical signal line VSL broken line
  • An inversion pulse to be performed and an operation image of the counter 37 are shown.
  • the DA converter 32 has a first gradient in which the potential sequentially drops at a constant gradient in the P phase for reading out the reset level of the pixel signal, and at a constant gradient in the D phase for reading out the data level of the pixel signal.
  • a ramp wave having a second slope in which the potential drops sequentially is generated.
  • the comparator 36 compares the potential of the pixel signal with the potential of the ramp wave, and outputs an inversion pulse that is inverted at the timing when the potential of the pixel signal and the potential of the ramp wave intersect.
  • the counter 37 counts from the timing when the ramp wave starts to drop in the P phase to the timing when the potential of the ramp wave becomes lower than the potential of the pixel signal (P phase count value), and then the ramp wave in the D phase. Is counted from the timing when the voltage starts to fall to the timing when the potential of the ramp wave becomes lower than the potential of the pixel signal (D-phase count value). Thereby, the difference between the P-phase count value and the D-phase count value is acquired as a pixel signal from which reset noise has been removed.
  • AD conversion of the pixel signal is performed using the ramp wave as shown in FIG.
  • the pixel arrangement of the pixel array unit 12 will be described with reference to FIG. 5A, 2 ⁇ 2 4 pixels are defined as one unit, and the upper left pixel 21-1 in the unit is an R pixel that mainly photoelectrically converts the red (R) band, and the lower left pixel 21 -2 is a G pixel that mainly photoelectrically converts the green (G) band, and an upper right pixel 21-3 is a G pixel that mainly photoelectrically converts the green (G) band, and the lower right pixel 21- 4 is a B pixel that photoelectrically converts a blue (B) band.
  • the pixel 21 can comprise 1 unit of a Bayer arrangement.
  • the R pixel, the G pixel, and the B pixel function as spectral sensitivity pixels having characteristics in their respective colors.
  • FIG. 5B shows another pixel arrangement.
  • 2 ⁇ 2 4 pixels are defined as one unit, and the upper left pixel 21-1 in the unit is an R pixel, the lower left pixel 21-2, the upper right pixel 21-3, and the right pixel
  • the lower pixel 21-4 is a C pixel having total color matching spectral sensitivity.
  • the C pixel is a pixel having higher sensitivity than the R pixel, the G pixel, and the B pixel described above.
  • the pixel arrangement shown in FIG. 5 is an example, and the scope of application of the present technology is not limited to the pixel arrangement shown in FIG.
  • the C pixel may be arranged on one of the two G pixels.
  • the imaging device 11 captures three images, and performs a generation process of a high dynamic range (HDR) image based on a synthesis process of the three images.
  • the three images are images with different exposure times.
  • a long exposure time is described as long exposure, and an image captured by long exposure is described as a long exposure image.
  • a short exposure time is described as a short exposure, and an image captured by the short exposure is described as a short exposure image.
  • An exposure time shorter than the long time exposure but longer than the short time exposure is described as a medium time exposure, and an image captured by the medium time exposure is described as a medium time exposure image.
  • a wide dynamic range image is obtained by synthesizing the long-time exposure image, the medium-time exposure image, and the short-time exposure image captured in the long-time exposure, the medium-time exposure, and the short-time exposure, respectively.
  • three exposure times for example, two images captured with two exposure times (long exposure and short exposure) are combined to process a wide dynamic range image.
  • the present technology can also be applied to the case where the data is generated.
  • the long-time exposure image, the medium-time exposure image, and the short-time exposure image are captured by shifting the time. For example, after a long exposure image is captured, a medium time exposure image is captured, and after a medium time exposure image is captured, a short exposure image is captured.
  • a long-time exposure image, a medium-time exposure image, and a short-time exposure image are taken in this order.
  • the imaging may be performed at.
  • FIG. 2 a configuration in which two photodiodes 22 are included in one unit pixel, in other words, a case where two pixels are shared will be described as an example. Further, here, the description is continued by taking as an example the pixel arrangement shown in FIG. 5B, that is, the case where the R pixel and the C pixel are arranged, and the R pixel and the C pixel arranged in the vertical direction are 1 The description will be continued assuming that it is included in the unit pixel and shared.
  • the R pixel 21-1, the C pixel 21-2, the R pixel 21-3, the C pixel 21-4, the R pixel 21-5, and the C pixel 21-6 are pixels arranged in the vertical direction.
  • the C pixel 21-2 and the R pixel 21-3 are shared pixels, and the C pixel 21-4 and the R pixel 21-5 are shared pixels.
  • “S” shown in the rectangle indicates the timing when the shutter is released, and “R” indicates the timing of reading.
  • the shutter is released for the R pixel 21-1 and the C pixel 21-2, and exposure is started.
  • the shutter is released for the R pixel 21-3 and the C pixel 21-4, and exposure is started.
  • the shutter is released for the R pixel 21-5 and the C pixel 21-6, and exposure is started.
  • reading from the R pixel 21-1 and the C pixel 21-2 is started.
  • the R pixel 21-1 and the C pixel 21-2 are exposed for a time T1 from time t1 to time t4, and this time T1 becomes a long time exposure T1.
  • exposure is started from time t2, and reading from the R pixel 21-3 and C pixel 21-4 is started at time t5 when the long exposure T1 has elapsed.
  • exposure is started from time t3, and readout from the R pixel 21-5 and C pixel 21-6 is started at time t6 when the long exposure T1 has elapsed.
  • imaging in medium time exposure is started.
  • the shutter is released for the R pixel 21-1 and the C pixel 21-2, and exposure is started.
  • the shutter is released for the R pixel 21-3 and the C pixel 21-4, and exposure is started.
  • the shutter is released for the R pixel 21-5 and the C pixel 21-6, and exposure is started.
  • reading from the R pixel 21-1 and the C pixel 21-2 is started.
  • the R pixel 21-1 and the C pixel 21-2 are exposed for a time T2 from time t6 to time t8, and this time T2 becomes a medium time exposure T2.
  • exposure is started from time t7, and reading from the R pixel 21-3 and C pixel 21-4 is started at time t10 when the medium time exposure T2 has elapsed.
  • exposure is started from time t3, and readout from the R pixel 21-5 and the C pixel 21-6 is started at time t13 when the medium time exposure T2 has elapsed.
  • imaging in a short exposure is started.
  • the shutter is released for the R pixel 21-1 and the C pixel 21-2, and exposure is started.
  • the shutter is released for the R pixel 21-3 and the C pixel 21-4, and exposure is started.
  • the shutter is released for the R pixel 21-5 and the C pixel 21-6, and exposure is started.
  • reading from the R pixel 21-1 and the C pixel 21-2 is started.
  • the R pixel 21-1 and the C pixel 21-2 are exposed for a time T3 from time t9 to time t11, and this time T3 becomes a short time exposure T3.
  • exposure is started from time t13, and readout from the R pixel 21-3 and C pixel 21-4 is started at time t14 when a short exposure T3 has elapsed.
  • exposure is started from time t15, and readout from the R pixel 21-5 and the C pixel 21-6 is started at time t16 when only a short exposure T3 has elapsed.
  • the long time exposure T1, the medium time exposure T2, and the short time exposure T3 have the following relationship. Long exposure T1> Medium exposure T2> Short exposure T3
  • the shutter of the R pixel 21-1 is released at time t1. That is, at time t1, exposure is started by issuing an electronic shutter transfer pulse (STR) and an electronic shutter reset pulse (SRST) to the R pixel 21-1. At time t4, when reading is started, the read reset pulse (RRST) is turned on.
  • STR electronic shutter transfer pulse
  • SRST electronic shutter reset pulse
  • the shutter is released within one horizontal synchronization period, and reading is performed within one horizontal synchronization period.
  • the exposure time is equivalent to one horizontal synchronization period.
  • the exposure time is equivalent to two horizontal synchronization periods.
  • the exposure time is an integral multiple of one horizontal synchronization period.
  • the column signal processing unit 15 performs AD conversion of the pixel signal by performing correlated double sampling processing on the pixel signal output from the plurality of pixels 21 via the vertical signal line VSL. Let the period to perform be AD period.
  • one AD period is one horizontal synchronization period.
  • the exposure time is an integral multiple of 1 AD period.
  • the short exposure T3 is at least one AD period. Even if the exposure time is longer than the appropriate exposure time, the short exposure T3 is set to a time equivalent to the 1AD period, and there is a possibility that optimal imaging cannot be performed.
  • the short exposure image has been described as an example.
  • the exposure time can be set only by an integral multiple of one AD period.
  • the exposure time can be adjusted only in units of 1 AD period, long exposure, medium exposure, and short exposure can only be set coarsely, and the ratio of these exposure times may not be as desired. There is also. If the ratio of these exposure times does not become a desired ratio, there is a possibility that the image quality of the image after synthesis is degraded.
  • the electronic shutter transfer pulse (STR) and the electronic shutter reset pulse (RRST) can be output at an arbitrary timing within one horizontal movement period.
  • a plurality of clocks are shown in order to show that they can be output at an arbitrary timing within one horizontal movement period, but one of the plurality of clocks is an electronic shutter transfer pulse (STR).
  • the reset pulse (RRST) at the time of electronic shutter is used.
  • the shutter release timing is the exposure start timing
  • the exposure time can be adjusted in units of one clock.
  • the description “release the shutter” is used, but this description can be read as “start of exposure”.
  • the timing at which the shutter is released (the timing at which exposure is started) and the timing at which reading is started are fixed within the AD period.
  • the exposure time is adjusted in units of 8 ⁇ s, and the timing at which the shutter is released is made variable and the timing at which reading is started is fixed within the horizontal synchronization period, the exposure time is 0. Adjustment can be made in units of 02 ⁇ s.
  • reading is performed from the C pixel 21-2, and reading is performed from the R pixel 21-3 at time t5. From time t4 to time t5 is a 3AD period. Similarly, readout is performed from the C pixel 21-4 at time t5, and readout is performed from the R pixel 21-5 at time t6. From time t5 to time t6 is a 3AD period.
  • reading is performed every 3 AD periods.
  • the reason for the 3AD period is that AD conversion is performed in the order of long exposure, medium exposure, and short exposure.
  • the timing at which the shutter is cut can be made variable, and the exposure time can be made variable. Therefore, there is a possibility that the exposure time coincides with the readout cycle (3AD period).
  • the pixel arrangement is shown on the left side as in FIG. FIG. 8 shows the pixels 21-1 to 21-5 arranged in the vertical direction, the C pixel 21-2 and the R pixel 21-3 are shared pixels, and the C pixel 21-4 and the R pixel 21-5. Are shared pixels.
  • time t31 the shutter is released for the R pixel 21-1 and the C pixel 21-2, and reading is started at time t32.
  • a time T31 from time t31 to time 32 is an exposure time, and this exposure time is a 3AD period.
  • time t32 the shutter is released for the R pixel 21-3 and the C pixel 21-4, and readout is started at time t33.
  • a time T32 from time t32 to time 33 is an exposure time, and this exposure time is a 3AD period.
  • the read cycle is time T32 from time t32 to time t33. As described above, this time T32 corresponds to the exposure time of the R pixel 21-3 and the C pixel 21-4, and is a 3AD period.
  • FIG. 9 shows a state in which readout for the C pixel 21-2 and shutter for the R pixel 21-3 are performed at the same time.
  • FIG. 9 is a diagram in which an arrow of charge flow is added to the pixel 21 shown in FIG.
  • the photodiode 22-2 is a photodiode included in the C pixel 21-2
  • the photodiode 22-3 is a photodiode included in the R pixel 21-3.
  • the state in which the shutter is released for the R pixel 21-3 is a state in which the electronic shutter transfer pulse and the electronic shutter reset pulse are turned on. In such a state, As shown in FIG. 9, the charge stored in the photodiode 22-3 flows toward the reset transistor 27 via the transfer transistor 23-1, and thus the reset is performed.
  • the state in which reading is started for the C pixel 21-2 is a state in which the reset pulse at the time of reading is turned on. In such a state, as shown in FIG.
  • the charge accumulated in the C pixel 21-2 flows and accumulates in the floating diffusion 24.
  • the reset transistor 27 since the reset transistor 27 is in the ON state, the reset transistor 27 is not accumulated in the floating diffusion 24. It will flow toward 27.
  • FIG. 10 like FIG. 6, shows the pixel arrangement on the left side, and shows the shutter timing and readout timing for each pixel. Note that the description of the same timing as that described with reference to FIG. 6 is omitted, and the description will be made while appropriately comparing with the case shown in FIG.
  • FIG. 6 shows a case where the exposure time is the same as the readout cycle, in this case, not the 3AD period.
  • Reading is started at time t4 for the C pixel 21-2 shown in FIG. 6, and reading is performed at time t5 for the R pixel 21-3 that is shared with the C pixel 21-2. Is started.
  • the time from time t4 to time t5 is a 3AD period.
  • FIG. 10 shows a case where the control for shifting the readout timing is performed because the exposure time is the same as the readout cycle.
  • the exposure time can be changed by making the shutter timing variable. Further, imaging is performed with each exposure time of long exposure, medium exposure, and short exposure. There are cases in which two, or one of long exposure, medium time exposure, and short exposure are all the same time as the readout cycle (an integral multiple of the readout cycle).
  • FIG. 10 shows a case where the long exposure and the short exposure overlap the readout cycle. Reading is started at time t54 for the C pixel 21-2 shown in FIG. 10, and reading is performed at time t55 for the R pixel 21-3 that is a shared pixel with the C pixel 21-2. Is started. The read timing is controlled so that the time from time t54 to time t55 is a 4AD period.
  • reading is normally performed in a 3AD period, but control is performed so that the reading cycle becomes a 4AD period by delaying the reading for 1 AD period.
  • delaying the shutter release timing by 1 AD period By delaying the shutter release timing by 1 AD period, readout is delayed by 1 AD period.
  • reading is started for the C pixel 21-4 shown in FIG. 6 at time t55, and for the R pixel 21-5 that is a shared pixel with the C pixel 21-4, time t56 is read. Reading is started at. The read timing is controlled so that the time from time t55 to time t56 is a 2AD period.
  • the reading is normally performed in the 3AD period, but the control is performed so that the reading cycle becomes the 2AD period by speeding up the reading in the 1AD period.
  • reading is performed earlier by 1 AD period by increasing the timing at which the shutter is released by 1 AD period earlier.
  • the readout timing interval is controlled to be a 4AD period or a 2AD period.
  • the exposure time is controlled to be a 3AD period by advancing or delaying the shutter release timing in accordance with the change of the readout timing.
  • the shutter timing and the readout timing are controlled so as not to overlap by shifting the readout timing.
  • FIG. 10 illustrates an example in which control for shifting the readout timing is performed for exposure in which the exposure time is the same as the readout cycle among long exposure, medium exposure, and short exposure. did.
  • the exposure time of one of long exposure, medium exposure, and short exposure is the same as the readout cycle
  • readout is performed for all exposures of long exposure, medium exposure, and short exposure.
  • the read timing may be controlled so as to be in the 4AD period or the 2AD period.
  • the 2AD period is used instead of the 3AD period. That is, when the exposure time is a 2AD period, control for shifting the readout timing is performed.
  • the 3AD period has been described as an example, but when an image is captured with different exposure times, and a plurality of captured images are acquired and combined, an AD period corresponding to the number of acquired images When the exposure times coincide with each other, control for shifting the readout timing is performed.
  • FIG. 11 is a diagram showing an internal configuration example of the address decoder 13 (FIG. 1).
  • the address decoder 13 is provided with a shutter address storage unit 101 and a read address storage unit 102 for each line of the pixel array unit 12.
  • the shutter address storage unit 101 stores the address of the pixel that releases the shutter.
  • the read address storage unit 102 stores the address of the pixel to be read.
  • the shutter address storage unit 101 includes a first address storage unit 121 and a second address storage unit 122.
  • the address stored in the first address storage unit 121 is transferred to the second address storage unit 122 at a predetermined timing.
  • the address stored in the second address storage unit 122 is supplied to the pixel timing drive unit 14 at the subsequent stage, whereby the shutter of the pixel 21 designated by the address is released.
  • the shutter address (hereinafter referred to as the shutter address as appropriate) is managed by the two-stage address storage unit.
  • the shutter is configured to be able to be cut at a desired timing within one AD period.
  • FIG. 12 shows the horizontal synchronization signal and the electronic shutter transfer pulse.
  • the transfer pulse at the time of electronic shutter is illustrated for explaining the timing of releasing the shutter.
  • control for releasing the shutter can be performed at any timing of time t71, time t72, or time t73 of the AD period T32. In other words, it is possible to perform control so that the shutter is released at any timing of the start of the AD period T72, the intermediate time, and the end time.
  • the shutter address of the pixel 21 that releases the shutter (the photodiode 22 in the pixel 21) is decoded in a period based on a pulse to be described later, and the first address storage unit of the shutter address storage unit 101 121 is stored.
  • the shutter address stored in the first address storage unit 121 is obtained from the first address storage unit 121 based on a pulse instructing transfer of the shutter address from the first address storage unit 121 to the second address storage unit 122. It is transferred to the second address storage unit 122.
  • the shutter address decoded in the AD period T31 is transferred and stored from the first address storage unit 121 to the second address storage unit 122 at a time point before the AD period T32 (in the AD period T31). Then, in the AD period T32, the shutter is released based on the shutter address stored in the second address storage unit 122.
  • the shutter can be released at a desired timing of the AD period.
  • the shutter can be released even if the desired timing is earlier in the AD period, for example.
  • the AD period to be decoded and the AD period in which the shutter is released based on the shutter address are different AD periods.
  • the timing at which the shutter is released is later in the AD period.
  • the AD period in which the decoded address is transferred to the second address storage unit 122 and the AD period in which the shutter is released based on the shutter address may be the same AD period.
  • the timing at which the shutter address is transferred from the first address storage unit 121 to the second address storage unit 122 is not always the same timing but different timing depending on at which timing in the AD period the shutter is released. It is good.
  • the transfer of the shutter address is executed in the AD period before the shutter is released, and the shutter is released later in the AD period.
  • the transfer of the shutter address may be performed within the same AD period as the AD period in which the shutter is released.
  • the first address storage unit 121 and the second address storage unit 122 can be configured by a latch, for example.
  • each of the first address storage unit 121 and the second address storage unit 122 is configured by a latch, as shown in FIG. 15, it becomes a 3-bit latch.
  • the first address storage unit 121 includes latches 141-1 to 141-3
  • the second address storage unit 122 includes latches 142-1 to 142-3. ing.
  • the latch 141-1 and the latch 142-1 store a shutter address for long time exposure
  • the latch 141-2 and the latch 142-2 store a shutter address for medium time exposure
  • the latch 141-3 the latch 142-3 can be configured to store a shutter address for short-time exposure.
  • the configuration of such a latch is an example, and is not a limitation.
  • the first address storage unit 121 and the second address storage unit 122 may be configured other than the latch.
  • a description has been given by taking as an example a 3-bit latch configuration for storing addresses in each exposure.
  • a 2-bit latch configuration can be used.
  • the first address storage unit 121 has three states: a shutter state, an exposure state, and an idling state. Since there is no relationship with the shutter operation, the first address storage unit 121 is not set, but there is a read state, so the description including the read state will be continued.
  • the shutter state is a state in which the shutter address is stored.
  • the first address storage unit 121 stores a shutter address when in the shutter state.
  • the shutter address stored in the first address storage unit 121 is transferred to the second address storage unit 122 at a predetermined timing as described above.
  • the shutter is released based on the shutter address stored in the second address storage unit 122. Exposure is started when the shutter is released.
  • the first address storage unit 121 transitions to the exposure state when reset is executed in the shutter state. When the exposure is completed, reading from the pixel 21 is performed. Since the address related to reading is stored in the reading address storage unit 102, the first address storage unit 121 that stores the shutter address is in a state in which no transition is made, but is provided as a state.
  • the first address storage unit 121 From the reading from the pixel 21 until the next shutter is released, it is in an idling state.
  • the first address storage unit 121 does not store a shutter address.
  • the first address storage unit 121 sequentially transitions between the three states of the shutter state, the exposure state, and the idling state.
  • the first address storage unit 121 is in an idle state.
  • the first address storage unit 121 is reset and transitions to the shutter state. Thereafter, the shutter address is set (stored) in the first address storage unit 121 based on the shutter address set pulse.
  • the second address transfer pulse When the second address transfer pulse is output to the shutter address storage unit 101, the shutter address stored in the first address storage unit 121 is transferred and stored in the second address storage unit 122.
  • the second address transfer pulse is a pulse issued when the shutter address is transferred from the first address storage unit 121 to the second address storage unit 122, and is a pulse issued at a predetermined cycle. .
  • the second address storage unit 122 shifts to the shutter state.
  • the shutter address is stored in the second address storage unit 122, so that the shutter can be released.
  • the shutter can be released at a desired timing within the AD period T93, which is a period after the AD period T92.
  • a shutter is applied to the pixel 21 corresponding to the address stored in the second address storage unit 122 by issuing an electronic shutter transfer pulse and an electronic shutter reset pulse. Cut off.
  • the first address storage unit 121 is reset and transitions from the shutter state to the accumulation state. Further, in the AD period T93, when the second address transfer pulse is issued, the second address storage unit 122 is reset and transits from the shutter state to the accumulation state.
  • the state of the second address storage unit 122 transitions to a state following the first address storage unit 121.
  • the first address storage unit 121 In the AD period T94, when a shutter address reset pulse is issued to the first address storage unit 121, the first address storage unit 121 is reset and transits from the accumulation state to the shutter state. When there is a decoded shutter address, the first address storage unit 121 stores the shutter address according to the shutter address set pulse.
  • the shutter address is stored in the first address storage unit 121, the shutter address is transferred from the first address storage unit 121 to the second address storage unit 122, and the second address is stored.
  • the shutter is controlled based on the shutter address stored in the storage unit 122.
  • FIG. 17 is a diagram illustrating another configuration of the shutter address storage unit 101.
  • the shutter address storage unit 101b shown in FIG. 17 (denoted with the shutter address storage unit 101 shown in FIG. 11 is labeled with 101b) is the first address storage.
  • Unit 121, second address storage unit 122, and state monitoring unit 151 is the first address storage.
  • the first address storage unit 121 and the second address storage unit 122 are basically similar in configuration and operation to the first address storage unit 121 and the second address storage unit 122 shown in FIG.
  • the same reference numerals are given and description thereof is omitted as appropriate.
  • the state monitoring unit 151 monitors the states of the first address storage unit 121 and the second address storage unit 122, respectively, from the first address storage unit 121 to the second address storage unit 122 at an appropriate timing. It functions as a control unit that performs control so that the shutter address is transferred.
  • the state monitoring unit 151 monitors the states of the first address storage unit 121 and the second address storage unit 122, and the state monitoring unit 151 includes the first address storage unit.
  • the point of controlling the transfer of the shutter address from 121 to the second address storage unit 122 is different from the above case. This point will be explained.
  • the state monitoring unit 151 determines whether or not the first address storage unit 121 is in the shutter state. If the first address storage unit 121 is in the shutter state when the second address transfer pulse is issued, the state monitoring unit 151 transfers from the first address storage unit 121 to the second address storage unit 122. Then, control is performed so that the shutter address is transferred. Further, the state of the second address storage unit 122 is changed to the shutter state.
  • the state monitoring unit 151 determines whether or not the first address storage unit 121 is in the shutter state. In the AD period T93, when the second address transfer pulse is issued, it is determined that the first address storage unit 121 is not in the shutter state because it is in the accumulation state.
  • the first address storage unit 121 when it is determined that the first address storage unit 121 is not in the shutter state, the first address storage unit 121 does not transfer the shutter address from the first address storage unit 121 to the second address storage unit 122, and The state of the second address storage unit 122 is changed from the shutter state to the accumulation state.
  • the state monitoring unit 151 determines whether or not the first address storage unit 121 is in the shutter state. If the first address storage unit 121 is in the shutter state when the second address transfer pulse is issued, the state monitoring unit 151 transfers from the first address storage unit 121 to the second address storage unit 122. Then, control is performed so that the shutter address is transferred. Further, the state of the second address storage unit 122 is changed to the shutter state.
  • the state monitoring unit 151 is provided, the states of the first address storage unit 121 and the second address storage unit 122 are monitored, transfer of the shutter address is controlled according to the state, and the second address storage unit 122 is controlled.
  • a configuration in which the state of the address storage unit 122 is changed may be employed.
  • the configuration of the shutter address storage unit 101 in the third embodiment can be the same as that of the shutter address storage unit 101b in the second embodiment shown in FIG.
  • the shutter address storage unit 101 in the third embodiment is assumed to be the shutter address storage unit 101b shown in FIG. 17, and although not shown, in order to distinguish from the shutter address storage unit 101b, This is described as a shutter address storage unit 101c.
  • the internal configuration of the state monitoring unit 151 of the shutter address storage unit 101c in the third embodiment is configured by a latch and appropriately stores the shutter address stored in the first address storage unit 121. Yes. This point is different from the second embodiment.
  • the state monitoring unit 151 can also be configured to include 3-bit latches 141-1 to 141-3. That is, since the state monitoring unit 151 stores the shutter address stored in the first address storage unit 121, the state monitoring unit 151 can have the same configuration as the first address storage unit 121.
  • the operation of the shutter address storage unit 101 when the state monitoring unit 151 is configured to store the shutter address will be described with reference to FIG.
  • FIG. 18 is a diagram in which a monitoring unit reset chart and a monitoring unit set chart are added to the timing chart shown in FIG.
  • the state monitoring unit 151 determines whether or not the first address storage unit 121 is in the shutter state. If the first address storage unit 121 is in the shutter state when the second address transfer pulse is issued, the state monitoring unit 151 transfers from the first address storage unit 121 to the second address storage unit 122. Then, control is performed so that the shutter address is transferred. Further, the state of the second address storage unit 122 is changed to the shutter state.
  • the state monitoring unit 151 resets the stored shutter address.
  • the state monitoring unit 151 reads and stores the shutter address stored in the first address storage unit 121.
  • the state monitoring unit 151 when the state monitoring unit 151 is reset when the first address storage unit 121 is in the shutter state, the state monitoring unit 151 stores the shutter address stored in the first address storage unit 121. To do. Then, the state monitoring unit 151 sets its own state to the shutter state.
  • the state monitoring unit 151 determines whether or not the first address storage unit 121 is in the shutter state.
  • the second address transfer pulse is issued, it is determined that the first address storage unit 121 is not in the shutter state because it is in the accumulation state.
  • the state monitoring unit 151 When the first address storage unit 121 determines that it is not in the shutter state, the state monitoring unit 151 further determines whether it is in the shutter state by looking at its own state. During the AD period T113, the state monitoring unit 151 determines that it is in the shutter state because it is in the shutter state.
  • the state monitoring unit 151 receives a second address from the first address storage unit 121. Controls the transfer of the shutter address to the storage unit 122.
  • the first address storage unit 121 When the second address transfer pulse is issued in the AD period T113, the first address storage unit 121 is in the exposure state, and therefore does not store the shutter address, so the shutter is stored in the second address storage unit 122. The address is not transferred, and the second address storage unit 122 is reset. Therefore, the second address storage unit 122 transitions to the same exposure state as the first address storage unit 121.
  • the state monitoring unit 151 when the state monitoring unit 151 receives the monitoring unit reset pulse, the state monitoring unit 151 resets the stored shutter address, but does not perform resetting when the first address storage unit 121 is not in the shutter state. Is set to
  • the state monitoring unit 151 looks at the state of the first address storage unit 121 and controls its state so that the reset is performed only in the shutter state.
  • the monitoring unit reset pulse when the first address storage unit 121 is not in the shutter state, the monitoring unit reset pulse may be masked (in the figure, the cross marks indicate the mask). Since the monitoring unit reset pulse is masked, the state monitoring unit 151 does not receive the monitoring unit reset pulse, and thus the state monitoring unit 151 does not perform the reset operation.
  • the shutter state of the state monitoring unit 151 is maintained in the AD period T114.
  • the state monitoring unit 151 determines whether or not the first address storage unit 121 is in the shutter state. In this case, in the shutter state. Therefore, the shutter address is transferred from the first address storage unit 121 to the second address storage unit 122. At this time, the state of the second address storage unit 122 is changed to the shutter state.
  • the state monitoring unit 151 is provided, the states of the first address storage unit 121 and the second address storage unit 122 are monitored, transfer of the shutter address is controlled according to the state, and the second address storage unit 122 is controlled.
  • a configuration in which the state of the address storage unit 122 is changed may be employed.
  • the shutter is released at a desired timing in the AD period, in other words, exposure can be started.
  • the exposure time can be adjusted more finely.
  • the exposure time can be adjusted more finely, it is possible to take an image with an appropriate exposure time, and it is possible to improve the image quality.
  • the different exposure times can be finely adjusted, and images can be captured with an appropriate exposure time to improve the image quality of the combined image.
  • the imaging apparatus 11 includes various imaging systems such as a digital still camera and a digital video camera, a mobile phone having an imaging function, and other devices having an imaging function. It can be applied to other electronic devices.
  • FIG. 20 is a block diagram illustrating a configuration example of an imaging device mounted on an electronic device.
  • the imaging apparatus 201 includes an optical system 202, an imaging element 203, a signal processing circuit 204, a monitor 205, and a memory 206, and can capture still images and moving images.
  • the optical system 202 includes one or more lenses, guides image light (incident light) from a subject to the image sensor 203, and forms an image on a light receiving surface (sensor unit) of the image sensor 203.
  • the imaging device 203 As the imaging device 203, the imaging device 11 according to the above-described embodiment is applied.
  • the image sensor 203 electrons are accumulated for a certain period according to the image formed on the light receiving surface via the optical system 202. Then, a signal corresponding to the electrons accumulated in the image sensor 203 is supplied to the signal processing circuit 204.
  • the signal processing circuit 204 performs various signal processing on the pixel signal output from the image sensor 203.
  • An image (image data) obtained by performing signal processing by the signal processing circuit 204 is supplied to the monitor 205 and displayed, or supplied to the memory 206 and stored (recorded).
  • an image with improved image quality can be captured by applying the imaging apparatus 11 according to the above-described embodiment.
  • the technology according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure is realized as a device that is mounted on any type of mobile body such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, personal mobility, an airplane, a drone, a ship, and a robot. May be.
  • FIG. 21 is a block diagram illustrating a schematic configuration example of a vehicle control system that is an example of a mobile control system to which the technology according to the present disclosure can be applied.
  • the vehicle control system 12000 includes a plurality of electronic control units connected via a communication network 12001.
  • the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, a vehicle exterior information detection unit 12030, a vehicle interior information detection unit 12040, and an integrated control unit 12050.
  • a microcomputer 12051, a sound image output unit 12052, and an in-vehicle network I / F (Interface) 12053 are illustrated as a functional configuration of the integrated control unit 12050.
  • the drive system control unit 12010 controls the operation of the device related to the drive system of the vehicle according to various programs.
  • the drive system control unit 12010 includes a driving force generator for generating a driving force of a vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism that adjusts and a braking device that generates a braking force of the vehicle.
  • the body system control unit 12020 controls the operation of various devices mounted on the vehicle body according to various programs.
  • the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as a headlamp, a back lamp, a brake lamp, a blinker, or a fog lamp.
  • the body control unit 12020 can be input with radio waves transmitted from a portable device that substitutes for a key or signals from various switches.
  • the body system control unit 12020 receives input of these radio waves or signals, and controls a door lock device, a power window device, a lamp, and the like of the vehicle.
  • the vehicle outside information detection unit 12030 detects information outside the vehicle on which the vehicle control system 12000 is mounted.
  • the imaging unit 12031 is connected to the vehicle exterior information detection unit 12030.
  • the vehicle exterior information detection unit 12030 causes the imaging unit 12031 to capture an image outside the vehicle and receives the captured image.
  • the vehicle outside information detection unit 12030 may perform an object detection process or a distance detection process such as a person, a car, an obstacle, a sign, or a character on a road surface based on the received image.
  • the imaging unit 12031 is an optical sensor that receives light and outputs an electrical signal corresponding to the amount of received light.
  • the imaging unit 12031 can output an electrical signal as an image, or can output it as distance measurement information. Further, the light received by the imaging unit 12031 may be visible light or invisible light such as infrared rays.
  • the vehicle interior information detection unit 12040 detects vehicle interior information.
  • a driver state detection unit 12041 that detects a driver's state is connected to the in-vehicle information detection unit 12040.
  • the driver state detection unit 12041 includes, for example, a camera that images the driver, and the vehicle interior information detection unit 12040 determines the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated or it may be determined whether the driver is asleep.
  • the microcomputer 12051 calculates a control target value of the driving force generator, the steering mechanism, or the braking device based on the information inside / outside the vehicle acquired by the vehicle outside information detection unit 12030 or the vehicle interior information detection unit 12040, and the drive system control unit A control command can be output to 12010.
  • the microcomputer 12051 realizes ADAS (Advanced Driver Assistance System) functions including vehicle collision avoidance or impact mitigation, following traveling based on inter-vehicle distance, vehicle speed maintenance traveling, vehicle collision warning, or vehicle lane departure warning. It is possible to perform cooperative control for the purpose.
  • ADAS Advanced Driver Assistance System
  • the microcomputer 12051 controls the driving force generator, the steering mechanism, the braking device, and the like based on the information around the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040. It is possible to perform cooperative control for the purpose of automatic driving that autonomously travels without depending on the operation.
  • the microcomputer 12051 can output a control command to the body system control unit 12030 based on information outside the vehicle acquired by the vehicle outside information detection unit 12030.
  • the microcomputer 12051 controls the headlamp according to the position of the preceding vehicle or the oncoming vehicle detected by the outside information detection unit 12030, and performs cooperative control for the purpose of anti-glare, such as switching from a high beam to a low beam. It can be carried out.
  • the sound image output unit 12052 transmits an output signal of at least one of sound and image to an output device capable of visually or audibly notifying information to a vehicle occupant or the outside of the vehicle.
  • an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are illustrated as output devices.
  • the display unit 12062 may include at least one of an on-board display and a head-up display, for example.
  • FIG. 22 is a diagram illustrating an example of an installation position of the imaging unit 12031.
  • the imaging unit 12031 includes imaging units 12101, 12102, 12103, 12104, and 12105.
  • the imaging units 12101, 12102, 12103, 12104, and 12105 are provided, for example, at positions such as a front nose, a side mirror, a rear bumper, a back door, and an upper part of a windshield in the vehicle interior of the vehicle 12100.
  • the imaging unit 12101 provided in the front nose and the imaging unit 12105 provided in the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 12100.
  • the imaging units 12102 and 12103 provided in the side mirror mainly acquire an image of the side of the vehicle 12100.
  • the imaging unit 12104 provided in the rear bumper or the back door mainly acquires an image behind the vehicle 12100.
  • the imaging unit 12105 provided on the upper part of the windshield in the passenger compartment is mainly used for detecting a preceding vehicle or a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
  • FIG. 22 shows an example of the shooting range of the imaging units 12101 to 12104.
  • the imaging range 12111 indicates the imaging range of the imaging unit 12101 provided in the front nose
  • the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided in the side mirrors, respectively
  • the imaging range 12114 The imaging range of the imaging part 12104 provided in the rear bumper or the back door is shown. For example, by superimposing the image data captured by the imaging units 12101 to 12104, an overhead image when the vehicle 12100 is viewed from above is obtained.
  • At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information.
  • at least one of the imaging units 12101 to 12104 may be a stereo camera including a plurality of imaging elements, or may be an imaging element having pixels for phase difference detection.
  • the microcomputer 12051 based on the distance information obtained from the imaging units 12101 to 12104, the distance to each three-dimensional object in the imaging range 12111 to 12114 and the temporal change in this distance (relative speed with respect to the vehicle 12100).
  • a solid object that travels at a predetermined speed (for example, 0 km / h or more) in the same direction as the vehicle 12100, particularly the closest three-dimensional object on the traveling path of the vehicle 12100. it can.
  • the microcomputer 12051 can set an inter-vehicle distance to be secured in advance before the preceding vehicle, and can perform automatic brake control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like.
  • automatic brake control including follow-up stop control
  • automatic acceleration control including follow-up start control
  • cooperative control for the purpose of autonomous driving or the like autonomously traveling without depending on the operation of the driver can be performed.
  • the microcomputer 12051 converts the three-dimensional object data related to the three-dimensional object to other three-dimensional objects such as a two-wheeled vehicle, a normal vehicle, a large vehicle, a pedestrian, and a utility pole based on the distance information obtained from the imaging units 12101 to 12104. It can be classified and extracted and used for automatic avoidance of obstacles.
  • the microcomputer 12051 identifies obstacles around the vehicle 12100 as obstacles that are visible to the driver of the vehicle 12100 and obstacles that are difficult to see.
  • the microcomputer 12051 determines the collision risk indicating the risk of collision with each obstacle, and when the collision risk is equal to or higher than the set value and there is a possibility of collision, the microcomputer 12051 is connected via the audio speaker 12061 or the display unit 12062. By outputting an alarm to the driver and performing forced deceleration or avoidance steering via the drive system control unit 12010, driving assistance for collision avoidance can be performed.
  • At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
  • the microcomputer 12051 can recognize a pedestrian by determining whether a pedestrian is present in the captured images of the imaging units 12101 to 12104. Such pedestrian recognition is, for example, whether or not the user is a pedestrian by performing a pattern matching process on a sequence of feature points indicating the outline of an object and a procedure for extracting feature points in the captured images of the imaging units 12101 to 12104 as infrared cameras. It is carried out by the procedure for determining.
  • the audio image output unit 12052 When the microcomputer 12051 determines that there is a pedestrian in the captured images of the imaging units 12101 to 12104 and recognizes the pedestrian, the audio image output unit 12052 has a rectangular contour line for emphasizing the recognized pedestrian.
  • the display unit 12062 is controlled so as to be superimposed and displayed.
  • voice image output part 12052 may control the display part 12062 so that the icon etc. which show a pedestrian may be displayed on a desired position.
  • system represents the entire apparatus composed of a plurality of apparatuses.
  • this technique can also take the following structures.
  • a first address storage unit that stores an address of a pixel that starts exposure in a pixel array unit in which a plurality of pixels are arranged in an array;
  • a second address storage unit for storing the address transferred from the first address storage unit,
  • An imaging apparatus that controls the start of the exposure based on the address stored in the second address storage unit.
  • the imaging apparatus according to (1) further including a transfer control unit that controls transfer of the address from the first address storage unit to the second address storage unit.
  • the first address storage unit stores the decoded address
  • the second address storage unit stores the address transferred from the first address storage unit at a time before the exposure is started.
  • the imaging according to (1) or (2) apparatus The imaging according to (1) or (2) apparatus.
  • the storage of the address in the first address storage unit and the transfer of the address from the first address storage unit to the second address storage unit are performed within one horizontal synchronization period.
  • (1) The imaging device according to any one of (3) to (3).
  • (5) The imaging apparatus according to any one of (1) to (4), wherein the exposure is started at a predetermined timing within one horizontal synchronization period, and the predetermined timing is variable.
  • the transfer control unit monitors whether or not the first address storage unit is storing the address. When the transfer control unit is storing the address, the second address storage unit The image pickup apparatus according to (2), wherein the address is transferred to a unit.
  • the transfer control unit stores the address stored in the first address storage unit, When the first address storage unit is not storing the address and the transfer control unit is storing the address, the second address storage unit is reset (2 ). (8) The transfer control unit resets the stored address when the first address storage unit receives a signal instructing reset when the first address storage unit is storing the address.
  • the imaging device described. An imaging device that generates an image obtained by combining images captured at different exposure times, The imaging apparatus according to any one of (1) to (8), wherein a timing for starting the exposure is adjusted for each different exposure time. (10) The imaging apparatus according to (9), wherein when the exposure time and the readout period of the signal from the pixel overlap, readout is performed with the readout timing shifted.
  • a first exposure address storage unit and a second exposure address storage unit for storing addresses of pixels for starting exposure in a pixel array unit in which a plurality of pixels are arranged in an array;
  • An image pickup apparatus comprising: a read address storage unit that stores an address of a pixel to be read.
  • each of the first exposure address storage unit and the second exposure address storage unit includes a latch.
  • (13) Take images with different exposure times, The imaging device according to (11) or (12), wherein the latch includes a number of bits corresponding to the number of different exposure times.
  • a third exposure address storage unit that includes the same configuration as the first exposure address storage unit and that stores the address stored in the first exposure address storage unit (11)
  • the imaging device according to any one of (13) to (13).
  • the address stored in the first exposure address storage unit is transferred to the second exposure address storage unit and the third exposure address storage unit at a predetermined timing. The imaging device described.
  • 11 imaging device 12 pixel array unit, 13 address recorder, 14 pixel timing drive unit, 15 column signal processing unit, 16 sensor controller, 21 pixel, 22 photodiode, 23 transfer transistor, 24 floating diffusion, 25 amplification transistor, 26 selection Transistor, 27 reset transistor, 31 constant current section, 32 DA converter, 33 AD converter, 34 and 35 capacitors, 36 comparator, 37 counter, 101 shutter address storage section, 102 read address storage section, 121 first address Storage unit, 122 second address storage unit, 141 latch, 151 status monitoring unit

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)
  • Studio Devices (AREA)

Abstract

La présente invention concerne un dispositif de capture d'images qui permet de commander avec précision le temps d'exposition. Le dispositif de capture d'images selon l'invention comprend : une première unité de stockage d'adresse qui stocke l'adresse d'un pixel auquel une exposition est démarrée à l'intérieur d'une section de matrice de pixels où de multiples pixels sont agencés en une matrice ; et une deuxième unité de stockage d'adresse qui stocke une adresse transférée depuis la première unité de stockage d'adresse, le début de l'exposition étant commandé sur la base de l'adresse stockée dans la deuxième unité de stockage d'adresse. Le dispositif de capture d'images est en outre pourvu d'une unité de commande de transfert destinée à commander le transfert d'adresses de la première unité de stockage d'adresse vers la deuxième unité de stockage d'adresse. La présente invention peut être appliquée, par exemple, à un dispositif de capture d'images dans lequel une image unique est générée par synthèse des images capturées avec des temps d'exposition différents.
PCT/JP2018/021144 2017-06-16 2018-06-01 Dispositif de capture d'images Ceased WO2018230367A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-118894 2017-06-16
JP2017118894A JP2019004382A (ja) 2017-06-16 2017-06-16 撮像装置

Publications (1)

Publication Number Publication Date
WO2018230367A1 true WO2018230367A1 (fr) 2018-12-20

Family

ID=64659525

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/021144 Ceased WO2018230367A1 (fr) 2017-06-16 2018-06-01 Dispositif de capture d'images

Country Status (2)

Country Link
JP (1) JP2019004382A (fr)
WO (1) WO2018230367A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113630565A (zh) * 2021-07-09 2021-11-09 中国科学院西安光学精密机械研究所 具备机内实时图像处理功能的scmos成像电路及方法
EP4216565A1 (fr) * 2022-01-21 2023-07-26 Samsung Electronics Co., Ltd. Capteur d'image et dispositif électronique le comprenant

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7374630B2 (ja) * 2019-07-09 2023-11-07 キヤノン株式会社 撮像装置及びその駆動方法
JP2022070502A (ja) * 2020-10-27 2022-05-13 ソニーセミコンダクタソリューションズ株式会社 撮像装置及び電子機器

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010057097A (ja) * 2008-08-29 2010-03-11 Sony Corp 固体撮像素子およびカメラシステム
WO2011114450A1 (fr) * 2010-03-17 2011-09-22 キヤノン株式会社 Dispositif de prélèvement d'image et système de prélèvement d'image
WO2013145487A1 (fr) * 2012-03-27 2013-10-03 ソニー株式会社 Dispositif de traitement d'image, élément de capture d'image, procédé de traitement d'image et programme
WO2017018188A1 (fr) * 2015-07-24 2017-02-02 ソニーセミコンダクタソリューションズ株式会社 Capteur d'image et dispositif électronique

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010057097A (ja) * 2008-08-29 2010-03-11 Sony Corp 固体撮像素子およびカメラシステム
WO2011114450A1 (fr) * 2010-03-17 2011-09-22 キヤノン株式会社 Dispositif de prélèvement d'image et système de prélèvement d'image
WO2013145487A1 (fr) * 2012-03-27 2013-10-03 ソニー株式会社 Dispositif de traitement d'image, élément de capture d'image, procédé de traitement d'image et programme
WO2017018188A1 (fr) * 2015-07-24 2017-02-02 ソニーセミコンダクタソリューションズ株式会社 Capteur d'image et dispositif électronique

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113630565A (zh) * 2021-07-09 2021-11-09 中国科学院西安光学精密机械研究所 具备机内实时图像处理功能的scmos成像电路及方法
EP4216565A1 (fr) * 2022-01-21 2023-07-26 Samsung Electronics Co., Ltd. Capteur d'image et dispositif électronique le comprenant
US12200386B2 (en) 2022-01-21 2025-01-14 Samsung Electronics Co., Ltd Image sensor and electronic device comprising the same

Also Published As

Publication number Publication date
JP2019004382A (ja) 2019-01-10

Similar Documents

Publication Publication Date Title
TWI846754B (zh) 固體攝像裝置、信號處理晶片、及電子機器
CN115665570B (zh) 摄像装置
CN111149353B (zh) 固态成像元件、成像装置和控制固态成像元件的方法
WO2017163890A1 (fr) Appareil d'imagerie à semi-conducteurs, procédé de commande d'appareil d'imagerie à semi-conducteurs et dispositif électronique
JP7309713B2 (ja) コンパレータ及び撮像装置
US11284024B2 (en) Solid-state imaging device, driving method, and electronic device
WO2018230367A1 (fr) Dispositif de capture d'images
US11451725B2 (en) Solid-state imaging element, imaging apparatus, and method for controlling solid-state imaging element
TWI809037B (zh) 固體攝像元件及攝像裝置
WO2019193801A1 (fr) Élément d'imagerie à semi-conducteurs, appareil électronique et procédé de commande d'un élément d'imagerie à semi-conducteurs
WO2022172586A1 (fr) Élément d'imagerie à semi-conducteurs, dispositif d'imagerie et procédé de commande d'un élément d'imagerie à semi-conducteurs
US20240406590A1 (en) Solid-state image sensor, imaging apparatus, and control method for solid-state image sensor
WO2023166848A1 (fr) Dispositif d'imagerie, dispositif de traitement d'image, et procédé de commande de dispositif d'imagerie
WO2024009343A1 (fr) Dispositif de détection optique
JP2023005963A (ja) 固体撮像素子、電子機器、および、固体撮像素子の制御方法
US20250030959A1 (en) Solid-state imaging element, imaging device, and method for controlling solid-state imaging element
US20250203201A1 (en) Imaging apparatus and electronic device
WO2024195290A1 (fr) Élément d'imagerie à semi-conducteurs, dispositif d'imagerie, et procédé permettant de commander un élément imagerie à semi-conducteurs
WO2022091755A1 (fr) Dispositif d'imagerie et instrument électronique
WO2024157605A1 (fr) Dispositif d'imagerie et appareil électronique
KR20250163336A (ko) 고체 촬상 소자, 촬상 장치, 및 고체 촬상 소자의 제어 방법
WO2025192024A1 (fr) Capteur d'image et procédé de commande pour capteur d'image
CN118355670A (zh) 固态成像元件、成像装置和用于控制固态成像元件的方法
TW202431859A (zh) 固態攝像裝置
WO2025211006A1 (fr) Capteur d'image et dispositif d'imagerie

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18817276

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18817276

Country of ref document: EP

Kind code of ref document: A1