WO2020170565A1 - 信号処理方法および撮像装置 - Google Patents
信号処理方法および撮像装置 Download PDFInfo
- Publication number
- WO2020170565A1 WO2020170565A1 PCT/JP2019/048497 JP2019048497W WO2020170565A1 WO 2020170565 A1 WO2020170565 A1 WO 2020170565A1 JP 2019048497 W JP2019048497 W JP 2019048497W WO 2020170565 A1 WO2020170565 A1 WO 2020170565A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- pixel
- pixels
- pixel group
- data
- exposure time
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/50—Control of the SSIS exposure
- H04N25/57—Control of the dynamic range
- H04N25/58—Control of the dynamic range involving two or more exposures
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/73—Circuitry for compensating brightness variation in the scene by influencing the exposure time
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B35/00—Stereoscopic photography
- G03B35/08—Stereoscopic photography by simultaneous recording
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
- H04N23/12—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with one sensor only
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/672—Focus control based on electronic image sensor signals based on the phase difference signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/40—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/50—Control of the SSIS exposure
- H04N25/57—Control of the dynamic range
- H04N25/58—Control of the dynamic range involving two or more exposures
- H04N25/581—Control of the dynamic range involving two or more exposures acquired simultaneously
- H04N25/583—Control of the dynamic range involving two or more exposures acquired simultaneously with different integration times
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/50—Control of the SSIS exposure
- H04N25/57—Control of the dynamic range
- H04N25/58—Control of the dynamic range involving two or more exposures
- H04N25/581—Control of the dynamic range involving two or more exposures acquired simultaneously
- H04N25/585—Control of the dynamic range involving two or more exposures acquired simultaneously with pixels having different sensitivities within the sensor, e.g. fast or slow pixels or pixels having different sizes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/703—SSIS architectures incorporating pixels for producing signals other than image signals
- H04N25/704—Pixels specially adapted for focusing, e.g. phase difference pixel sets
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/76—Addressed sensors, e.g. MOS or CMOS sensors
Definitions
- the present disclosure relates to a signal processing method and an imaging device.
- An imaging device that generates an HDR (High Dynamic Range) image having a wider dynamic range than an image obtained by proper exposure is known (for example, see Patent Document 1).
- the imaging device includes a plurality of pixels each including a photoelectric conversion element and arranged in a matrix on the light receiving surface, and one pixel for each of the plurality of pixels in the plurality of pixels. It is provided with a plurality of light receiving lenses and a control unit that controls the exposure time of a plurality of pixels.
- the control unit sets the exposure times of at least two pixels of the plurality of pixels corresponding to the light receiving lenses to be the same, and the exposure times of at least two pixels of the plurality of pixels corresponding to the light receiving lenses to each other.
- the exposure time of a plurality of pixels is controlled to be different.
- the imaging device which is the first aspect of the present disclosure
- at least two pixels of the plurality of pixels corresponding to the light receiving lenses have the same exposure time, and at least the plurality of pixels corresponding to the respective light receiving lenses are at least exposed.
- the exposure times of the plurality of pixels are controlled so that the exposure times of the two pixels are different from each other.
- the phase difference data is generated for each exposure time, and the phase difference data having different exposure times and the plurality of image data having different exposure times are generated.
- HDR High Dynamic Range
- a plurality of pixels each including a photoelectric conversion element and arranged in a matrix on the light receiving surface, and one pixel for each of the plurality of pixels in the plurality of pixels are provided.
- a method for processing a signal in an image pickup apparatus including a plurality of light receiving lenses including the following two. (1) Of the plurality of pixels corresponding to each light receiving lens, at least two pixels have the same exposure time, and at least two pixels of the plurality of pixels corresponding to each light receiving lens have different exposure times. As described above, the exposure time of a plurality of pixels is controlled (2) Phase difference data is generated for each exposure time from the image data obtained by the control of the exposure time, Generating an HDR image from multiple image data with different exposure times
- the signal processing method according to the second aspect of the present disclosure, among the plurality of pixels corresponding to the light receiving lens, at least two pixels have the same exposure time, and among the plurality of pixels corresponding to each light receiving lens, The exposure times of the plurality of pixels are controlled so that the exposure times of at least two pixels are different from each other.
- the phase difference data is generated for each exposure time from the image data obtained by the exposure control, and the HDR image is generated from the plurality of phase difference data having different exposure times and the plurality of image data having different exposure times. can do.
- FIG. 3 is a diagram illustrating an example of an HDR image generation procedure in the image pickup apparatus in FIG. 1. It is a figure showing an example of the imaging procedure in the imaging device of FIG. FIG.
- FIG. 9 is a diagram illustrating a modified example of the configuration of the pixel array unit in FIG. 2.
- FIG. 9 is a diagram illustrating a modified example of the configuration of the pixel array unit in FIG. 2.
- FIG. 9 is a diagram illustrating a modified example of the configuration of the pixel array unit in FIG. 2.
- FIG. 13 is a diagram illustrating an example of a wiring layout of the pixel array section in FIG. 12. It is a figure showing an example of the direction of the phase difference detectable in the pixel array part of FIG.
- FIG. 13 is a diagram illustrating an example of an HDR image generation procedure in an imaging device including the pixel array unit in FIG. 12.
- FIG. 9 is a diagram illustrating a modified example of the configuration of the pixel array unit in FIG. 2.
- FIG. 9 is a diagram illustrating a modified example of the configuration of the pixel array unit in FIG. 2.
- FIG. 17 is a diagram illustrating an example of an HDR image generation procedure in an image pickup apparatus including the pixel array unit in FIG. 16.
- FIG. 9 is a diagram illustrating a modified example of the configuration of the pixel array unit in FIG. 2.
- FIG. 19 is a diagram illustrating an example of a procedure of generating an HDR image in an imaging device including the pixel array unit of FIG. 18.
- FIG. 9 is a diagram illustrating a modified example of the configuration of the pixel array unit in FIG. 2.
- FIG. 21 is a diagram illustrating an example of an HDR image generation procedure in an image pickup apparatus including the pixel array unit of FIG. 20.
- FIG. 9 is a diagram illustrating a modified example of the configuration of the pixel array unit in FIG. 2.
- FIG. 23 is a diagram illustrating an example of an HDR image generation procedure in an imaging device including the pixel array unit in FIG. 22.
- FIG. 9 is a diagram illustrating a modified example of the configuration of the pixel array unit in FIG. 2. It is a figure showing an example of the wiring layout of the pixel array part of FIG. It is a figure showing an example of the direction of the phase difference which can be detected by the pixel array part of FIG.
- FIG. 9 is a diagram illustrating a modified example of the configuration of the pixel array unit in FIG. 2.
- FIG. 28 is a diagram illustrating an example of a wiring layout of the pixel array unit in FIG. 27.
- FIG. 29 is a diagram illustrating an example of a procedure of generating an HDR image in an imaging device including the pixel array unit of FIGS.
- FIG. 9 is a diagram illustrating a modified example of the configuration of the pixel array unit in FIG. 2. It is a figure showing an example of the wiring layout of the pixel array part of FIG. It is a figure showing an example of the direction of the phase difference detectable by the pixel array part of FIG.
- FIG. 31 is a diagram illustrating an example of a procedure of generating an HDR image in an imaging device including the pixel array unit of FIG. 30.
- FIG. 9 is a diagram illustrating a modified example of the configuration of the pixel array unit in FIG. 2.
- FIG. 9 is a diagram illustrating a modified example of the configuration of the pixel array unit in FIG. 2.
- FIG. 35 is a diagram illustrating an example of a wiring layout of the pixel array unit in FIG. 34.
- FIG. 35 is a diagram illustrating an example of directions of phase differences that can be detected by the pixel array unit in FIG. 34.
- FIG. 35 is a diagram illustrating an example of an HDR image generation procedure in an imaging device including the pixel array unit in FIG. 34.
- FIG. 9 is a diagram illustrating a modified example of the configuration of the pixel array unit in FIG. 2.
- FIG. 39 is a diagram illustrating an example of a wiring layout of the pixel array unit in FIG. 38.
- FIG. 39 is a diagram illustrating an example of directions of phase differences that can be detected by the pixel array unit in FIG. 38.
- FIG. 39 is a diagram illustrating an example of directions of phase differences that can be detected by the pixel array unit in FIG. 38.
- FIG. 39 is a diagram illustrating an example of an HDR image generation procedure in an imaging device including the pixel array unit in FIG. 38.
- FIG. 9 is a diagram illustrating a modified example of the configuration of the pixel array unit in FIG. 2.
- FIG. 43 is a diagram illustrating an example of a wiring layout of the pixel array unit in FIG. 42.
- FIG. 43 is a diagram illustrating an example of directions of phase differences that can be detected by the pixel array unit in FIG. 42.
- FIG. 43 is a diagram illustrating an example of an HDR image generation procedure in an imaging device including the pixel array unit in FIG. 42.
- FIG. 9 is a diagram illustrating a modified example of the configuration of the pixel array unit in FIG. 2.
- FIG. 47 is a diagram illustrating an example of a wiring layout of the pixel array unit in FIG. 46.
- FIG. 47 is a diagram illustrating an example of directions of phase differences that can be detected by the pixel array unit in FIG. 46.
- FIG. 47 is a diagram illustrating an example of an HDR image generation procedure in an imaging device including the pixel array unit in FIG. 46.
- It is a block diagram showing an example of a schematic structure of a vehicle control system. It is explanatory drawing which shows an example of the installation position of a vehicle exterior information detection part and an imaging part. It is a figure which shows an example of a schematic structure of an endoscopic surgery system. It is a block diagram showing an example of functional composition of a camera head and CCU.
- Embodiment imaging device... FIGS. 1 to 9
- Modification imaging device... FIGS. 10 to 49
- Example of application Example of application to mobile unit... Figs. 50 and 51
- FIG. 1 illustrates an example of a schematic configuration of the image pickup apparatus 1.
- the imaging device 1 is an electronic device such as a digital still camera, a video camera, a smart phone, or a tablet terminal.
- the image pickup apparatus 1 includes an image pickup device 10, a calculation unit 20, a frame memory 30, a display unit 40, a storage unit 50, an operation unit 60, a power supply unit 70, and an optical system 80.
- the image sensor 10, the calculation unit 20, the frame memory 30, the display unit 40, the storage unit 50, the operation unit 60, and the power supply unit 70 are connected to each other via a bus line L.
- the optical system 80 is configured to have one or more lenses, guides light (incident light) from a subject to the image sensor 10, and forms an image on the light receiving surface of the image sensor 10.
- the image sensor 10 outputs a pixel signal (image data) corresponding to the light imaged on the light receiving surface via the optical system 80.
- the image sensor 10 is, for example, a CMOS (Complementary Metal Oxide Semiconductor) image sensor.
- the internal configuration of the image sensor 10 will be described in detail later.
- the arithmetic unit 20 is a signal processing circuit that processes pixel signals (image data) output from the image sensor 10.
- the calculation unit 20 generates an HDR image based on the pixel signal (image data). The signal processing procedure in the arithmetic unit 20 will be described in detail later.
- the frame memory 30 temporarily holds image data (for example, HDR image data) obtained by signal processing by the arithmetic unit 20 in units of frames.
- the display unit 40 includes, for example, a panel-type display device such as a liquid crystal panel or an organic EL (Electro Luminescence) panel, and displays a moving image or a still image captured by the image sensor 10.
- the storage unit 50 records image data of a moving image or a still image captured by the image sensor 10 in a recording medium such as a semiconductor memory or a hard disk.
- the operation unit 60 issues operation commands for various functions of the image pickup apparatus 1 in accordance with an operation by the user.
- the operation unit 60 drives the image pickup device 10 by outputting a drive signal for controlling the transfer operation of the image pickup device 10 in accordance with an image pickup instruction from the user, for example.
- the power supply unit 70 appropriately supplies various power supplies serving as operating power supplies of the image sensor 10, the arithmetic unit 20, the frame memory 30, the display unit 40, the storage unit 50, and the operation unit 60 to these supply targets.
- FIG. 2 illustrates an example of a schematic configuration of the image sensor 10.
- the image sensor 10 includes a pixel array unit 110 in which a plurality of sensor pixels 111 including photoelectric conversion elements are two-dimensionally arranged in a matrix (matrix).
- the sensor pixel 111 includes, for example, as shown in FIG. 3, a pixel circuit 112 that performs photoelectric conversion and a readout circuit 113 that outputs a pixel signal based on the charges output from the pixel circuit 112.
- the pixel circuit 112 includes, for example, a photodiode PD, a transfer transistor TR electrically connected to the photodiode PD, and a floating diffusion FD that temporarily holds charges output from the photodiode PD via the transfer transistor TR. And have.
- the photodiode PD performs photoelectric conversion to generate electric charges according to the amount of received light.
- the cathode of the photodiode PD is connected to the source of the transfer transistor TR, and the anode of the photodiode PD is connected to a reference potential line (eg ground).
- the drain of the transfer transistor TR is connected to the floating diffusion FD, and the gate of the transfer transistor TR is connected to the pixel drive line ctl1.
- each pixel circuit 112 the floating diffusion FD is connected to the input terminal of the corresponding readout circuit 113.
- the read circuit 113 has, for example, a reset transistor RST, a selection transistor SEL, and an amplification transistor AMP.
- the source of the reset transistor RST (the input end of the read circuit 113) is connected to the floating diffusion FD, and the drain of the reset transistor RST is connected to the power supply line VDD and the drain of the amplification transistor AMP.
- the gate of the reset transistor RST is connected to the pixel drive line ctl2.
- the source of the amplification transistor AMP is connected to the drain of the selection transistor SEL, and the gate of the amplification transistor AMP is connected to the source of the reset transistor RST.
- the source of the selection transistor SEL (the output terminal of the read circuit 113) is connected to the vertical signal line vsl, and the gate of the selection transistor SEL is connected to the pixel drive line ctl3.
- the transfer transistor TR transfers the charge of the photodiode PD to the floating diffusion FD when the transfer transistor TR is turned on.
- the reset transistor RST resets the potential of the floating diffusion FD to a predetermined potential.
- the selection transistor SEL controls the output timing of the pixel signal from the reading circuit 113.
- the amplification transistor AMP generates, as a pixel signal, a signal having a voltage corresponding to the level of the charge held in the floating diffusion FD. That is, the amplification transistor AMP generates, as a pixel signal, a signal having a voltage according to the amount of light received by the sensor pixel 111.
- the amplification transistor AMP constitutes a source follower type amplifier, and outputs a pixel signal having a voltage corresponding to the level of the charge generated in the photodiode PD.
- the selection transistor SEL is turned on, the amplification transistor AMP amplifies the potential of the floating diffusion FD and outputs a voltage corresponding to the potential to the column signal processing circuit 122 (described later) via the vertical signal line vsl. ..
- the selection transistor SEL may be provided between the power supply line VDD and the amplification transistor AMP.
- the drain of the reset transistor RST is connected to the power line VDD and the drain of the selection transistor SEL.
- the source of the selection transistor SEL is connected to the drain of the amplification transistor AMP, and the gate of the selection transistor SEL is connected to the pixel drive line ctl3.
- the source of the amplification transistor AMP (the output terminal of the read circuit 113) is connected to the vertical signal line vsl, and the gate of the amplification transistor AMP is connected to the source of the reset transistor RST.
- the image sensor 10 further includes a logic circuit 120 that processes pixel signals.
- the logic circuit 120 has, for example, a vertical drive circuit 121, a column signal processing circuit 122, a horizontal drive circuit 123, and a system control circuit 124.
- the logic circuit 120 outputs a digital value for each sensor pixel 111 to the outside.
- the system control circuit 124 Based on the master clock, the system control circuit 124 generates a clock signal, a control signal, or the like that serves as a reference for operations of the vertical drive circuit 121, the column signal processing circuit 122, the horizontal drive circuit 123, etc., and the vertical drive circuit 121, the column It is given to the signal processing circuit 122, the horizontal drive circuit 123, and the like.
- the vertical drive circuit 121 is configured by, for example, a shift register, and a plurality of pixel drive lines ctl (for example, ctl1, ctl2, ctl3, ctlM (described later), ctlL (described later), ctlS (described later))
- the row scanning of the sensor pixel 111 is controlled.
- the column signal processing circuit 122 performs correlated double sampling (CDS) processing on the pixel signal output from each sensor pixel 111 in the row selected by the vertical drive circuit 121.
- the column signal processing circuit 122 extracts the signal level of the pixel signal by performing CDS processing, for example, and holds pixel data according to the amount of light received by each sensor pixel 111.
- the column signal processing circuit 122 has, for example, a plurality of ADCs (analog-digital conversion circuits) provided one for each vertical signal line vsl.
- the ADC converts, for example, an analog pixel signal output from each sensor pixel 111 for each column into a digital signal and outputs the digital signal.
- the ADC converts the potential (analog signal) of the vertical signal line vsl into a digital signal by changing the ramp waveform voltage (ramp voltage) and the counter value in a one-to-one correspondence, for example.
- the ADC converts, for example, a change in lamp voltage into a change in time, and converts the time into a digital value by counting the time in a certain cycle (clock).
- the horizontal drive circuit 123 is composed of a shift register or the like, and controls the column address and column scan of the ADC in the column signal processing circuit 122. Under the control of the horizontal drive circuit 123, the N-bit digital signal AD-converted by each of the ADCs is sequentially read out to the horizontal output line and output as imaging data via the horizontal output line.
- FIG. 4 shows a configuration example of the pixel array unit 110.
- the pixel array section 110 has a plurality of color filters CF and a plurality of light receiving lenses OCL on the light receiving surface 110A.
- the plurality of color filters CF are provided one for each of the sensor pixels 111 (photodiodes PD) of 2 rows ⁇ 2 columns (that is, four).
- the plurality of light receiving lenses OCL are provided one for each color filter CF. That is, the plurality of light receiving lenses OCL are also provided for each of the sensor pixels 111 of 2 rows ⁇ 2 columns (that is, four).
- the light incident on each light receiving lens OCL is condensed by the light receiving lens OCL, passes through the corresponding color filter CF, and then enters the corresponding sensor pixel 111 of 2 rows ⁇ 2 columns.
- the sensor pixels 111 of 2 rows ⁇ 2 columns corresponding to the light receiving lens OCL are referred to as a monochrome sensor pixel group P1.
- the plurality of color filters CF include a plurality of color filters CFr that transmits light in a red wavelength range, a plurality of color filters CFg that transmit light in a green wavelength range, and a plurality of color filters that transmit light in a blue wavelength range. It is composed of a color filter CFb.
- the plurality of color filters CF have a Bayer array on the light receiving surface. Since the plurality of color filters CF are arranged in the Bayer array on the light receiving surface, the phase difference data can be periodically acquired in the row direction and the column direction.
- the color filter CFr, the color filter CFg, and the color filter CFb are arranged on the light receiving surface 110A at a ratio of 1:2:1.
- the 2-row ⁇ 2-column monochromatic sensor pixel group P1 in the Bayer array is referred to as a 3-color sensor pixel group P2.
- each sensor pixel 111 (photodiode PD) is controlled by the system control circuit 124.
- FIG. 4 shows an example of the exposure time of each sensor pixel 111 when controlled by the system control circuit 124.
- the exposure time of the two sensor pixels 111 (photodiodes PD) is set to “Middle” and the remaining two sensor pixels 111 (photodiodes PD) are One exposure time is set to "Short", and the remaining one exposure time is set to "Long”.
- the monochromatic sensor pixel group P1 includes three types of sensor pixels 111 (photodiodes PD) having different exposure times, and further includes two sensor pixels 111 (photodiodes PD) having the same exposure times. ..
- the system control circuit 124 causes the exposure times of the three sensor pixels 111 (photodiodes PD) of the single-color sensor pixel group P1 to be different from each other and the exposure times of the two sensor pixels 111 (photodiode PD) to be mutually different.
- the exposure times of the plurality of sensor pixels 111 (photodiodes PD) are controlled to be the same.
- each monochromatic sensor pixel group P1 the exposure time of the two sensor pixels 111 (photodiodes PD) arranged in the upward-sloping direction is set to “Middle” as shown in FIG. That is, in each monochromatic sensor pixel group P1, the exposure times of the two sensor pixels 111 (photodiodes PD) arranged in the upward-sloping direction are set to be equal to each other. Further, in each monochromatic sensor pixel group P1, the exposure time of the lower right sensor pixel 111 (photodiode PD) is set to "Short” as shown in FIG. Further, in each monochromatic sensor pixel group P1, the exposure time of the upper left sensor pixel 111 (photodiode PD) is set to "long” as shown in FIG.
- the pixel drive line ctlM is connected to the sensor pixel 111 whose exposure time is set to “Middle”. That is, of the two pixel drive lines ctlM assigned to the monochrome sensor pixel group P1, one pixel drive line ctlM is connected to the sensor pixel 111 on the upper right of the monochrome sensor pixel group P1 and the other pixel drive line ctlM. Is connected to the lower left sensor pixel 111 of the monochrome sensor pixel group P1. A pixel drive line ctlS is connected to each sensor pixel 111 whose exposure time is set to “short”.
- the pixel drive line ctlS is connected to the lower right sensor pixel 111 of the monochrome sensor pixel group P1.
- the pixel drive line ctlL is connected to the sensor pixel 111 whose exposure time is set to “long”. That is, the pixel drive line ctlL is connected to the upper left sensor pixel 111 of the monochrome sensor pixel group P1.
- the pixel array section 110 has a pixel drive line ctlL and a pixel drive line ctlM at a position corresponding to the upper stage of each monochrome sensor pixel group P1. Further, the pixel array section 110 has a pixel drive line ctlM and a pixel drive line ctlS at locations corresponding to the lower stages of the respective monochrome sensor pixel groups P1.
- the system control circuit 124 outputs a control signal to the pixel drive line ctlM so that the exposure time is “Middle”, so that the exposure time of each sensor pixel 111 connected to the pixel drive line ctlM. Control to “Middle”.
- the system control circuit 124 outputs a control signal to the pixel drive line ctlS so that the exposure time is “short”, whereby the exposure time of each sensor pixel 111 connected to the pixel drive line ctlS. Is controlled to be “Short”.
- the system control circuit 124 outputs a control signal to the pixel drive line ctlL so that the exposure time becomes “long”, so that the exposure time of each sensor pixel 111 connected to the pixel drive line ctlL. Is controlled to “Long”. In this way, the system control circuit 124 controls the exposure time for each sensor pixel 111.
- the image sensor 10 outputs the image data Ia obtained under such control.
- the image data Ia includes pixel data Sig of X rows ⁇ Y columns corresponding to the sensor pixels 111 of X rows ⁇ Y columns in the pixel array unit 110.
- the pixel data Sig of 2 rows ⁇ 2 columns corresponding to each monochromatic sensor pixel group P1 includes two sensor pixels whose exposure time is set to “Middle”. Two pixel data Sig1 corresponding to 111, one pixel data Sig2 corresponding to the sensor pixel 111 whose exposure time is set to “Short”, and an exposure time set to “Long”
- One pixel data Sig3 corresponding to the sensor pixel 111 is included.
- the pixel data Sig of 2 rows ⁇ 2 columns corresponding to each single color sensor pixel group P1 includes two pixel data Sig1 having the same exposure time. .. Further, in the pixel data Sig of 2 rows ⁇ 2 columns corresponding to each monochromatic sensor pixel group P1 among the pixel data Sig of X rows ⁇ Y columns, the difference between two pixel data Sig1 having the same exposure time is obtained. It is possible to obtain a phase difference in the upward rising direction on the light receiving surface 110A. From the above, it can be seen that the pixel array section 110 is configured to obtain phase difference data in one direction (in the direction of rising to the right) from each monochromatic sensor pixel group P1.
- each monochromatic sensor pixel group P1 the exposure time of the two sensor pixels 111 (photodiodes PD) arranged in the downward-sloping direction is set to “Middle” as shown in FIG. May be. That is, in each monochromatic sensor pixel group P1, the exposure times of the two sensor pixels 111 (photodiodes PD) arranged in the downward-sloping direction may be set to be equal to each other.
- the pixel drive line ctlM is connected to the sensor pixel 111 whose exposure time is set to “Middle”. That is, of the two pixel drive lines ctlM assigned to the monochrome sensor pixel group P1, one pixel drive line ctlM is connected to the upper left sensor pixel 111 of the monochrome sensor pixel group P1 and the other pixel drive line ctlM. Is connected to the lower right sensor pixel 111 of the monochrome sensor pixel group P1. A pixel drive line ctlS is connected to each sensor pixel 111 whose exposure time is set to “short”.
- the pixel drive line ctlS is connected to the lower left sensor pixel 111 of the monochrome sensor pixel group P1.
- the pixel drive line ctlL is connected to the sensor pixel 111 whose exposure time is set to “long”. That is, the pixel drive line ctlL is connected to the sensor pixel 111 on the upper right of the monochrome sensor pixel group P1.
- the pixel array section 110 has a pixel drive line ctlM and a pixel drive line ctlL at a position corresponding to the upper stage of each monochrome sensor pixel group P1. Further, the pixel array section 110 has a pixel drive line ctlS and a pixel drive line ctlM at a position corresponding to the lower stage of each monochrome sensor pixel group P1.
- the difference between two pixel data Sig1 having the same exposure time is calculated.
- the pixel array unit 110 is configured to obtain the phase difference data in one direction (downward to the right) from each monochrome sensor pixel group P1.
- the exposure time of the two sensor pixels 111 (photodiodes PD) arranged in the left-right direction may be set to “Middle” as shown in FIG. .. That is, in each monochromatic sensor pixel group P1, the exposure times of the two sensor pixels 111 (photodiodes PD) arranged in the left-right direction may be set to be equal to each other.
- the exposure time of the upper two sensor pixels 111 (photodiodes PD) is set to “Middle”, and the lower two sensor pixels 111 (photodiode PD) are set. 2), one of the exposure times is set to “Short” and the other exposure time is set to “Long”.
- the pixel drive line ctlM is connected to the sensor pixel 111 whose exposure time is set to “Middle”. That is, of the two pixel drive lines ctlM assigned to the monochrome sensor pixel group P1, one pixel drive line ctlM is connected to the upper left sensor pixel 111 of the monochrome sensor pixel group P1 and the other pixel drive line ctlM. Is connected to the sensor pixel 111 on the upper right of the monochrome sensor pixel group P1. A pixel drive line ctlS is connected to each sensor pixel 111 whose exposure time is set to “short”.
- the pixel drive line ctlS is connected to the lower right sensor pixel 111 of the monochrome sensor pixel group P1.
- the pixel drive line ctlL is connected to the sensor pixel 111 whose exposure time is set to “long”. That is, the pixel drive line ctlL is connected to the lower left sensor pixel 111 of the monochrome sensor pixel group P1.
- the pixel array section 110 has two pixel drive lines ctlM at locations corresponding to the upper stages of the respective monochrome sensor pixel groups P1.
- the pixel array unit 110 has a pixel drive line ctlL and a pixel drive line ctlS at a position corresponding to the lower stage of each monochrome sensor pixel group P1.
- the difference between two pixel data Sig1 having the same exposure time is calculated.
- the pixel array unit 110 is configured to obtain phase difference data in one direction (left-right direction) from each monochrome sensor pixel group P1.
- the exposure time of the two lower sensor pixels 111 may be set to “Middle”.
- the exposure time of one of the upper two sensor pixels 111 is set to “Short” and the exposure time of the other is “Long”. Is set to.
- the pixel array section 110 has two pixel drive lines ctlM at locations corresponding to the lower tiers of each monochrome sensor pixel group P1. Further, the pixel array section 110 has a pixel drive line ctlL and a pixel drive line ctlS at a position corresponding to the upper stage of each monochrome sensor pixel group P1.
- the difference between two pixel data Sig1 having the same exposure time is calculated.
- the pixel array unit 110 is configured to obtain phase difference data in one direction (left-right direction) from each monochrome sensor pixel group P1.
- the exposure time of the two sensor pixels 111 (photodiodes PD) arranged in the vertical direction may be set to “Middle” as shown in FIG. .. That is, in each monochromatic sensor pixel group P1, the exposure times of the two sensor pixels 111 (photodiodes PD) arranged in the vertical direction may be set equal to each other.
- the exposure time of the two photodiodes PD on the left side is set to “Middle”, and one of the two sensor pixels 111 (photodiode PD) on the right side is exposed.
- An example is shown in which the time is set to “Short” and the other exposure time is set to “Long”.
- each monochromatic sensor pixel group P1 the upper exposure time of the two right sensor pixels 111 (photodiodes PD) is set to “Long” and the lower exposure time is set to “Short”.
- the two pixel drive lines ctlM assigned to the monochrome sensor pixel group P1 one pixel drive line ctlM is connected to the upper left sensor pixel 111 of the monochrome sensor pixel group P1 and the other pixel drive line ctlM.
- ctlM is connected to the lower left sensor pixel 111 of the monochrome sensor pixel group P1.
- the pixel drive line ctlS is connected to the lower right sensor pixel 111 of the monochrome sensor pixel group P1.
- the pixel drive line ctlL is connected to the sensor pixel 111 on the upper right of the monochrome sensor pixel group P1.
- the pixel array section 110 has a pixel drive line ctlM and a pixel drive line ctlL at a position corresponding to the upper stage of each monochrome sensor pixel group P1. Further, the pixel array section 110 has a pixel drive line ctlM and a pixel drive line ctlS at locations corresponding to the lower stages of the respective monochrome sensor pixel groups P1.
- the upper exposure time of the two sensor pixels 111 (photodiodes PD) on the right side is set to “Short”, and the lower exposure time is “Long”. Is set to.
- one pixel drive line ctlM is connected to the upper left sensor pixel 111 of the monochrome sensor pixel group P1 and the other pixel drive line ctlM.
- ctlM is connected to the lower left sensor pixel 111 of the monochrome sensor pixel group P1.
- the pixel drive line ctlS is connected to the sensor pixel 111 on the upper right of the monochrome sensor pixel group P1.
- the pixel drive line ctlL is connected to the lower right sensor pixel 111 of the monochrome sensor pixel group P1.
- the pixel array section 110 has a pixel drive line ctlM and a pixel drive line ctlS at a position corresponding to the upper stage of each monochrome sensor pixel group P1, and a pixel drive line is provided at a position corresponding to the lower stage of each monochrome sensor pixel group P1. It has a ctlM and a pixel drive line ctlL.
- the difference between two pixel data Sig1 having the same exposure time is calculated.
- the pixel array section 110 is configured to obtain phase difference data in one direction (vertical direction) from each monochromatic sensor pixel group P1.
- the exposure time of the two sensor pixels 111 (photodiodes PD) on the right side may be set to “Middle”.
- the exposure time of one of the left two sensor pixels 111 (photodiodes PD) is set to “Short” and the exposure time of the other is “Long”. Is set to.
- the pixel data Sig of 2 rows ⁇ 2 columns corresponding to each monochromatic sensor pixel group P1 among the pixel data Sig of X rows ⁇ Y columns the difference between two pixel data Sig1 having the same exposure time is calculated.
- the pixel array section 110 is configured to obtain phase difference data in one direction (vertical direction) from each single color sensor pixel group P1.
- FIG. 8 shows an example of a signal processing procedure in the arithmetic circuit 20.
- the arithmetic circuit 20 generates HDR image data Ib based on the image data Ia obtained by the image sensor 10.
- the arithmetic circuit 20 first decomposes the image data Ia for each exposure time (step S101). Specifically, the arithmetic circuit 20 determines the image data Ia as data having an exposure time of “Middle” (image data Im) and data having an exposure time of “Long” (image data Il). , The exposure time is “short” data (image data Is).
- the arithmetic circuit 20 generates the phase difference data Pd1 based on the image data Im (step S102). Specifically, the arithmetic circuit 20 derives the difference value of the two pixel data Sig1 corresponding to each single color sensor pixel group P1 in the image data Im, and generates the phase difference data Pd1 from the derived difference value. The arithmetic circuit 20 also generates the phase difference data Pd2 based on the image data Il and Im (step S102). Specifically, the arithmetic circuit 20 obtains the image data Il and an image obtained by multiplying the image data Im by the exposure time ratio of the exposure time “Long” and the exposure time “Middle”.
- the difference value from the data Im′ is derived, and the phase difference data Pd2 is generated from the derived difference value.
- the arithmetic circuit 20 also generates the phase difference data Pd3 based on the image data Im and Is (step S102). Specifically, the arithmetic circuit 20 stores the image data Im and the image data Is′ obtained by multiplying the image data Is by the exposure time ratio of the exposure time “Middle” and the exposure time “Short”. The difference value is derived, and the phase difference data Pd3 is generated from the derived difference value.
- the arithmetic circuit 20 converts the phase difference data Pd1 into the level data Da regarding the phase difference (step S103).
- the level data Da is, for example, data represented by a value within a range from a lower limit value (for example, 0 bit) to an upper limit value (for example, 128 bit).
- the arithmetic circuit 20 converts, for example, a numerical value below a predetermined range in the phase difference data Pd1 into a lower limit value (for example, 0 bit).
- the arithmetic circuit 20 converts a numerical value exceeding a predetermined range in the phase difference data Pd1 into an upper limit value (for example, 128 bits).
- the arithmetic circuit 20, for example, in the phase difference data Pd1 converts a numerical value within a predetermined range into a value within a range of 1 bit to 127 bits according to the size of the numerical value.
- the arithmetic circuit 20 converts the phase difference data Pd2 and Pd3 into level data Db for the moving body (step S104).
- the level data Db is, for example, data represented by a value within a range from a lower limit value (for example, 0 bit) to an upper limit value (for example, 128 bit).
- the arithmetic circuit 20 generates level data Db based on the noise level data of the image sensor 10 (noise data) and the phase difference data Pd2 and Pd3.
- the arithmetic circuit 20 detects a portion having a large phase difference from the obtained level data Da (step S105). Further, the arithmetic circuit 20 detects the presence or absence of a moving body from the obtained level data Db (step S106). Finally, the arithmetic circuit 20 generates the HDR image data Ib from the image data Im, Il, Is, the presence/absence of a phase difference, and the presence/absence of a moving object (step S107). In this way, the HDR image data Ib is generated.
- FIG. 9 shows an example of a flowchart of the image pickup operation in the image pickup apparatus 1.
- the user operates the operation unit 60 to instruct the imaging device 1 to start imaging (step S201). Then, the operation unit 60 transmits an image pickup command to the image pickup element 10 (step S202).
- the image pickup device 10 (specifically, the system control circuit 124) receives the image pickup command, the image pickup device 10 executes image pickup by a predetermined image pickup method (step S203).
- the system control circuit 124 performs exposure control for each sensor pixel 111 connected to the pixel drive line ctlM so that the exposure time is “Middle”. For example, the system control circuit 124 further performs exposure control for each sensor pixel 111 connected to the pixel drive line ctlS so that the exposure time becomes “short”. For example, the system control circuit 124 further performs exposure control on each sensor pixel 111 connected to the pixel drive line ctlL so that the exposure time becomes “long”. In this way, the system control circuit 124 controls the exposure time for each sensor pixel 111.
- the image sensor 10 outputs the image data Ia having the number of pixels of X rows and Y columns, which is obtained under such control, to the arithmetic circuit 20.
- the arithmetic circuit 20 performs predetermined signal processing (for example, generation of HDR image data Ib) based on the image data Ia input from the image sensor 10 (step S204).
- the arithmetic circuit 20 holds the image data (for example, HDR image data Ib) obtained by predetermined signal processing in the frame memory 30, and the frame memory 30 stores the held image data (for example, HDR image data Ib) in the storage unit 50. (Step S205). In this way, the image pickup by the image pickup apparatus 1 is performed.
- the image sensor 10 is configured so as to obtain phase difference data in one direction (upward rightward direction, downward rightward direction, left-right direction, or up-down direction). Thereby, the presence/absence of a phase difference and the presence/absence of a moving object can be determined in one direction.
- the pixel array section 110 is configured to obtain phase difference data in one direction from each monochromatic sensor pixel group P1.
- the pixel array section 110 may be configured to obtain phase difference data in two directions from each of the three-color sensor pixel groups P2.
- FIG. 10 shows a configuration example of the pixel array section 110 according to this modification.
- the pixel array unit 110 is configured to obtain phase difference data in the upward-rightward and downward-rightward directions from each of the three-color sensor pixel groups P2.
- the single-color sensor pixel group P1 corresponding to the color filters CFr and CFb has the same configuration as the single-color sensor pixel group P1 according to the above embodiment.
- the single-color sensor pixel group P1 (hereinafter, referred to as “single-color sensor pixel group Pa”) corresponding to one color filter CFg is also the single-color sensor pixel group P1 according to the above-described embodiment. It has the same configuration.
- the single-color sensor pixel group P1 (hereinafter, referred to as “single-color sensor pixel group Pb”) corresponding to the other color filter CFg is the single-color sensor pixel group according to the above-described embodiment. It has a different configuration from P1
- each monochromatic sensor pixel group Pa the exposure time of the two sensor pixels 111 (photodiodes PD) arranged in the direction of rising to the right is “Middle” as shown in FIG. Is set to. That is, in each monochromatic sensor pixel group Pa, the exposure times of the two sensor pixels 111 (photodiodes PD) arranged in the upward-sloping direction are set to be equal to each other. Further, in each monochromatic sensor pixel group Pa, the exposure time of the sensor pixel 111 (photodiode PD) at the lower right is set to "Short” as shown in FIG. In each monochrome sensor pixel group Pa, the exposure time of the upper left sensor pixel 111 (photodiode PD) is set to "long" as shown in FIG.
- the exposure time of the two sensor pixels 111 (photodiodes PD) arranged in the downward-sloping direction is “Middle” as shown in FIG. Is set to. That is, in each monochromatic sensor pixel group Pb, the exposure times of the two sensor pixels 111 (photodiodes PD) arranged in the downward-sloping direction are set to be equal to each other. In each monochrome sensor pixel group Pb, the exposure time of the lower left sensor pixel 111 (photodiode PD) is set to "Short" as shown in FIG.
- each monochromatic sensor pixel group Pb the exposure time of the sensor pixel 111 (photodiode PD) on the upper right is set to "long" as shown in FIG.
- the system control circuit 124 makes the exposure times of the three sensor pixels 111 (photodiodes PD) of each monochrome sensor pixel group Pb different from each other and the exposure times of the two sensor pixels 111 (photodiode PD).
- the exposure times of the plurality of sensor pixels 111 (photodiodes PD) are controlled so that they are the same.
- one pixel drive line ctlM is connected to the sensor pixel 111 on the upper right of the monochrome sensor pixel group Pa, and the other pixel drive line ctlM is It is connected to the lower left sensor pixel 111 of the single color sensor pixel group Pa.
- the pixel drive line ctlS is connected to the lower right sensor pixel 111 of the monochrome sensor pixel group Pa.
- the pixel drive line ctlL is connected to the upper left sensor pixel 111 of the monochrome sensor pixel group Pa.
- one pixel drive line ctlM is connected to the upper left sensor pixel 111 of the single color sensor pixel group Pb, and the other pixel drive line ctlM is It is connected to the lower right sensor pixel 111 of the monochrome sensor pixel group Pb.
- the pixel drive line ctlS is connected to the lower left sensor pixel 111 of the monochrome sensor pixel group Pb.
- the pixel drive line ctlL is connected to the sensor pixel 111 on the upper right of the monochrome sensor pixel group Pb.
- the pixel array section 110 has a pixel drive line ctlL and a pixel drive line ctlM at a position corresponding to the upper stage of each monochrome sensor pixel group P1. Further, the pixel array section 110 has a pixel drive line ctlM and a pixel drive line ctlS at locations corresponding to the lower stages of the respective monochrome sensor pixel groups P1. As a result, it is possible to discriminate the presence/absence of a phase difference and the presence/absence of a moving body in two directions (a direction in which the shoulder rises upward and a direction in which the shoulder falls downward).
- the single-color sensor pixel group Pa may be arranged in the upper right and the single-color sensor pixel group Pb may be arranged in the lower left. Further, in each of the three-color sensor pixel groups P2, the single-color sensor pixel group Pa may be arranged at the lower left and the single-color sensor pixel group Pb may be arranged at the upper right. Further, the plurality of single color sensor pixel groups Pa and the plurality of single color sensor pixel groups Pb may be alternately arranged in two directions within the light-receiving surface 110A (in the upward rightward direction and the downward rightward direction). ..
- the pixel array unit 110 may be configured to obtain phase difference data in the left-right direction and the vertical direction from each three-color sensor pixel group P2, as shown in FIG. 11, for example. ..
- the pixel array unit 110 is in the left-right direction from the single color sensor pixel group P1 (hereinafter, referred to as “single color sensor pixel group Pc”) corresponding to one color filter CFg in each three color sensor pixel group P2.
- the phase difference data may be obtained.
- the pixel array section 110 is positioned vertically from the single-color sensor pixel group P1 (hereinafter, referred to as “single-color sensor pixel group Pd”) corresponding to the other color filter CFg. It may be configured to obtain phase difference data.
- the exposure time of the two sensor pixels 111 (photodiodes PD) arranged in the upper stage is set to “Middle” as shown in FIG. That is, in each monochromatic sensor pixel group Pc, the exposure times of the two sensor pixels 111 (photodiodes PD) arranged in the upper row are set to be equal to each other. Further, in each monochromatic sensor pixel group Pc, as shown in FIG. 11, the exposure time of one sensor pixel 111 (photodiode PD) of the two sensor pixels 111 (photodiode PD) arranged in the lower stage is " The exposure time of the other sensor pixel 111 (photodiode PD) is set to "Long".
- each monochromatic sensor pixel group Pd the exposure time of the two sensor pixels 111 (photodiodes PD) arranged on the left side is set to “Middle” as shown in FIG. That is, in each monochromatic sensor pixel group Pd, the exposure times of the two sensor pixels 111 (photodiodes PD) arranged on the left side are set to be equal to each other. Further, in each monochromatic sensor pixel group Pd, as shown in FIG. 11, the exposure time of the sensor pixel 111 (photodiode PD) arranged in the upper right is set to “long”, and the sensor pixel in the lower right is The exposure time of 111 (photodiode PD) is set to "Short".
- one pixel drive line ctlM is connected to the upper left sensor pixel 111 of the monochromatic sensor pixel group Pc, and the other pixel drive line ctlM is It is connected to the sensor pixel 111 on the upper right of the monochrome sensor pixel group Pc.
- the pixel drive line ctlS is connected to the lower right sensor pixel 111 of the monochrome sensor pixel group Pc.
- the pixel drive line ctlL is connected to the lower left sensor pixel 111 of the monochrome sensor pixel group Pc.
- one pixel drive line ctlM is connected to the upper left sensor pixel 111 of the monochrome sensor pixel group Pd, and the other pixel drive line ctlM is It is connected to the lower left sensor pixel 111 of the single color sensor pixel group Pd.
- the pixel drive line ctlS is connected to the lower right sensor pixel 111 of the monochrome sensor pixel group Pd.
- the pixel drive line ctlL is connected to the sensor pixel 111 on the upper right of the monochrome sensor pixel group Pd.
- the exposure time of the two sensor pixels 111 (photodiodes PD) arranged in the lower stage may be set to “Middle”. That is, in each monochromatic sensor pixel group Pc, the exposure times of the two sensor pixels 111 (photodiodes PD) arranged in the lower stage may be set to be equal to each other. Further, in each monochromatic sensor pixel group Pd, the exposure time of the two sensor pixels 111 (photodiodes PD) arranged on the right side may be set to “Middle”. That is, in each monochromatic sensor pixel group Pd, the exposure times of the two sensor pixels 111 (photodiodes PD) arranged on the right side may be set to be equal to each other.
- each monochromatic sensor pixel group Pc when the exposure time of the two sensor pixels 111 (photodiodes PD) arranged in the upper stage is set to “Middle”, the pixel array unit 110 determines that each monochromatic Two pixel drive lines ctlM are provided at locations corresponding to the upper stage of the sensor pixel group Pc, and pixel drive lines ctlS and ctlL are provided at locations corresponding to the lower stage of each monochromatic sensor pixel group Pc. There is.
- each monochromatic sensor pixel group Pc when the exposure time of the two sensor pixels 111 (photodiodes PD) arranged in the lower stage is set to “Middle”, the pixel array unit 110 There are two pixel drive lines ctlM at the locations corresponding to the bottom of each monochrome sensor pixel group Pc, and pixel drive lines ctlS and ctlL are provided at the locations corresponding to the top of each monochrome sensor pixel group Pc. doing.
- each monochromatic sensor pixel group Pd when the exposure time of the two sensor pixels 111 (photodiodes PD) arranged on the left side is set to “Middle”, the pixel array unit 110 determines that each monochromatic One pixel drive line ctlM is provided at the location corresponding to the upper stage of the sensor pixel group Pd, and one pixel drive line ctlM is provided at the location corresponding to the lower stage of each single color sensor pixel group Pd.
- each monochromatic sensor pixel group Pd when the exposure time of the two sensor pixels 111 (photodiodes PD) arranged on the right side is set to “Middle”, the pixel array unit 110 determines that each monochromatic One pixel drive line ctlM is provided at the location corresponding to the upper stage of the sensor pixel group Pd, and one pixel drive line ctlM is provided at the location corresponding to the lower stage of each single color sensor pixel group Pd.
- the image sensor 10 is configured to obtain phase difference data in two directions (left and right and up and down). This makes it possible to determine the presence/absence of a phase difference and the presence/absence of a moving body in two directions.
- the pixel array section 110 is configured such that phase difference data in two directions can be obtained from each of the three-color sensor pixel groups P2.
- the pixel array unit 110 outputs phase difference data in three directions from the two-row ⁇ two-column three-color sensor pixel group P2 (hereinafter, referred to as “three-color sensor pixel group P3”). It may be configured to be obtained.
- FIG. 12 shows a configuration example of the pixel array section 110 according to this modification.
- FIG. 13 shows an example of a wiring layout of the pixel array section 110 of FIG.
- FIG. 14 shows an example of the direction of the phase difference that can be detected by the pixel array section 110 of FIG.
- the pixel array unit 110 is configured to obtain phase difference data in the upward-sloping direction, the vertical direction, and the horizontal direction from each three-color sensor pixel group P3.
- the single-color sensor pixel group P1 corresponding to the color filters CFr and CFb (hereinafter, referred to as “single-color sensor pixel group Ph”) is the single-color sensor pixel according to the above-described embodiment. It has the same configuration as the group P1.
- the single color sensor pixel group P1 (hereinafter, also referred to as “single color sensor pixel group Pe”) corresponding to each color filter CFg.
- the configuration is the same as that of the monochromatic sensor pixel group P1 according to the above embodiment.
- the single-color sensor pixel group P1 (hereinafter, referred to as “single-color sensor pixel group Pf”) corresponding to each color filter CFg.
- the configuration is different from that of the monochromatic sensor pixel group P1 according to the above embodiment.
- the single-color sensor pixel group P1 (hereinafter, referred to as “single-color sensor pixel group Pg”) corresponding to each color filter CFg.
- the configuration is different from that of the monochromatic sensor pixel group P1 according to the above embodiment.
- the exposure time of the two sensor pixels 111 is set to “Short”, and the remaining two sensor pixels 111 (photodiodes PD) are Exposure time is set to “Middle”, and the remaining one exposure time is set to “Long”. That is, the monochromatic sensor pixel group Pf includes three types of sensor pixels 111 (photodiodes PD) having different exposure times, and further includes two sensor pixels 111 (photodiodes PD) having the same exposure times. ..
- the system control circuit 124 makes the exposure times of the three sensor pixels 111 (photodiodes PD) of the monochromatic sensor pixel group Pf different from each other and the exposure times of the two sensor pixels 111 (photodiode PD) mutually.
- the exposure times of the plurality of sensor pixels 111 (photodiodes PD) are controlled to be the same.
- each monochromatic sensor pixel group Pf the exposure time of the two sensor pixels 111 (photodiodes PD) arranged in the lower left and right directions is set to “Short” as shown in FIG. That is, in each monochromatic sensor pixel group Pf, the exposure times of the two sensor pixels 111 (photodiodes PD) arranged in the lower left and right directions are set to be equal to each other. Further, in each monochromatic sensor pixel group Pf, the exposure time of the upper right sensor pixel 111 (photodiode PD) is set to “Middle” as shown in FIG. Further, in each monochrome sensor pixel group Pf, the exposure time of the upper left sensor pixel 111 (photodiode PD) is set to "long” as shown in FIG.
- one pixel drive line ctlS is connected to the lower left sensor pixel 111 of the monochromatic sensor pixel group Pf, and the other pixel drive line ctlS is It is connected to the lower right sensor pixel 111 of the monochrome sensor pixel group Pf.
- the pixel drive line ctlM is connected to the sensor pixel 111 on the upper right of the monochrome sensor pixel group Pf.
- the pixel drive line ctlL is connected to the upper left sensor pixel 111 of the monochrome sensor pixel group Pf.
- the pixel array section 110 has a pixel drive line ctlL and a pixel drive line ctlM at a position corresponding to the upper stage of each monochrome sensor pixel group Pf. Further, the pixel array section 110 has two pixel drive lines ctlS at locations corresponding to the lower stages of the respective monochrome sensor pixel groups Pf.
- each monochrome sensor pixel group Pf one sensor pixel 111 (photodiode PD) whose exposure time is set to “Middle” is arranged at the upper left, and the exposure time is set to “Long”.
- One sensor pixel 111 (photodiode PD) may be arranged at the upper right.
- two sensor pixels 111 (photodiodes PD) whose exposure time is set to “Short” may be arranged in the upper stage and arranged in the left-right direction. ..
- the exposure time of one of the lower two sensor pixels 111 is set to “Middle”, and the exposure time of the other is “Long”. Is set to.
- the pixel array section 110 has two pixel drive lines ctlS at locations corresponding to the upper stages of the respective monochromatic sensor pixel groups Pf.
- the pixel array unit 110 has a pixel drive line ctlL and a pixel drive line ctlM at a position corresponding to the lower stage of each monochrome sensor pixel group Pf.
- the exposure time of the two sensor pixels 111 is set to “long”, and one of the remaining two sensor pixels 111 (photodiodes PD) is set.
- the exposure time is set to “Middle” and the remaining one exposure time is set to “Short”. That is, the monochromatic sensor pixel group Pg includes three types of sensor pixels 111 (photodiodes PD) having different exposure times, and further includes two sensor pixels 111 (photodiodes PD) having the same exposure times. ..
- the system control circuit 124 makes the exposure times of the three sensor pixels 111 (photodiodes PD) of the single color sensor pixel group Pg different from each other and the exposure times of the two sensor pixels 111 (photodiode PD) to each other.
- the exposure times of the plurality of sensor pixels 111 (photodiodes PD) are controlled to be the same.
- each monochromatic sensor pixel group Pg the two sensor pixels 111 (photodiodes PD) whose exposure time is set to “Long” are arranged on the left side as shown in FIG. 12, and It is arranged vertically. That is, in each monochromatic sensor pixel group Pg, two sensor pixels 111 (photodiodes PD) having the same exposure time are arranged on the left side and are arranged in the vertical direction, as shown in FIG. 12, for example. .. Further, in each monochromatic sensor pixel group Pg, one sensor pixel 111 (photodiode PD) whose exposure time is set to “Middle” is arranged at the upper right as shown in FIG. 12, for example. .. Further, in each monochromatic sensor pixel group Pg, one sensor pixel 111 (photodiode PD) whose exposure time is set to “Short” is arranged at the lower right as shown in FIG. 12, for example. It
- one pixel drive line ctlL is connected to the upper left sensor pixel 111 of the monochromatic sensor pixel group Pg, and the other pixel drive line ctlL is It is connected to the lower left sensor pixel 111 of the monochrome sensor pixel group Pg.
- the pixel drive line ctlM is connected to the sensor pixel 111 on the upper right of the monochrome sensor pixel group Pg.
- the pixel drive line ctlS is connected to the lower right sensor pixel 111 of the monochrome sensor pixel group Pg.
- the pixel array unit 110 has a pixel drive line ctlL and a pixel drive line ctlM at a position corresponding to the upper stage of each monochrome sensor pixel group Pg.
- the pixel array unit 110 has a pixel drive line ctlL and a pixel drive line ctlS at a position corresponding to the lower stage of each monochrome sensor pixel group Pf.
- each monochromatic sensor pixel group Pg one sensor pixel 111 (photodiode PD) whose exposure time is set to “Middle” is arranged at the lower right, and the exposure time is set to “Short”.
- One set sensor pixel 111 (photodiode PD) may be arranged at the upper right.
- the pixel array section 110 has a pixel drive line ctlL and a pixel drive line ctlS at a position corresponding to the upper stage of each monochrome sensor pixel group Pg.
- the pixel array unit 110 has a pixel drive line ctlL and a pixel drive line ctlM at a position corresponding to the lower stage of each monochrome sensor pixel group Pf.
- the two sensor pixels 111 (photodiodes PD) whose exposure time is set to “long” may be arranged on the right side and arranged vertically. ..
- the exposure time of one of the left two sensor pixels 111 (photodiodes PD) is set to “Middle” and the exposure time of the other is “Short”. Is set to.
- the image data Ia includes pixel data Sig of X rows ⁇ Y columns corresponding to the sensor pixels 111 of X rows ⁇ Y columns in the pixel array unit 110.
- the pixel data Sig of X rows ⁇ Y columns the pixel data Sig of 2 rows ⁇ 2 columns corresponding to the single color sensor pixel group Ph has two sensor pixels 111 whose exposure time is set to “Middle”. Of the sensor pixel 111 whose exposure time is set to “Short” and one pixel data Sig2 of which the exposure time is set to “Long”.
- One pixel data Sig3 corresponding to the pixel 111 is included.
- the pixel data Sig of 2 rows ⁇ 2 columns corresponding to the monochromatic sensor pixel group Pe has two sensors whose exposure time is set to “Middle”. Two pixel data Sig1 corresponding to the pixel 111, one pixel data Sig2 corresponding to the sensor pixel 111 whose exposure time is set to “Short”, and an exposure time set to “Long” One pixel data Sig3 corresponding to the sensor pixel 111 is included.
- the pixel data Sig of 2 rows ⁇ 2 columns corresponding to the monochrome sensor pixel group Pf includes two sensors whose exposure time is set to “Short”. Two pieces of pixel data Sig2 corresponding to the pixel 111, one piece of pixel data Sig1 corresponding to the sensor pixel 111 whose exposure time is set to “Middle”, and an exposure time set to “Long” One pixel data Sig3 corresponding to the sensor pixel 111 is included. Further, among the pixel data Sig of X rows ⁇ Y columns, the pixel data Sig of 2 rows ⁇ 2 columns corresponding to the single color sensor pixel group Pg includes two sensors whose exposure time is set to “Long”.
- One pixel data Sig2 corresponding to the sensor pixel 111 is included.
- FIG. 15 shows an example of a signal processing procedure in the arithmetic circuit 20 in the present modification.
- the arithmetic circuit 20 generates HDR image data Ib based on the image data Ia obtained by the image sensor 10.
- the arithmetic circuit 20 first decomposes the image data Ia for each exposure time (step S301). Specifically, the arithmetic circuit 20 determines the image data Ia as data having an exposure time of “Middle” (image data Im) and data having an exposure time of “Long” (image data Il). , The exposure time is “short” data (image data Is).
- the arithmetic circuit 20 generates the phase difference data Pd11 based on the image data Im (step S302). Specifically, the arithmetic circuit 20 derives the difference value between the two pixel data Sig1 corresponding to each monochromatic sensor pixel group Pe in the image data Im, and from the derived difference value, the first direction ( Phase difference data Pd11 in the upward rising direction) is generated. Further, the phase difference data Pd12 is generated based on the image data Is (step S302).
- the arithmetic circuit 20 derives a difference value between the two pixel data Sig2 corresponding to each monochromatic sensor pixel group Pf in the image data Is, and from the derived difference value, the second direction (in the light receiving surface 110A) ( Phase difference data Pd12 in the horizontal direction) is generated. Further, the phase difference data Pd13 is generated based on the image data Il (step S302). Specifically, the arithmetic circuit 20 derives a difference value between the two pixel data Sig3 corresponding to each single color sensor pixel group Pg in the image data Il, and from the derived difference value, the third direction (in the light receiving surface 110A). Phase difference data Pd13 in the horizontal direction) is generated.
- the arithmetic circuit 20 also generates the phase difference data Pd14 based on the image data Il and Im (step S302). Specifically, the arithmetic circuit 20 obtains an image data Il and an image obtained by multiplying the image data Im by the exposure time ratio of the exposure time “Long” and the exposure time “Middle”. The difference value with respect to the data Im′ is derived, and the phase difference data Pd14 is generated from the derived difference value. The arithmetic circuit 20 also generates the phase difference data Pd15 based on the image data Im and Is (step S302).
- the arithmetic circuit 20 stores the image data Im and the image data Is′ obtained by multiplying the image data Is by the exposure time ratio of the exposure time “Middle” and the exposure time “Short”.
- the difference value is derived, and the phase difference data Pd15 is generated from the derived difference value.
- the arithmetic circuit 20 generates the level data Da for the phase difference based on the phase difference data Pd11, Pd12, Pd13 (step S303).
- the level data Da is, for example, data represented by a value within a range from a lower limit value (for example, 0 bit) to an upper limit value (for example, 128 bit).
- the arithmetic circuit 20 converts, for example, in the phase difference data Pd11, Pd12, and Pd13, a numerical value below a predetermined range to a lower limit value (for example, 0 bit).
- the arithmetic circuit 20 converts, for example, in the phase difference data Pd11, Pd12, and Pd13, a numerical value exceeding a predetermined range into an upper limit value (for example, 128 bits). For example, the arithmetic circuit 20 converts a numerical value within a predetermined range in the phase difference data Pd11, Pd12, Pd13 into a value within a range of 1 bit to 127 bits according to the size of the numerical value.
- the arithmetic circuit 20 converts the phase difference data Pd14 and Pd15 into level data Db for the moving body (step S104).
- the level data Db is, for example, data represented by a value within a range from a lower limit value (for example, 0 bit) to an upper limit value (for example, 128 bit).
- the arithmetic circuit 20 generates level data Db based on the noise level data (noise data) of the image sensor 10 and the phase difference data Pd14 and Pd15.
- the arithmetic circuit 20 detects a portion having a large phase difference from the obtained level data Da (step S305). Further, the arithmetic circuit 20 detects the presence or absence of a moving body from the obtained level data Db (step S306). Finally, the arithmetic circuit 20 generates the HDR image data Ib from the image data Im, Il, Is, the presence/absence of a phase difference, and the presence/absence of a moving object (step S307). In this way, the HDR image data Ib is generated.
- the image sensor 10 receives phase difference data in three directions from the three-color sensor pixel group P2 (hereinafter, referred to as “three-color sensor pixel group P3”) of 2 rows ⁇ 2 columns. Is configured to be obtained. This makes it possible to determine the presence/absence of a phase difference and the presence/absence of a moving body in the three directions.
- the pixel array section 110 is configured to obtain the phase difference data in one direction from the pixel data Sig of 2 rows ⁇ 2 columns corresponding to the monochrome sensor pixel group P1.
- the pixel array section 110 is configured to obtain phase difference data in two directions from the pixel data Sig of 2 rows ⁇ 2 columns corresponding to the monochromatic sensor pixel group P1. May be.
- FIG. 16 illustrates a configuration example of the pixel array unit 110 according to this modification.
- the pixel array unit 110 is configured to obtain phase difference data in the upward-rightward and downward-rightward directions from each of the three-color sensor pixel groups P2.
- the monochromatic sensor pixel group P1 (monochromatic sensor pixel group Ph) corresponding to the color filters CFr and CFb has the same configuration as the monochromatic sensor pixel group P1 according to the above-described embodiment. Has become.
- the single-color sensor pixel group P1 (single-color sensor pixel groups Pa and Pb) corresponding to each color filter CFg has a different configuration from the single-color sensor pixel group P1 according to the above-described embodiment. There is.
- the exposure time of the two sensor pixels 111 is set to “Short”, and the remaining two sensor pixels 111 (photodiodes PD) have the same exposure time.
- the exposure time is set to "Middle”.
- the system control circuit 124 makes the exposure times of the two sensor pixels 111 (photodiodes PD) of the monochromatic sensor pixel group Pb different from each other and the exposure times of the two sensor pixels 111 (photodiode PD) to each other.
- the exposure times of the plurality of sensor pixels 111 (photodiodes PD) are controlled to be the same.
- each monochromatic sensor pixel group Pb the two sensor pixels 111 (photodiodes PD) whose exposure time is set to “short” are, for example, as shown in FIG. Arranged in. That is, in each monochromatic sensor pixel group Pb, the two sensor pixels 111 (photodiodes PD) having the same exposure time are arranged in a downward-sloping direction as shown in FIG. 16, for example. Further, in each monochromatic sensor pixel group Pb, the two sensor pixels 111 (photodiodes PD) whose exposure time is set to “Middle” are arrayed in a sloping upward direction as shown in FIG. 16, for example. To be done. That is, in each monochromatic sensor pixel group Pb, the two sensor pixels 111 (photodiodes PD) having the same exposure time are also arranged in the upward-sloping direction as shown in FIG. 16, for example.
- one pixel drive line ctlS is connected to the upper left sensor pixel 111 of the monochromatic sensor pixel group Pb, and the other pixel drive line ctlS is It is connected to the lower right sensor pixel 111 of the monochrome sensor pixel group Pb.
- one pixel drive line ctlM is connected to the lower left sensor pixel 111 of the monochromatic sensor pixel group Pb, and the other pixel drive line ctlM is It is connected to the sensor pixel 111 on the upper right of the monochrome sensor pixel group Pb.
- the pixel array section 110 has a pixel drive line ctlM and a pixel drive line ctlS at a position corresponding to the upper stage of each monochrome sensor pixel group Pb.
- the pixel array section 110 has a pixel drive line ctlM and a pixel drive line ctlS at a position corresponding to the lower stage of each monochrome sensor pixel group Pb.
- the exposure time of one sensor pixel 111 (photodiode PD) included in the upper stage of the single color sensor pixel group Ph arranged on the right or left side of the single color sensor pixel group Pb is set to “Long”.
- the pixel array section 110 further has a pixel drive line ctlL at a position corresponding to the upper stage of each monochrome sensor pixel group Pb.
- the exposure time of one sensor pixel 111 (photodiode PD) included in the lower stage of the single color sensor pixel group Ph arranged on the right or left side of the single color sensor pixel group Pb is set to “long”.
- the pixel array section 110 further has a pixel drive line ctlL at a position corresponding to the lower stage of each monochrome sensor pixel group Pb.
- the exposure time of the two sensor pixels 111 is set to “long”, and the exposure time of the remaining two sensor pixels 111 (photodiode PD) is set. Set to “Middle”.
- the system control circuit 124 makes the exposure times of the two sensor pixels 111 (photodiodes PD) of the monochrome sensor pixel group Pa different from each other, and makes the exposure times of the two sensor pixels 111 (photodiode PD) mutually different.
- the exposure times of the plurality of sensor pixels 111 (photodiodes PD) are controlled to be the same.
- the two sensor pixels 111 (photodiodes PD) whose exposure time is set to “Long” are, for example, as shown in FIG. Arranged in. That is, in each monochromatic sensor pixel group Pa, the two sensor pixels 111 (photodiodes PD) having the same exposure time are arranged in a downward-sloping direction as shown in FIG. 16, for example. Further, in each monochromatic sensor pixel group Pa, the two sensor pixels 111 (photodiodes PD) whose exposure time is set to “Middle” are arranged in an upward slope as shown in FIG. 16, for example. To be done. That is, in each monochromatic sensor pixel group Pa, the two sensor pixels 111 (photodiodes PD) having the same exposure time are also arrayed in the upward-sloping direction as shown in FIG. 16, for example.
- one pixel drive line ctlL is connected to the upper left sensor pixel 111 of the monochrome sensor pixel group Pa, and the other pixel drive line ctlL is It is connected to the lower right sensor pixel 111 of the single color sensor pixel group Pa.
- one pixel drive line ctlM is connected to the lower left sensor pixel 111 of the monochromatic sensor pixel group Pa, and the other pixel drive line ctlM is It is connected to the sensor pixel 111 on the upper right of the monochromatic sensor pixel group Pa.
- the pixel array section 110 has a pixel drive line ctlL and a pixel drive line ctlM at a position corresponding to the upper stage of each monochrome sensor pixel group Pa. Further, the pixel array section 110 has a pixel drive line ctlM and a pixel drive line ctlL at locations corresponding to the lower stages of the respective monochrome sensor pixel groups Pa.
- the exposure time of one sensor pixel 111 (photodiode PD) included in the lower row of the monochromatic sensor pixel group Ph arranged on the right side or the left side of the monochromatic sensor pixel group Pa is set to “Short”.
- the pixel array section 110 further has a pixel drive line ctlS at a position corresponding to the lower stage of each monochrome sensor pixel group Pa.
- the exposure time of one sensor pixel 111 (photodiode PD) included in the upper row of the monochromatic sensor pixel group Ph arranged on the right or left of the monochromatic sensor pixel group Pa is set to “short”.
- the pixel array section 110 further has a pixel drive line ctlS at a position corresponding to the upper stage of each monochromatic sensor pixel group Pa.
- the image data Ia includes pixel data Sig of X rows ⁇ Y columns corresponding to the sensor pixels 111 of X rows ⁇ Y columns in the pixel array unit 110.
- the pixel data Sig of X rows ⁇ Y columns the pixel data Sig of 2 rows ⁇ 2 columns corresponding to the single color sensor pixel group Ph has two sensor pixels 111 whose exposure time is set to “Middle”. Of the sensor pixel 111 whose exposure time is set to “Short” and one pixel data Sig2 of which the exposure time is set to “Long”.
- One pixel data Sig3 corresponding to the pixel 111 is included.
- the pixel data Sig of 2 rows ⁇ 2 columns corresponding to the single color sensor pixel group Pa includes two sensors whose exposure time is set to “Middle”. Two pieces of pixel data Sig1 corresponding to the pixel 111 and two pieces of pixel data Sig2 corresponding to the two sensor pixels 111 whose exposure time is set to “short” are included.
- the pixel data Sig of X rows ⁇ Y columns the pixel data Sig of 2 rows ⁇ 2 columns corresponding to the monochrome sensor pixel group Pb includes two sensors whose exposure time is set to “Middle”.
- the two pixel data Sig1 corresponding to the pixel 111 and the two pixel data Sig3 corresponding to the two sensor pixels 111 whose exposure time is set to “long” are included.
- FIG. 17 shows an example of a signal processing procedure in the arithmetic circuit 20 in this modification.
- the arithmetic circuit 20 generates HDR image data Ib based on the image data Ia obtained by the image sensor 10.
- the arithmetic circuit 20 first decomposes the image data Ia for each exposure time (step S401). Specifically, the arithmetic circuit 20 determines the image data Ia as data having an exposure time of “Middle” (image data Im) and data having an exposure time of “Long” (image data Il). , The exposure time is “short” data (image data Is).
- the arithmetic circuit 20 generates the phase difference data Pd21 based on the image data Im (step S402). Specifically, the arithmetic circuit 20 derives the difference value between the two pixel data Sig1 corresponding to each of the monochrome sensor pixel groups Pa and Pb in the image data Im, and based on the derived difference value, the first difference on the light receiving surface 110A. The phase difference data Pd21 in the direction (upward direction) is generated. Further, the phase difference data Pd22 is generated based on the image data Is (step S402).
- the arithmetic circuit 20 derives the difference value of the two pixel data Sig2 corresponding to each monochromatic sensor pixel group Pa in the image data Is, and from the derived difference value, the second direction (in the second direction on the light receiving surface 110A).
- the phase difference data Pd22 in the downward-sloping direction) is generated.
- the phase difference data Pd23 is generated based on the image data Il (step S402).
- the arithmetic circuit 20 derives the difference value of the two pixel data Sig3 corresponding to each monochromatic sensor pixel group Pb in the image data Il, and from the derived difference value, the second direction on the light receiving surface 110A ( The phase difference data Pd23 in the downward-sloping direction) is generated.
- the arithmetic circuit 20 also generates the phase difference data Pd24 based on the image data Il and Im (step S402). Specifically, the arithmetic circuit 20 obtains an image data Il and an image obtained by multiplying the image data Im by the exposure time ratio of the exposure time “Long” and the exposure time “Middle”. The difference value from the data Im′ is derived, and the phase difference data Pd24 is generated from the derived difference value. The arithmetic circuit 20 also generates the phase difference data Pd25 based on the image data Im and Is (step S402).
- the arithmetic circuit 20 stores the image data Im and the image data Is′ obtained by multiplying the image data Is by the exposure time ratio of the exposure time “Middle” and the exposure time “Short”.
- the difference value is derived, and the phase difference data Pd25 is generated from the derived difference value.
- the arithmetic circuit 20 generates the level data Da regarding the phase difference based on the phase difference data Pd21, Pd22, Pd23 (step S403).
- the level data Da is, for example, data represented by a value within a range from a lower limit value (for example, 0 bit) to an upper limit value (for example, 128 bit).
- the arithmetic circuit 20 converts, for example, in the phase difference data Pd21, Pd22, and Pd23, a numerical value below a predetermined range to a lower limit value (for example, 0 bit).
- the arithmetic circuit 20 converts a numerical value exceeding a predetermined range in the phase difference data Pd21, Pd22, Pd23 into an upper limit value (for example, 128 bits).
- the arithmetic circuit 20 converts the phase difference data Pd24 and Pd25 into level data Db for the moving body (step S404).
- the level data Db is, for example, data represented by a value within a range from a lower limit value (for example, 0 bit) to an upper limit value (for example, 128 bit).
- the arithmetic circuit 20 generates level data Db based on the noise level data of the image sensor 10 (noise data) and the phase difference data Pd24 and Pd25.
- the arithmetic circuit 20 detects a portion having a large phase difference from the obtained level data Da (step S405). Further, the arithmetic circuit 20 detects the presence or absence of a moving body from the obtained level data Db (step S406). Finally, the arithmetic circuit 20 generates the HDR image data Ib from the image data Im, Il, Is, the presence/absence of a phase difference, and the presence/absence of a moving object (step S407). In this way, the HDR image data Ib is generated.
- the image sensor 10 is configured to obtain phase difference data in two directions from the pixel data Sig of 2 rows ⁇ 2 columns corresponding to the monochromatic sensor pixel group P1. This makes it possible to determine the presence/absence of a phase difference and the presence/absence of a moving body in two directions.
- the pixel array unit 110 is configured to obtain one phase difference data in one direction from the pixel data Sig of 2 rows ⁇ 2 columns corresponding to the monochrome sensor pixel group P1. It was However, in the above-described embodiment and its modified example, the pixel array unit 110 is configured to obtain two phase difference data in one direction from the pixel data Sig of 2 rows ⁇ 2 columns corresponding to the monochrome sensor pixel group P1. It may be configured.
- FIG. 18 shows a configuration example of the pixel array section 110 according to this modification.
- the pixel array section 110 is configured so that phase difference data in the vertical direction and the horizontal direction can be obtained from each of the three-color sensor pixel groups P2.
- the monochromatic sensor pixel group P1 (monochromatic sensor pixel group Ph) corresponding to the color filters CFr and CFb has the same configuration as the monochromatic sensor pixel group P1 according to the above-described embodiment. Has become.
- the single-color sensor pixel group P1 (single-color sensor pixel groups Pa and Pb) corresponding to each color filter CFg has a different configuration from the single-color sensor pixel group P1 according to the above-described embodiment. There is.
- the exposure time of the two sensor pixels 111 is set to “Short”, and the remaining two sensor pixels 111 (photodiodes PD) have the same exposure time.
- the exposure time is set to "Middle”.
- the system control circuit 124 makes the exposure times of the two sensor pixels 111 (photodiodes PD) of the monochromatic sensor pixel group Pb different from each other and the exposure times of the two sensor pixels 111 (photodiode PD) to each other.
- the exposure times of the plurality of sensor pixels 111 (photodiodes PD) are controlled to be the same.
- each monochromatic sensor pixel group Pb the two sensor pixels 111 (photodiodes PD) whose exposure time is set to “short” are arranged in the upper stage as shown in FIG. 18, for example. And they are arranged in the left-right direction. That is, in each monochromatic sensor pixel group Pb, two sensor pixels 111 (photodiodes PD) having the same exposure time are arranged in the upper stage and arranged in the left-right direction, as shown in FIG. 18, for example. Further, in each monochromatic sensor pixel group Pb, the two sensor pixels 111 (photodiodes PD) whose exposure time is set to “Middle” are arranged in the lower stage as shown in FIG. 18, for example.
- each monochromatic sensor pixel group Pb two sensor pixels 111 (photodiodes PD) having the same exposure time are arranged in the lower stage and arranged in the left-right direction, as shown in FIG. 18, for example.
- one pixel drive line ctlS is connected to the upper left sensor pixel 111 of the monochromatic sensor pixel group Pb, and the other pixel drive line ctlS is It is connected to the sensor pixel 111 on the upper right of the monochrome sensor pixel group Pb.
- one pixel drive line ctlM is connected to the lower left sensor pixel 111 of the monochromatic sensor pixel group Pb, and the other pixel drive line ctlM is It is connected to the lower right sensor pixel 111 of the monochrome sensor pixel group Pb.
- the pixel array section 110 has two pixel drive lines ctlS at a position corresponding to the upper stage of each monochromatic sensor pixel group Pb, and two pixel drive lines ctlS at a position corresponding to the lower stage of each monochromatic sensor pixel group Pb.
- Pixel drive line ctlM the exposure time of one sensor pixel 111 (photodiode PD) included in the upper stage of the single color sensor pixel group Ph arranged on the right or left side of the single color sensor pixel group Pb is set to “Long”.
- the pixel array section 110 further has a pixel drive line ctlL at a position corresponding to the upper stage of each monochrome sensor pixel group Pb.
- the exposure time of one sensor pixel 111 (photodiode PD) included in the lower stage of the single color sensor pixel group Ph arranged on the right or left side of the single color sensor pixel group Pb is set to “long”.
- the pixel array section 110 further has a pixel drive line ctlL at a position corresponding to the lower stage of each monochrome sensor pixel group Pb.
- the exposure time of the lower two sensor pixels 111 is set to “Short”, and the upper two sensor pixels 111 (photodiode PD) are exposed.
- the time may be set to "Middle”.
- the exposure time of the two sensor pixels 111 is set to “long”, and the exposure time of the remaining two sensor pixels 111 (photodiode PD) is set. Set to “Middle”.
- the system control circuit 124 makes the exposure times of the two sensor pixels 111 (photodiodes PD) of the monochrome sensor pixel group Pa different from each other, and makes the exposure times of the two sensor pixels 111 (photodiode PD) mutually different.
- the exposure times of the plurality of sensor pixels 111 (photodiodes PD) are controlled to be the same.
- each monochromatic sensor pixel group Pa the two sensor pixels 111 (photodiodes PD) whose exposure time is set to “long” are arranged on the left side as shown in FIG. 18, for example. And they are arranged vertically. That is, in each monochromatic sensor pixel group Pa, the two sensor pixels 111 (photodiodes PD) having the same exposure time are arranged on the left side and are arranged vertically as shown in FIG. 18, for example. Further, in each monochromatic sensor pixel group Pa, the two sensor pixels 111 (photodiodes PD) whose exposure time is set to “Middle” are arranged on the right side, as shown in FIG. And they are arranged vertically. That is, in each monochromatic sensor pixel group Pa, the two sensor pixels 111 (photodiodes PD) having the same exposure time are arranged on the right side and arranged in the vertical direction as shown in FIG. 18, for example.
- the pixel array section 110 has a pixel drive line ctlL and a pixel drive line ctlM at a position corresponding to the upper stage of each monochrome sensor pixel group Pa. Further, the pixel array section 110 has a pixel drive line ctlL and a pixel drive line ctlM at a position corresponding to the lower stage of each monochrome sensor pixel group Pa.
- the exposure time of one sensor pixel 111 (photodiode PD) included in the lower row of the monochromatic sensor pixel group Ph arranged on the right side or the left side of the monochromatic sensor pixel group Pa is set to “Short”.
- the pixel array section 110 further has a pixel drive line ctlS at a position corresponding to the lower stage of each monochrome sensor pixel group Pa.
- the exposure time of one sensor pixel 111 (photodiode PD) included in the upper row of the monochromatic sensor pixel group Ph arranged on the right or left of the monochromatic sensor pixel group Pa is set to “short”.
- the pixel array section 110 further has a pixel drive line ctlS at a position corresponding to the upper stage of each monochromatic sensor pixel group Pa.
- the exposure time of the two sensor pixels 111 (photodiodes PD) on the right side is set to “long”, and the exposure of the two sensor pixels 111 (photodiode PD) on the left side is performed.
- the time may be set to "Middle”.
- the image data Ia includes pixel data Sig of X rows ⁇ Y columns corresponding to the sensor pixels 111 of X rows ⁇ Y columns in the pixel array unit 110.
- the pixel data Sig of X rows ⁇ Y columns the pixel data Sig of 2 rows ⁇ 2 columns corresponding to the single color sensor pixel group Ph has two sensor pixels 111 whose exposure time is set to “Middle”. Of the sensor pixel 111 whose exposure time is set to “Short” and one pixel data Sig2 of which the exposure time is set to “Long”.
- One pixel data Sig3 corresponding to the pixel 111 is included.
- the two sensors whose exposure time is set to “Middle” Two pieces of pixel data Sig1 corresponding to the pixel 111 and one piece of pixel data Sig2 corresponding to the two sensor pixels 111 whose exposure time is set to “short” are included.
- the pixel data Sig of 2 rows ⁇ Y columns the pixel data Sig of 2 rows ⁇ 2 columns corresponding to the monochrome sensor pixel group Pb includes two sensors whose exposure time is set to “Middle”. Two pieces of pixel data Sig1 corresponding to the pixel 111 and one piece of pixel data Sig3 corresponding to the two sensor pixels 111 whose exposure time is set to “long” are included.
- FIG. 19 shows an example of a signal processing procedure in the arithmetic circuit 20 in this modification.
- the arithmetic circuit 20 generates HDR image data Ib based on the image data Ia obtained by the image sensor 10.
- the arithmetic circuit 20 first decomposes the image data Ia for each exposure time (step S501). Specifically, the arithmetic circuit 20 determines the image data Ia as data having an exposure time of “Middle” (image data Im) and data having an exposure time of “Long” (image data Il). , The exposure time is “short” data (image data Is).
- the arithmetic circuit 20 generates the phase difference data Pd31 and Pd32 based on the image data Im (step S502). Specifically, the arithmetic circuit 20 derives the difference value between the two pixel data Sig1 corresponding to each monochrome sensor pixel group Pa in the image data Im, and from the derived difference value, the first direction ( Phase difference data Pd31 in the horizontal direction) is generated. Further, the arithmetic circuit 20 derives the difference value of the two pixel data Sig1 corresponding to each monochromatic sensor pixel group Pb in the image data Im, and from the derived difference value, the second direction (vertical direction) on the light receiving surface 110A. The phase difference data Pd32 is generated.
- the arithmetic circuit 20 also generates the phase difference data Pd33 based on the image data Is (step S502). Specifically, the arithmetic circuit 20 derives the difference value of the two pixel data Sig2 corresponding to each monochromatic sensor pixel group Pa in the image data Is, and from the derived difference value, the first direction on the light receiving surface 110A ( Phase difference data Pd33 in the horizontal direction) is generated. Further, the phase difference data Pd34 is generated based on the image data Il (step S502).
- the arithmetic circuit 20 derives the difference value of the two pixel data Sig3 corresponding to each monochromatic sensor pixel group Pb in the image data Il, and from the derived difference value, the second direction on the light receiving surface 110A ( Phase difference data Pd34 in the vertical direction) is generated.
- the arithmetic circuit 20 also generates the phase difference data Pd35 based on the image data Il and Im (step S502). Specifically, the arithmetic circuit 20 obtains the image data Il and an image obtained by multiplying the image data Im by the exposure time ratio of the exposure time “Long” and the exposure time “Middle”. The difference value from the data Im′ is derived, and the phase difference data Pd35 is generated from the derived difference value. The arithmetic circuit 20 also generates the phase difference data Pd36 based on the image data Im and Is (step S502).
- the arithmetic circuit 20 stores the image data Im and the image data Is′ obtained by multiplying the image data Is by the exposure time ratio of the exposure time “Middle” and the exposure time “Short”.
- the difference value is derived, and the phase difference data Pd36 is generated from the derived difference value.
- the arithmetic circuit 20 generates the level data Da for the phase difference based on the phase difference data Pd31, Pd32, Pd33, Pd34 (step S503).
- the level data Da is, for example, data represented by a value within a range from a lower limit value (for example, 0 bit) to an upper limit value (for example, 128 bit).
- the arithmetic circuit 20 converts, for example, in the phase difference data Pd31, Pd32, Pd33, Pd34, a numerical value below a predetermined range to a lower limit value (for example, 0 bit).
- the arithmetic circuit 20 converts, for example, in the phase difference data Pd31, Pd32, Pd33, Pd34, a numerical value exceeding a predetermined range into an upper limit value (for example, 128 bits). For example, the arithmetic circuit 20 converts a numerical value within a predetermined range in the phase difference data Pd31, Pd32, Pd33, Pd34 into a value within a range of 1 bit to 127 bits according to the size of the numerical value. ..
- the arithmetic circuit 20 converts the phase difference data Pd35 and Pd36 into level data Db for the moving body (step S504).
- the level data Db is, for example, data represented by a value within a range from a lower limit value (for example, 0 bit) to an upper limit value (for example, 128 bit).
- the arithmetic circuit 20 generates level data Db based on the noise level data (noise data) of the image sensor 10 and the phase difference data Pd35 and Pd36.
- the arithmetic circuit 20 detects a portion having a large phase difference from the obtained level data Da (step S505). Further, the arithmetic circuit 20 detects the presence or absence of a moving body from the obtained level data Db (step S506). Finally, the arithmetic circuit 20 generates the HDR image data Ib from the image data Im, Il, Is, the presence/absence of a phase difference, and the presence/absence of a moving body (step S507). In this way, the HDR image data Ib is generated.
- the pixel array unit 110 is configured such that two pieces of phase difference data in the first direction are obtained from the pixel data Sig of 2 rows ⁇ 2 columns corresponding to the monochrome sensor pixel group Pa. Two pieces of phase difference data in the second direction are obtained from the pixel data Sig of 2 rows ⁇ 2 columns corresponding to the group Pb. This makes it possible to determine the presence/absence of a phase difference and the presence/absence of a moving body in two directions.
- the pixel array unit 110 is configured to obtain phase difference data in four directions from the pixel data Sig of 8 rows ⁇ 8 columns corresponding to each three-color sensor pixel group P3.
- the pixel array unit 110 calculates the phase difference data in four directions from the pixel data Sig of 2 rows ⁇ 2 columns corresponding to one single color sensor pixel group Pa of each of the three color sensor pixel groups P3. May be obtained.
- FIG. 20 shows a configuration example of the pixel array section 110 according to this modification.
- the pixel array unit 110 includes pixel data Sig of 2 rows ⁇ 2 columns corresponding to one monochromatic sensor pixel group P1 (hereinafter, “monochromatic sensor pixel group Pi”) of the three color sensor pixel groups P3. From, the phase difference data in the direction of rising to the right, the direction of falling to the right, the left-right direction, and the up-down direction can be obtained.
- the monochromatic sensor pixel group Pi is the monochromatic sensor pixel group P1 corresponding to the color filter CFg.
- each single-color sensor pixel group P1 excluding the single-color sensor pixel group Pi has the same configuration as the single-color sensor pixel group P1 according to the above-described embodiment.
- each monochromatic sensor pixel group Pi has a different configuration from the monochromatic sensor pixel group P1 according to the above embodiment. Specifically, in each monochrome sensor pixel group Pi, the exposure time of the four sensor pixels 111 (photodiodes PD) is set to "Middle".
- the system control circuit 124 causes the sensor pixels 111 (photodiodes PD) in the single-color sensor pixel group Pi included in each of the three-color sensor pixel groups P3 to have the same exposure time.
- the exposure time of 111 (photodiode PD) is controlled.
- one pixel drive line ctlM is connected to the two sensor pixels 111 in the upper stage of the monochromatic sensor pixel group Pi, and the other pixel drive line ctlM. Are connected to the two lower sensor pixels 111 of the monochrome sensor pixel group Pi.
- the exposure time of one sensor pixel 111 (photodiode PD) included in the upper stage of the single color sensor pixel group P1 arranged on the right or left side of the single color sensor pixel group Pi is set to "Long".
- the pixel array section 110 further has a pixel drive line ctlL at a position corresponding to the upper stage of each monochromatic sensor pixel group Pi.
- the exposure time of one sensor pixel 111 (photodiode PD) included in the upper stage of the monochromatic sensor pixel group P1 arranged on the right side or the left side of the monochromatic sensor pixel group Pi is set to "Short".
- the pixel array section 110 further has a pixel drive line ctlS at a position corresponding to the upper stage of each monochrome sensor pixel group Pi.
- the exposure time of one sensor pixel 111 (photodiode PD) included in the lower stage of the single color sensor pixel group P1 arranged on the right or left side of the single color sensor pixel group Pi is set to "Short".
- the pixel array section 110 further has a pixel drive line ctlS at a position corresponding to the lower stage of each monochrome sensor pixel group Pi.
- the exposure time of one sensor pixel 111 (photodiode PD) included in the lower stage of the monochromatic sensor pixel group P1 arranged on the right side or the left side of the monochromatic sensor pixel group Pi is set to "Long".
- the pixel array section 110 further has a pixel drive line ctlL at a position corresponding to the lower stage of each monochromatic sensor pixel group Pi.
- the image data Ia includes pixel data Sig of X rows ⁇ Y columns corresponding to the sensor pixels 111 of X rows ⁇ Y columns in the pixel array unit 110.
- the exposure time is “Middle” in the pixel data Sig of 2 rows ⁇ 2 columns corresponding to each monochromatic sensor pixel group P1 excluding the monochromatic sensor pixel group Pd.
- Two pixel data Sig1 corresponding to the set two sensor pixels 111, one pixel data Sig2 corresponding to the sensor pixel 111 whose exposure time is set to “Short”, and an exposure time “long ( One pixel data Sig3 corresponding to the sensor pixel 111 set to “Long)” is included.
- the pixel data Sig of 2 rows ⁇ 2 columns corresponding to the single color sensor pixel group Pd has four sensors whose exposure time is set to “Middle”. Four pieces of pixel data Sig1 corresponding to the pixel 111 are included.
- FIG. 21 shows an example of a signal processing procedure in the arithmetic circuit 20 in this modification.
- the arithmetic circuit 20 generates HDR image data Ib based on the image data Ia obtained by the image sensor 10.
- the arithmetic circuit 20 first decomposes the image data Ia for each exposure time (step S601). Specifically, the arithmetic circuit 20 determines the image data Ia as data having an exposure time of “Middle” (image data Im) and data having an exposure time of “Long” (image data Il). , The exposure time is “short” data (image data Is).
- the arithmetic circuit 20 generates the phase difference data Pd41, Pd42, Pd43, Pd44 based on the image data Im (step S602). Specifically, the arithmetic circuit 20 calculates the difference between the two pixel data Sig1 corresponding to each monochromatic sensor pixel group Pi in the image data Im, and the two pixel data Sig1 arranged in the direction of rising to the right. The value is derived, and the phase difference data Pd41 in the first direction (the direction of rising to the right) on the light receiving surface 110A is generated from the derived difference value.
- the arithmetic circuit 20 derives a difference value between two pixel data Sig1 corresponding to each monochrome sensor pixel group Pi in the image data Im, and two pixel data Sig1 arranged in a downward sloping direction. Then, the phase difference data Pd42 in the second direction (downward-sloping direction) on the light receiving surface 110A is generated from the derived difference value.
- the arithmetic circuit 20 has two pixel data Sig1 corresponding to each monochrome sensor pixel group Pi, and a difference value between two pixel data Sig1 in the upper row arranged in the left-right direction and the left-right direction.
- the difference value between the arranged two lower pixel data Sig1 is derived, and the phase difference data Pd43 in the third direction (left-right direction) on the light receiving surface 110A is generated from the derived difference value.
- the arithmetic circuit 20 determines, in the image data Im, two pixel data Sig1 corresponding to each single color sensor pixel group Pi and a difference value between two pixel data Sig1 on the left side arranged in the vertical direction, The difference value between the right two pixel data Sig1 arranged in the direction is derived, and the phase difference data Pd44 in the fourth direction (vertical direction) on the light receiving surface 110A is generated from the derived difference value.
- the arithmetic circuit 20 also generates the phase difference data Pd45 based on the image data Il and Im (step S602). Specifically, the arithmetic circuit 20 obtains the image data Il and an image obtained by multiplying the image data Im by the exposure time ratio of the exposure time “Long” and the exposure time “Middle”. The difference value from the data Im′ is derived, and the phase difference data Pd45 is generated from the derived difference value. The arithmetic circuit 20 also generates the phase difference data Pd46 based on the image data Im and Is (step S602).
- the arithmetic circuit 20 stores the image data Im and the image data Is′ obtained by multiplying the image data Is by the exposure time ratio of the exposure time “Middle” and the exposure time “Short”.
- the difference value is derived, and the phase difference data Pd46 is generated from the derived difference value.
- the arithmetic circuit 20 generates the level data Da for the phase difference based on the phase difference data Pd41, Pd42, Pd43, Pd44 (step S603).
- the level data Da is, for example, data represented by a value within a range from a lower limit value (for example, 0 bit) to an upper limit value (for example, 128 bit).
- the arithmetic circuit 20 converts, for example, in the phase difference data Pd41, Pd42, Pd43, Pd44, a numerical value below a predetermined range to a lower limit value (for example, 0 bit).
- the arithmetic circuit 20 converts, for example, in the phase difference data Pd41, Pd42, Pd43, Pd44, a numerical value exceeding a predetermined range into an upper limit value (for example, 128 bits).
- the arithmetic circuit 20 converts, for example, a numerical value within a predetermined range in the phase difference data Pd41, Pd42, Pd43, Pd44 into a value within a range of 1 bit to 127 bits according to the magnitude of the numerical value. ..
- the arithmetic circuit 20 converts the phase difference data Pd45 and Pd46 into level data Db for the moving body (step S604).
- the level data Db is, for example, data represented by a value within a range from a lower limit value (for example, 0 bit) to an upper limit value (for example, 128 bit).
- the arithmetic circuit 20 generates level data Db based on the noise level data (noise data) of the image sensor 10 and the phase difference data Pd45 and Pd46.
- the arithmetic circuit 20 detects a portion having a large phase difference from the obtained level data Da (step S605). Further, the arithmetic circuit 20 detects the presence or absence of a moving body from the obtained level data Db (step S606). Finally, the arithmetic circuit 20 generates the HDR image data Ib from the image data Im, Il, Is, the presence/absence of a phase difference, and the presence/absence of a moving object (step S607). In this way, the HDR image data Ib is generated.
- the pixel array section 110 obtains phase difference data in four directions from the pixel data Sig of 2 rows ⁇ 2 columns corresponding to one single color sensor pixel group Pa of each of the three color sensor pixel groups P3. Is configured. As a result, the presence/absence of a phase difference and the presence/absence of a moving object can be determined in the four directions.
- the pixel array unit 110 obtains phase difference data in four directions from the pixel data Sig of 2 rows ⁇ 2 columns corresponding to one single color sensor pixel group Pa of each of the three color sensor pixel groups P3. It was configured to be. However, in the modified example E, the pixel array unit 110 determines the phase difference in four directions from the pixel data Sig of 4 rows ⁇ 4 columns corresponding to one 3 color sensor pixel group P 2 of each 3 color sensor pixel group P 3. It may be configured to obtain data.
- FIG. 22 shows a configuration example of the pixel array section 110 according to this modification.
- the pixel array unit 110 includes pixels of 4 rows ⁇ 4 columns corresponding to one 3 color sensor pixel group P 2 (hereinafter, “3 color sensor pixel group Pj”) of each 3 color sensor pixel group P 3.
- the data Sig is configured to obtain phase difference data in the upward-sloping direction, the downward-sloping direction, the horizontal direction, and the vertical direction.
- each single-color sensor pixel group P1 included in each three-color sensor pixel group P2 excluding the three-color sensor pixel group Pj has the same configuration as the single-color sensor pixel group P1 according to the above-described embodiment. Has become.
- each three-color sensor pixel group Pj has a different configuration from the three-color sensor pixel group P2 according to the above embodiment. Specifically, in each single color sensor pixel group P1 of each three color sensor pixel group Pj, the exposure time of the four sensor pixels 111 (photodiodes PD) is set to "Middle". In other words, the system control circuit 124 sets the same exposure time for each sensor pixel 111 (photodiode PD) in each single-color sensor pixel group P1 of the three-color sensor pixel group Pj included in each three-color sensor pixel group P3. Therefore, the exposure time of the plurality of sensor pixels 111 (photodiodes PD) is controlled.
- one pixel drive line ctlM is connected to the upper two sensor pixels 111 of the single color sensor pixel group P1.
- the other pixel drive line ctlM is connected to the lower two sensor pixels 111 of the monochrome sensor pixel group P1.
- the exposure time of one sensor pixel 111 (photodiode PD) included in the upper row of the single color sensor pixel group P1 arranged on the right side or the left side of the upper row of the three-color sensor pixel group Pj is “Long”.
- the pixel array unit 110 further includes a pixel at a position corresponding to the upper sensor pixel 111 (photodiode PD) of the single color sensor pixel group P1 of each three color sensor pixel group Pj. It has a drive line ctlL.
- the exposure time of one sensor pixel 111 (photodiode PD) included in the upper row of the single color sensor pixel group P1 arranged on the right side or the left side of the upper row of the three-color sensor pixel group Pj is “short”.
- the pixel array unit 110 further drives the pixel drive to a position corresponding to the upper sensor pixel 111 (photodiode PD) of the single color sensor pixel group P1 of each three color sensor pixel group Pj. It has the line ctlS.
- the exposure time of one sensor pixel 111 (photodiode PD) included in the lower row of the single color sensor pixel group P1 arranged on the right side or the left side of the upper row of the three-color sensor pixel group Pj is "Long".
- the pixel array unit 110 further drives the pixel drive to a position corresponding to the lower sensor pixel 111 (photodiode PD) of the single color sensor pixel group P1 of each three color sensor pixel group Pj. It has a line ctlL.
- the exposure time of one sensor pixel 111 (photodiode PD) included in the lower row of the single color sensor pixel group P1 arranged on the right side or the left side of the upper row of the three-color sensor pixel group Pj is "short".
- the pixel array unit 110 further drives the pixel drive to a position corresponding to the lower sensor pixel 111 (photodiode PD) of the single color sensor pixel group P1 of each three color sensor pixel group Pj. It has the line ctlS.
- the exposure time of one sensor pixel 111 (photodiode PD) included in the upper row of the single color sensor pixel group P1 arranged on the right side or the left side of the lower row of the three-color sensor pixel group Pj is “long”.
- the pixel array unit 110 further drives the pixel drive to a position corresponding to the upper sensor pixel 111 (photodiode PD) of the single color sensor pixel group P1 of each three color sensor pixel group Pj. It has a line ctlL.
- the exposure time of one sensor pixel 111 (photodiode PD) included in the upper row of the single color sensor pixel group P1 arranged on the right side or the left side of the lower side of the three-color sensor pixel group Pj is “short”.
- the pixel array unit 110 further drives the pixel drive to a position corresponding to the upper sensor pixel 111 (photodiode PD) of the single color sensor pixel group P1 of each three color sensor pixel group Pj. It has the line ctlS.
- the exposure time of one sensor pixel 111 (photodiode PD) included in the lower row of the single color sensor pixel group P1 arranged on the right side or the left side of the lower row of the three-color sensor pixel group Pj is “Long”.
- the pixel array unit 110 further drives the pixel drive to a position corresponding to the lower sensor pixel 111 (photodiode PD) of the single color sensor pixel group P1 of each three color sensor pixel group Pj. It has a line ctlL.
- the exposure time of one sensor pixel 111 (photodiode PD) included in the lower row of the single color sensor pixel group P1 arranged on the right side or the left side of the lower row of the three-color sensor pixel group Pj is "short".
- the pixel array unit 110 further drives the pixel drive to a position corresponding to the lower sensor pixel 111 (photodiode PD) of the single color sensor pixel group P1 of each three color sensor pixel group Pj. It has the line ctlS.
- the image data Ia includes pixel data Sig of X rows ⁇ Y columns corresponding to the sensor pixels 111 of X rows ⁇ Y columns in the pixel array unit 110.
- the pixel data Sig of X rows ⁇ Y columns the pixel data Sig of 4 rows ⁇ 4 columns corresponding to each monochromatic sensor pixel group P1 included in each of the three color sensor pixel groups P2 excluding the three color sensor pixel group Pe is , Two pixel data Sig1 corresponding to two sensor pixels 111 whose exposure time is set to “Middle” and one pixel data Sig1 corresponding to the sensor pixels 111 whose exposure time is set to “Short”.
- Pixel data Sig2 and one pixel data Sig3 corresponding to the sensor pixel 111 whose exposure time is set to “long” are included.
- the pixel data Sig of 2 rows ⁇ 2 columns corresponding to each monochromatic sensor pixel group P1 included in the 3-color sensor pixel group Pe is represented by the exposure time “medium ( Four pixel data Sig1 corresponding to the four sensor pixels 111 set to “Middle)” are included.
- FIG. 23 shows an example of a signal processing procedure in the arithmetic circuit 20 in this modification.
- the arithmetic circuit 20 generates HDR image data Ib based on the image data Ia obtained by the image sensor 10.
- the arithmetic circuit 20 first decomposes the image data Ia for each exposure time (step S701). Specifically, the arithmetic circuit 20 determines the image data Ia as data having an exposure time of “Middle” (image data Im) and data having an exposure time of “Long” (image data Il). , The exposure time is “short” data (image data Is).
- the arithmetic circuit 20 generates the phase difference data Pd51, Pd52, Pd53, Pd54 based on the image data Im (step S702).
- the arithmetic circuit 20 is two pieces of pixel data Sig1 corresponding to each monochromatic sensor pixel group P1 included in the three-color sensor pixel group Pe in the image data Im, and is arranged in the direction of rising right.
- the difference value between the generated two pixel data Sig1 is derived, and from the derived difference value, the phase difference data Pd51 in the first direction (in the upward rising direction) on the light receiving surface 110A is generated.
- the arithmetic circuit 20 is the two pixel data Sig1 corresponding to each monochromatic sensor pixel group P1 included in the three-color sensor pixel group Pe in the image data Im, and is arranged in the downward sloping direction 2.
- the difference value of the one pixel data Sig1 is derived, and from the derived difference value, the phase difference data Pd52 in the second direction (downward to the right) on the light receiving surface 110A is generated.
- the arithmetic circuit 20 includes two pieces of pixel data Sig1 corresponding to each monochromatic sensor pixel group P1 included in the three-color sensor pixel group Pe in the image data Im, and two upper row data arranged in the left-right direction.
- the difference value of the pixel data Sig1 and the difference value of the lower two pixel data Sig1 arranged in the left-right direction are derived, and the phase difference data in the third direction (left-right direction) on the light-receiving surface 110A is derived from the derived difference value.
- the arithmetic circuit 20 includes two pieces of pixel data Sig1 corresponding to each monochromatic sensor pixel group P1 included in the three-color sensor pixel group Pe in the image data Im, and two left-side pixel data Sig1 arranged vertically.
- the difference value of the pixel data Sig1 and the difference value of the two right-side pixel data Sig1 arranged in the vertical direction are derived, and the phase difference data in the fourth direction (vertical direction) on the light receiving surface 110A is derived from the derived difference value.
- the arithmetic circuit 20 also generates the phase difference data Pd55 based on the image data Il and Im (step S702). Specifically, the arithmetic circuit 20 obtains an image data Il and an image obtained by multiplying the image data Im by the exposure time ratio of the exposure time “Long” and the exposure time “Middle”. The difference value from the data Im′ is derived, and the phase difference data Pd55 is generated from the derived difference value. The arithmetic circuit 20 also generates the phase difference data Pd56 based on the image data Im and Is (step S702).
- the arithmetic circuit 20 stores the image data Im and the image data Is′ obtained by multiplying the image data Is by the exposure time ratio of the exposure time “Middle” and the exposure time “Short”.
- the difference value is derived, and the phase difference data Pd56 is generated from the derived difference value.
- the arithmetic circuit 20 generates the level data Da for the phase difference based on the phase difference data Pd51, Pd52, Pd53, Pd54 (step S703).
- the level data Da is, for example, data represented by a value within a range from a lower limit value (for example, 0 bit) to an upper limit value (for example, 128 bit).
- the arithmetic circuit 20 converts, for example, in the phase difference data Pd51, Pd52, Pd53, Pd54, a numerical value below a predetermined range to a lower limit value (for example, 0 bit).
- the arithmetic circuit 20 converts a numerical value exceeding a predetermined range in the phase difference data Pd51, Pd52, Pd53, Pd54 into an upper limit value (for example, 128 bits).
- the arithmetic circuit 20 converts, for example, a numerical value within a predetermined range in the phase difference data Pd51, Pd52, Pd53, Pd54 into a value within a range of 1 bit to 127 bits according to the magnitude of the numerical value. ..
- the arithmetic circuit 20 converts the phase difference data Pd55 and Pd56 into level data Db for the moving body (step S704).
- the level data Db is, for example, data represented by a value within a range from a lower limit value (for example, 0 bit) to an upper limit value (for example, 128 bit).
- the arithmetic circuit 20 generates level data Db based on the noise level data (noise data) of the image sensor 10 and the phase difference data Pd55 and Pd56.
- the arithmetic circuit 20 detects a portion having a large phase difference from the obtained level data Da (step S705). Further, the arithmetic circuit 20 detects the presence or absence of a moving body from the obtained level data Db (step S706). Finally, the arithmetic circuit 20 generates the HDR image data Ib from the image data Im, Il, Is, the presence/absence of a phase difference, and the presence/absence of a moving object (step S707). In this way, the HDR image data Ib is generated.
- the pixel array unit 110 obtains phase difference data in four directions from the pixel data Sig of 4 rows ⁇ 4 columns corresponding to one 3 color sensor pixel group P 2 of each 3 color sensor pixel group P 3. It is configured to be. As a result, the presence/absence of a phase difference and the presence/absence of a moving object can be determined in the four directions.
- the pixel array unit 110 obtains phase difference data in four directions from pixel data Sig of 4 rows ⁇ 4 columns corresponding to one 3 color sensor pixel group P 2 of each 3 color sensor pixel group P 3. Was configured to be obtained.
- the pixel array unit 110 may be configured to obtain phase difference data in four directions from the pixel data Sig of 2 rows ⁇ 2 columns corresponding to each monochrome sensor pixel group P1. Good.
- FIG. 24 shows a configuration example of the pixel array section 110 according to this modification.
- FIG. 25 shows an example of the wiring layout of the pixel array section 110 of FIG.
- FIG. 26 shows an example of the direction of the phase difference that can be detected by the pixel array section 110 of FIG.
- the pixel array unit 110 outputs phase difference data in the upward rising direction, the vertical direction, and the horizontal direction from each single color sensor pixel group P1 for every n frames (n is an integer of 2 or more), for example. Is configured to be obtained.
- the exposure time of the four sensor pixels 111 (photodiodes PD) is set to “Middle”.
- the system control circuit 124 causes the exposure time of the plurality of sensor pixels 111 (photodiodes PD) to be the same in each sensor pixel 111 (photodiode PD) in each monochrome sensor pixel group P1. To control.
- the pixel array unit 110 further includes pixel data of three types of exposure times in each frame other than n ⁇ k frames (n is an integer of 2 or more and k is an integer of 1 or more), for example.
- the image data Ia is obtained.
- the pixel array unit 110 further includes, for example, as shown in FIG. 27, the pixel according to the above-described embodiment in each frame other than n ⁇ k frames (n is an integer of 2 or more and k is an integer of 1 or more). It has the same configuration as the array unit 110.
- the exposure time of the two sensor pixels 111 is set to “Middle”, and the remaining two sensor pixels 111 (photodiodes PD) are set.
- One exposure time is set to "Short” and the remaining one exposure time is set to "Long”.
- the pixel array section 110 has a pixel drive line ctlL and a pixel drive line ctlM at a position corresponding to the upper stage of each single color sensor pixel group P1, as shown in FIGS. 25 and 28, for example. .. Further, the pixel array section 110 has a pixel drive line ctlM and a pixel drive line ctlS at locations corresponding to the lower stages of the respective monochrome sensor pixel groups P1.
- the system control circuit 124 when obtaining the image data Ia for each n frames, has the pixel drive line ctlL and the pixel drive line ctlL provided at a position corresponding to the upper stage of each monochrome sensor pixel group P1.
- ctlM is used to perform exposure control for setting the exposure time of the sensor pixel 111 (photodiode PD) at the upper stage of each monochrome sensor pixel group P1 to “Middle”.
- the system control circuit 124 when obtaining the image data Ia for every n frames, is provided with a pixel drive line ctlM and a pixel drive line ctlM provided at a position corresponding to the lower stage of each monochrome sensor pixel group P1.
- ctlS is used to perform exposure control for setting the exposure time of the sensor pixel 111 (photodiode PD) in the lower stage of each monochrome sensor pixel group P1 to “Middle”.
- the image data Ia1 obtained for every n frames includes the pixel data Sig of X rows ⁇ Y columns corresponding to the sensor pixels 111 of X rows ⁇ Y columns in the pixel array unit 110.
- the pixel data Sig of X rows ⁇ Y columns the pixel data Sig of 4 rows ⁇ 4 columns corresponding to each monochrome sensor pixel group P1 has four sensor pixels whose exposure time is set to “Middle”. Four pieces of pixel data Sig1 corresponding to 111 are included.
- the image data Ia2 obtained in each frame other than n ⁇ k frames includes X rows ⁇ Y columns in the pixel array unit 110.
- the pixel data Sig of X rows ⁇ Y columns corresponding to the sensor pixel 111 is included.
- the pixel data Sig of 4 rows ⁇ 4 columns corresponding to each monochromatic sensor pixel group P1 has two exposure times set to “Middle”, for example.
- One pixel data Sig3 corresponding to the generated sensor pixel 111 is included.
- FIG. 29 shows an example of a signal processing procedure in the arithmetic circuit 20 in this modification.
- the arithmetic circuit 20 generates the HDR image data Ib based on the image data Ia1 and Ia2 obtained by the image sensor 10.
- the arithmetic circuit 20 first decomposes the obtained image data Ia2 for each exposure time (step S801). Specifically, the arithmetic circuit 20 defines the image data Ia2 as data having an exposure time of “Middle” (image data Im2) and data having an exposure time of “Long” (image data Il2). , The exposure time is “short” data (image data Is2).
- the arithmetic circuit 20 decomposes the obtained image data Ia1 for each exposure time (step S801). Specifically, the arithmetic circuit 20 defines the image data Ia1 as data having an exposure time of “Middle” (image data Im1) and data having an exposure time of “Long” (image data Il1). , The exposure time is “short” data (image data Is1).
- the arithmetic circuit 20 generates the phase difference data Pd61, Pd62, Pd63, Pd64 based on the image data Im1 (step S802). Specifically, the arithmetic circuit 20 calculates the difference between the two pixel data Sig1 corresponding to each monochrome sensor pixel group P1 in the image data Im1 and the two pixel data Sig1 arranged in the upward rising direction. The value is derived, and the phase difference data Pd61 in the first direction (direction of rising to the right) on the light receiving surface 110A is generated from the derived difference value.
- the arithmetic circuit 20 derives a difference value between two pixel data Sig1 corresponding to each monochrome sensor pixel group P1 in the image data Im1, and two pixel data Sig1 arranged in a downward sloping direction. Then, the phase difference data Pd62 in the second direction (downward to the right) on the light receiving surface 110A is generated from the derived difference value.
- the arithmetic circuit 20 includes two pixel data Sig1 corresponding to each monochrome sensor pixel group P1 in the image data Im1, and a difference value between two pixel data Sig1 in the upper row arranged in the left-right direction and the left and right.
- the difference value between the lower two pixel data Sig1 arranged in the direction is derived, and the phase difference data Pd63 in the third direction (left-right direction) on the light receiving surface 110A is generated from the derived difference value.
- the arithmetic circuit 20 determines, in the image data Im1, two pixel data Sig1 corresponding to each monochromatic sensor pixel group P1 and a difference value between two pixel data Sig1 on the left side arranged in the vertical direction, The difference value between the right two pixel data Sig1 arranged in the direction is derived, and the phase difference data Pd64 in the fourth direction (vertical direction) on the light receiving surface 110A is generated from the derived difference value.
- the arithmetic circuit 20 also generates the phase difference data Pd65 based on the image data Il1 and Im1 (step S802). Specifically, the arithmetic circuit 20 obtains the image data Il1 and an image obtained by multiplying the image data Im1 by the exposure time ratio of the exposure time “Long” and the exposure time “Middle”. The difference value from the data Im1′ is derived, and the phase difference data Pd65 is generated from the derived difference value. The arithmetic circuit 20 also generates the phase difference data Pd66 based on the image data Im1 and Is1 (step S802).
- the arithmetic circuit 20 calculates the image data Im1 and the image data Is1′ obtained by multiplying the image data Is1 by the exposure time ratio of the exposure time “Middle” and the exposure time “Short”.
- the difference value is derived, and the phase difference data Pd66 is generated from the derived difference value.
- the arithmetic operation circuit 20 generates the level data Da for the phase difference based on the phase difference data Pd61, Pd62, Pd63, Pd64 (step S803).
- the level data Da is, for example, data represented by a value within a range from a lower limit value (for example, 0 bit) to an upper limit value (for example, 128 bit).
- the arithmetic circuit 20 converts, for example, in the phase difference data Pd61, Pd62, Pd63, Pd64, a numerical value below a predetermined range to a lower limit value (for example, 0 bit).
- the arithmetic circuit 20 converts, for example, in the phase difference data Pd61, Pd62, Pd63, Pd64, a numerical value exceeding a predetermined range into an upper limit value (for example, 128 bits).
- the arithmetic circuit 20 converts the phase difference data Pd65 and Pd66 into the level data Db for the moving body (step S804).
- the level data Db is, for example, data represented by a value within a range from a lower limit value (for example, 0 bit) to an upper limit value (for example, 128 bit).
- the arithmetic circuit 20 generates level data Db based on the noise level data of the image sensor 10 (noise data) and the phase difference data Pd65 and Pd66.
- the arithmetic circuit 20 detects a portion having a large phase difference from the obtained level data Da (step S805). Furthermore, the arithmetic circuit 20 detects the presence or absence of a moving body from the obtained level data Db (step S806). Finally, the arithmetic circuit 20 generates the HDR image data Ib from the image data Im2, Il2, Is2, the presence/absence of a phase difference, and the presence/absence of a moving object (step S807). In this way, the HDR image data Ib is generated.
- the pixel array unit 110 is configured to obtain phase difference data in four directions from each monochrome sensor pixel group P1 for every n frames, for example.
- the HDR image Ib is further generated based on the image data Ia1 obtained every n frames and the image data Ia1 obtained in each frame other than n ⁇ k frames.
- the pixel array section 110 is configured to obtain the phase difference data based on the monochromatic image data.
- the pixel array unit 110 may be configured to obtain the phase difference data for each color based on the image data of all colors.
- FIG. 30 illustrates a configuration example of the pixel array unit 110 according to this modification.
- FIG. 31 shows an example of a wiring layout of the pixel array section 110 of FIG.
- FIG. 32 shows an example of directions of phase differences that can be detected by the pixel array section 110 of FIG.
- the pixel array unit 110 includes two three-color sensor pixel groups P2 arranged in the row direction (hereinafter, referred to as “three-color sensor pixel group P4”), and a phase difference in two directions for each color. It is configured to obtain data. Specifically, the pixel array unit 110 is configured to obtain the phase difference data in the upward-sloping direction and the downward-sloping direction for each color from the three-color sensor pixel group P4. At this time, in one of the three-color sensor image groups P2 included in each of the three-color sensor pixel groups P4, each single-color sensor pixel group P1 is configured to obtain phase difference data in the upward rising direction. In the other three-color sensor image group P2 included in each three-color sensor pixel group P4, each single-color sensor pixel group P1 is configured to obtain phase difference data in the downward sloping direction.
- each single color sensor pixel group P1 of one three color sensor image group P2 included in each three color sensor pixel group P4 two sensor pixels 111 (photodiode PD ) Exposure time is set to "Middle" as shown in FIG. 30 shows the exposure time of the lower right sensor pixel 111 (photodiode PD) in each single color sensor pixel group P1 of the one three color sensor image group P2 included in each three color sensor pixel group P4. Is set to “Short”.
- the exposure time of the upper left sensor pixel 111 (photodiode PD) is as shown in FIG.
- the system control circuit 124 causes the exposure time of the two sensor pixels 111 (photodiodes PD) of the single color sensor pixel group P1 of the one three color sensor image group P2 included in each three color sensor pixel group P4.
- the exposure times of the plurality of sensor pixels 111 (photodiodes PD) are controlled so that the exposure times of the two sensor pixels 111 (photodiode PD) are the same.
- each monochromatic sensor pixel group P1 of the other three-color sensor image group P2 included in each three-color sensor pixel group P4 the exposure time of the two sensor pixels 111 (photodiodes PD) arranged in the downward-sloping direction. , As shown in FIG. 30, it is set to “Middle”. Further, in each monochromatic sensor pixel group P1 of the other three-color sensor image group P2 included in each three-color sensor pixel group P4, the exposure time of the lower left sensor pixel 111 (photodiode PD) is as shown in FIG. Is set to “Short”.
- each single color sensor pixel group P1 of the other three color sensor image group P2 included in each three color sensor pixel group P4 the exposure time of the sensor pixel 111 (photodiode PD) on the upper right is as shown in FIG. Is set to “Long”.
- the system control circuit 124 causes the exposure time of the two sensor pixels 111 (photodiode PD) of each monochromatic sensor pixel group P1 of the other three-color sensor image group P2 included in each three-color sensor pixel group P4.
- the exposure times of the plurality of sensor pixels 111 (photodiodes PD) are controlled so that the exposure times of the two sensor pixels 111 (photodiodes PD) are the same.
- the pixel array section 110 has a pixel drive line ctlL and a pixel drive line ctlM at a position corresponding to the upper stage of each monochromatic sensor pixel group P1 included in each three-color sensor pixel group P4.
- the pixel array section 110 has a pixel drive line ctlM and a pixel drive line ctlS at a position corresponding to the lower stage of each monochromatic sensor pixel group P1 included in each three-color sensor pixel group P4.
- the image data Ia includes pixel data Sig of X rows ⁇ Y columns corresponding to the sensor pixels 111 of X rows ⁇ Y columns in the pixel array unit 110.
- the pixel data Sig of X rows ⁇ Y columns the pixel data Sig of 4 rows ⁇ 4 columns corresponding to each monochromatic sensor pixel group P1 has two sensor pixels whose exposure time is set to “Middle”. Two pixel data Sig1 corresponding to 111, one pixel data Sig2 corresponding to the sensor pixel 111 whose exposure time is set to “Short”, and an exposure time set to “Long”
- One pixel data Sig3 corresponding to the sensor pixel 111 is included.
- FIG. 33 shows an example of a signal processing procedure in the arithmetic circuit 20 in this modification.
- the arithmetic circuit 20 generates HDR image data Ib based on the image data Ia obtained by the image sensor 10.
- the arithmetic circuit 20 first decomposes the image data Ia for each exposure time (step S901). Specifically, the arithmetic circuit 20 determines the image data Ia as data having an exposure time of “Middle” (image data Im) and data having an exposure time of “Long” (image data Il). , The exposure time is “short” data (image data Is).
- the arithmetic circuit 20 generates the phase difference data Pd71 and Pd72 based on the image data Im (step S902). Specifically, the arithmetic circuit 20 calculates the difference between two pixel data Sig1 corresponding to each single color sensor pixel group P1 of one three color sensor image group P2 included in each three color sensor pixel group P4 in the image data Im. The value is derived, and the phase difference data Pd71 in the first direction (the direction of rising to the right) on the light receiving surface 110A is generated from the derived difference value.
- the arithmetic circuit 20 derives the difference value of each monochromatic sensor pixel group P1 of the other three-color sensor image group P2 included in each three-color sensor pixel group P4 in the image data Im, and from the derived difference value, Phase difference data Pd72 in the second direction (downward to the right) on the light receiving surface 110A is generated.
- the arithmetic circuit 20 also generates the phase difference data Pd73 based on the image data Il and Im (step S902). Specifically, the arithmetic circuit 20 obtains an image data Il and an image obtained by multiplying the image data Im by the exposure time ratio of the exposure time “Long” and the exposure time “Middle”. The difference value from the data Im′ is derived, and the phase difference data Pd73 is generated from the derived difference value. The arithmetic circuit 20 also generates the phase difference data Pd74 based on the image data Im and Is (step S902).
- the arithmetic circuit 20 stores the image data Im and the image data Is′ obtained by multiplying the image data Is by the exposure time ratio of the exposure time “Middle” and the exposure time “Short”.
- the difference value is derived, and the phase difference data Pd74 is generated from the derived difference value.
- the arithmetic circuit 20 generates the level data Da regarding the phase difference based on the phase difference data Pd71 and Pd72 (step S903).
- the level data Da is, for example, data represented by a value within a range from a lower limit value (for example, 0 bit) to an upper limit value (for example, 128 bit).
- the arithmetic circuit 20 converts, for example, a numerical value of the phase difference data Pd71 and Pd72 that falls below a predetermined range into a lower limit value (for example, 0 bit). For example, the arithmetic circuit 20 converts a numerical value exceeding a predetermined range in the phase difference data Pd71 and Pd72 into an upper limit value (for example, 128 bits). For example, the arithmetic circuit 20 converts a numerical value within a predetermined range in the phase difference data Pd71 and Pd72 into a value within a range of 1 bit to 127 bits according to the size of the numerical value.
- the arithmetic circuit 20 converts the phase difference data Pd73 and Pd74 into level data Db for the moving body (step S904).
- the level data Db is, for example, data represented by a value within a range from a lower limit value (for example, 0 bit) to an upper limit value (for example, 128 bit).
- the arithmetic circuit 20 generates level data Db based on the noise level data (noise data) of the image sensor 10 and the phase difference data Pd73 and Pd74.
- the arithmetic circuit 20 detects a portion having a large phase difference from the obtained level data Da (step S905). Further, the arithmetic circuit 20 detects the presence or absence of a moving body from the obtained level data Db (step S906). Finally, the arithmetic circuit 20 generates the HDR image data Ib from the image data Im, Il, Is, the presence/absence of a phase difference, and the presence/absence of a moving body (step S907). In this way, the HDR image data Ib is generated.
- the pixel array unit 110 is configured to obtain phase difference data for each color based on image data of all colors. This makes it possible to determine the presence or absence of a phase difference and the presence or absence of a moving object for each color.
- the pixel array section 110 is configured to obtain phase difference data in two directions (upward to the right and downward to the right) for each color.
- the pixel array section 110 may be configured to obtain phase difference data in two directions (horizontal direction, vertical direction) for each color.
- FIG. 34 shows a configuration example of the pixel array section 110 according to this modification.
- FIG. 35 shows an example of a wiring layout of the pixel array section 110 of FIG.
- FIG. 36 shows an example of directions of phase differences detectable by the pixel array section 110 of FIG.
- the pixel array unit 110 includes two three-color sensor pixel groups P2 arranged in the column direction (hereinafter, referred to as “three-color sensor pixel group P5”), and a phase difference in two directions for each color. It is configured to obtain data. Specifically, the pixel array unit 110 is configured to obtain phase difference data in the horizontal direction and the vertical direction for each color from the three-color sensor pixel group P5. At this time, in the one three-color sensor image group P2 included in each three-color sensor pixel group P5, each single-color sensor pixel group P1 is configured to obtain the phase difference data in the vertical direction. In the other three-color sensor image group P2 included in each three-color sensor pixel group P5, each single-color sensor pixel group P1 is configured to obtain phase difference data in the left-right direction.
- each single color sensor pixel group P1 of the one three color sensor image group P2 included in each three color sensor pixel group P5 the two sensor pixels 111 whose exposure time is set to “Middle”
- the (photodiode PD) is arranged on the left side and vertically arranged.
- one sensor pixel 111 (photodiode) whose exposure time is set to “Short” is set.
- PD is arranged at the lower right of the monochrome sensor pixel group P1.
- each single-color sensor pixel group P1 of one three-color sensor image group P2 included in each three-color sensor pixel group P5 one sensor pixel 111 (photodiode) whose exposure time is set to “long” PD) is arranged on the upper right of the single color sensor pixel group P1.
- the system control circuit 124 causes the exposure time of the three sensor pixels 111 (photodiodes PD) of the single color sensor pixel group P1 of the one three color sensor image group P2 included in each three color sensor pixel group P5.
- the exposure times of the plurality of sensor pixels 111 (photodiodes PD) are controlled so that the exposure times of the two sensor pixels 111 (photodiodes PD) are the same.
- each single color sensor pixel group P1 of the other three color sensor image group P2 included in each three color sensor pixel group P5 the exposure time of the upper two sensor pixels 111 (photodiode PD) is as shown in FIG. Is set to “Middle”. Further, in each monochromatic sensor pixel group P1 of the other three-color sensor image group P2 included in each three-color sensor pixel group P5, the exposure time of the lower right sensor pixel 111 (photodiode PD) is shown in FIG. Is set to “Short”.
- each monochromatic sensor pixel group P1 of the other three-color sensor image group P2 included in each three-color sensor pixel group P5 the exposure time of the lower left sensor pixel 111 (photodiode PD) is as shown in FIG. Is set to “Long”.
- the system control circuit 124 causes the exposure time of the three sensor pixels 111 (photodiode PD) of each monochromatic sensor pixel group P1 of the other three-color sensor image group P2 included in each three-color sensor pixel group P5.
- the exposure times of the plurality of sensor pixels 111 (photodiodes PD) are controlled so that the exposure times of the two sensor pixels 111 (photodiodes PD) are the same.
- the pixel drive line ctlM and the pixel drive line are provided at a position corresponding to the upper stage of each single color sensor pixel group P1 of the one three color sensor image group P2 included in each three color sensor pixel group P5. It has ctlL.
- the pixel array section 110 includes a pixel drive line ctlM and a pixel drive line ctlS at a position corresponding to a lower stage of each single color sensor pixel group P1 of one three color sensor image group P2 included in each three color sensor pixel group P5. have.
- the pixel array section 110 has two pixel drive lines ctlM at locations corresponding to the upper stages of the respective monochromatic sensor pixel groups P1 of the other three-color sensor image group P2 included in each three-color sensor pixel group P5. doing.
- the pixel array unit 110 includes the pixel drive line ctlL and the pixel drive line ctlS at a position corresponding to the lower stage of each single color sensor pixel group P1 of the other three color sensor image group P2 included in each three color sensor pixel group P5. have.
- the image data Ia includes pixel data Sig of X rows ⁇ Y columns corresponding to the sensor pixels 111 of X rows ⁇ Y columns in the pixel array unit 110.
- the pixel data Sig of 2 rows ⁇ 2 columns corresponding to each monochromatic sensor pixel group P1 includes two sensor pixels whose exposure time is set to “Middle”. Two pixel data Sig1 corresponding to 111, one pixel data Sig2 corresponding to the sensor pixel 111 whose exposure time is set to “Short”, and an exposure time set to “Long”
- One pixel data Sig3 corresponding to the sensor pixel 111 is included.
- FIG. 37 shows an example of a signal processing procedure in the arithmetic circuit 20 in this modification.
- the arithmetic circuit 20 generates HDR image data Ib based on the image data Ia obtained by the image sensor 10.
- the arithmetic circuit 20 first decomposes the image data Ia for each exposure time (step S1001). Specifically, the arithmetic circuit 20 determines the image data Ia as data having an exposure time of “Middle” (image data Im) and data having an exposure time of “Long” (image data Il). , The exposure time is “short” data (image data Is).
- the arithmetic circuit 20 generates the phase difference data Pd81 and Pd82 based on the image data Im (step S1002). Specifically, the arithmetic circuit 20 calculates the difference between two pixel data Sig1 corresponding to each single color sensor pixel group P1 of one three color sensor image group P2 included in each three color sensor pixel group P5 in the image data Im. The value is derived, and the phase difference data Pd81 in the first direction (vertical direction) on the light receiving surface 110A is generated from the derived difference value.
- the arithmetic circuit 20 derives the difference value of each monochromatic sensor pixel group P1 of the other three-color sensor image group P2 included in each three-color sensor pixel group P5 in the image data Im, and from the derived difference value, Phase difference data Pd82 in the second direction (horizontal direction) on the light receiving surface 110A is generated.
- the arithmetic circuit 20 also generates the phase difference data Pd83 based on the image data Il and Im (step S1002). Specifically, the arithmetic circuit 20 obtains the image data Il and an image obtained by multiplying the image data Im by the exposure time ratio of the exposure time “Long” and the exposure time “Middle”. The difference value from the data Im′ is derived, and the phase difference data Pd83 is generated from the derived difference value. The arithmetic circuit 20 also generates the phase difference data Pd84 based on the image data Im and Is (step S1002).
- the arithmetic circuit 20 stores the image data Im and the image data Is′ obtained by multiplying the image data Is by the exposure time ratio of the exposure time “Middle” and the exposure time “Short”.
- the difference value is derived, and the phase difference data Pd84 is generated from the derived difference value.
- the arithmetic circuit 20 generates the level data Da regarding the phase difference based on the phase difference data Pd81 and Pd82 (step S1003).
- the level data Da is, for example, data represented by a value within a range from a lower limit value (for example, 0 bit) to an upper limit value (for example, 128 bit).
- the arithmetic circuit 20 converts, for example, in the phase difference data Pd81 and Pd82, a numerical value below a predetermined range to a lower limit value (for example, 0 bit). For example, the arithmetic circuit 20 converts a numerical value exceeding a predetermined range in the phase difference data Pd81 and Pd82 into an upper limit value (for example, 128 bits). For example, the arithmetic circuit 20 converts a numerical value within a predetermined range in the phase difference data Pd81 and Pd82 into a value within a range of 1 bit to 127 bits according to the size of the numerical value.
- the arithmetic circuit 20 converts the phase difference data Pd83 and Pd84 into level data Db for the moving body (step S1004).
- the level data Db is, for example, data represented by a value within a range from a lower limit value (for example, 0 bit) to an upper limit value (for example, 128 bit).
- the arithmetic circuit 20 generates level data Db based on the noise level data (noise data) of the image sensor 10 and the phase difference data Pd83 and Pd84.
- the arithmetic circuit 20 detects a portion having a large phase difference from the obtained level data Da (step S1005). Furthermore, the arithmetic circuit 20 detects the presence or absence of a moving body from the obtained level data Db (step S1006). Finally, the arithmetic circuit 20 generates the HDR image data Ib from the image data Im, Il, Is, the presence/absence of a phase difference, and the presence/absence of a moving body (step S1007). In this way, the HDR image data Ib is generated.
- the pixel array unit 110 is configured to obtain phase difference data in two directions (horizontal direction, vertical direction) for each color based on image data of all colors. Accordingly, it is possible to determine the presence/absence of a phase difference and the presence/absence of a moving body in two directions for each color.
- the pixel array unit 110 is configured to obtain phase difference data in two directions for each color.
- the pixel array section 110 may be configured to obtain phase difference data in three directions for each color.
- FIG. 38 shows a configuration example of the pixel array section 110 according to this modification.
- FIG. 39 shows an example of a wiring layout of the pixel array section 110 of FIG.
- FIG. 40 shows an example of directions of phase differences that can be detected by the pixel array section 110 of FIG.
- the pixel array unit 110 is configured so that phase difference data in three directions can be obtained for each color from the three-color sensor pixel group P3.
- the pixel array unit 110 is configured to obtain, from the three-color sensor pixel group P3, phase difference data in the downward sloping direction, the horizontal direction, and the vertical direction for each color.
- each single-color sensor pixel group P1 is in the downward-sloping direction. The phase difference data is obtained.
- each single-color sensor pixel group P1 is configured to obtain phase difference data in the left-right direction. Further, in the upper right three-color sensor image group P2 included in each three-color sensor pixel group P3, each single-color sensor pixel group P1 is configured to obtain the phase difference data in the vertical direction.
- the downward-sloping direction The exposure time of the two sensor pixels 111 (photodiodes PD) arranged is set to "Middle” as shown in FIG. Further, in each of the single color sensor pixel groups P1 in the upper left three color sensor image group P2 and the lower right three color sensor image group P2 included in each three color sensor pixel group P3, the lower right sensor pixel 111 (photodiode The exposure time of PD) is set to "Short" as shown in FIG.
- the upper left sensor pixel 111 (photodiode PD ) Exposure time is set to "Long" as shown in FIG.
- the system control circuit 124 sets 3 of the single color sensor pixel groups P1 in the upper left 3 color sensor image group P2 and the lower right 3 color sensor image group P2 included in each 3 color sensor pixel group P3.
- Exposure of a plurality of sensor pixels 111 is performed so that the exposure times of two sensor pixels 111 (photodiode PD) are different from each other and the exposure times of two sensor pixels 111 (photodiode PD) are the same. Control the time.
- each single color sensor pixel group P1 in the lower left three color sensor image group P2 included in each three color sensor pixel group P3 the exposure time of the two sensor pixels 111 (photodiode PD) arranged in the left and right direction in the lower row. Is set to "Short” as shown in FIG.
- one sensor pixel 111 (photodiode PD) in the upper stage The exposure time is set to “long”, and the exposure time of the other sensor pixel 111 (photodiode PD) in the upper row is set to “middle”.
- the system control circuit 124 causes the exposure time of the three sensor pixels 111 (photodiode PD) of each monochromatic sensor pixel group P1 in the lower left three-color sensor image group P2 included in each three-color sensor pixel group P3.
- the exposure times of the plurality of sensor pixels 111 (photodiodes PD) are controlled so that the exposure times of the two sensor pixels 111 (photodiode PD) are the same.
- each single color sensor pixel group P1 in the upper right three color sensor image group P2 included in each three color sensor pixel group P3 the exposure time of the two sensor pixels 111 (photodiode PD) arranged vertically on the left side. Is set to "Long” as shown in FIG.
- one sensor pixel 111 (photodiode PD) on the right side The exposure time is set to “Middle” and the exposure time of the other sensor pixel 111 (photodiode PD) on the right side is set to “Short”.
- the exposure time of the two sensor pixels 111 (photodiodes PD) arranged vertically on the right side in each single-color sensor pixel group P1 in the upper-right three-color sensor image group P2 included in each three-color sensor pixel group P3. May be set to “Long”.
- the exposure times of the three sensor pixels 111 (photodiodes PD) of each single color sensor pixel group P1 in the upper right three color sensor image group P2 included in each three color sensor pixel group P3 are different from each other.
- the exposure times of the plurality of sensor pixels 111 (photodiodes PD) are controlled so that the exposure times of the one sensor pixel 111 (photodiode PD) are the same.
- the pixel drive line ctlL and the pixel drive line are provided at locations corresponding to the upper stage. It has the line ctlM.
- the pixel drive line ctlL and the pixel drive line are provided at locations corresponding to the lower row in each of the uppermost and second uppermost single color sensor pixel groups P1 included in each three-color sensor pixel group P3. It has a ctlM and a pixel drive line ctlS.
- the pixel array section 110 includes the pixel drive line ctlL and the pixel drive line at locations corresponding to the upper row in each of the third color color sensor pixel group P1 from the top and the bottommost single color sensor pixel group P1 included in each three color sensor pixel group P3. It has ctlM.
- the pixel array section 110 includes the pixel drive line ctlS and the pixel drive line at locations corresponding to the lower row in the third color sensor pixel group P1 and the single color sensor pixel group P1 in the third row and the bottom row included in each three color sensor pixel group P3. It has ctlM.
- the image data Ia includes pixel data Sig of X rows ⁇ Y columns corresponding to the sensor pixels 111 of X rows ⁇ Y columns in the pixel array unit 110.
- the exposure time is set to “Middle” in the pixel data Sig of 2 rows ⁇ 2 columns corresponding to the upper left and lower right three-color sensor pixel groups P2.
- One pixel data Sig3 corresponding to the sensor pixel 111 set to “” is included.
- the exposure time is set to “Short” for the pixel data Sig of 2 rows ⁇ 2 columns corresponding to the lower left three-color sensor pixel group P2.
- the exposure time is set to “Long” in the pixel data Sig of 2 rows ⁇ 2 columns corresponding to the upper right three-color sensor pixel group P2.
- the arithmetic circuit 20 generates HDR image data Ib based on the image data Ia obtained by the image sensor 10.
- the arithmetic circuit 20 first decomposes the image data Ia for each exposure time (step S1101). Specifically, the arithmetic circuit 20 determines the image data Ia as data having an exposure time of “Middle” (image data Im) and data having an exposure time of “Long” (image data Il). , The exposure time is “short” data (image data Is).
- the arithmetic circuit 20 generates phase difference data Pd91, Pd92, Pd93 based on the image data Im (step S1102). Specifically, the arithmetic circuit 20 sets two pixel data corresponding to each single color sensor pixel group P1 included in the upper left and lower right three color sensor pixel groups P2 of each three color sensor pixel group P3 in the image data Im. The difference value of Sig1 is derived, and the phase difference data Pd91 in the first direction (downward to the right) on the light receiving surface 110A is generated from the derived difference value.
- the arithmetic circuit 20 derives a difference value between two pixel data Sig2 corresponding to each monochromatic sensor pixel group P1 included in the three-color sensor pixel group P2 at the lower left of each three-color sensor pixel group P3 in the image data Is. Then, the phase difference data Pd92 in the second direction (left-right direction) on the light receiving surface 110A is generated from the derived difference value. Further, the arithmetic circuit 20 derives a difference value between two pixel data Sig3 corresponding to each monochromatic sensor pixel group P1 included in the three color sensor pixel group P2 on the upper right of each three color sensor pixel group P3 in the image data Il. Then, the phase difference data Pd93 in the third direction (vertical direction) on the light receiving surface 110A is generated from the derived difference value.
- the arithmetic circuit 20 also generates the phase difference data Pd94 based on the image data Il and Im (step S1102). Specifically, the arithmetic circuit 20 obtains the image data Il and an image obtained by multiplying the image data Im by the exposure time ratio of the exposure time “Long” and the exposure time “Middle”. The difference value from the data Im′ is derived, and the phase difference data Pd94 is generated from the derived difference value. The arithmetic circuit 20 also generates the phase difference data Pd95 based on the image data Im and Is (step S1102).
- the arithmetic circuit 20 stores the image data Im and the image data Is′ obtained by multiplying the image data Is by the exposure time ratio of the exposure time “Middle” and the exposure time “Short”.
- the difference value is derived, and the phase difference data Pd95 is generated from the derived difference value.
- the arithmetic circuit 20 generates the level data Da for the phase difference based on the phase difference data Pd91, Pd92, Pd93 (step S1103).
- the level data Da is, for example, data represented by a value within a range from a lower limit value (for example, 0 bit) to an upper limit value (for example, 128 bit).
- the arithmetic circuit 20 converts, for example, in the phase difference data Pd91, Pd92, and Pd93, a numerical value below a predetermined range to a lower limit value (for example, 0 bit).
- the arithmetic circuit 20 converts a numerical value exceeding a predetermined range in the phase difference data Pd91, Pd92, and Pd93 into an upper limit value (for example, 128 bits).
- the arithmetic circuit 20 converts the phase difference data Pd94 and Pd95 into level data Db for the moving body (step S1104).
- the level data Db is, for example, data represented by a value within a range from a lower limit value (for example, 0 bit) to an upper limit value (for example, 128 bit).
- the arithmetic circuit 20 generates level data Db based on the noise level data of the image sensor 10 (noise data) and the phase difference data Pd94 and Pd95.
- the arithmetic circuit 20 detects a portion having a large phase difference from the obtained level data Da (step S1105). Further, the arithmetic circuit 20 detects the presence or absence of a moving body from the obtained level data Db (step S1106). Finally, the arithmetic circuit 20 generates HDR image data Ib from the image data Im, Il, Is, the presence/absence of a phase difference, and the presence/absence of a moving object (step S1107). In this way, the HDR image data Ib is generated.
- the pixel array unit 110 is configured to obtain phase difference data in three directions for each color. Accordingly, it is possible to determine the presence/absence of a phase difference and the presence/absence of a moving body in three directions for each color.
- the pixel array unit 110 is configured to obtain the phase difference data based on the monochromatic image data.
- the pixel array unit 110 may be configured to obtain the phase difference data for each color based on the image data of all colors.
- FIG. 42 shows a configuration example of the pixel array section 110 according to this modification.
- FIG. 43 shows an example of the wiring layout of the pixel array section 110 of FIG.
- FIG. 44 shows an example of the direction of the phase difference that can be detected by the pixel array section 110 of FIG.
- the pixel array unit 110 is configured so that phase difference data in two directions can be obtained for each color from the three-color sensor pixel group P3. Specifically, the pixel array unit 110 is configured to obtain phase difference data in the upward-sloping direction and the downward-sloping direction for each color from the three-color sensor pixel group P3. At this time, in the upper left three-color sensor image group P2 and the lower right three-color sensor image group P2 included in each three-color sensor pixel group P3, each single-color sensor pixel group P1 is related to the above embodiment. It has the same configuration as the single color sensor pixel group P1.
- each single-color sensor pixel group P1 is provided with phase difference data in the direction of rising right and down. It is configured.
- each single-color sensor pixel group P1 is configured to obtain phase difference data in the upward-sloping direction and the downward-sloping direction. It is configured.
- the exposure time of the two sensor pixels 111 (photodiodes PD) arranged is set to "Middle” as shown in FIG.
- the lower right sensor pixel 111 (photodiode The exposure time of PD) is set to "Short" as shown in FIG.
- the exposure time is set to "Long" as shown in FIG.
- each single-color sensor pixel group P1 in the lower-left three-color sensor image group P2 included in each three-color sensor pixel group P3 the exposure of the two sensor pixels 111 (photodiode PD) arranged in the direction of rising to the right.
- the time is set to "Middle” as shown in FIG.
- the time is set to "Short" as shown in FIG.
- the exposure time of the two sensor pixels 111 (photodiode PD) of each monochromatic sensor pixel group P1 in the lower left three-color sensor image group P2 included in each three-color sensor pixel group P3 is different from each other.
- the exposure times of the plurality of sensor pixels 111 (photodiodes PD) are controlled so that the exposure times of the one sensor pixel 111 (photodiode PD) are the same.
- each single-color sensor pixel group P1 in the upper-right three-color sensor image group P2 included in each three-color sensor pixel group P3 the exposure of the two sensor pixels 111 (photodiode PD) arranged in the direction of rising to the right.
- the time is set to "Middle” as shown in FIG.
- the time is set to "Long" as shown in FIG.
- the exposure times of the two sensor pixels 111 (photodiodes PD) of the single-color sensor pixel group P1 in the upper-right three-color sensor image group P2 included in the three-color sensor pixel groups P3 are different from each other.
- the exposure times of the plurality of sensor pixels 111 (photodiodes PD) are controlled so that the exposure times of the one sensor pixel 111 (photodiode PD) are the same.
- the pixel drive line ctlL and the pixel drive line are provided at locations corresponding to the upper stage. It has the line ctlM.
- the pixel drive line ctlL and the pixel drive line are provided at locations corresponding to the lower row in each of the uppermost and second uppermost single color sensor pixel groups P1 included in each three-color sensor pixel group P3. It has a ctlM and a pixel drive line ctlS.
- the pixel drive line ctlL and the pixel drive line are provided at locations corresponding to the upper row. It has a ctlM and a pixel drive line ctlS.
- the pixel array section 110 includes the pixel drive line ctlS and the pixel drive line at locations corresponding to the lower row in the third color sensor pixel group P1 and the single color sensor pixel group P1 in the third row and the bottom row included in each three color sensor pixel group P3. It has ctlM.
- the image data Ia includes pixel data Sig of X rows ⁇ Y columns corresponding to the sensor pixels 111 of X rows ⁇ Y columns in the pixel array unit 110.
- the exposure time is set to “Middle” in the pixel data Sig of 2 rows ⁇ 2 columns corresponding to the upper left and lower right three-color sensor pixel groups P2.
- One pixel data Sig3 corresponding to the sensor pixel 111 set to “” is included.
- the exposure time is set to “short” for the pixel data Sig of 2 rows ⁇ 2 columns corresponding to the lower left three-color sensor pixel group P2.
- Two pixel data Sig2 corresponding to the two sensor pixels 111 and two pixel data Sig1 corresponding to the two sensor pixels 111 whose exposure time is set to “Middle” are included.
- the exposure time is set to “Long” in the pixel data Sig of 2 rows ⁇ 2 columns corresponding to the upper right three-color sensor pixel group P2.
- Two pixel data Sig3 corresponding to the two sensor pixels 111 and two pixel data Sig1 corresponding to the two sensor pixels 111 whose exposure time is set to “Middle” are included.
- FIG. 45 shows an example of a signal processing procedure in the arithmetic circuit 20 in this modification.
- the arithmetic circuit 20 generates HDR image data Ib based on the image data Ia obtained by the image sensor 10.
- the arithmetic circuit 20 first decomposes the image data Ia for each exposure time (step S1201). Specifically, the arithmetic circuit 20 determines the image data Ia as data having an exposure time of “Middle” (image data Im) and data having an exposure time of “Long” (image data Il). , The exposure time is “short” data (image data Is).
- the arithmetic circuit 20 generates the phase difference data Pd101, Pd102, Pd103 based on the image data Im (step S1202). Specifically, the arithmetic circuit 20 sets two pixel data Sig1 corresponding to each single color sensor pixel group P1 included in the lower left and upper right three color sensor pixel groups P2 of each three color sensor pixel group P3 in the image data Im. Of the phase difference data Pd101 in the first direction (in the upward rising direction) on the light receiving surface 110A is generated from the derived difference value.
- the arithmetic circuit 20 derives a difference value between two pixel data Sig2 corresponding to each monochromatic sensor pixel group P1 included in the three-color sensor pixel group P2 at the lower left of each three-color sensor pixel group P3 in the image data Is. Then, from the derived difference value, the phase difference data Pd102 in the second direction (downward-rightward direction) on the light receiving surface 110A is generated. Further, the arithmetic circuit 20 derives a difference value between two pixel data Sig3 corresponding to each monochromatic sensor pixel group P1 included in the three color sensor pixel group P2 on the upper right of each three color sensor pixel group P3 in the image data Il. Then, from the derived difference value, the phase difference data Pd103 in the first direction (in the upward rising direction) on the light receiving surface 110A is generated.
- the arithmetic circuit 20 also generates the phase difference data Pd104 based on the image data Il and Im (step S1202). Specifically, the arithmetic circuit 20 obtains the image data Il and an image obtained by multiplying the image data Im by the exposure time ratio of the exposure time “Long” and the exposure time “Middle”. The difference value from the data Im′ is derived, and the phase difference data Pd104 is generated from the derived difference value. The arithmetic circuit 20 also generates the phase difference data Pd105 based on the image data Im and Is (step S1202).
- the arithmetic circuit 20 stores the image data Im and the image data Is′ obtained by multiplying the image data Is by the exposure time ratio of the exposure time “Middle” and the exposure time “Short”.
- the difference value is derived, and the phase difference data Pd105 is generated from the derived difference value.
- the arithmetic circuit 20 generates the level data Da for the phase difference based on the phase difference data Pd101, Pd102, Pd103 (step S1203).
- the level data Da is, for example, data represented by a value within a range from a lower limit value (for example, 0 bit) to an upper limit value (for example, 128 bit).
- the arithmetic circuit 20 converts, for example, in the phase difference data Pd101, Pd102, Pd103, a numerical value below a predetermined range to a lower limit value (for example, 0 bit).
- the arithmetic circuit 20 converts, for example, in the phase difference data Pd101, Pd102, Pd103, a numerical value exceeding a predetermined range into an upper limit value (for example, 128 bits). For example, the arithmetic circuit 20 converts a numerical value within a predetermined range in the phase difference data Pd101, Pd102, Pd103 into a value within a range of 1 bit to 127 bits according to the size of the numerical value.
- the arithmetic circuit 20 converts the phase difference data Pd104 and Pd105 into level data Db for the moving body (step S1204).
- the level data Db is, for example, data represented by a value within a range from a lower limit value (for example, 0 bit) to an upper limit value (for example, 128 bit).
- the arithmetic circuit 20 generates level data Db based on the noise level data (noise data) of the image sensor 10 and the phase difference data Pd104 and Pd105.
- the arithmetic circuit 20 detects a portion having a large phase difference from the obtained level data Da (step S1205). Further, the arithmetic circuit 20 detects the presence or absence of a moving body from the obtained level data Db (step S1206). Finally, the arithmetic circuit 20 generates the HDR image data Ib from the image data Im, Il, Is, the presence/absence of a phase difference, and the presence/absence of a moving object (step S1207). In this way, the HDR image data Ib is generated.
- the pixel array unit 110 is configured to obtain phase difference data in two directions for each color. This makes it possible to determine the presence/absence of a phase difference and the presence/absence of a moving body in each of the two directions for each color.
- the pixel array unit 110 is configured to obtain phase difference data in two directions (upward rightward direction, downward rightward direction) for each color based on image data of a single color. It had been. However, in the modified example K, the pixel array unit 110 may be configured to obtain phase difference data in two directions (horizontal direction, vertical direction) for each color based on image data of a single color. Good.
- FIG. 46 shows a configuration example of the pixel array section 110 according to this modification.
- FIG. 47 shows an example of a wiring layout of the pixel array section 110 of FIG.
- FIG. 48 shows an example of directions of phase differences that can be detected by the pixel array section 110 of FIG.
- the pixel array unit 110 is configured so that phase difference data in two directions can be obtained for each color from the three-color sensor pixel group P3. Specifically, the pixel array unit 110 is configured to obtain phase difference data in the horizontal direction and the vertical direction for each color from the three-color sensor pixel group P3. At this time, in the upper left three-color sensor image group P2 and the lower right three-color sensor image group P2 included in each three-color sensor pixel group P3, each single-color sensor pixel group P1 is related to the above embodiment. It has the same configuration as the single color sensor pixel group P1.
- each single-color sensor pixel group P1 is configured to obtain phase difference data in the left-right direction. Further, in the upper right three-color sensor image group P2 included in each three-color sensor pixel group P3, each single-color sensor pixel group P1 is configured to obtain the phase difference data in the vertical direction.
- the exposure time of the two arrayed sensor pixels 111 (photodiodes PD) is set to "Middle” as shown in FIG.
- the lower right sensor pixel 111 photodiode The exposure time of PD) is set to "Short" as shown in FIG.
- the upper left sensor pixel 111 (photodiode PD ) Exposure time is set to "Long" as shown in FIG.
- each single color sensor pixel group P1 in the lower left three color sensor image group P2 included in each three color sensor pixel group P3 the exposure time of the two sensor pixels 111 (photodiode PD) arranged in the left and right direction in the upper row. Is set to "Short” as shown in FIG.
- the exposure time of the two sensor pixels 111 (photodiodes PD) arranged in the left-right direction in the lower row Is set to "Middle" as shown in FIG.
- the exposure time of the two sensor pixels 111 (photodiode PD) of each monochromatic sensor pixel group P1 in the lower left three-color sensor image group P2 included in each three-color sensor pixel group P3 is different from each other.
- the exposure times of the plurality of sensor pixels 111 (photodiodes PD) are controlled so that the exposure times of the one sensor pixel 111 (photodiode PD) are the same.
- each single color sensor pixel group P1 in the upper right three color sensor image group P2 included in each three color sensor pixel group P3 the exposure time of the two sensor pixels 111 (photodiode PD) arranged vertically on the left side. Is set to "Long” as shown in FIG. Further, in each single color sensor pixel group P1 in the upper right three color sensor image group P2 included in each three color sensor pixel group P3, the exposure time of the two sensor pixels 111 (photodiode PD) arranged vertically on the right side. Is set to "Middle" as shown in FIG.
- the exposure times of the two sensor pixels 111 (photodiodes PD) of the single-color sensor pixel group P1 in the upper-right three-color sensor image group P2 included in the three-color sensor pixel groups P3 are different from each other.
- the exposure times of the plurality of sensor pixels 111 (photodiodes PD) are controlled so that the exposure times of the one sensor pixel 111 (photodiode PD) are the same.
- the pixel drive line ctlL and the pixel drive line are provided at locations corresponding to the upper stage. It has the line ctlM.
- the pixel drive line ctlL and the pixel drive line are provided at locations corresponding to the lower row in each of the uppermost and second uppermost single color sensor pixel groups P1 included in each three-color sensor pixel group P3. It has a ctlM and a pixel drive line ctlS.
- the pixel drive line ctlL and the pixel drive line are provided at locations corresponding to the upper row. It has a ctlM and a pixel drive line ctlS.
- the pixel array section 110 includes the pixel drive line ctlS and the pixel drive line at locations corresponding to the lower row in the third color sensor pixel group P1 and the single color sensor pixel group P1 in the third row and the bottom row included in each three color sensor pixel group P3. It has ctlM.
- the image data Ia includes pixel data Sig of X rows ⁇ Y columns corresponding to the sensor pixels 111 of X rows ⁇ Y columns in the pixel array unit 110.
- the exposure time is set to “Middle” in the pixel data Sig of 2 rows ⁇ 2 columns corresponding to the upper left and lower right three-color sensor pixel groups P2.
- One pixel data Sig3 corresponding to the sensor pixel 111 set to “” is included.
- the exposure time is set to “short” for the pixel data Sig of 2 rows ⁇ 2 columns corresponding to the lower left three-color sensor pixel group P2.
- Two pixel data Sig2 corresponding to the two sensor pixels 111 and two pixel data Sig1 corresponding to the two sensor pixels 111 whose exposure time is set to “Middle” are included.
- the exposure time is set to “Long” in the pixel data Sig of 2 rows ⁇ 2 columns corresponding to the upper right three-color sensor pixel group P2.
- Two pixel data Sig3 corresponding to the two sensor pixels 111 and two pixel data Sig1 corresponding to the two sensor pixels 111 whose exposure time is set to “Middle” are included.
- FIG. 49 shows an example of a signal processing procedure in the arithmetic circuit 20 in this modification.
- the arithmetic circuit 20 generates HDR image data Ib based on the image data Ia obtained by the image sensor 10.
- the arithmetic circuit 20 first decomposes the image data Ia for each exposure time (step S1301). Specifically, the arithmetic circuit 20 determines the image data Ia as data having an exposure time of “Middle” (image data Im) and data having an exposure time of “Long” (image data Il). , The exposure time is “short” data (image data Is).
- the arithmetic circuit 20 generates the phase difference data Pd111, Pd112, Pd113, Pd114 based on the image data Im (step S1302). Specifically, the arithmetic circuit 20 causes the difference between two pixel data Sig1 corresponding to each single color sensor pixel group P1 included in the lower left three color sensor pixel group P2 of each three color sensor pixel group P3 in the image data Im. The value is derived, and the phase difference data Pd111 in the first direction (left-right direction) on the light receiving surface 110A is generated from the derived difference value.
- the arithmetic circuit 20 derives a difference value between two pixel data Sig1 corresponding to each monochromatic sensor pixel group P1 included in the three-color sensor pixel group P2 at the upper right of each three-color sensor pixel group P3 in the image data Im. Then, the phase difference data Pd112 in the second direction (vertical direction) on the light receiving surface 110A is generated from the derived difference value. Further, the arithmetic circuit 20 derives a difference value between two pixel data Sig2 corresponding to each monochromatic sensor pixel group P1 included in the three-color sensor pixel group P2 at the lower left of each three-color sensor pixel group P3 in the image data Is.
- phase difference data Pd113 in the first direction (left-right direction) on the light receiving surface 110A is generated from the derived difference value.
- the arithmetic circuit 20 derives a difference value between two pixel data Sig3 corresponding to each monochromatic sensor pixel group P1 included in the three color sensor pixel group P2 on the upper right of each three color sensor pixel group P3 in the image data Il.
- the phase difference data Pd114 in the second direction (vertical direction) on the light receiving surface 110A is generated from the derived difference value.
- the arithmetic circuit 20 also generates the phase difference data Pd115 based on the image data Il and Im (step S1302). Specifically, the arithmetic circuit 20 obtains the image data Il and an image obtained by multiplying the image data Im by the exposure time ratio of the exposure time “Long” and the exposure time “Middle”. The difference value from the data Im′ is derived, and the phase difference data Pd115 is generated from the derived difference value. The arithmetic circuit 20 also generates the phase difference data Pd116 based on the image data Im and Is (step S1302).
- the arithmetic circuit 20 stores the image data Im and the image data Is′ obtained by multiplying the image data Is by the exposure time ratio of the exposure time “Middle” and the exposure time “Short”.
- the difference value is derived, and the phase difference data Pd116 is generated from the derived difference value.
- the arithmetic circuit 20 generates the level data Da for the phase difference based on the phase difference data Pd111, Pd112, Pd113, Pd114 (step S1303).
- the level data Da is, for example, data represented by a value within a range from a lower limit value (for example, 0 bit) to an upper limit value (for example, 128 bit).
- the arithmetic circuit 20 converts, for example, in the phase difference data Pd111, Pd112, Pd113, and Pd114, a numerical value below a predetermined range to a lower limit value (for example, 0 bit).
- the arithmetic circuit 20 converts, for example, in the phase difference data Pd111, Pd112, Pd113, Pd114, a numerical value exceeding a predetermined range into an upper limit value (for example, 128 bits). For example, the arithmetic circuit 20 converts a numerical value within a predetermined range in the phase difference data Pd111, Pd112, Pd113, and Pd114 into a value within a range of 1 bit to 127 bits according to the size of the numerical value. ..
- the arithmetic circuit 20 converts the phase difference data Pd115 and Pd116 into level data Db for the moving body (step S1304).
- the level data Db is, for example, data represented by a value within a range from a lower limit value (for example, 0 bit) to an upper limit value (for example, 128 bit).
- the arithmetic circuit 20 generates level data Db based on the noise level data (noise data) of the image sensor 10 and the phase difference data Pd115 and Pd116.
- the arithmetic circuit 20 detects a portion having a large phase difference from the obtained level data Da (step S1305). Further, the arithmetic circuit 20 detects the presence or absence of a moving body from the obtained level data Db (step S1306). Finally, the arithmetic circuit 20 generates the HDR image data Ib from the image data Im, Il, Is, the presence/absence of a phase difference, and the presence/absence of a moving object (step S1307). In this way, the HDR image data Ib is generated.
- the pixel array unit 110 is configured to obtain phase difference data in two directions for each color. This makes it possible to determine the presence/absence of a phase difference and the presence/absence of a moving body in each of the two directions for each color.
- the technology according to the present disclosure (this technology) can be applied to various products.
- the technology according to the present disclosure is realized as a device mounted on any type of moving body such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, personal mobility, an airplane, a drone, a ship, and a robot. May be.
- FIG. 50 is a block diagram showing a schematic configuration example of a vehicle control system which is an example of a mobile body control system to which the technology according to the present disclosure can be applied.
- the vehicle control system 12000 includes a plurality of electronic control units connected via a communication network 12001.
- the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, a vehicle exterior information detection unit 12030, a vehicle interior information detection unit 12040, and an integrated control unit 12050.
- a microcomputer 12051, an audio/video output unit 12052, and an in-vehicle network I/F (interface) 12053 are shown as the functional configuration of the integrated control unit 12050.
- the drive system control unit 12010 controls the operation of devices related to the drive system of the vehicle according to various programs.
- the drive system control unit 12010 includes a drive force generation device for generating a drive force of a vehicle such as an internal combustion engine or a drive motor, a drive force transmission mechanism for transmitting the drive force to wheels, and a steering angle of the vehicle. It functions as a steering mechanism for adjustment and a control device such as a braking device that generates a braking force of the vehicle.
- the body system control unit 12020 controls the operation of various devices mounted on the vehicle body according to various programs.
- the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as a head lamp, a back lamp, a brake lamp, a winker, or a fog lamp.
- the body system control unit 12020 may receive radio waves or signals of various switches transmitted from a portable device that substitutes for a key.
- the body system control unit 12020 receives input of these radio waves or signals and controls the vehicle door lock device, power window device, lamp, and the like.
- the vehicle exterior information detection unit 12030 detects information outside the vehicle equipped with the vehicle control system 12000.
- the imaging unit 12031 is connected to the vehicle outside information detection unit 12030.
- the vehicle exterior information detection unit 12030 causes the image capturing unit 12031 to capture an image of the vehicle exterior and receives the captured image.
- the vehicle exterior information detection unit 12030 may perform object detection processing or distance detection processing such as people, vehicles, obstacles, signs, or characters on the road surface based on the received image.
- the image pickup unit 12031 is an optical sensor that receives light and outputs an electric signal according to the amount of received light.
- the imaging unit 12031 can output the electric signal as an image or as distance measurement information.
- the light received by the imaging unit 12031 may be visible light or invisible light such as infrared light.
- the in-vehicle information detection unit 12040 detects in-vehicle information.
- a driver state detection unit 12041 that detects the state of the driver is connected.
- the driver state detection unit 12041 includes, for example, a camera that images the driver, and the in-vehicle information detection unit 12040 determines the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated or it may be determined whether the driver is asleep.
- the microcomputer 12051 calculates a control target value of the driving force generation device, the steering mechanism, or the braking device based on the information inside or outside the vehicle acquired by the outside information detection unit 12030 or the inside information detection unit 12040, and the drive system control unit.
- a control command can be output to 12010.
- the microcomputer 12051 realizes a function of ADAS (Advanced Driver Assistance System) including avoidance or impact mitigation of a vehicle, follow-up traveling based on an inter-vehicle distance, vehicle speed maintenance traveling, a vehicle collision warning, or a vehicle lane departure warning. It is possible to perform cooperative control for the purpose.
- ADAS Advanced Driver Assistance System
- the microcomputer 12051 controls the driving force generation device, the steering mechanism, the braking device, or the like based on the information around the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, so that the driver's It is possible to perform cooperative control for the purpose of autonomous driving or the like that autonomously travels without depending on the operation.
- the microcomputer 12051 can output a control command to the body system control unit 12020 based on the information outside the vehicle acquired by the outside information detection unit 12030.
- the microcomputer 12051 controls the headlamp according to the position of the preceding vehicle or the oncoming vehicle detected by the vehicle exterior information detection unit 12030, and performs cooperative control for the purpose of antiglare such as switching the high beam to the low beam. It can be carried out.
- the voice image output unit 12052 transmits an output signal of at least one of a voice and an image to an output device capable of visually or audibly notifying information to an occupant of the vehicle or the outside of the vehicle.
- an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are illustrated as output devices.
- the display unit 12062 may include, for example, at least one of an onboard display and a head-up display.
- FIG. 51 is a diagram showing an example of the installation position of the imaging unit 12031.
- the vehicle 12100 has imaging units 12101, 12102, 12103, 12104, 12105 as the imaging unit 12031.
- the imaging units 12101, 12102, 12103, 12104, 12105 are provided at positions such as the front nose of the vehicle 12100, the side mirrors, the rear bumper, the back door, and the upper portion of the windshield inside the vehicle.
- the image capturing unit 12101 provided on the front nose and the image capturing unit 12105 provided on the upper part of the windshield in the vehicle interior mainly acquire images in front of the vehicle 12100.
- the imaging units 12102 and 12103 included in the side mirrors mainly acquire images of the side of the vehicle 12100.
- the imaging unit 12104 provided on the rear bumper or the back door mainly acquires an image of the rear of the vehicle 12100.
- the front images acquired by the imaging units 12101 and 12105 are mainly used for detecting a preceding vehicle or a pedestrian, an obstacle, a traffic signal, a traffic sign, a lane, or the like.
- FIG. 51 shows an example of the shooting range of the imaging units 12101 to 12104.
- the imaging range 12111 indicates the imaging range of the imaging unit 12101 provided on the front nose
- the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided on the side mirrors
- the imaging range 12114 indicates The imaging range of the imaging part 12104 provided in a rear bumper or a back door is shown.
- a bird's-eye view image of the vehicle 12100 viewed from above can be obtained.
- At least one of the image capturing units 12101 to 12104 may have a function of acquiring distance information.
- at least one of the image capturing units 12101 to 12104 may be a stereo camera including a plurality of image capturing elements, or may be an image capturing element having pixels for phase difference detection.
- the microcomputer 12051 based on the distance information obtained from the imaging units 12101 to 12104, the distance to each three-dimensional object within the imaging range 12111 to 12114 and the temporal change of this distance (relative speed with respect to the vehicle 12100).
- the closest three-dimensional object on the traveling path of the vehicle 12100 which travels in the substantially same direction as the vehicle 12100 at a predetermined speed (for example, 0 km/h or more), can be extracted as a preceding vehicle by determining it can.
- the microcomputer 12051 can set an inter-vehicle distance to be secured in front of the preceding vehicle, and can perform automatic brake control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like. In this way, it is possible to perform cooperative control for the purpose of autonomous driving or the like that autonomously travels without depending on the operation of the driver.
- the microcomputer 12051 uses the distance information obtained from the imaging units 12101 to 12104 to convert three-dimensional object data regarding a three-dimensional object into another three-dimensional object such as a two-wheeled vehicle, an ordinary vehicle, a large vehicle, a pedestrian, and a utility pole. It can be classified and extracted and used for automatic avoidance of obstacles. For example, the microcomputer 12051 distinguishes obstacles around the vehicle 12100 into obstacles visible to the driver of the vehicle 12100 and obstacles difficult to see. Then, the microcomputer 12051 determines the collision risk indicating the risk of collision with each obstacle, and when the collision risk is equal to or more than the set value and there is a possibility of collision, the microcomputer 12051 outputs the audio through the audio speaker 12061 and the display unit 12062. A driver can be assisted for collision avoidance by outputting an alarm to the driver and by performing forced deceleration or avoidance steering through the drive system control unit 12010.
- At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
- the microcomputer 12051 can recognize a pedestrian by determining whether or not a pedestrian is present in the images captured by the imaging units 12101 to 12104.
- a procedure of extracting a feature point in an image captured by the image capturing units 12101 to 12104 as an infrared camera and a pattern matching process on a series of feature points indicating the contour of an object are performed to determine whether the pedestrian is a pedestrian. Is performed by the procedure for determining.
- the audio image output unit 12052 causes the recognized pedestrian to have a rectangular contour line for emphasis.
- the display unit 12062 is controlled so as to superimpose and display. Further, the audio image output unit 12052 may control the display unit 12062 so as to display an icon indicating a pedestrian or the like at a desired position.
- the above has described an example of the mobile control system to which the technology according to the present disclosure can be applied.
- the technology according to the present disclosure can be applied to the imaging unit 12031 among the configurations described above.
- the image pickup device 1 according to the above-described embodiment and its modification can be applied to the image pickup unit 12031.
- the technique according to the present disclosure to the image capturing unit 12031, a high-definition captured image with less noise can be obtained, and thus a highly accurate control using the captured image can be performed in the mobile body control system.
- FIG. 52 is a diagram showing an example of a schematic configuration of an endoscopic surgery system to which the technology according to the present disclosure (the present technology) can be applied.
- FIG. 52 illustrates a situation in which an operator (doctor) 11131 is operating on a patient 11132 on a patient bed 11133 using the endoscopic surgery system 11000.
- the endoscopic surgery system 11000 includes an endoscope 11100, other surgical tools 11110 such as a pneumoperitoneum tube 11111 and an energy treatment tool 11112, and a support arm device 11120 that supports the endoscope 11100.
- a cart 11200 on which various devices for endoscopic surgery are mounted.
- the endoscope 11100 is composed of a lens barrel 11101 into which a region having a predetermined length from the distal end is inserted into the body cavity of the patient 11132, and a camera head 11102 connected to the base end of the lens barrel 11101.
- the endoscope 11100 configured as a so-called rigid mirror having the rigid barrel 11101 is illustrated, but the endoscope 11100 may be configured as a so-called flexible mirror having a flexible barrel. Good.
- An opening in which an objective lens is fitted is provided at the tip of the lens barrel 11101.
- a light source device 11203 is connected to the endoscope 11100, and light generated by the light source device 11203 is guided to the tip of the lens barrel by a light guide extending inside the lens barrel 11101. It is irradiated toward the observation target in the body cavity of the patient 11132 via the lens.
- the endoscope 11100 may be a direct-viewing endoscope, or may be a perspective or side-viewing endoscope.
- An optical system and an image pickup device are provided inside the camera head 11102, and the reflected light (observation light) from the observation target is condensed on the image pickup device by the optical system.
- the observation light is photoelectrically converted by the imaging element, and an electric signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated.
- the image signal is transmitted to the camera control unit (CCU: Camera Control Unit) 11201 as RAW data.
- the CCU 11201 is composed of a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), etc., and controls the operations of the endoscope 11100 and the display device 11202 in a centralized manner. Further, the CCU 11201 receives the image signal from the camera head 11102, and performs various image processing such as development processing (demosaic processing) on the image signal for displaying an image based on the image signal.
- image processing such as development processing (demosaic processing)
- the display device 11202 displays an image based on an image signal subjected to image processing by the CCU 11201 under the control of the CCU 11201.
- the light source device 11203 is composed of a light source such as an LED (Light Emitting Diode), for example, and supplies irradiation light to the endoscope 11100 when photographing a surgical site or the like.
- a light source such as an LED (Light Emitting Diode), for example, and supplies irradiation light to the endoscope 11100 when photographing a surgical site or the like.
- the input device 11204 is an input interface for the endoscopic surgery system 11000.
- the user can input various kinds of information and instructions to the endoscopic surgery system 11000 via the input device 11204.
- the user inputs an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 11100.
- the treatment instrument control device 11205 controls driving of the energy treatment instrument 11112 for cauterization of tissue, incision, sealing of blood vessel, or the like.
- the pneumoperitoneum device 11206 is used to inflate the body cavity of the patient 11132 through the pneumoperitoneum tube 11111 in order to inflate the body cavity of the patient 11132 for the purpose of securing the visual field by the endoscope 11100 and the working space of the operator.
- the recorder 11207 is a device capable of recording various information regarding surgery.
- the printer 11208 is a device that can print various types of information regarding surgery in various formats such as text, images, or graphs.
- the light source device 11203 that supplies the endoscope 11100 with irradiation light when imaging a surgical site can be configured by, for example, an LED, a laser light source, or a white light source configured by a combination thereof.
- a white light source is formed by a combination of RGB laser light sources
- the output intensity and output timing of each color (each wavelength) can be controlled with high accuracy, so that the light source device 11203 adjusts the white balance of the captured image. It can be carried out.
- the laser light from each of the RGB laser light sources is time-divisionally irradiated to the observation target, and the drive of the image pickup device of the camera head 11102 is controlled in synchronization with the irradiation timing to correspond to each of RGB. It is also possible to take the captured image in a time division manner. According to this method, a color image can be obtained without providing a color filter on the image sensor.
- the drive of the light source device 11203 may be controlled so as to change the intensity of the output light at predetermined time intervals.
- the drive of the image sensor of the camera head 11102 in synchronism with the timing of changing the intensity of the light to acquire an image in a time-division manner and synthesizing the images, a high dynamic image without so-called blackout and overexposure Images of the range can be generated.
- the light source device 11203 may be configured to be able to supply light in a predetermined wavelength band corresponding to special light observation.
- special light observation for example, by utilizing the wavelength dependence of absorption of light in body tissues, by irradiating a narrow band of light as compared with irradiation light (that is, white light) in normal observation, the mucosal surface layer
- the so-called narrow band imaging is performed in which a predetermined tissue such as a blood vessel is imaged with high contrast.
- fluorescence observation in which an image is obtained by the fluorescence generated by irradiating the excitation light may be performed.
- the body tissue is irradiated with excitation light to observe the fluorescence from the body tissue (autofluorescence observation), or a reagent such as indocyanine green (ICG) is locally injected into the body tissue and the body tissue is injected.
- the excitation light corresponding to the fluorescence wavelength of the reagent can be irradiated to obtain a fluorescence image or the like.
- the light source device 11203 may be configured to be able to supply the narrow band light and/or the excitation light corresponding to such special light observation.
- FIG. 53 is a block diagram showing an example of the functional configuration of the camera head 11102 and the CCU 11201 shown in FIG.
- the camera head 11102 includes a lens unit 11401, an imaging unit 11402, a driving unit 11403, a communication unit 11404, and a camera head control unit 11405.
- the CCU 11201 has a communication unit 11411, an image processing unit 11412, and a control unit 11413.
- the camera head 11102 and the CCU 11201 are communicably connected to each other via a transmission cable 11400.
- the lens unit 11401 is an optical system provided at the connecting portion with the lens barrel 11101.
- the observation light taken in from the tip of the lens barrel 11101 is guided to the camera head 11102 and enters the lens unit 11401.
- the lens unit 11401 is configured by combining a plurality of lenses including a zoom lens and a focus lens.
- the image pickup unit 11402 includes an image pickup element.
- the number of image pickup elements forming the image pickup unit 11402 may be one (so-called single-plate type) or plural (so-called multi-plate type).
- image signals corresponding to R, G, and B may be generated by the respective image pickup elements, and these may be combined to obtain a color image.
- the image capturing unit 11402 may be configured to have a pair of image capturing elements for respectively acquiring the image signals for the right eye and the left eye corresponding to 3D (Dimensional) display.
- the 3D display enables the operator 11131 to more accurately grasp the depth of the living tissue in the operation site.
- a plurality of lens units 11401 may be provided corresponding to each image pickup element.
- the image pickup unit 11402 does not necessarily have to be provided in the camera head 11102.
- the imaging unit 11402 may be provided inside the lens barrel 11101 immediately after the objective lens.
- the drive unit 11403 is composed of an actuator, and moves the zoom lens and the focus lens of the lens unit 11401 by a predetermined distance along the optical axis under the control of the camera head control unit 11405. Accordingly, the magnification and focus of the image captured by the image capturing unit 11402 can be adjusted as appropriate.
- the communication unit 11404 is composed of a communication device for transmitting/receiving various information to/from the CCU 11201.
- the communication unit 11404 transmits the image signal obtained from the imaging unit 11402 as RAW data to the CCU 11201 via the transmission cable 11400.
- the communication unit 11404 receives a control signal for controlling the driving of the camera head 11102 from the CCU 11201, and supplies it to the camera head control unit 11405.
- the control signal includes, for example, information that specifies the frame rate of the captured image, information that specifies the exposure value at the time of capturing, and/or information that specifies the magnification and focus of the captured image. Contains information about the condition.
- the image capturing conditions such as the frame rate, the exposure value, the magnification, and the focus may be appropriately designated by the user, or may be automatically set by the control unit 11413 of the CCU 11201 based on the acquired image signal. Good. In the latter case, the so-called AE (Auto Exposure) function, AF (Auto Focus) function, and AWB (Auto White Balance) function are mounted on the endoscope 11100.
- AE Auto Exposure
- AF Auto Focus
- AWB Auto White Balance
- the camera head control unit 11405 controls the driving of the camera head 11102 based on the control signal from the CCU 11201 received via the communication unit 11404.
- the communication unit 11411 is composed of a communication device for transmitting and receiving various information to and from the camera head 11102.
- the communication unit 11411 receives the image signal transmitted from the camera head 11102 via the transmission cable 11400.
- the communication unit 11411 transmits a control signal for controlling the driving of the camera head 11102 to the camera head 11102.
- the image signal and the control signal can be transmitted by electric communication, optical communication, or the like.
- the image processing unit 11412 performs various types of image processing on the image signal that is the RAW data transmitted from the camera head 11102.
- the control unit 11413 performs various controls regarding imaging of a surgical site or the like by the endoscope 11100 and display of a captured image obtained by imaging the surgical site or the like. For example, the control unit 11413 generates a control signal for controlling the driving of the camera head 11102.
- the control unit 11413 also causes the display device 11202 to display a captured image of the surgical site or the like based on the image signal subjected to the image processing by the image processing unit 11412. At this time, the control unit 11413 may recognize various objects in the captured image using various image recognition techniques. For example, the control unit 11413 detects a surgical instrument such as forceps, a specific body part, bleeding, and a mist when the energy treatment instrument 11112 is used by detecting the shape and color of the edge of the object included in the captured image. Can be recognized. When displaying the captured image on the display device 11202, the control unit 11413 may use the recognition result to superimpose and display various types of surgery support information on the image of the operation unit. By displaying the surgery support information in a superimposed manner and presenting it to the operator 11131, the burden on the operator 11131 can be reduced, and the operator 11131 can reliably proceed with the surgery.
- a surgical instrument such as forceps, a specific body part, bleeding, and a mist when
- the transmission cable 11400 that connects the camera head 11102 and the CCU 11201 is an electric signal cable compatible with electric signal communication, an optical fiber compatible with optical communication, or a composite cable of these.
- wired communication is performed using the transmission cable 11400, but communication between the camera head 11102 and the CCU 11201 may be performed wirelessly.
- the example of the endoscopic surgery system to which the technology according to the present disclosure can be applied has been described above.
- the technology according to the present disclosure can be suitably applied to the imaging unit 11402 provided in the camera head 11102 of the endoscope 11100 among the configurations described above.
- the image capturing unit 11402 can be downsized or high-definition, and thus a compact or high-definition endoscope 11100 can be provided.
- a plurality of pixels each including a photoelectric conversion element, arranged in a matrix on the light receiving surface, A plurality of light receiving lenses provided one for each of the plurality of pixels in the plurality of pixels;
- a control unit that controls the exposure time of the plurality of pixels, Of the plurality of pixels corresponding to each of the light receiving lenses, at least two of the pixels have the same exposure time, and at least two of the plurality of pixels of each of the plurality of pixels corresponding to each of the light receiving lenses.
- An imaging device that controls the exposure times of the plurality of pixels such that the exposure times of the pixels are different from each other.
- the image pickup according to (1) further including a plurality of Bayer array color filters provided for each of the first pixel groups when the plurality of pixels corresponding to each of the light receiving lenses are a first pixel group. apparatus.
- the control unit has the same exposure time of two pixels, and the three pixels of the plurality of pixels of each of the plurality of pixels corresponding to each of the light receiving lenses.
- the image pickup apparatus according to (2) wherein the exposure times of the plurality of pixels are controlled so that the exposure times of the pixels are different.
- the control unit exposes the two pixels on the light-receiving surface, which are arranged in an upward right direction, a downward right direction, a horizontal direction, or a vertical direction.
- the imaging device according to (3) wherein the exposure times of the plurality of pixels are controlled so that the times are the same.
- the controller is configured to operate in the first direction in the first first pixel group included in each of the second pixel groups. And the exposure times of the two pixels arranged in the second direction in the second first pixel group included in each of the second pixel groups are the same.
- the image pickup apparatus wherein the exposure times of the plurality of pixels are controlled so that the same values are obtained.
- (6) When the plurality of first pixel groups of 2 rows ⁇ 2 columns are the second pixel groups and the second pixel groups of 2 rows ⁇ 2 columns are the third pixel groups, each of the three In the first second pixel group included in the pixel group, the exposure times of the two pixels arranged in the first direction are the same, and the second second pixel included in each of the third pixel groups. In the group, the exposure times of the two pixels arranged in the second direction are the same, and further, the third second pixel groups included in each of the third pixel groups are arranged in the third direction.
- the controller is configured to operate in the first direction in the first first pixel group included in each of the second pixel groups.
- the two pixels arranged in the same direction have the same exposure time
- the two pixels arranged in the second direction have the same exposure time
- the second pixels included in each of the second pixel groups have the same exposure time.
- the two pixels arranged in the first direction have the same exposure time
- the two pixels arranged in the second direction have the same exposure time.
- the imaging device wherein the exposure time of the plurality of pixels is controlled.
- the controller is configured to operate in the first direction in the first first pixel group included in each of the second pixel groups.
- the two pixels arranged in the same direction have the same exposure time, and the two pixels arranged in the first direction and different from the two pixels arranged in the first direction have the same exposure time.
- the exposure times of the two pixels arranged in the second direction become the same, and the pixels are arranged in the second direction.
- the image pickup apparatus wherein the exposure times of the plurality of pixels are controlled so that the exposure times of the two pixels different from the two pixels arranged in the second direction are the same.
- the plurality of first pixel groups of 2 rows ⁇ 2 columns are the second pixel groups and the second pixel groups of 2 rows ⁇ 2 columns are the third pixel groups, each of the three
- the exposure times of the plurality of pixels are controlled so that the exposure times of the pixels are the same in one of the first pixel groups included in the pixel group.
- each of the three The imaging device according to (2) wherein the exposure times of the plurality of pixels are controlled so that the exposure times of the respective pixels are the same in one second pixel group included in the pixel group.
- (11) When the plurality of first pixel groups of 2 rows ⁇ 2 columns are the second pixel groups and the second pixel groups of 2 rows ⁇ 2 columns are the third pixel groups, each of the three The imaging device according to (2), wherein in each of the second pixel groups included in the pixel group, the exposure times of the plurality of pixels are controlled so that the exposure times of the pixels are the same.
- the exposure time of the plurality of pixels is controlled so that the exposure times of the two pixels arranged in the second direction are the same.
- each of the first pixel groups in the second pixel group the exposure time of the plurality of pixels is controlled so that the exposure times of the two pixels arranged in the third direction are the same.
- Imaging device When the plurality of first pixel groups of 2 rows ⁇ 2 columns are the second pixel groups and the second pixel groups of 2 rows ⁇ 2 columns are the third pixel groups, each of the three In each of the first pixel groups in the first second pixel group included in the pixel group, the exposure times of the two pixels arranged in the first direction are the same, and the two pixels arranged in the second direction are the same.
- the exposure times of the pixels are the same, and the exposure times of the two pixels arranged in the first direction in each of the first pixel groups in the second of the second pixel groups included in each of the third pixel groups
- the exposure time of the plurality of pixels is controlled so that the exposure times of the two pixels arrayed in the second direction become the same as each other.
- the exposure times of the two pixels different from the two pixels arranged in the first direction are the same, and further, the exposure times of the second second pixel groups included in the third pixel groups are the same.
- the two pixels arranged in the second direction have the same exposure time and are different from the two pixels arranged in the second direction and arranged in the second direction.
- the imaging device according to (2) wherein the exposure times of the plurality of pixels are controlled so that the exposure times of the two pixels are the same.
- an imaging device including a plurality of pixels each including a photoelectric conversion element and arranged in a matrix on a light-receiving surface, and a plurality of light-receiving lenses provided for each of the plurality of pixels in the plurality of pixels
- a signal processing method Of the plurality of pixels corresponding to each of the light receiving lenses, at least two of the pixels have the same exposure time, and at least two of the pixels of the plurality of pixels of each of the light receiving lenses have an exposure time. Controlling the exposure time of the plurality of pixels so that From the image data obtained by controlling the exposure time, phase difference data is generated for each exposure time, and HDR (High) is calculated from the plurality of phase difference data having different exposure times and the plurality of image data having different exposure times. Dynamic Range) Signal processing method including generating an image.
- the phase difference data is obtained for each exposure time from the image data obtained by the exposure control by the control unit. Since it is possible to generate an HDR image from a plurality of phase difference data having different exposure times and a plurality of image data having different exposure times, it is possible to generate an image quality such as color loss, coloring, and double contour. It is possible to suppress deterioration. As a result, it is possible to suppress image quality deterioration of the HDR image. Note that the effect of the present technology is not necessarily limited to the effect described here, and may be any effect described in the present specification.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Transforming Light Signals Into Electric Signals (AREA)
- Solid State Image Pick-Up Elements (AREA)
- Color Television Image Signal Generators (AREA)
Abstract
Description
(1)各受光レンズに対応する複数の画素のうち、少なくとも2つの画素の露光時間が同じになるとともに、各受光レンズに対応する複数の画素のうち、少なくとも2つの画素の露光時間が互いに異なるように、複数の画素の露光時間を制御すること
(2)露光時間の制御によって得られた画像データから、露光時間ごとに位相差データを生成し、露光時間の異なる複数の位相差データと、露光時間の異なる複数の画像データとから、HDR画像を生成すること
1.実施の形態(撮像装置)…図1~図9
2.変形例(撮像装置)…図10~図49
3.応用例
移動体への応用例…図50、図51
内視鏡手術システムへの応用例…図52、図53
[構成]
本開示の一実施の形態に係る撮像装置1について説明する。図1は、撮像装置1の概略構成の一例を表したものである。撮像装置1は、例えば、デジタルスチルカメラや、ビデオカメラ、スマートフォン、タブレット型端末等の電子機器である。撮像装置1は、撮像素子10、演算部20、フレームメモリ30、表示部40、記憶部50、操作部60、電源部70および光学系80を備えている。撮像素子10、演算部20、フレームメモリ30、表示部40、記憶部50、操作部60および電源部70は、バスラインLを介して相互に接続されている。
次に、本実施の形態に係る撮像装置1の効果について説明する。
以下に、上記実施の形態に係る撮像素子1の変形例について説明する。
上記実施の形態では、画素アレイ部110は、各単色センサ画素群P1から、一方向の位相差データが得られるように構成されていた。しかし、上記実施の形態において、画素アレイ部110は、各3色センサ画素群P2から、二方向の位相差データが得られるように構成されていてもよい。
上記変形例Aでは、画素アレイ部110は、各3色センサ画素群P2から、二方向の位相差データが得られるように構成されていた。しかし、上記変形例Aにおいて、画素アレイ部110は、2行×2列の3色センサ画素群P2(以下、「3色センサ画素群P3」と称する。)から、三方向の位相差データが得られるように構成されていてもよい。
上記実施の形態およびその変形例では、画素アレイ部110は、単色センサ画素群P1に対応する2行×2列の画素データSigから一方向の位相差データが得られるように構成されていた。しかし、上記実施の形態およびその変形例において、画素アレイ部110は、単色センサ画素群P1に対応する2行×2列の画素データSigから、二方向の位相差データが得られるように構成されていてもよい。
上記実施の形態およびその変形例では、画素アレイ部110は、単色センサ画素群P1に対応する2行×2列の画素データSigから一方向の位相差データが1つ得られるように構成されていた。しかし、上記実施の形態およびその変形例において、画素アレイ部110は、単色センサ画素群P1に対応する2行×2列の画素データSigから、一方向の位相差データが2つ得られるように構成されていてもよい。
上記変形例Bでは、画素アレイ部110は、各3色センサ画素群P3に対応する8行×8列の画素データSigから四方向の位相差データが得られるように構成されていた。しかし、上記実施の形態において、画素アレイ部110は、各3色センサ画素群P3のうち1つの単色センサ画素群Paに対応する2行×2列の画素データSigから、四方向の位相差データが得られるように構成されていてもよい。
上記変形例Eでは、画素アレイ部110は、各3色センサ画素群P3のうち1つの単色センサ画素群Paに対応する2行×2列の画素データSigから、四方向の位相差データが得られるように構成されていた。しかし、上記変形例Eにおいて、画素アレイ部110は、各3色センサ画素群P3のうち1つの3色センサ画素群P2に対応する4行×4列の画素データSigから、四方向の位相差データが得られるように構成されていてもよい。
上記変形例Fでは、画素アレイ部110は、各3色センサ画素群P3のうち1つの3色センサ画素群P2に対応する4行×4列の画素データSigから、四方向の位相差データが得られるように構成されていた。しかし、上記変形例Fにおいて、画素アレイ部110は、各単色センサ画素群P1に対応する2行×2列の画素データSigから、四方向の位相差データが得られるように構成されていてもよい。
上記変形例A,B,C,D,Eでは、画素アレイ部110は、単色の画像データに基づいて位相差データが得られるように構成されていた。しかし、上記実施の形態において、画素アレイ部110は、全色の画像データに基づいて、色ごとに位相差データが得られるように構成されていてもよい。
上記変形例Hでは、画素アレイ部110は、色ごとに二方向(右肩上がりの方向、右肩下がりの方向)の位相差データが得られるように構成されていた。しかし、上記変形例Hにおいて、画素アレイ部110は、色ごとに二方向(左右方向、上下方向)の位相差データが得られるように構成されていてもよい。
上記変形例H,Iでは、画素アレイ部110は、色ごとに二方向の位相差データが得られるように構成されていた。しかし、上記変形例H,Iにおいて、画素アレイ部110は、色ごとに三方向の位相差データが得られるように構成されていてもよい。
上記変形例A,B,C,D,Eでは、画素アレイ部110は、単色の画像データに基づいて位相差データが得られるように構成されていた。しかし、上記実施の形態において、画素アレイ部110は、全色の画像データに基づいて、色ごとに位相差データが得られるように構成されていてもよい。
上記変形例Kでは、画素アレイ部110は、単全色の画像データに基づいて、色ごとに二方向(右肩上がりの方向、右肩下がりの方向)の位相差データが得られるように構成されていた。しかし、上記変形例Kにおいて、画素アレイ部110は、単全色の画像データに基づいて、色ごとに二方向(左右方向、上下方向)の位相差データが得られるように構成されていてもよい。
[応用例1]
本開示に係る技術(本技術)は、様々な製品へ応用することができる。例えば、本開示に係る技術は、自動車、電気自動車、ハイブリッド電気自動車、自動二輪車、自転車、パーソナルモビリティ、飛行機、ドローン、船舶、ロボット等のいずれかの種類の移動体に搭載される装置として実現されてもよい。
図52は、本開示に係る技術(本技術)が適用され得る内視鏡手術システムの概略的な構成の一例を示す図である。
(1)
各々が光電変換素子を含み、受光面に行列状に配置された複数の画素と、
前記複数の画素における前記複数の画素ごとに1つずつ設けられた複数の受光レンズと、
前記複数の画素の露光時間を制御する制御部と
を備え、
前記制御部は、各前記受光レンズに対応する前記複数の画素のうち、少なくとも2つの前記画素の露光時間が同じになるとともに、各前記受光レンズに対応する前記複数の画素のうち、少なくとも2つの前記画素の露光時間が互いに異なるように、前記複数の画素の露光時間を制御する
撮像装置。
(2)
各前記受光レンズに対応する前記複数の画素を第1画素群としたときに、前記第1画素群ごとに設けられた、ベイヤー配列の複数のカラーフィルタを更に備えた
(1)に記載の撮像装置。
(3)
前記制御部は、各前記受光レンズに対応する前記複数の画素のうち、2つの前記画素の露光時間が同じになるとともに、各前記受光レンズに対応する前記複数の画素のうち、3つの前記画素の露光時間が異なるように、前記複数の画素の露光時間を制御する
(2)に記載の撮像装置。
(4)
前記制御部は、各前記受光レンズに対応する前記複数の画素において、前記受光面における、右肩上がりの方向、右肩下がりの方向、左右方向または上下方向に配列される2つの前記画素の露光時間が同じになるように、前記複数の画素の露光時間を制御する
(3)に記載の撮像装置。
(5)
前記制御部は、2行×2列の複数の前記第1画素群を第2画素群としたときに、各前記第2画素群に含まれる第1の前記第1画素群において、第1方向に配列された2つの前記画素の露光時間が同じになるとともに、各前記第2画素群に含まれる第2の前記第1画素群において、第2方向に配列された2つの前記画素の露光時間が同じになるように、前記複数の画素の露光時間を制御する
(2)に記載の撮像装置。
(6)
前記制御部は、2行×2列の複数の前記第1画素群を第2画素群とし、2行×2列の前記第2画素群を第3画素群としたときに、各前記第3画素群に含まれる第1の前記第2画素群において、第1方向に配列された2つの前記画素の露光時間が同じになり、各前記第3画素群に含まれる第2の前記第2画素群において、第2方向に配列された2つの前記画素の露光時間が同じになり、さらに、各前記第3画素群に含まれる第3の前記第2画素群において、第3方向に配列された2つの前記画素の露光時間が同じになるように、前記複数の画素の露光時間を制御する
(2)に記載の撮像装置。
(7)
前記制御部は、2行×2列の複数の前記第1画素群を第2画素群としたときに、各前記第2画素群に含まれる第1の前記第1画素群において、第1方向に配列された2つの前記画素の露光時間が同じになるとともに、第2方向に配列された2つの前記画素の露光時間が同じになり、さらに、各前記第2画素群に含まれる第2の前記第1画素群において、前記第1方向に配列された2つの前記画素の露光時間が同じになるとともに、前記第2方向に配列された2つの前記画素の露光時間が同じになるように、前記複数の画素の露光時間を制御する
(2)に記載の撮像装置。
(8)
前記制御部は、2行×2列の複数の前記第1画素群を第2画素群としたときに、各前記第2画素群に含まれる第1の前記第1画素群において、第1方向に配列された2つの前記画素の露光時間が同じになるとともに、前記第1方向に配列された、前記第1方向に配列された2つの前記画素とは異なる2つの前記画素の露光時間が同じになり、さらに、各前記第2画素群に含まれる第2の前記第1画素群において、第2方向に配列された2つの前記画素の露光時間が同じになり、前記第2方向に配列された、前記第2方向に配列された2つの前記画素とは異なる2つの前記画素の露光時間が同じになるように、前記複数の画素の露光時間を制御する
(2)に記載の撮像装置。
(9)
前記制御部は、2行×2列の複数の前記第1画素群を第2画素群とし、2行×2列の前記第2画素群を第3画素群としたときに、各前記第3画素群に含まれる1つの前記第1画素群において、各前記画素の露光時間が互いに同じになるように、前記複数の画素の露光時間を制御する
(2)に記載の撮像装置。
(10)
前記制御部は、2行×2列の複数の前記第1画素群を第2画素群とし、2行×2列の前記第2画素群を第3画素群としたときに、各前記第3画素群に含まれる1つの前記第2画素群において、各前記画素の露光時間が互いに同じになるように、前記複数の画素の露光時間を制御する
(2)に記載の撮像装置。
(11)
前記制御部は、2行×2列の複数の前記第1画素群を第2画素群とし、2行×2列の前記第2画素群を第3画素群としたときに、各前記第3画素群に含まれる各前記第2画素群において、各前記画素の露光時間が同じになるように、前記複数の画素の露光時間を制御する
(2)に記載の撮像装置。
(12)
前記制御部は、2行×2列の複数の前記第1画素群を第2画素群とし、1行×2列の前記第2画素群を第4画素群としたときに、各前記第4画素群に含まれる第1の前記第2画素群における各第1画素群において、第1方向に配列された2つの前記画素の露光時間が同じになり、各前記第4画素群に含まれる第2の前記第2画素群における各第1画素群において、第2方向に配列された2つの前記画素の露光時間が同じになるように、前記複数の画素の露光時間を制御する
(2)に記載の撮像装置。
(13)
前記制御部は、2行×2列の複数の前記第1画素群を第2画素群とし、2行×1列の前記第2画素群を第5画素群としたときに、各前記第5画素群に含まれる第1の前記第2画素群における各第1画素群において、第1方向に配列された2つの前記画素の露光時間が同じになり、各前記第5画素群に含まれる第2の前記第2画素群における各第1画素群において、第2方向に配列された2つの前記画素の露光時間が同じになるように、前記複数の画素の露光時間を制御する
(2)に記載の撮像装置。
(14)
前記制御部は、2行×2列の複数の前記第1画素群を第2画素群とし、2行×2列の前記第2画素群を第3画素群としたときに、各前記第3画素群に含まれる第1の前記第2画素群における各前記第1画素群において、第1方向に配列された2つの前記画素の露光時間が同じになり、各前記第3画素群に含まれる第2の前記第2画素群における各前記第1画素群において、第2方向に配列された2つの前記画素の露光時間が同じになり、さらに、各前記第3画素群に含まれる第3の前記第2画素群における各前記第1画素群において、第3方向に配列された2つの前記画素の露光時間が同じになるように、前記複数の画素の露光時間を制御する
(2)に記載の撮像装置。
(15)
前記制御部は、2行×2列の複数の前記第1画素群を第2画素群とし、2行×2列の前記第2画素群を第3画素群としたときに、各前記第3画素群に含まれる第1の前記第2画素群における各第1画素群において、第1方向に配列された2つの前記画素の露光時間が同じになるとともに、第2方向に配列された2つの前記画素の露光時間が同じになり、各前記第3画素群に含まれる第2の前記第2画素群における各第1画素群において、前記第1方向に配列された2つの前記画素の露光時間が同じになるとともに、前記第2方向に配列された2つの前記画素の露光時間が同じになるように、前記複数の画素の露光時間を制御する
(2)に記載の撮像装置。
(16)
前記制御部は、2行×2列の複数の前記第1画素群を第2画素群とし、2行×2列の前記第2画素群を第3画素群としたときに、各前記第3画素群に含まれる第1の前記第2画素群における各前記第1画素群において、第1方向に配列された2つの前記画素の露光時間が同じになるとともに、前記第1方向に配列された、前記第1方向に配列された2つの前記画素とは異なる2つの前記画素の露光時間が同じになり、さらに、各前記第3画素群に含まれる第2の前記第2画素群における各前記第1画素群において、第2方向に配列された2つの前記画素の露光時間が同じになるとともに、前記第2方向に配列された、前記第2方向に配列された2つの前記画素とは異なる2つの前記画素の露光時間が同じになるように、前記複数の画素の露光時間を制御する
(2)に記載の撮像装置。
(17)
前記制御部による露光制御によって得られた画像データから、露光時間ごとに位相差データを生成し、露光時間の異なる複数の前記位相差データと、露光時間の異なる複数の画像データとから、HDR(High Dynamic Range)画像を生成する
請求項2に記載の撮像装置。
(18)
各々が光電変換素子を含み、受光面に行列状に配置された複数の画素と、前記複数の画素における前記複数の画素ごとに1つずつ設けられた複数の受光レンズとを備えた撮像装置における信号処理方法であって、
各前記受光レンズに対応する前記複数の画素のうち、少なくとも2つの前記画素の露光時間が同じになるとともに、各前記受光レンズに対応する前記複数の画素のうち、少なくとも2つの前記画素の露光時間が互いに異なるように、前記複数の画素の露光時間を制御することと、
前記露光時間の制御によって得られた画像データから、露光時間ごとに位相差データを生成し、露光時間の異なる複数の前記位相差データと、露光時間の異なる複数の画像データとから、HDR(High Dynamic Range)画像を生成することと
を含む
信号処理方法。
Claims (18)
- 各々が光電変換素子を含み、受光面に行列状に配置された複数の画素と、
前記複数の画素における前記複数の画素ごとに1つずつ設けられた複数の受光レンズと、
前記複数の画素の露光時間を制御する制御部と
を備え、
前記制御部は、各前記受光レンズに対応する前記複数の画素のうち、少なくとも2つの前記画素の露光時間が同じになるとともに、各前記受光レンズに対応する前記複数の画素のうち、少なくとも2つの前記画素の露光時間が互いに異なるように、前記複数の画素の露光時間を制御する
撮像装置。 - 各前記受光レンズに対応する前記複数の画素を第1画素群としたときに、前記第1画素群ごとに設けられた、ベイヤー配列の複数のカラーフィルタを更に備えた
請求項1に記載の撮像装置。 - 前記制御部は、各前記受光レンズに対応する前記複数の画素のうち、2つの前記画素の露光時間が同じになるとともに、各前記受光レンズに対応する前記複数の画素のうち、3つの前記画素の露光時間が異なるように、前記複数の画素の露光時間を制御する
請求項2に記載の撮像装置。 - 前記制御部は、各前記受光レンズに対応する前記複数の画素において、前記受光面における、右肩上がりの方向、右肩下がりの方向、左右方向または上下方向に配列される2つの前記画素の露光時間が同じになるように、前記複数の画素の露光時間を制御する
請求項3に記載の撮像装置。 - 前記制御部は、2行×2列の複数の前記第1画素群を第2画素群としたときに、各前記第2画素群に含まれる第1の前記第1画素群において、第1方向に配列された2つの前記画素の露光時間が同じになるとともに、各前記第2画素群に含まれる第2の前記第1画素群において、第2方向に配列された2つの前記画素の露光時間が同じになるように、前記複数の画素の露光時間を制御する
請求項2に記載の撮像装置。 - 前記制御部は、2行×2列の複数の前記第1画素群を第2画素群とし、2行×2列の前記第2画素群を第3画素群としたときに、各前記第3画素群に含まれる第1の前記第2画素群において、第1方向に配列された2つの前記画素の露光時間が同じになり、各前記第3画素群に含まれる第2の前記第2画素群において、第2方向に配列された2つの前記画素の露光時間が同じになり、さらに、各前記第3画素群に含まれる第3の前記第2画素群において、第3方向に配列された2つの前記画素の露光時間が同じになるように、前記複数の画素の露光時間を制御する
請求項2に記載の撮像装置。 - 前記制御部は、2行×2列の複数の前記第1画素群を第2画素群としたときに、各前記第2画素群に含まれる第1の前記第1画素群において、第1方向に配列された2つの前記画素の露光時間が同じになるとともに、第2方向に配列された2つの前記画素の露光時間が同じになり、さらに、各前記第2画素群に含まれる第2の前記第1画素群において、前記第1方向に配列された2つの前記画素の露光時間が同じになるとともに、前記第2方向に配列された2つの前記画素の露光時間が同じになるように、前記複数の画素の露光時間を制御する
請求項2に記載の撮像装置。 - 前記制御部は、2行×2列の複数の前記第1画素群を第2画素群としたときに、各前記第2画素群に含まれる第1の前記第1画素群において、第1方向に配列された2つの前記画素の露光時間が同じになるとともに、前記第1方向に配列された、前記第1方向に配列された2つの前記画素とは異なる2つの前記画素の露光時間が同じになり、さらに、各前記第2画素群に含まれる第2の前記第1画素群において、第2方向に配列された2つの前記画素の露光時間が同じになり、前記第2方向に配列された、前記第2方向に配列された2つの前記画素とは異なる2つの前記画素の露光時間が同じになるように、前記複数の画素の露光時間を制御する
請求項2に記載の撮像装置。 - 前記制御部は、2行×2列の複数の前記第1画素群を第2画素群とし、2行×2列の前記第2画素群を第3画素群としたときに、各前記第3画素群に含まれる1つの前記第1画素群において、各前記画素の露光時間が互いに同じになるように、前記複数の画素の露光時間を制御する
請求項2に記載の撮像装置。 - 前記制御部は、2行×2列の複数の前記第1画素群を第2画素群とし、2行×2列の前記第2画素群を第3画素群としたときに、各前記第3画素群に含まれる1つの前記第2画素群において、各前記画素の露光時間が互いに同じになるように、前記複数の画素の露光時間を制御する
請求項2に記載の撮像装置。 - 前記制御部は、2行×2列の複数の前記第1画素群を第2画素群とし、2行×2列の前記第2画素群を第3画素群としたときに、各前記第3画素群に含まれる各前記第2画素群において、各前記画素の露光時間が同じになるように、前記複数の画素の露光時間を制御する
請求項2に記載の撮像装置。 - 前記制御部は、2行×2列の複数の前記第1画素群を第2画素群とし、1行×2列の前記第2画素群を第4画素群としたときに、各前記第4画素群に含まれる第1の前記第2画素群における各第1画素群において、第1方向に配列された2つの前記画素の露光時間が同じになり、各前記第4画素群に含まれる第2の前記第2画素群における各第1画素群において、第2方向に配列された2つの前記画素の露光時間が同じになるように、前記複数の画素の露光時間を制御する
請求項2に記載の撮像装置。 - 前記制御部は、2行×2列の複数の前記第1画素群を第2画素群とし、2行×1列の前記第2画素群を第5画素群としたときに、各前記第5画素群に含まれる第1の前記第2画素群における各第1画素群において、第1方向に配列された2つの前記画素の露光時間が同じになり、各前記第5画素群に含まれる第2の前記第2画素群における各第1画素群において、第2方向に配列された2つの前記画素の露光時間が同じになるように、前記複数の画素の露光時間を制御する
請求項2に記載の撮像装置。 - 前記制御部は、2行×2列の複数の前記第1画素群を第2画素群とし、2行×2列の前記第2画素群を第3画素群としたときに、各前記第3画素群に含まれる第1の前記第2画素群における各前記第1画素群において、第1方向に配列された2つの前記画素の露光時間が同じになり、各前記第3画素群に含まれる第2の前記第2画素群における各前記第1画素群において、第2方向に配列された2つの前記画素の露光時間が同じになり、さらに、各前記第3画素群に含まれる第3の前記第2画素群における各前記第1画素群において、第3方向に配列された2つの前記画素の露光時間が同じになるように、前記複数の画素の露光時間を制御する
請求項2に記載の撮像装置。 - 前記制御部は、2行×2列の複数の前記第1画素群を第2画素群とし、2行×2列の前記第2画素群を第3画素群としたときに、各前記第3画素群に含まれる第1の前記第2画素群における各第1画素群において、第1方向に配列された2つの前記画素の露光時間が同じになるとともに、第2方向に配列された2つの前記画素の露光時間が同じになり、各前記第3画素群に含まれる第2の前記第2画素群における各第1画素群において、前記第1方向に配列された2つの前記画素の露光時間が同じになるとともに、前記第2方向に配列された2つの前記画素の露光時間が同じになるように、前記複数の画素の露光時間を制御する
請求項2に記載の撮像装置。 - 前記制御部は、2行×2列の複数の前記第1画素群を第2画素群とし、2行×2列の前記第2画素群を第3画素群としたときに、各前記第3画素群に含まれる第1の前記第2画素群における各前記第1画素群において、第1方向に配列された2つの前記画素の露光時間が同じになるとともに、前記第1方向に配列された、前記第1方向に配列された2つの前記画素とは異なる2つの前記画素の露光時間が同じになり、さらに、各前記第3画素群に含まれる第2の前記第2画素群における各前記第1画素群において、第2方向に配列された2つの前記画素の露光時間が同じになるとともに、前記第2方向に配列された、前記第2方向に配列された2つの前記画素とは異なる2つの前記画素の露光時間が同じになるように、前記複数の画素の露光時間を制御する
請求項2に記載の撮像装置。 - 前記制御部による露光制御によって得られた画像データから、露光時間ごとに位相差データを生成し、露光時間の異なる複数の前記位相差データと、露光時間の異なる複数の画像データとから、HDR(High Dynamic Range)画像を生成する
請求項2に記載の撮像装置。 - 各々が光電変換素子を含み、受光面に行列状に配置された複数の画素と、前記複数の画素における前記複数の画素ごとに1つずつ設けられた複数の受光レンズとを備えた撮像装置における信号処理方法であって、
各前記受光レンズに対応する前記複数の画素のうち、少なくとも2つの前記画素の露光時間が同じになるとともに、各前記受光レンズに対応する前記複数の画素のうち、少なくとも2つの前記画素の露光時間が互いに異なるように、前記複数の画素の露光時間を制御することと、
前記露光時間の制御によって得られた画像データから、露光時間ごとに位相差データを生成し、露光時間の異なる複数の前記位相差データと、露光時間の異なる複数の画像データとから、HDR(High Dynamic Range)画像を生成することと
を含む
信号処理方法。
Priority Applications (7)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2021501600A JP7520804B2 (ja) | 2019-02-19 | 2019-12-11 | 信号処理方法および撮像装置 |
| US17/429,765 US11778333B2 (en) | 2019-02-19 | 2019-12-11 | Signal processing method and imaging apparatus |
| EP19916223.1A EP3930310B1 (en) | 2019-02-19 | 2019-12-11 | Signal processing method and imaging device |
| KR1020217025404A KR102838760B1 (ko) | 2019-02-19 | 2019-12-11 | 신호 처리 방법 및 촬상 장치 |
| CN202211396691.2A CN115767292A (zh) | 2019-02-19 | 2019-12-11 | 信号处理方法和成像装置 |
| CN201980088889.4A CN113316929B (zh) | 2019-02-19 | 2019-12-11 | 信号处理方法和成像装置 |
| US18/453,433 US12167144B2 (en) | 2019-02-19 | 2023-08-22 | Signal processing method and imaging apparatus |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2019027479 | 2019-02-19 | ||
| JP2019-027479 | 2019-02-19 |
Related Child Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US17/429,765 A-371-Of-International US11778333B2 (en) | 2019-02-19 | 2019-12-11 | Signal processing method and imaging apparatus |
| US18/453,433 Continuation US12167144B2 (en) | 2019-02-19 | 2023-08-22 | Signal processing method and imaging apparatus |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2020170565A1 true WO2020170565A1 (ja) | 2020-08-27 |
Family
ID=72145051
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2019/048497 Ceased WO2020170565A1 (ja) | 2019-02-19 | 2019-12-11 | 信号処理方法および撮像装置 |
Country Status (7)
| Country | Link |
|---|---|
| US (2) | US11778333B2 (ja) |
| EP (1) | EP3930310B1 (ja) |
| JP (1) | JP7520804B2 (ja) |
| KR (1) | KR102838760B1 (ja) |
| CN (2) | CN113316929B (ja) |
| TW (1) | TWI840483B (ja) |
| WO (1) | WO2020170565A1 (ja) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2023021774A1 (ja) * | 2021-08-17 | 2023-02-23 | ソニーセミコンダクタソリューションズ株式会社 | 撮像装置及び撮像装置を備える電子機器 |
| WO2023105916A1 (ja) * | 2021-12-08 | 2023-06-15 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像素子、撮像装置、および、固体撮像素子の制御方法 |
| EP4270934A4 (en) * | 2021-02-10 | 2024-06-05 | Samsung Electronics Co., Ltd. | ELECTRONIC DEVICE COMPRISING AN IMAGE SENSOR AND ITS OPERATING METHOD |
Families Citing this family (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11653101B2 (en) * | 2019-05-17 | 2023-05-16 | Samsung Electronics Co., Ltd. | Imaging system for generating high dynamic range image |
| CN115861473B (zh) * | 2022-07-19 | 2023-10-24 | 北京中关村科金技术有限公司 | 实时绘制分贝检测走势图的模型训练方法、装置及介质 |
| JP7444958B1 (ja) * | 2022-12-12 | 2024-03-06 | レノボ・シンガポール・プライベート・リミテッド | 情報処理装置、及び制御方法 |
| US20250220315A1 (en) * | 2023-12-27 | 2025-07-03 | Varjo Technologies Oy | High-dynamic range imaging using partial polarisation mask |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2015151794A1 (ja) * | 2014-03-31 | 2015-10-08 | ソニー株式会社 | 固体撮像装置及びその駆動制御方法、画像処理方法、並びに、電子機器 |
| JP2015181213A (ja) * | 2014-03-05 | 2015-10-15 | ソニー株式会社 | 撮像装置 |
| WO2017126326A1 (ja) * | 2016-01-20 | 2017-07-27 | ソニー株式会社 | 固体撮像装置およびその駆動方法、並びに電子機器 |
| WO2019026718A1 (en) * | 2017-08-03 | 2019-02-07 | Sony Semiconductor Solutions Corporation | IMAGING APPARATUS AND ELECTRONIC DEVICE |
| JP2019027479A (ja) | 2017-07-27 | 2019-02-21 | 日立オートモティブシステムズ株式会社 | 緩衝器およびカバー部材 |
Family Cites Families (22)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR20090087644A (ko) | 2008-02-13 | 2009-08-18 | 삼성전자주식회사 | 따른 픽셀 회로 어레이 |
| JP5901614B6 (ja) * | 2011-04-08 | 2018-06-27 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | 画像処理装置および画像処理方法 |
| FR2976121A1 (fr) * | 2011-05-31 | 2012-12-07 | St Microelectronics Sa | Dispositif d'imagerie matriciel comprenant au moins un ensemble de photosites a multiples temps d'integration. |
| JP2013066140A (ja) * | 2011-08-31 | 2013-04-11 | Sony Corp | 撮像装置、および信号処理方法、並びにプログラム |
| JP2013055499A (ja) * | 2011-09-02 | 2013-03-21 | Sony Corp | 固体撮像素子およびカメラシステム |
| JP2013055500A (ja) * | 2011-09-02 | 2013-03-21 | Sony Corp | 固体撮像素子およびカメラシステム |
| US9143708B2 (en) * | 2013-12-11 | 2015-09-22 | Himax Imaging, Inc. | Dual low voltage levels for control of transfer switch device in pixel array |
| US9711553B2 (en) * | 2014-04-28 | 2017-07-18 | Samsung Electronics Co., Ltd. | Image sensor including a pixel having photoelectric conversion elements and image processing device having the image sensor |
| US9491442B2 (en) * | 2014-04-28 | 2016-11-08 | Samsung Electronics Co., Ltd. | Image processing device and mobile computing device having the same |
| US9888198B2 (en) * | 2014-06-03 | 2018-02-06 | Semiconductor Components Industries, Llc | Imaging systems having image sensor pixel arrays with sub-pixel resolution capabilities |
| US9749556B2 (en) * | 2015-03-24 | 2017-08-29 | Semiconductor Components Industries, Llc | Imaging systems having image sensor pixel arrays with phase detection capabilities |
| KR102356706B1 (ko) * | 2015-07-07 | 2022-01-27 | 삼성전자주식회사 | 넓은 다이나믹 레인지를 갖는 이미지 센서, 이미지 센서의 픽셀 회로 및 이미지 센서의 동작방법 |
| KR102796731B1 (ko) * | 2016-02-19 | 2025-04-16 | 삼성전자주식회사 | 전자 장치 및 그의 동작 방법 |
| JP6762766B2 (ja) * | 2016-06-01 | 2020-09-30 | キヤノン株式会社 | 撮像素子、撮像装置、および撮像信号処理方法 |
| JP2018019296A (ja) * | 2016-07-28 | 2018-02-01 | キヤノン株式会社 | 撮像装置およびその制御方法 |
| US10574872B2 (en) * | 2016-12-01 | 2020-02-25 | Semiconductor Components Industries, Llc | Methods and apparatus for single-chip multispectral object detection |
| KR102354991B1 (ko) * | 2017-05-24 | 2022-01-24 | 삼성전자주식회사 | 픽셀 회로 및 이를 포함하는 이미지 센서 |
| JP7171199B2 (ja) | 2017-08-03 | 2022-11-15 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像装置、及び電子機器 |
| JP6944846B2 (ja) * | 2017-10-04 | 2021-10-06 | オリンパス株式会社 | 撮像装置、撮像装置の制御方法 |
| CN108462841A (zh) * | 2018-03-21 | 2018-08-28 | 上海晔芯电子科技有限公司 | 像素阵列及图像传感器 |
| CN108632537B (zh) * | 2018-05-04 | 2020-08-21 | Oppo广东移动通信有限公司 | 控制方法及装置、成像设备、计算机设备及可读存储介质 |
| CN108683863B (zh) * | 2018-08-13 | 2020-01-10 | Oppo广东移动通信有限公司 | 成像控制方法、装置、电子设备以及可读存储介质 |
-
2019
- 2019-12-11 CN CN201980088889.4A patent/CN113316929B/zh active Active
- 2019-12-11 KR KR1020217025404A patent/KR102838760B1/ko active Active
- 2019-12-11 EP EP19916223.1A patent/EP3930310B1/en active Active
- 2019-12-11 CN CN202211396691.2A patent/CN115767292A/zh active Pending
- 2019-12-11 US US17/429,765 patent/US11778333B2/en active Active
- 2019-12-11 WO PCT/JP2019/048497 patent/WO2020170565A1/ja not_active Ceased
- 2019-12-11 JP JP2021501600A patent/JP7520804B2/ja active Active
- 2019-12-26 TW TW108147776A patent/TWI840483B/zh active
-
2023
- 2023-08-22 US US18/453,433 patent/US12167144B2/en active Active
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2015181213A (ja) * | 2014-03-05 | 2015-10-15 | ソニー株式会社 | 撮像装置 |
| WO2015151794A1 (ja) * | 2014-03-31 | 2015-10-08 | ソニー株式会社 | 固体撮像装置及びその駆動制御方法、画像処理方法、並びに、電子機器 |
| JP2015201834A (ja) | 2014-03-31 | 2015-11-12 | ソニー株式会社 | 固体撮像装置及びその駆動制御方法、画像処理方法、並びに、電子機器 |
| WO2017126326A1 (ja) * | 2016-01-20 | 2017-07-27 | ソニー株式会社 | 固体撮像装置およびその駆動方法、並びに電子機器 |
| JP2019027479A (ja) | 2017-07-27 | 2019-02-21 | 日立オートモティブシステムズ株式会社 | 緩衝器およびカバー部材 |
| WO2019026718A1 (en) * | 2017-08-03 | 2019-02-07 | Sony Semiconductor Solutions Corporation | IMAGING APPARATUS AND ELECTRONIC DEVICE |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP4270934A4 (en) * | 2021-02-10 | 2024-06-05 | Samsung Electronics Co., Ltd. | ELECTRONIC DEVICE COMPRISING AN IMAGE SENSOR AND ITS OPERATING METHOD |
| US12464234B2 (en) | 2021-02-10 | 2025-11-04 | Samsung Electronics Co., Ltd. | Electronic device including image sensor and operating method thereof |
| WO2023021774A1 (ja) * | 2021-08-17 | 2023-02-23 | ソニーセミコンダクタソリューションズ株式会社 | 撮像装置及び撮像装置を備える電子機器 |
| WO2023105916A1 (ja) * | 2021-12-08 | 2023-06-15 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像素子、撮像装置、および、固体撮像素子の制御方法 |
Also Published As
| Publication number | Publication date |
|---|---|
| EP3930310B1 (en) | 2025-09-17 |
| JP7520804B2 (ja) | 2024-07-23 |
| US20230421913A1 (en) | 2023-12-28 |
| EP3930310A1 (en) | 2021-12-29 |
| KR20210128395A (ko) | 2021-10-26 |
| US12167144B2 (en) | 2024-12-10 |
| JPWO2020170565A1 (ja) | 2021-12-16 |
| US11778333B2 (en) | 2023-10-03 |
| KR102838760B1 (ko) | 2025-07-24 |
| CN113316929B (zh) | 2024-08-16 |
| CN115767292A (zh) | 2023-03-07 |
| EP3930310A4 (en) | 2022-04-13 |
| CN113316929A (zh) | 2021-08-27 |
| TWI840483B (zh) | 2024-05-01 |
| TW202105994A (zh) | 2021-02-01 |
| US20220132014A1 (en) | 2022-04-28 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP7199371B2 (ja) | 固体撮像素子および電子機器 | |
| JP7520804B2 (ja) | 信号処理方法および撮像装置 | |
| JP7600129B2 (ja) | 撮像装置 | |
| JP7284171B2 (ja) | 固体撮像装置 | |
| JP7341141B2 (ja) | 撮像装置および電子機器 | |
| US12342640B2 (en) | Solid-state imaging element and imaging apparatus | |
| KR102878752B1 (ko) | 고체 촬상 장치 및 그 구동 방법, 및 전자기기 | |
| JP2019192802A (ja) | 撮像素子および撮像素子の製造方法 | |
| WO2020179302A1 (ja) | 撮像装置 | |
| US20230005993A1 (en) | Solid-state imaging element | |
| WO2019171947A1 (ja) | 撮像素子、電子機器 | |
| JP2019022020A (ja) | 固体撮像素子、固体撮像素子の駆動方法および電子機器 | |
| CN116325782A (zh) | 摄像装置 | |
| KR20250166200A (ko) | 광 검출 장치 및 전자 기기 | |
| JP2023069798A (ja) | 撮像装置及び電子機器 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19916223 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2021501600 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| ENP | Entry into the national phase |
Ref document number: 2019916223 Country of ref document: EP Effective date: 20210920 |
|
| WWG | Wipo information: grant in national office |
Ref document number: 2019916223 Country of ref document: EP |