Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar elements throughout or elements having like or similar functionality. The embodiments described below by referring to the drawings are illustrative only and are not to be construed as limiting the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The features of the application "first", "second" and the like in the description and in the claims may be used for the explicit or implicit inclusion of one or more such features. In the description of the present application, unless otherwise indicated, the meaning of "a plurality" is two or more.
In the description of the present application, unless explicitly stated or limited otherwise, the terms "connected" and "connected" should be interpreted broadly, for example, as a fixed connection, a removable connection, or an integral connection, as a mechanical connection, as an electrical connection, as a direct connection, as an indirect connection via an intermediary, or as a communication between two elements. The specific meaning of the above terms in the present application will be understood in specific cases by those of ordinary skill in the art.
The pixel circuit, the image sensor, the camera module, and the electronic device according to the embodiments of the present application are described in detail below with reference to fig. 1 to 10.
As shown in fig. 1, an embodiment of the present application provides a pixel circuit 100. The pixel circuit 100 includes at least one pixel unit 102, a first exposure control circuit 104, a second exposure control circuit 106, and a region processing circuit 108.
The first exposure control circuit 104 is connected to each pixel unit 102.
Further, the first exposure control circuit 104 is a main exposure control system, and the first exposure control circuit 104 is used for controlling the light sensing of the conventional pixels. Specifically, the first exposure control circuit 104 is configured to control any one of the pixel units 102 to read out the photoelectric signal with the first exposure time length as a signal reading interval.
Further, a second exposure control circuit 106 is connected to each pixel unit 102.
Further, the second exposure control circuit 106 is a slave exposure control system, and the second exposure control circuit 106 is used for controlling the sensitization of pixels of the ROI (Region Of Interest ) of the user so as to meet the requirement of the user on the video frame rate. Specifically, the second exposure control circuit 106 is configured to control any one of the pixel units 102 to read out the photoelectric signal with the second exposure time length as a signal reading interval.
Among them, the frame rate, i.e., FPS (FRAMES PER seconds, frames per Second), is one of important indicators of video quality, which represents the number of frames of an image displayed per Second. Each frame corresponds to a picture, and the frame rate is the frequency of picture updates per second. If 5 pictures are played in one second, the frame rate is 5fps. The frame rate directly affects the fluency of the video, the higher the frame rate is, the smoother the picture is, and when the frame rate is lower, the picture can appear to jump and be incoherent. However, it should be noted that although a higher frame rate can provide smoother pictures, this is equivalent to higher video quality, and the selection of the frame rate depends on the photographed content and scene, and there is no unified standard, and the suitable frame rate varies from case to case.
Further, the first exposure time period is longer than the second exposure time period. Thus, the frame rate of the image data corresponding to the photoelectric signal read out by taking the second exposure time as the signal reading interval is higher than the frame rate of the image data corresponding to the photoelectric signal read out by taking the first exposure time as the signal reading interval, and the image sensor corresponding to the pixel circuit 100 can output the image data with high and low frame rates at the same time.
Further, the area processing circuit 108 is connected to each pixel unit 102 and the second exposure control circuit 106.
Further, the area processing circuit 108 is configured to enable the second exposure control circuit 106 to activate the second exposure control circuit 106, so that the second exposure control circuit 106 controls the pixel units 102 in the ROI of the user to read out the photoelectric signal with the second exposure time length as a signal reading interval. Thus, during the video recording process, when the user ROI needs to operate at a high frame rate, the first exposure control circuit 104 controls the pixel units 102 outside the user ROI to operate at a normal frame rate, and the second exposure control circuit 106 controls the pixel units 102 inside the user ROI to operate at a high frame rate, so as to realize the slow motion function inside the user ROI.
In the practical application process, for the specific values of the first exposure duration and the second exposure duration, those skilled in the art may set the specific values according to practical situations, for example, the first exposure duration is 33ms, the corresponding frame rate is 30fps, the second exposure duration is 11ms, and the corresponding frame rate is 90fps, which is not limited herein.
The pixel circuit 100 according to an embodiment of the present application includes at least one pixel unit 102, a first exposure control circuit 104, a second exposure control circuit 106, and a region processing circuit 108. The first exposure control circuit 104 is connected to each pixel unit 102, the second exposure control circuit 106 is connected to each pixel unit 102, and the area processing circuit 108 is connected to each pixel unit 102 and the second exposure control circuit 106. Further, the first exposure control circuit 104 is configured to control any one of the pixel units 102 to read the photoelectric signal with a first exposure time period as a signal reading interval, the area processing circuit 108 is configured to enable the second exposure control circuit 106, the second exposure control circuit 106 is configured to control any one of the pixel units 102 to read the photoelectric signal with a second exposure time period as a signal reading interval, and the first exposure time period is longer than the second exposure time period. Through the pixel circuit 100, two paths of exposure control circuits are arranged to respectively control the pixel units 102 to read out photoelectric signals according to different exposure time lengths, so that the partition control of the exposure time length and the corresponding image frame rate of any pixel unit 102 is realized, and the control of the normal sensitization of part of the pixel units 102 is facilitated, and the high-speed operation of the other part of the pixel units 102 is controlled, so that the image sensor can simultaneously output image data with high and low frame rates. Therefore, the high-frame-rate video can be realized without adjusting a post-processing algorithm, the video image quality, the power consumption and the image processing capability are not influenced, and the requirements on the video image quality, the power consumption and the image processing capability are met while the high-frame-rate video is realized.
Optionally, according to some embodiments of the application, as shown in fig. 1, the pixel circuit 100 further comprises an image analysis circuit 110.
Wherein the image analysis circuit 110 is connected to the region processing circuit 108.
Further, the image analysis circuit 110 is configured to determine at least one pixel unit 102 that reads out the photoelectric signal with the second exposure time length as a signal reading interval. Specifically, the image analysis circuit 110 is configured to determine a user ROI, i.e., a user region of interest, requiring high frame rate operation and a pixel coordinate set thereof, the pixel coordinate set including position information of all pixel units 102 in the user region of interest, and send the pixel coordinate set to the region processing circuit 108. On the basis of this, the area processing circuit 108 is configured to send an enable signal to the second exposure control circuit 106 according to the position information of all the pixel units 102 in the region of interest, so that the second exposure control circuit 106 controls all the pixel units 102 in the region of interest to read out the photoelectric signal with the second exposure time length as a signal reading interval.
Wherein the size of the user ROI is determined according to the output result of the image analysis circuit 110.
In an actual application process, the image analysis circuit 110 may perform motion detection according to the image data output previously, automatically analyze the user ROI with motion signs in the shooting scene, automatically calculate a pixel coordinate set of the user ROI, generate a control signal according to the pixel coordinate set, and transmit the control signal to the region processing circuit 108, so that the region processing circuit 108 drives the second exposure control circuit 106 to control the pixel units 102 in the user ROI to perform exposure.
Further, during the video recording process, the user may manually frame the region of interest, and the image analysis circuit 110 further calculates a pixel coordinate set of the user ROI according to the framed user ROI, and generates a corresponding control signal to transmit to the region processing circuit 108.
The pixel circuit 100 according to the embodiment of the application further comprises an image analysis circuit 110 connected with the region processing circuit 108, wherein the image analysis circuit 110 is used for determining a region of interest of a user and sending the position information of all the pixel units 102 in the region of interest to the region processing circuit 108, and the region processing circuit 108 is used for sending an enabling signal to the second exposure control circuit 106 according to the position information of all the pixel units 102 in the region of interest, so that the second exposure control circuit 106 controls all the pixel units 102 in the region of interest to read out photoelectric signals with a second exposure time length as a signal reading interval. In this way, the image analysis circuit 110 determines the region of interest of the user, so that the high frame rate operation on the region of interest of the user can be conveniently and accurately realized, and the video quality in a dynamic scene is improved.
Optionally, as shown in fig. 1, each pixel cell 102 includes at least one pixel via 112, a first transistor 114, and a capacitive device 128, according to some embodiments of the application.
Wherein each pixel via 112 is connected to both the first exposure control circuit 104 and the second exposure control circuit 106.
Further, the first exposure control circuit 104 is configured to control any one of the pixel paths 112 to read out the photoelectric signal with the first exposure time length as a signal reading interval, and the second exposure control circuit 106 is configured to control any one of the pixel paths 112 to read out the photoelectric signal with the second exposure time length as a signal reading interval.
Further, a first terminal of the first transistor 114 is connected to each pixel path 112, and a control terminal of the first transistor 114 is connected to both the first exposure control circuit 104 and the second exposure control circuit 106.
Further, the first transistor 114 is used to control whether electrons in the pixel path 112 flow into the capacitive device 128.
Further, a first terminal of the capacitor 128 is connected to the second terminal of the first transistor 114, a second terminal of the capacitor 128 is grounded, and a third terminal of the capacitor 128 is connected to the region processing circuit 108.
Further, the capacitor 128 may be a floating diffuser, which is equivalent to a capacitor, and the capacitor 128 is responsible for carrying the charges transferred from the pixel path 112.
In the process of image capturing, different pixel paths 112 are jointly controlled by the first exposure control circuit 104 and the second exposure control circuit 106, so that the pixel paths 112 in the ROI of a user in a shooting scene are controlled to read out photoelectric signals at a first frame rate according to the high-frame-rate image capturing requirement of the user, and the pixel paths 112 in other areas in the shooting scene are controlled to read out photoelectric signals at a second frame rate. Wherein the first frame rate is greater than the second frame rate, thereby enabling a slow motion function within the user ROI.
According to the pixel circuit 100 of the embodiment of the application, each pixel cell 102 includes at least one pixel via 112, a first transistor 114, and a capacitive device 128. Wherein a first terminal of the first transistor 114 is connected to each pixel via 112, a control terminal of the first transistor 114 is connected to both the first exposure control circuit 104 and the second exposure control circuit 106, a first terminal of the capacitor device 128 is connected to a second terminal of the first transistor 114, a second terminal of the capacitor device 128 is grounded, and a third terminal of the capacitor device 128 is connected to the area processing circuit 108. Further, the first exposure control circuit 104 is configured to control any one of the pixel paths 112 to read out the photoelectric signal with the first exposure time length as a signal reading interval, and the second exposure control circuit 106 is configured to control any one of the pixel paths 112 to read out the photoelectric signal with the second exposure time length as a signal reading interval. In this way, the first exposure control circuit 104 and the second exposure control circuit 106 jointly control the exposure time of the pixel channels 112 in each pixel unit 102, so that each pixel channel 112 can read out the photoelectric signal in a long exposure or a short exposure, and therefore, the pixel circuit 100 can output the image data of the corresponding position of each pixel channel 112 in a high frame rate or a low frame rate, the partition control of the output frame rate of the image data is realized, and the high frame rate video recording requirement of the user region of interest can be realized.
Optionally, as shown in fig. 1, each pixel cell 102 further includes a second transistor 116, a third transistor 120, and a fourth transistor 124, according to some embodiments of the present application.
Wherein a first terminal of the second transistor 116 is connected to the first power supply 118, a second terminal of the second transistor 116 is connected to both the second terminal of the first transistor 114 and the first terminal of the capacitor 128, and a control terminal of the second transistor 116 is connected to both the first exposure control circuit 104 and the second exposure control circuit 106.
Further, the second transistor 116 acts as a restart transistor, and is responsible for clearing residual photo-generated electrons in the pixel path 112 and the capacitor 128.
Further, a first terminal of the third transistor 120 is connected to the second power supply 122, a control terminal of the third transistor 120 is connected to both the second terminal of the first transistor 114 and the first terminal of the capacitor 128, and a second terminal of the third transistor 120 is connected to the first terminal of the fourth transistor 124.
Further, the third transistor 120 acts as a source follower responsible for transferring charge in the capacitive device 128 to the second terminal of the fourth transistor 124.
Further, a second terminal of the fourth transistor 124 is connected to a first terminal of the third power supply 126, a control terminal of the fourth transistor 124 is connected to both the first exposure control circuit 104 and the second exposure control circuit 106, a second terminal of the fourth transistor 124 is used for outputting an optical-electrical signal outwards, and a second terminal of the third power supply 126 is grounded.
Further, the fourth transistor 124 is used as a row selector and is responsible for controlling the output of the photoelectric signal of the pixel path 112, and when the fourth transistor 124 is turned on, the charges in the capacitor 128 are transferred out through the third transistor 120, so as to realize the readout of the photoelectric signal.
In the image capturing process, the first exposure control circuit 104 and the second exposure control circuit 106 jointly control the on-off of the pixel path 112, the first transistor 114, the second transistor 116, the third transistor 120 and the fourth transistor 124, so that the pixel path 112 in the ROI of the user in the shooting scene is controlled to read out the photoelectric signal at the first frame rate according to the high frame rate video recording requirement of the user, and the pixel path 112 in other areas in the shooting scene is controlled to read out the photoelectric signal at the second frame rate. Wherein the first frame rate is greater than the second frame rate, thereby enabling a slow motion function within the user ROI.
According to the pixel circuit 100 of the embodiment of the application, each pixel unit 102 further includes a second transistor 116, a third transistor 120, and a fourth transistor 124. The first end of the second transistor 116 is connected to the first power supply 118, the second end of the second transistor 116 is connected to the second end of the first transistor 114 and the first end of the capacitor 128, the control end of the second transistor 116 is connected to the first exposure control circuit 104 and the second exposure control circuit 106, the first end of the third transistor 120 is connected to the second power supply 122, the control end of the third transistor 120 is connected to the second end of the first transistor 114 and the first end of the capacitor 128, the first end of the fourth transistor 124 is connected to the second end of the third transistor 120, the second end of the fourth transistor 124 is connected to the first end of the third power supply 126, the control end of the fourth transistor 124 is connected to the first exposure control circuit 104 and the second exposure control circuit 106, the second end of the fourth transistor 124 is used for outputting a photoelectric signal, and the second end of the third power supply 126 is grounded. In this way, the first exposure control circuit 104 and the second exposure control circuit 106 jointly control the on-off state of the pixel path 112, the first transistor 114, the second transistor 116, the third transistor 120 and the fourth transistor 124 in each pixel unit 102, so that the exposure duration of each pixel unit 102 can be controlled, each pixel unit 102 can read out the photoelectric signal in a long exposure or a short exposure, so that the pixel circuit 100 can output the image data of the corresponding position of each pixel unit 102 in a high frame rate or a low frame rate, the partition control of the output frame rate of the image data is realized, and the high frame rate video recording requirement of the user region of interest can be realized.
In accordance with some embodiments of the application, optionally, where the pixel circuit 100 is used to control a single pixel, the pixel circuit 100 includes 1 pixel cell 102, the pixel cell 102 including 1 pixel via 112.
The pixel path 112 is connected to the first terminals of the first exposure control circuit 104, the second exposure control circuit 106, and the first transistor 114.
Further, a pixel circuit 100 is used to control exposure and signal readout of a single pixel such as an R pixel, a Gr pixel, a Gb pixel, or a B pixel. On this basis, as shown in fig. 3, for an image sensor including a plurality of single pixels, the exposure of the single pixels is controlled by the pixel circuit 100 corresponding to each single pixel, so that each single pixel can read out the photoelectric signal according to the long exposure time or the short exposure time, and thus each single pixel can output image data according to a low frame rate or a high frame rate, thereby realizing the slow release function of the region of interest of the user in the shooting process.
According to the pixel circuit 100 of the embodiment of the application, in the case where the pixel circuit 100 is used to control a single pixel, the pixel circuit 100 includes 1 pixel unit 102, the pixel unit 102 includes 1 pixel via 112, and the pixel via 112 is connected to the first terminals of the first exposure control circuit 104, the second exposure control circuit 106, and the first transistor 114. In this way, the first exposure control circuit 104 and the second exposure control circuit 106 jointly control the exposure time of the pixel path 112, so that two photoelectric signal readout modes are given to each single pixel, and each single pixel can read out a photoelectric signal according to the long exposure time or the short exposure time, so that each single pixel can output image data according to a low frame rate or a high frame rate, the partition control of the image data output frame rate taking the single pixel as the minimum unit is realized, and the high frame rate video recording requirement of a user region of interest is conveniently realized through a plurality of single pixels.
In accordance with some embodiments of the application, optionally, where the pixel circuit 100 is used to control m×m pixels, the pixel circuit 100 includes M pixel cells 102.
Wherein each pixel cell 102 includes M pixel paths 112, i.e., pixel circuit 100 includes m×m pixel paths 112.
Further, each pixel via 112 is connected to the first terminals of the first exposure control circuit 104, the second exposure control circuit 106, and the first transistor 114.
Further, in the case where the pixel circuit 100 is used to control an all-in-one pixel obtained by merging m×m pixels, the all-in-one pixel of the m×m pixel merging mode is a pixel, and one pixel circuit 100 is used to control exposure and signal readout of the m×m pixels in one all-in-one pixel. Wherein, one pixel in the multiple-in-one pixels corresponds to one pixel channel 112, and the exposure time of each pixel channel 112 is jointly controlled by the first exposure control circuit 104 and the second exposure control circuit 106, so that each pixel in the multiple-in-one pixels reads out the photoelectric signal according to the long exposure time or the short exposure time, and each pixel in the multiple-in-one pixels outputs image data according to the low frame rate or the high frame rate, thereby facilitating realization of high frame rate video recording requirements on the user region of interest through the multiple-in-one pixels.
Wherein M is a positive integer greater than 1.
In practical applications, M may be 2, that is, one pixel circuit 100 may be used to control the exposure and signal readout of one four-in-one pixel, where, as shown in fig. 4, one pixel circuit 100 is used to control the exposure and signal readout of 4 pixels, such as 4R pixels, 4 Gr pixels, 4 Gb pixels, or 4B pixels, in one four-in-one pixel.
Further, M may be 3, that is, one pixel circuit 100 may be used to control the exposure and signal readout of one nine-in-one pixel, where, as shown in fig. 5, one pixel circuit 100 is used to control the exposure and signal readout of 9 pixels, such as 9R pixels, 9 Gr pixels, 9 Gb pixels, or 9B pixels, in one nine-in-one pixel.
Further, M may be 4, that is, one pixel circuit 100 may be used to control one sixteen in-one pixel, and in this case, as shown in fig. 6, one pixel circuit 100 is used to control exposure and signal readout of 16 pixels, such as 16R pixels, 16 Gr pixels, 16 Gb pixels, or 16B pixels, in one sixteen in-one pixel.
In the practical application process, the specific value of M can be selected by those skilled in the art according to the practical situation, and is not particularly limited herein.
According to the pixel circuit 100 of the embodiment of the application, in the case that the pixel circuit 100 is used for controlling m×m pixels, the pixel circuit 100 includes M pixel units 102, each pixel unit 102 includes M pixel paths 112, and each pixel path 112 is connected to the first ends of the first exposure control circuit 104, the second exposure control circuit 106, and the first transistor 114, where M is a positive integer greater than 1. In this way, the first exposure control circuit 104 and the second exposure control circuit 106 jointly control the exposure time of each pixel path 112, so that each pixel in the all-in-one pixel can be controlled to read out the photoelectric signal according to the long exposure time or the short exposure time, and accordingly each pixel in the all-in-one pixel outputs image data according to a low frame rate or a high frame rate, independent control of the image data output frame rate of each pixel in the all-in-one pixel is realized, and the high frame rate video recording requirement of a user region of interest is conveniently realized through the all-in-one pixel.
Optionally, each pixel via 112 includes a fifth transistor 130 and a photodiode 132, as shown in fig. 1, according to some embodiments of the present application.
The first end of the fifth transistor 130 is connected to the first end of the first transistor 114, the control end of the fifth transistor 130 is connected to both the first exposure control circuit 104 and the second exposure control circuit 106, and the second end of the fifth transistor 130 is connected to the cathode of the photodiode 132.
Further, the anode of the photodiode 132 is grounded.
Further, the photodiode 132 is used for light sensing, the photodiode 132 is controlled by the fifth transistor 130, and the fifth transistor 130 is responsible for switching the photodiode 132 to control the light sensing time of the photodiode 132.
In a practical application process, in the case where the pixel unit 102 includes one pixel via 112, the pixel unit 102 has a PPD (Pinned Photodiode Pixel ) structure, and a schematic structural diagram of the pixel unit 102 may be specifically shown in fig. 2. As shown in fig. 2, the pixel unit 102 includes a photodiode 132, a fifth transistor 130, a capacitor 128, a second transistor 116, a third transistor 120, and a fourth transistor 124. The anode of the photodiode 132 is grounded, the cathode of the photodiode 132 is connected to the first end of the fifth transistor 130, the second end of the fifth transistor 130 is connected to the first end of the capacitor 128, the second end of the second transistor 116 and the control end of the third transistor 120, the first end of the second transistor 116 is connected to the first power supply 118, the second end of the capacitor 128 is grounded, the first end of the third transistor 120 is connected to the second power supply 122, the second end of the third transistor 120 is connected to the first end of the fourth transistor 124, the second end of the fourth transistor 124 is connected to the first end of the third power supply 126, and the second end of the third power supply 126 is grounded.
The second transistor 116 serves as a reset transistor, the fifth transistor 130 serves as a floating switch, the fourth transistor 124 serves as a row selector, and the third transistor 120 serves as a signal amplifier. It will be appreciated that the PPD architecture allows the introduction of CDS (Correlated Double Sampling ) circuitry so that the KTC noise introduced by reset, i.e., reset noise, and the 1/f noise and offset noise introduced by MOS (Metal-Oxide-Semiconductor) tubes, i.e., offset noise, which is also known as flicker noise or low frequency noise, can be eliminated. In different operation states, the operation modes of the components in the pixel unit 102 are as follows:
(1) Exposure-the second transistor 116 and the fifth transistor 130 are simultaneously turned on to empty the photodiode 132, and then both are turned off to start exposure, and the "electron-hole pairs" generated by the light irradiation are separated by the existence of the electric field of the photodiode 132, so that electrons move to the N region and holes move to the P region.
(2) Reset-at the end of the exposure, the second transistor 116 is activated, resetting the capacitive device 128 to a high level.
(3) Reset level readout, after the reset is completed, the reset level of the readout capacitor 128 is a, which includes offset noise, 1/f noise and KTC noise introduced by the reset of the MOS transistor, and the readout reset point level signal is stored in the first capacitor.
(4) Charge transfer-the fifth transistor 130 is activated, transferring charge from the photodiode 132 completely to the capacitive device 128 for readout.
(5) Signal level readout, namely, reading out a voltage signal B of the capacitor device 128 to the second capacitor, wherein the voltage signal B comprises a signal generated by photoelectric conversion and offset noise, 1/f noise and KTC noise introduced by reset generated by operational amplification.
(6) And the signal output is that the signals stored in the two capacitors are subtracted, namely B-A, if A CDS circuit is adopted, the main noise in the pixel can be eliminated, the obtained signal is A pure photoelectric signal, the photoelectric signal is output as V out after being amplified by A third transistor 120, and then the photoelectric signal is sampled by an ADC (Analog-to-Digital Converter) to realize the digital signal output.
On the basis, the first exposure control circuit 104 and the second exposure control circuit 106 jointly control the on-off of the first transistor 114, the second transistor 116, the third transistor 120, the fourth transistor 124 and the fifth transistor 130, namely, according to the high frame rate video recording requirement of a user, the pixel access 112 in the user ROI in the shooting scene is controlled to read out the photoelectric signals at a first frame rate, and the pixel access 112 in other areas in the shooting scene is controlled to read out the photoelectric signals at a second frame rate, wherein the first frame rate is larger than the second frame rate, so that the slow motion function in the user ROI is realized.
Taking the example of the pixel circuit 100 controlling a four-in-one pixel, that is, taking the example that one pixel circuit 100 includes 4 photodiodes 132, as shown in fig. 1, four photodiodes 132 in the pixel circuit 100 are respectively named as PD1, PD2, PD3, and PD4, four fifth transistors 130 in the pixel circuit 100 are respectively named as TG1, TG2, TG3, and TG4, two first transistors 114 in the pixel circuit 100 are respectively named as TG5 and TG6, two second transistors 116 in the pixel circuit 100 are respectively named as RST1 and RST2, two capacitor devices 128 in the pixel circuit 100 are respectively named as FD1 and FD2, two third transistors 120 in the pixel circuit 100 are respectively named as SF1 and SF2, and two fourth transistors 124 in the pixel circuit 100 are respectively named as Rsel1 and Rsel2. If the first frame rate is 90fps, i.e. the second exposure time is 11ms, and the second frame rate is 30fps, i.e. the first exposure time is 33ms, the image analysis circuit 110 determines that the user ROI corresponds to PD2, and the other image areas correspond to PD1, PD3 and PD4. At this time, by the joint control of the first exposure control circuit 104 and the second exposure control circuit 106, the PD2 is controlled to sense and read out the photoelectric signal at the first frame rate of 90fps, and the PD1, PD3, and PD4 are controlled to sense and read out the photoelectric signal at the second frequency of 30 fps.
Specifically, the first exposure control circuit 104 is connected to the line K, L, N, O, P, Q, R, S, T and U. At 0ms, the line K, L, O, P connected thereto is pulled up by the first exposure control circuit 104 to open TG5, RST1, TG2, and TG1, thereby clearing residual electrons in PD1, PD2, and FD1, while the line Q, R, T, U connected thereto is pulled up by the first exposure control circuit 104 to open TG6, RST2, TG4, and TG3, thereby clearing residual electrons in PD3, PD4, and FD 2. After the clearing, all transistors are turned off, at which time PD1, PD2, PD3, and PD4 start to be in a controlled photosensitive state.
Further, the second exposure control circuit 106 is connected to the wiring A, B, C, D, E, F, G, H, I and J. At 11ms, the line D, A, C connected thereto is pulled high by the second exposure control circuit 106 to open TG2, TG5, and Rsel1, so that the photoelectric signal of PD2 is read out for the first time at 11 ms. Immediately after the end of the first photoelectric signal readout, the line D, A, B connected thereto is pulled up by the second exposure control circuit 106 to open TG2, TG5, and RST1, thereby clearing the residual electrons in PD2 and FD1, and at this time, PD2 is in a controlled exposure state. Further, at 22ms, the line D, A, C connected thereto is pulled up again by the second exposure control circuit 106 to open TG2, TG5, and Rsel1, so that the photoelectric signal of PD2 is read out a second time at 22 ms. The above-described operation is repeated, and at 33ms, the second exposure control circuit 106 controls the PD2 to read out the photoelectric signal for the third time, following the first exposure control circuit 104 controlling the PD1, PD3, and PD4 to read out the photoelectric signal for the first time.
On the basis, as shown in fig. 1, the photoelectric signals read out by the PD2 according to 90fps, the photoelectric signals read out by the PD1, the PD3 and the PD4 according to 30fps are converted into digital signals through the ADC, and then synthesized by the ISP (IMAGE SIGNAL Processing, image signal processor) and output by the MIPI (Mobile Industry Processor Interface, mobile industrial processor interface), the special effect video with local slow motion effect can be obtained, so that the high frame rate dynamic effect of the local area can be realized under the condition that the whole frame rate is not influenced.
According to the pixel circuit 100 of the embodiment of the application, each pixel path 112 includes a fifth transistor 130 and a photodiode 132. The first end of the fifth transistor 130 is connected to the first end of the first transistor 114, the control end of the fifth transistor 130 is connected to both the first exposure control circuit 104 and the second exposure control circuit 106, the anode of the photodiode 132 is grounded, and the cathode of the photodiode 132 is connected to the second end of the fifth transistor 130. In this way, by jointly controlling the on/off states of the fifth transistor 130, the first transistor 114, the second transistor 116, the third transistor 120 and the fourth transistor 124 in each pixel unit 102 by the first exposure control circuit 104 and the second exposure control circuit 106, the control of the exposure duration of any photodiode 132 can be realized, so that each photodiode 132 can read out a photoelectric signal in a long exposure or a short exposure, thereby enabling the pixel circuit 100 to output image data at a position corresponding to each photodiode 132 in a high frame rate or a low frame rate, realizing partition control of the output frame rate of the image data with a single photodiode 132 as a minimum unit, and thus realizing frame rate control of any image area, and further realizing high frame rate video recording requirements of a user region of interest.
Optionally, according to some embodiments of the present application, the area processing circuit 108 is further configured to control the capacitance value of the capacitive device 128 to be a first value during the readout of the photoelectric signal by the pixel unit 102 at the second exposure time interval.
Further, the area processing circuit 108 is further configured to control the capacitance value of the capacitive device 128 to be a second value during the process of reading out the photoelectric signal by the pixel unit 102 with the first exposure time length as the signal reading interval.
The first value is smaller than the second value, and the ratio of the first value to the second value is equal to the ratio of the second exposure time period to the first exposure time period.
That is, the region processing circuit 108 controls the capacitance device 128 to maintain a small capacitance value when the pixel unit 102 reads out the photoelectric signal with the second exposure time period as the signal reading interval, and the region processing circuit 108 controls the capacitance device 128 to maintain a large capacitance value when the pixel unit 102 reads out the photoelectric signal with the first exposure time period as the signal reading interval.
It can be understood that when controlling the different pixel units 102 to read out the photoelectric signals according to different exposure durations, the exposure values of the photoelectric signals read out by the different pixel units 102 are different, so that in order to align the output brightness of the different pixel units 102, the area processing circuit 108 is required to dynamically adjust the capacitance value of the capacitance device 128 correspondingly when the different pixel units 102 read out the photoelectric signals.
Illustratively, along with the above example, by the joint control of the first exposure control circuit 104 and the second exposure control circuit 106, the PD2 is controlled to sense and read out the photoelectric signal at the first frame rate of 90fps, and the PD1, PD3, and PD4 are controlled to sense and read out the photoelectric signal at the second frequency of 30 fps. When PD1, PD3 and PD4 read out photoelectric signals, the exposure time period of PD1, PD3 and PD4 is 33ms, the output signals of PD1, PD3 and PD4 are x×33=33x, where X is an exposure value of 1ms per pixel, and when PD2 reads out photoelectric signals, the exposure time period of PD2 is 11ms, and the output signal of PD2 is x×11=11x. At this time, when PD2 reads out the photoelectric signal, the area processing circuit 108 is required to adjust the capacitance value of FD1 to 1/3 of the original value, so that the output luminance of PD1, PD2, PD3, and PD4 can be ensured to be uniform.
According to the pixel circuit 100 of the embodiment of the application, the area processing circuit 108 is further configured to control the capacitance value of the capacitor device 128 to be a first value during the process of reading the photoelectric signal by the pixel unit 102 with the second exposure time as the signal reading interval, and control the capacitance value of the capacitor device 128 to be a second value during the process of reading the photoelectric signal by the pixel unit 102 with the first exposure time as the signal reading interval, wherein the ratio of the first value to the second value is equal to the ratio of the second exposure time to the first exposure time. Therefore, the consistency of the output brightness of each area of each frame of image is ensured, and the video quality is ensured.
Optionally, in case the number of pixel cells 102 is greater than 1, the pixel circuit 100 further comprises a sixth transistor 134, as shown in fig. 1, according to some embodiments of the application.
Wherein the sixth transistor 134 is connected between the first transistors 114 in different pixel units 102.
Specifically, as shown in fig. 1, a first terminal of the sixth transistor 134 is connected to the second terminal of the first transistor 114 in one pixel unit 102, and a second terminal of the sixth transistor 134 is connected to the first terminal of the first transistor 114 in another pixel unit 102. Thus, by controlling the sixth transistor 134 to be turned on, electrons in the photodiodes 132 of different pixel units 102 can be transferred to the same capacitive device 128.
According to the pixel circuit 100 of the embodiment of the application, in the case where the number of the pixel units 102 is greater than 1, the pixel circuit 100 further includes the sixth transistor 134. Wherein the sixth transistor 134 is connected between the first transistors 114 in different pixel units 102. In this way, a combined transfer of electrons to the photodiodes 132 in different pixel units 102 is facilitated by the sixth transistor 134.
Optionally, as shown in fig. 7, according to some embodiments of the present application, an image sensor 200 is further provided. Wherein the image sensor 200 comprises at least one pixel circuit 100 in any of the embodiments described above. The image sensor 200 provided in the embodiment of the present application includes the pixel circuit 100 in any of the above embodiments, and can achieve the same technical effects, and for avoiding repetition, the description is omitted here.
In practical applications, the pixel array of the image sensor 200 may be specifically shown in fig. 3, where the image sensor 200 includes at least two single pixels, and one pixel circuit 100 is used to control exposure and signal readout of one pixel, such as an R pixel, a Gr pixel, a Gb pixel, or a B pixel, and one pixel circuit 100 includes one photodiode. The upper limit value of the single pixels included in the image sensor 200 is related to the size of the image sensor 200 and the size of each single pixel, and is not particularly limited herein.
Further, as shown in fig. 4, the pixel array of the image sensor 200 may specifically further include at least one four-in-one pixel, where one pixel circuit 100 is used to control the exposure and signal readout of 4 pixels, such as 4R pixels, 4 Gr pixels, 4 Gb pixels, or 4B pixels, in each four-in-one pixel, and one pixel circuit 100 includes 4 photodiodes. The upper limit value of the four-in-one pixels included in the image sensor 200 is related to the size of the image sensor 200 and the size of each four-in-one pixel, and is not particularly limited herein.
Further, as shown in fig. 5, the image sensor 200 may further include at least one nine-in-one pixel, and one pixel circuit 100 is used to control the exposure and signal readout of 9 pixels, such as 9R pixels, 9 Gr pixels, 9 Gb pixels, or 9B pixels, in each nine-in-one pixel, and one pixel circuit 100 includes 9 photodiodes. The upper limit value of the nine-in-one pixels included in the image sensor 200 is related to the size of the image sensor 200 and the size of each nine-in-one pixel, and is not particularly limited herein.
Further, as shown in fig. 6, the image sensor 200 may include at least one sixteen-in-one pixel, one pixel circuit 100 for controlling exposure and signal readout of 16 pixels, such as 16R pixels, 16 Gr pixels, 16 Gb pixels, or 16B pixels, in each sixteen-in-one pixel, and one pixel circuit 100 includes 16 photodiodes. The upper limit value of the sixteen-in-one pixels included in the image sensor 200 is related to the size of the image sensor 200 and the size of each sixteen-in-one pixel, and is not particularly limited herein.
In practical applications, the specific pixel array of the image sensor 200 can be selected by those skilled in the art according to practical situations, and is not limited herein.
Optionally, as shown in fig. 8, according to some embodiments of the present application, an image capturing module 300 is further provided. The camera module 300 includes a base 302, the image sensor 200, a filter element 304, a lens 306, a motor 308, and a signal processing circuit 310. The image capturing module 300 provided in the embodiment of the present application includes the image sensor 200 in the above embodiment, and can achieve the same technical effects, and for avoiding repetition, the description is omitted here.
Wherein the image sensor 200 is disposed on a base 302.
Further, the image sensor 200 is the most core hardware of the camera module 300, which is a key factor for determining the imaging quality of the camera module 300. The main parameters of the image sensor 200 include sensor size, effective pixel count, and single pixel size. The image sensor 200 may be a CMOS (Complementary Metal Oxide Semiconductor ) sensor, which is fabricated by a process similar to a computer chip, mainly using semiconductor materials such as silicon and germanium to form a chip having N-type and P-type semiconductor structures.
Further, a filter element 304 is provided on the image sensor 200.
Further, the filter element 304 may be an infrared filter. As shown in fig. 8, the light of the scene converged into the camera module 300 is projected to the filter element 304, and the light passing through the filter element 304 can be perceived by the image sensor 200. The filtering element 304 is used to filter out unnecessary light projected to the image sensor 200, so as to prevent the image sensor 200 from generating false color and moire phenomenon, thereby improving the effective resolution and color reproducibility of the image sensor 200.
Further, a lens 306 is disposed on the filter element 304.
Further, the lens 306 is used for focusing and focusing, and the lens 306 may be formed by combining multiple glass or plastic lenses. For example, a 6P lens refers to a lens composed of 6 lenses. The design and quality of the lens 306 directly affects the manner in which light enters the image sensor 200, thereby affecting the imaging effect.
Further, a motor 308 is connected to the lens 306.
Further, a motor 308 encloses the stationary lens 306.
In the practical application process, the upper end and the lower end of the motor 308 are linked with the shrapnel. During focusing, the motor 308 generates electromagnetic force through electrifying, the electromagnetic force is balanced with the elastic force of the elastic sheet finally, the position of the motor 308 can be controlled through electrifying, and then the motor 308 and the lens 306 are pushed to the focusing position.
Further, the signal processing circuit 310 is connected to the image sensor 200. The signal processing circuit 310 is configured to process the output signal of the image sensor 200, thereby obtaining storable image data.
Optionally, according to some embodiments of the application, as shown in fig. 9, the signal processing circuit 310 includes an analog-to-digital converter 312 and an image processor 314.
Wherein an analog-to-digital converter 312 is connected to an output of each pixel circuit 100 in the image sensor 200.
Further, the analog-to-digital converter 312 is used for performing analog-to-digital conversion on the photoelectric signal output by the pixel circuit 100 to convert the photoelectric signal into a digital signal.
Further, an image processor 314 is coupled to the analog-to-digital converter 312.
Further, the image processor 314 is configured to process the digital signal output by the analog-to-digital converter 312 to generate storable image data.
Specifically, as shown in fig. 9, during shooting, light irradiates the image sensor 200 through the lens 306, after entering the image sensor 200, the light is sensed by a plurality of photodiodes and converted into an electric signal, i.e. an analog signal, which is converted into a digital signal by the analog-to-digital converter 312, and the digital signal is further processed by the image processor 314 to generate storable image data, and the image data is further transmitted to the memory for storage.
Optionally, according to some embodiments of the present application, as shown in fig. 10, an electronic device 400 is further provided according to an embodiment of the present application. The electronic device 400 includes the camera module 300 in any of the above embodiments. The electronic device 400 provided in the embodiment of the present application includes the camera module 300 in any of the above embodiments, and can achieve the same technical effects, and for avoiding repetition, the description is omitted here.
Further, as shown in FIG. 10, the electronic device 400 further includes a radio frequency unit 401, a network module 402, an audio output unit 403, an input unit 404, a sensor 405, a display unit 406, a user input unit 407, an interface unit 408, a memory 409, a processor 410, and the like.
Those skilled in the art will appreciate that the electronic device 400 may also include a power source, such as a battery, for powering the various components, which may be logically connected to the processor 410 by a power management system, such as to perform functions such as managing charge, discharge, and power consumption by the power management system. The electronic device structure shown in fig. 10 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than shown, or may combine certain components, or may be arranged in different components, which are not described in detail herein.
It should be appreciated that in embodiments of the present application, the input unit 404 may include a graphics processor (Graphics Processing Unit, GPU) 4041 and a microphone 4042, where the graphics processor 4041 processes image data of still pictures or video obtained by an image capture device, such as the camera module 300, in a video capture mode or an image capture mode. The display unit 406 may include a display panel 4061, and the display panel 4061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 407 includes at least one of a touch panel 4071 and other input devices 4072. The touch panel 4071 is also referred to as a touch screen. The touch panel 4071 may include two parts, a touch detection device and a touch controller. Other input devices 4072 may include, but are not limited to, a physical keyboard, function keys, a trackball, a mouse, a joystick, function keys may include volume control keys, switch keys, etc., which are not described herein.
Memory 409 may be used to store software programs as well as various data. The memory 409 may mainly include a first memory area storing programs or instructions and a second memory area storing data, wherein the first memory area may store an operating system, application programs or instructions required for at least one function such as a sound playing function, an image playing function, etc. Further, the memory 409 may include volatile memory or nonvolatile memory, or the memory 409 may include both volatile and nonvolatile memory. The nonvolatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable EPROM (EEPROM), or a flash Memory. The volatile memory may be random access memory (Random Access Memory, RAM), static random access memory (STATIC RAM, SRAM), dynamic random access memory (DYNAMIC RAM, DRAM), synchronous Dynamic Random Access Memory (SDRAM), double data rate Synchronous dynamic random access memory (Double DATA RATE SDRAM, DDRSDRAM), enhanced Synchronous dynamic random access memory (ENHANCED SDRAM, ESDRAM), synchronous link dynamic random access memory (SYNCH LINK DRAM, SLDRAM), and Direct random access memory (DRRAM). Memory 409 in embodiments of the application includes, but is not limited to, these and any other suitable types of memory.
Processor 410 may include one or more processing units and, optionally, processor 410 integrates an application processor that primarily processes operations involving an operating system, user interface, application program, etc., and a modem processor that primarily processes wireless communication signals, such as a baseband processor. It will be appreciated that the modem processor described above may not be integrated into the processor 410.
It should be noted that, the electronic device 400 in the embodiment of the present application includes a mobile electronic device and a non-mobile electronic device.
In an actual application process, the electronic device 400 may be a terminal, or may be other devices besides a terminal. For example, the electronic device 400 may be a Mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted electronic device, a Mobile internet appliance (Mobile INTERNET DEVICE, MID), an augmented reality (augmented reality, AR)/Virtual Reality (VR) device, a robot, a wearable device, an ultra-Mobile personal computer (UMPC), a netbook or a personal digital assistant (personal DIGITAL ASSISTANT, PDA), etc., and may also be a server, a network attached storage (Network Attached Storage, NAS), a personal computer (personal computer, PC), a Television (TV), a teller machine, a self-service machine, etc., which is not limited in particular embodiments of the present application.
In the description of the present specification, reference to the terms "one embodiment," "some embodiments," "illustrative embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiments or examples. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Although embodiments of the present application have been shown and described, it will be understood by those skilled in the art that various changes, modifications, substitutions and alterations can be made therein without departing from the spirit and scope of the application as defined by the appended claims and their equivalents.