WO2020038072A1 - 曝光控制方法、装置和电子设备 - Google Patents
曝光控制方法、装置和电子设备 Download PDFInfo
- Publication number
- WO2020038072A1 WO2020038072A1 PCT/CN2019/090476 CN2019090476W WO2020038072A1 WO 2020038072 A1 WO2020038072 A1 WO 2020038072A1 CN 2019090476 W CN2019090476 W CN 2019090476W WO 2020038072 A1 WO2020038072 A1 WO 2020038072A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- exposure
- night scene
- frame
- image
- jitter
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/71—Circuitry for evaluating the brightness variation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/741—Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
Definitions
- the present application relates to the technical field of mobile terminals, and in particular, to an exposure control method, device, and electronic device.
- This application is intended to solve at least one of the technical problems in the related technology.
- this application proposes an exposure control method.
- the shooting night scene mode is determined according to the degree of shake, and the exposure parameters used for each frame of the image in the currently adopted night scene mode are determined, thereby realizing based on different shooting scenes. Dynamically adjust the night scene mode and exposure parameters to improve the image quality of night scene shooting.
- the application proposes an exposure control device.
- the present application proposes an electronic device.
- the present application proposes a computer-readable storage medium.
- An embodiment of one aspect of the present application provides an exposure control method, including:
- the exposure parameter is used for exposure control.
- An embodiment of another aspect of the present application provides an exposure control device, including:
- a scene determination module configured to determine that a current shooting scene belongs to a night scene scene
- a recognition module configured to identify a night scene mode applicable to the current shooting scene according to the degree of jitter of the imaging device
- a parameter determining module configured to determine an exposure parameter of an image to be acquired in each frame according to the night scene mode
- a control module configured to perform exposure control by using the exposure parameter.
- An embodiment of another aspect of the present application provides an electronic device, including: a memory, a processor, and a computer program stored in the memory and executable on the processor.
- the processor executes the program, the implementation is implemented as described above. Aspect of the exposure control method.
- Another embodiment of the present application provides a computer-readable storage medium on which a computer program is stored.
- the instructions in the storage medium are executed by a processor, the exposure control method according to the foregoing aspect is implemented. .
- FIG. 1 is a schematic flowchart of an exposure control method according to an embodiment of the present application
- FIG. 2 is a schematic flowchart of another exposure control method according to an embodiment of the present application.
- FIG. 3 is a schematic flowchart of another exposure control method according to an embodiment of the present application.
- FIG. 4 is a schematic flowchart of still another exposure control method according to an embodiment of the present application.
- FIG. 5 is a schematic structural diagram of an exposure control apparatus according to an embodiment of the present application.
- FIG. 6 is a schematic diagram of the internal structure of the electronic device 200 in one embodiment.
- FIG. 7 is a schematic diagram of an image processing circuit 90 in one embodiment.
- this embodiment provides an exposure control method, determining that the current shooting scene belongs to a night scene, identifying the night scene mode applicable to the current shooting scene according to the degree of shaking of the imaging device, and determining the exposure of the images to be collected in each frame according to the night scene mode. Parameters, using exposure parameters for exposure control, to achieve the use of different night scene modes and exposure parameters based on different shooting scenes when shooting night scenes, improving the shooting effect.
- FIG. 1 is a schematic flowchart of an exposure control method according to an embodiment of the present application.
- the method includes the following steps:
- Step 101 Determine that the current shooting scene belongs to a night scene.
- an image acquisition module is used to obtain a preview image of the current scene, image feature extraction is performed on the preview image, and the extracted image features are input to the recognition model, and the current scene is determined according to the type of the scene output by the recognition model.
- the shooting scene is a night scene, where the recognition model has learned to obtain the correspondence between the image features and the scene type.
- a user operation for scene switching is detected, and when a user operation switched to a night scene is detected, the ambient brightness is detected to obtain brightness information.
- electronic The built-in light metering module detects the current ambient brightness and determines the brightness information of the current environment. According to the brightness information, it is determined that the current shooting scene belongs to a night scene. For example, the brightness level can be measured by the brightness index Lix_index. The larger the value of the brightness information, the lower the brightness of the current scene.
- the obtained brightness information is compared with a preset brightness value. If the obtained brightness information is greater than the preset brightness value, it is determined that the current shooting scene belongs to a night scene.
- the obtained brightness information is less than the preset brightness value, it is determined that the current shooting scene belongs to a non-night scene.
- imaging is performed in a high dynamic range mode.
- Exposure compensation value to obtain a higher dynamic range. For example, 3 frames of images can be acquired, and the interval of the exposure compensation value is [-4, +1].
- Step 102 Identify a night scene mode applicable to the current shooting scene according to the degree of shaking of the imaging device.
- the degree of jitter of the imaging device is obtained, and if the degree of jitter is greater than or equal to the first jitter threshold, a single-frame night scene mode is determined; if the degree of jitter is less than the first jitter threshold and greater than the second jitter threshold, a handheld night mode is determined to be adopted; If the degree of jitter is less than or equal to the second jitter threshold, it is determined to use a tripod night scene mode, where the first jitter threshold is greater than the second jitter threshold, the number of frames to be collected in the handheld night scene mode is greater than one frame, and the tripod night scene mode is used. The number of frames in the captured image is greater than the number of frames in the handheld night scene mode.
- Step 103 Determine the exposure parameters of the images to be collected in each frame according to the night scene mode.
- the exposure parameters include exposure duration, sensitivity, and exposure compensation.
- the preset sensitivity of each frame of the image to be acquired is determined according to the night scene mode, wherein the preset sensitivity in the handheld night scene mode is greater than the preset sensitivity in the tripod night scene mode.
- determine the preset exposure compensation value for each frame of the image to be acquired determine the reference exposure amount based on the brightness information of the preview image, and determine the frame to be acquired based on the reference exposure amount and the exposure compensation value preset for each frame of image to be acquired.
- the target exposure amount of the collected image is determined according to the target exposure amount of the image to be collected in each frame and the preset sensitivity of the image to be collected in each frame, to determine the exposure time of the image to be collected in each frame.
- the hand-held night scene mode since the hand-held night scene mode has a large shake, in order to improve the imaging quality and avoid the afterimage caused by the shake, the frame number of the image captured in the hand-held night scene mode is reduced as much as possible.
- the interval between exposure compensation values for multiple frames of images is set small, so that the high light areas in the final composite image can be And low light areas smooth transition.
- the value range of the exposure compensation value in the handheld night scene mode is often set to be small, so that the handheld night scene The exposure compensation value range in the mode is smaller than the exposure compensation value range in the tripod night scene mode.
- the degree of jitter in the handheld night scene mode can be further subdivided.
- the degree of jitter has a positive relationship with the sensitivity in the exposure parameter, that is, the greater the degree of jitter, the greater the sensitivity;
- the exposure time in the exposure parameters has a reverse relationship, that is, the greater the degree of jitter, the shorter the exposure time. This is because, although the lower the sensitivity, the less noise in the image, but at the same exposure, the The longer the required exposure time.
- the degree of jitter also has an inverse relationship with the value range of the exposure compensation value in the exposure parameters. This is because when the degree of jitter is greater, in order to improve the imaging quality and avoid the afterimage caused by the jitter, the captured image can be appropriately reduced. Number of frames.
- the degree of jitter is greater, in order to improve the imaging quality and avoid the afterimage caused by the jitter, the captured image can be appropriately reduced. Number of frames.
- the greater the degree of jitter the smaller the interval between exposure compensation values for multiple frames of images, and the smaller the number of captured images, the smaller the range of exposure compensation values.
- the correspondence relationship is queried to obtain the exposure parameters of the images to be acquired for each frame.
- Step 104 Use exposure parameters to perform exposure control.
- the exposure control is performed using the determined exposure parameters of the images to be acquired in each frame.
- the exposure control method of this embodiment it is determined that the current shooting scene belongs to a night scene, and the night scene mode applicable to the current shooting scene is identified according to the degree of jitter of the imaging device. Perform exposure control to dynamically adjust the night scene mode and exposure parameters based on different shooting scenes, improve the imaging quality of the night scene shooting, and solve the related technology, the night shooting mode is single and cannot be applied to all shooting scenes, causing Technical problems with poor shooting quality in some scenes.
- this embodiment provides another exposure control method, and further clearly explains that in step 102, the night scene mode applicable to the current shooting scene is identified according to the degree of jitter of the imaging device.
- FIG. A schematic flowchart of another exposure control method is provided.
- step 102 may include the following sub-steps:
- Step 1021 Obtain the degree of jitter of the imaging device.
- a sensor is provided in the imaging device to acquire the collected displacement information, and according to the displacement information, a degree of jitter of the imaging device is determined.
- the sensor may be a gyroscope, and the gyroscope may output displacement information of three axes of x, y, and z.
- the gyroscope may output displacement information of three axes of x, y, and z.
- the absolute value of the information is added and summed.
- the sum of the displacement information corresponding to the three axes is represented by S.
- the value of the displacement information S is used to indicate the degree of jitter of the imaging device, that is, the value of the degree of jitter is equal to the displacement information. Value of S.
- step 1022 it is determined whether the degree of jitter is greater than or equal to the first jitter threshold. If yes, go to step 1023; if no, go to step 1024.
- step 1023 it is determined that a single-frame night scene mode is adopted.
- the degree of jitter is greater than or equal to the first jitter threshold, the current jitter is large, and a single-frame night scene mode is directly used to obtain a single frame image. This is because when the degree of jitter is large, if multiple frames of images are acquired, Each frame of image may have a large difference, and the acquired images cannot be synthesized. Therefore, when the jitter is large, only a single frame of night scene mode is used to acquire a single frame of image.
- Step 1024 Determine whether the degree of jitter is less than or equal to the first jitter threshold and greater than the second jitter threshold. If yes, go to step 1026; if no, go to step 1025.
- the second jitter threshold is smaller than the first jitter threshold.
- step 1025 it is determined that the handheld night scene mode is adopted.
- the degree of jitter is not greater than or equal to the first jitter threshold, it indicates that the current jitter is not very large. It is further judged whether the current degree of jitter is less than or equal to the first jitter threshold and is greater than the second jitter threshold. If so, a handheld Night scene mode.
- the number of frames of the image to be acquired in the handheld night scene mode is greater than one frame, for example, it can be 7 frames, wherein the exposure parameters used in each frame are not exactly the same.
- the following embodiments will Details.
- step 1026 it is determined that the night scene mode of the tripod is adopted.
- a tripod night scene mode is used.
- the number of frames of the image to be acquired in the night scene mode is larger than the number of frames of the image to be acquired in the hand-held night scene mode, that is, in the night scene mode, the corresponding jitter is very small, and the imaging device itself has anti-jitter elimination.
- Strategies, such as Optical Image Stabilization (OIS), etc. Therefore, the jitter in the night scene mode of the tripod is very small, so it can take a long time to obtain a high-quality picture. Therefore, in this mode , Can take many frames of images continuously, for example, it can be 17 frames to improve the imaging quality of the image, where the exposure parameters used in each frame are not exactly the same, for the method of determining the exposure parameters, the following embodiments will Details.
- the degree of shake of the imaging device is obtained, and according to the magnitude of the degree of shake, it is determined that the night scene mode to be adopted is a tripod night scene mode, a handheld night scene mode and a single frame night scene mode, and the night scene mode is refined by In this way, different night scenes are used for different night scenes, which improves the imaging quality of the collected images.
- an embodiment of the present application further proposes an exposure control method, which further clearly explains how to determine the exposure parameters of the images to be collected in each frame according to the determined night scene mode.
- FIG. 3 is another example provided by the embodiment of the present application.
- Step 1031 determine the preset sensitivity of the image to be collected for each frame.
- the night scene mode is the handheld night scene mode
- the night scene mode is a tripod night scene mode
- the preset sensitivities of the images to be collected in each frame in the handheld night scene mode may be completely the same, or there may be small differences, which are not limited in this embodiment.
- the night scene of the tripod The principle of the preset sensitivity in the mode is the same as that in the hand-held night scene mode, and will not be repeated here.
- the preset sensitivity of the tripod night scene mode is smaller than the preset sensitivity of the handheld night scene mode. This is because the jitter level of the tripod night scene mode is smaller than that of the handheld night scene mode. Small, therefore, when collecting graphics to be collected for each frame, a lower sensitivity can be used, thereby extending the exposure time, reducing the noise of the image to be collected, and improving the imaging quality of the image.
- Step 1032 Determine a preset exposure compensation value for each frame of the image to be acquired according to the night scene mode.
- the night scene mode is the tripod night scene mode
- the range of the value EV (Exposure Compensation Value) is set to [-6, +2] EV, and the interval of the exposure compensation value for each frame is set to 0.5 EV.
- the night scene mode is the handheld night scene mode
- the value range is set to [-6, + 1] EV, for example, corresponding to the 7 frames of images to be acquired, the corresponding exposure compensation values can be [-6, -3,0, + 1, + 1, + 1, +1] EV.
- the exposure compensation value can also be changed according to the ambient brightness and the captured image, and the exposure range can be reduced.
- the range of the exposure compensation value is adjusted to [- 5, + 1] EV.
- Step 1033 Determine a reference exposure amount according to the brightness information of the preview image.
- the reference exposure amount is determined according to the brightness information of the preview image.
- the brightness information of the preview image corresponding to the current shooting scene is measured by the photometry module in the electronic device, and the set comparison value is used.
- Low sensitivity convert the measured brightness information, determine the reference exposure, and set it to EVO.
- the sensitivity measured by the photometric module is 500iso
- the exposure time is 50 milliseconds (ms)
- the target sensitivity If it is 100iso
- the sensitivity after conversion is 100iso
- the exposure time is 250ms.
- the sensitivity is 100iso and the exposure time is 250ms as the reference exposure amount EVO.
- EVO is not a fixed value, but a value that changes according to the brightness information of the preview image.
- the brightness information of the preview image changes, so the reference exposure EV0 is also Changed.
- Step 1034 Determine the target exposure amount of the image to be acquired for each frame according to the reference exposure amount and the preset exposure compensation value of the image to be acquired for each frame.
- the corresponding preset exposure compensation values for the 7 frames of images to be acquired are -6EV, -3EV, 0EV, + 1EV, + 1EV, + 1EV, and + 1EV, among which "+” Indicates that the exposure is increased based on the reference exposure set by the metering. "-" Indicates that the exposure is decreased.
- the corresponding number is the number of steps to compensate for the exposure. According to the preset exposure compensation value and reference exposure for each frame of the image to be acquired, determine each The target exposure of the frame to be acquired.
- the target exposure of the frame image is determined as EVO * 2 -6 , that is, EVO / 64, that is to reduce the brightness of the frame image capture; if the exposure compensation value of a frame image is 1EV and the reference exposure is EVO, the target exposure amount of the frame image is determined as EVO * 2, which is 2 times the EVO, that is, to increase the brightness of the image collection in this frame.
- the method of confirming the target exposure of the images to be collected in other frames is the same, and not listed here.
- the method for determining the target exposure amount of the image to be collected in each frame is the same, and is not repeated here.
- Step 1035 Determine the exposure duration of the image to be acquired in each frame according to the target exposure of the image to be acquired in each frame and the preset sensitivity of the image to be acquired in each frame.
- the aperture value in the night scene mode, is fixed when collecting each frame of images.
- the target exposure amount is determined by the sensitivity and the exposure duration.
- the corresponding exposure time can be determined.
- the sensitivity IOS value and exposure duration corresponding to the reference exposure are divided into: 100iso and 250ms, the preset sensitivity of a frame to be captured is 100iso, and the exposure compensation value is -3EV.
- Target exposure time is That is 32ms, that is, the exposure time is reduced.
- the exposure compensation value is + 1EV
- the obtained exposure time is 500ms, that is, the exposure time is increased.
- the exposure time of each frame can be determined.
- the minimum exposure duration supported by the shutter is 10 milliseconds (ms).
- the metering device may mistakenly assume that the current scene light Brighter, so that the determined reference exposure is smaller, that is, the exposure duration corresponding to the reference exposure is shorter.
- the calculated exposure time may be lower than the preset minimum exposure time of 10ms, such as 8ms, then increase the exposure time from 8ms to 10ms, and determine to increase the corresponding magnification compensation value to ensure that the darkest There is a certain shooting brightness in one frame.
- the images to be collected in each frame are increased according to this magnification compensation value, so that the brightness of the obtained images to be collected increases linearly, so that the acquired images are synthesized in subsequent images.
- the transition of the halo is natural, improving the effect of the synthesized image.
- the maximum exposure duration of a single frame calculated with the reference exposure amount may be greater than the maximum value set for the exposure duration, for example, 5 seconds, for example, the The sensitivity is 100iso, the exposure duration is 2 seconds, and when the corresponding exposure compensation value is + 2EV, the preset sensitivity of the image to be captured is also 100iso.
- the calculated exposure duration is 8 seconds, which exceeds the preset maximum of 5 Seconds, you need to reduce the exposure time of the frame to a preset maximum of 5 seconds, determine the reduction ratio, and adjust the sensitivity at this ratio to prevent the exposure time from being too long.
- the night scene mode is a single-frame night scene mode, it is difficult to acquire high-quality images due to large jitter. Therefore, only one frame of image is captured and the time is shorter than that of the handheld night scene mode and the tripod night scene mode.
- the exposure duration is used to perform image exposure control on the image to be acquired, which will not be described in detail in this embodiment.
- a preset sensitivity and a preset exposure compensation value of each frame to be acquired are determined, a reference exposure amount is determined based on the brightness information of the preview image, and the reference exposure is determined.
- the target exposure amount corresponding to the image to be collected for each frame and to determine the exposure time according to the target exposure amount and the preset sensitivity of the image to be collected for each frame, thereby determining the exposure parameters of the image to be collected for each frame.
- Different exposure parameters are set in different night scene modes, so that in the handheld night scene mode with large vibrations, multiple frames of images to be captured are acquired, and the exposure time is reduced for image collection. Multiple frames of images to be captured, and the exposure time is increased for image acquisition, in the night scene mode, to dynamically adjust the exposure parameters during night scene shooting, and improve the imaging quality of night scene shooting.
- FIG. 4 is a schematic flowchart of another exposure control method provided by an embodiment of the present application. As shown in FIG. 4, after step 104, the method It can also include the following steps:
- Step 401 Acquire each frame image collected under exposure control, and synthesize each frame image to obtain an imaging image.
- each frame image acquired under the control of the corresponding exposure parameter is acquired, and the acquired images are aligned to eliminate the influence of jitter, and at the same time, the image
- the moving objects in the image are detected to eliminate ghosting, and then the corresponding pixels in each frame image are weighted and synthesized to obtain a corresponding one frame target image.
- the exposure parameters used for each frame of the image acquired are different and correspond to different exposure durations, by combining the images of each frame, the dark portion of the final output imaging image can be obtained from the corresponding pixel information in the image with the longer exposure duration. For compensation, the bright part can be suppressed by the corresponding pixel information in the image with a shorter exposure time.
- the amplitude and position of the noise generated by the current are random. Therefore, when multiple images are superimposed and synthesized, the noise can be canceled each other, thereby improving the imaging quality.
- the exposure control method of the embodiment of the present application it is determined that the current shooting scene belongs to a night scene, and the night scene mode applicable to the current shooting scene is identified according to the degree of jitter of the imaging device.
- the night scene mode the exposure parameter of each frame to be collected is determined, and the exposure is adopted.
- the parameters are used for exposure control, and the obtained multi-frame images are synthesized to obtain an imaging image.
- different night scene modes and exposure parameters are used to obtain multiple frames of images based on different shooting scenes. Compositing, retaining the details of the highlights and the corresponding transitions, improves the imaging effect.
- the present application also proposes an exposure control device.
- FIG. 5 is a schematic structural diagram of an exposure control apparatus according to an embodiment of the present application.
- the device includes a scene determination module 51, an identification module 52, a parameter determination module 53, and a control module 54.
- the scene determining module 51 is configured to determine that a current shooting scene belongs to a night scene.
- the identification module 52 is configured to identify a night scene mode applicable to the current shooting scene according to the degree of shaking of the imaging device.
- a parameter determining module 53 is configured to determine an exposure parameter of an image to be acquired in each frame according to a night scene mode.
- the control module 54 is configured to perform exposure control by using an exposure parameter.
- the apparatus further includes: a synthesis module.
- a synthesizing module is configured to acquire each frame image collected under exposure control; and synthesize each frame image to obtain an imaging image.
- the foregoing identification module 52 may further include: an obtaining unit and a determining unit.
- the obtaining unit is configured to obtain a degree of jitter of the imaging device.
- the determining unit determines that if the degree of jitter is greater than or equal to the first jitter threshold, the single-frame night scene mode is adopted; if the degree of jitter is less than the first jitter threshold and is greater than the second jitter threshold, the hand-held night scene mode is determined; if the degree of jitter is less than or Is equal to the second jitter threshold, and a tripod night scene mode is determined to be adopted;
- the first jitter threshold is greater than the second jitter threshold, the number of frames of the image to be collected in the handheld night scene mode is greater than one frame, and the number of frames of the image to be collected in the tripod night scene mode is greater than the number of frames in the handheld night scene mode.
- the obtaining unit is specifically configured to obtain the collected displacement information from a sensor provided on the imaging device; and determine the degree of jitter of the imaging device according to the displacement information.
- the foregoing parameter determining module 53 is specifically configured to:
- the preset sensitivity in the handheld night scene mode is greater than the preset sensitivity in the tripod night scene mode.
- the foregoing parameter determining module 53 is further specifically configured to:
- the night scene mode determine a preset exposure compensation value for each frame of images to be acquired
- the exposure time of the images to be acquired in each frame is determined.
- the value range of the exposure compensation value in the handheld night scene mode is smaller than the value range of the exposure compensation value in the tripod night scene mode.
- the above-mentioned parameter determination module 53 may also be specifically used for:
- the degree of jitter of the imaging device query the corresponding relationship to obtain the exposure parameters of the images to be collected in each frame; wherein the exposure parameters include the exposure time, sensitivity, and exposure compensation value.
- the degree of dithering has a positive relationship with the sensitivity in the exposure parameters; the degree of dithering has an inverse relationship with the exposure time in the exposure parameters; the range of value between the degree of dithering and the exposure compensation value in the exposure parameters Has a reverse relationship.
- the foregoing scenario determining module 51 is specifically configured to:
- Extract the image features of the preview image input the extracted image features into the recognition model, and determine that the current shooting scene belongs to the night scene according to the type of scene output by the recognition model; wherein the recognition model has learned to obtain the correspondence between the image features and the scene type.
- the foregoing scenario determination module 51 is further specifically configured to:
- Detecting a user operation for scene switching when detecting a user operation switching to a night scene, detecting the ambient brightness to obtain brightness information; and according to the brightness information, determining that the current shooting scene belongs to a night scene.
- the exposure control device of this embodiment it is determined that the current shooting scene belongs to a night scene, and the night scene mode applicable to the current shooting scene is identified according to the degree of jitter of the imaging device.
- the night scene mode the exposure parameters of the images to be collected in each frame are determined, and the exposure parameters are adopted Perform exposure control to achieve different shooting modes and exposure parameters based on different shooting scenes when shooting night scenes. This improves the shooting effect and solves the problem that in the related technology, the night shooting mode is single and cannot be applied to all shooting scenes. Causes technical problems of poor shooting quality in some scenes.
- an embodiment of the present application further provides an electronic device including a memory, a processor, and a computer program stored in the memory and executable on the processor.
- the processor executes the program, The exposure control method according to the foregoing method embodiment is implemented.
- FIG. 6 is a schematic diagram of the internal structure of the electronic device 200 in one embodiment.
- the electronic device 200 includes a processor 60, a memory 50 (for example, a non-volatile storage medium), an internal memory 82, a display screen 83, and an input device 84 connected through a system bus 81.
- the memory 50 of the electronic device 200 stores an operating system and computer-readable instructions.
- the computer-readable instructions can be executed by the processor 60 to implement the control method in the embodiment of the present application.
- the processor 60 is used to provide computing and control capabilities to support the operation of the entire electronic device 200.
- the internal memory 50 of the electronic device 200 provides an environment for execution of computer-readable instructions in the memory 52.
- the display screen 83 of the electronic device 200 may be a liquid crystal display or an electronic ink display.
- the input device 84 may be a touch layer covered on the display screen 83, or may be a button, a trackball, or a touch button provided on the housing of the electronic device 200.
- Board which can also be an external keyboard, trackpad, or mouse.
- the electronic device 200 may be a mobile phone, a tablet computer, a notebook computer, a personal digital assistant, or a wearable device (for example, a smart bracelet, a smart watch, a smart helmet, or smart glasses).
- Those skilled in the art can understand that the structure shown in FIG. 6 is only a schematic diagram of a part of the structure related to the solution of the present application, and does not constitute a limitation on the electronic device 200 to which the solution of the present application is applied.
- the specific electronic device 200 may include more or fewer components than shown in the figure, or some components may be combined, or have different component arrangements.
- the electronic device 200 includes an image processing circuit 90.
- the image processing circuit 90 may be implemented by using hardware and / or software components, including various types of defining an ISP (Image Signal Processing) pipeline. Processing unit.
- FIG. 7 is a schematic diagram of an image processing circuit 90 in one embodiment. As shown in FIG. 7, for ease of description, only aspects of the image processing technology related to the embodiments of the present application are shown.
- the image processing circuit 90 includes an ISP processor 91 (the ISP processor 91 may be the processor 60) and a control logic 92.
- the image data captured by the camera 93 is first processed by the ISP processor 91.
- the ISP processor 91 analyzes the image data to capture image statistical information that can be used to determine one or more control parameters of the camera 93.
- the camera 93 may include one or more lenses 932 and an image sensor 934.
- the image sensor 934 may include a color filter array (such as a Bayer filter). The image sensor 934 may obtain light intensity and wavelength information captured by each imaging pixel, and provide a set of raw image data that can be processed by the ISP processor 91.
- the sensor 94 (such as a gyroscope) may provide parameters (such as image stabilization parameters) of the acquired image processing to the ISP processor 91 based on the interface type of the sensor 94.
- the sensor 94 interface may be a SMIA (Standard Mobile Imaging Architecture) interface, other serial or parallel camera interfaces, or a combination of the foregoing interfaces.
- the image sensor 934 may also send the original image data to the sensor 94.
- the sensor 94 may provide the original image data to the ISP processor 91 based on the interface type of the sensor 94, or the sensor 94 stores the original image data into the image memory 95.
- the ISP processor 91 processes the original image data pixel by pixel in a variety of formats. For example, each image pixel may have a bit depth of 8, 10, 12, or 14 bits, and the ISP processor 91 may perform one or more image processing operations on the original image data and collect statistical information about the image data. The image processing operations may be performed with the same or different bit depth accuracy.
- the ISP processor 91 may also receive image data from the image memory 95.
- the sensor 94 interface sends the original image data to the image memory 95, and the original image data in the image memory 95 is then provided to the ISP processor 91 for processing.
- the image memory 95 may be an independent dedicated memory in the memory 50, a part of the memory 50, a storage device, or an electronic device, and may include a DMA (Direct Memory Access) feature.
- DMA Direct Memory Access
- the ISP processor 91 may perform one or more image processing operations, such as time-domain filtering.
- the processed image data may be sent to the image memory 95 for further processing before being displayed.
- the ISP processor 91 receives processing data from the image memory 95, and performs processing on the image data in the original domain and in the RGB and YCbCr color spaces.
- the image data processed by the ISP processor 91 may be output to a display 97 (the display 97 may include a display screen 83) for viewing by a user and / or further processing by a graphics engine or a GPU (Graphics Processing Unit).
- the output of the ISP processor 91 can also be sent to the image memory 95, and the display 97 can read image data from the image memory 95.
- the image memory 95 may be configured to implement one or more frame buffers.
- the output of the ISP processor 91 may be sent to an encoder / decoder 96 to encode / decode image data.
- the encoded image data can be saved and decompressed before being displayed on the display 97 device.
- the encoder / decoder 96 may be implemented by a CPU or a GPU or a coprocessor.
- the statistical data determined by the ISP processor 91 may be sent to the control logic unit 92.
- the statistical data may include image sensor 934 statistical information such as auto exposure, auto white balance, auto focus, flicker detection, black level compensation, and lens 932 shading correction.
- the control logic 92 may include a processing element and / or a microcontroller that executes one or more routines (such as firmware). The one or more routines may determine the control parameters of the camera 93 and the ISP processor according to the received statistical data. 91 control parameters.
- control parameters of the camera 93 may include sensor 94 control parameters (such as gain, integration time for exposure control, anti-shake parameters, etc.), camera flash control parameters, lens 932 control parameters (such as focus distance for focusing or zooming), or these parameters The combination.
- the ISP control parameters may include gain levels and color correction matrices for automatic white balance and color adjustment (eg, during RGB processing), and lens 932 shading correction parameters.
- the following are the steps of implementing the exposure control method by using the processor 60 in FIG. 6 or the image processing circuit 90 (specifically, the ISP processor 91) in FIG. 7:
- an embodiment of the present application further provides a computer-readable storage medium on which a computer program is stored.
- a computer program When instructions in the storage medium are executed by a processor, the implementation is implemented as in the foregoing method embodiment.
- first and second are used for descriptive purposes only, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Therefore, the features defined as “first” and “second” may explicitly or implicitly include at least one of the features. In the description of the present application, the meaning of "a plurality” is at least two, for example, two, three, etc., unless it is specifically and specifically defined otherwise.
- Any process or method description in a flowchart or otherwise described herein can be understood as representing a module, fragment, or portion of code that includes one or more executable instructions for implementing steps of a custom logic function or process
- the scope of the preferred embodiments of this application includes additional implementations in which the functions may be performed out of the order shown or discussed, including performing the functions in a substantially simultaneous manner or in the reverse order according to the functions involved. It is understood by those skilled in the art to which the embodiments of the present application pertain.
- Logic and / or steps represented in a flowchart or otherwise described herein, for example, a sequenced list of executable instructions that may be considered to implement a logical function, may be embodied in any computer-readable medium, For use by, or in combination with, an instruction execution system, device, or device (such as a computer-based system, a system that includes a processor, or another system that can fetch and execute instructions from an instruction execution system, device, or device) Or equipment.
- a "computer-readable medium” may be any device that can contain, store, communicate, propagate, or transmit a program for use by or in connection with an instruction execution system, apparatus, or device.
- computer-readable media include the following: electrical connections (electronic devices) with one or more wirings, portable computer disk cartridges (magnetic devices), random access memory (RAM), Read-only memory (ROM), erasable and editable read-only memory (EPROM or flash memory), fiber optic devices, and portable optical disk read-only memory (CDROM).
- the computer-readable medium may even be paper or other suitable medium on which the program can be printed, because, for example, by optically scanning the paper or other medium, followed by editing, interpretation, or other suitable Processing to obtain the program electronically and then store it in computer memory.
- each part of the application may be implemented by hardware, software, firmware, or a combination thereof.
- multiple steps or methods may be implemented by software or firmware stored in a memory and executed by a suitable instruction execution system.
- Discrete logic circuits with logic gates for implementing logic functions on data signals Logic circuits, ASICs with suitable combinational logic gate circuits, programmable gate arrays (PGA), field programmable gate arrays (FPGA), etc.
- a person of ordinary skill in the art can understand that all or part of the steps carried by the methods in the foregoing embodiments can be implemented by a program instructing related hardware.
- the program can be stored in a computer-readable storage medium.
- the program is When executed, one or a combination of the steps of the method embodiment is included.
- each functional unit in each embodiment of the present application may be integrated into one processing module, or each unit may exist separately physically, or two or more units may be integrated into one module.
- the above integrated modules may be implemented in the form of hardware or software functional modules. If the integrated module is implemented in the form of a software functional module and sold or used as an independent product, it may also be stored in a computer-readable storage medium.
- the aforementioned storage medium may be a read-only memory, a magnetic disk, or an optical disk.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
Abstract
本申请提出一种曝光控制方法、装置和电子设备,涉及移动终端技术领域,其中,方法包括:确定当前拍摄场景属于夜景场景,根据成像设备的抖动程度,识别当前拍摄场景适用的夜景模式,根据夜景模式,确定各帧待采集图像的曝光参数,采用曝光参数进行曝光控制,实现了基于不同的拍摄场景,动态调整夜景模式和曝光参数,提高了夜景拍摄时图像的成像质量,解决了相关技术中,夜景拍摄的模式单一,无法适用所有的拍摄场景,造成某些场景下拍摄质量较差的技术问题。
Description
相关申请的交叉引用
本申请要求OPPO广东移动通信有限公司于2018年8月22日提交的、申请名称为“曝光控制方法、装置和电子设备”的、中国专利申请号“201810962773.6”的优先权。
本申请涉及移动终端技术领域,尤其涉及一种曝光控制方法、装置和电子设备。
随着移动终端技术和图像处理技术的发展,人们对于拍摄的要求越来越高,即使是环境光线较暗的夜晚,也希望可以获取高质量的图像,然而,夜景环境中由于光源强弱和位置不定,使得拍摄环境复杂多变,从而需要相应的夜景模式与复杂多变的夜景场景相匹配,以提高夜景拍摄的效果。
发明内容
本申请旨在至少在一定程度上解决相关技术中的技术问题之一。
为此,本申请提出一种曝光控制方法,在夜景拍摄时,根据抖动程度确定拍摄的夜景模式,并确定当前采用的夜景模式中各帧图像采用的曝光参数,实现了基于不同的拍摄场景,动态调整夜景模式和曝光参数,提高了夜景拍摄时图像的成像质量。
本申请提出一种曝光控制装置。
本申请提出一种电子设备。
本申请提出一种计算机可读存储介质。
本申请一方面实施例提出了一种曝光控制方法,包括:
确定当前拍摄场景属于夜景场景;
根据成像设备的抖动程度,识别当前拍摄场景适用的夜景模式;
根据所述夜景模式,确定各帧待采集图像的曝光参数;
采用所述曝光参数进行曝光控制。
本申请又一方面实施例提出了一种曝光控制装置,包括:
场景确定模块,用于确定当前拍摄场景属于夜景场景;
识别模块,用于根据成像设备的抖动程度,识别当前拍摄场景适用的夜景模式;
参数确定模块,用于根据所述夜景模式,确定各帧待采集图像的曝光参数;
控制模块,用于采用所述曝光参数进行曝光控制。
本申请又一方面实施例提出了一种电子设备,包括:存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时,实现如前述一方面所述的曝光控制方法。
本申请又一方面实施例提出了一种计算机可读存储介质,其上存储有计算机程序,当所述存储介质中的指令由处理器被执行时,实现如前述一方面所述的曝光控制方法。
本申请上述的和/或附加的方面和优点从下面结合附图对实施例的描述中将变得明显和容易理解,其中:
图1为本申请实施例所提供的一种曝光控制方法的流程示意图;
图2为本申请实施例所提供的另一种曝光控制方法的流程示意图;
图3为本申请实施例所提供的又一种曝光控制方法的流程示意图;
图4为本申请实施例所提供的再一种曝光控制方法的流程示意图;
图5为本申请实施例提供的一种曝光控制装置的结构示意图;
图6为一个实施例中电子设备200的内部结构示意图;以及
图7为一个实施例中图像处理电路90的示意图。
下面详细描述本申请的实施例,所述实施例的示例在附图中示出,其中自始至终相同或类似的标号表示相同或类似的元件或具有相同或类似功能的元件。下面通过参考附图描述的实施例是示例性的,旨在用于解释本申请,而不能理解为对本申请的限制。
下面参考附图描述本申请实施例的曝光控制方法、装置和电子设备。
目前,在夜景场景下,移动终端大都采用单一夜景模式进行拍摄,而夜景环境中光源强弱和位置不定,使得拍摄环境复杂多变,因此,单一的夜景模式无法适用于所有的场景,在一些场景下会存在拍摄效果不佳的情况。为此,本实施例提供了一种曝光控制方法,确定当前拍摄场景属于夜景场景,根据成像设备的抖动程度,识别当前拍摄场景适用的夜景模式,根据夜景模式,确定各帧待采集图像的曝光参数,采用曝光参数进行曝光控制,实现了在夜景场景拍摄时,基于不同的拍摄场景采用不同的夜景模式和曝光参数,提高了拍摄效果。
图1为本申请实施例所提供的一种曝光控制方法的流程示意图。
如图1所示,该方法包括以下步骤:
步骤101,确定当前拍摄场景属于夜景场景。
作为一种可能的实现方式,根据当前拍摄场景,利用图像采集模块获取当前场景的预览图像,对预览图像进行图像特征提取,将提取的图像特征输入识别模型,根据识别模型输出的场景类型确定当前拍摄场景属于夜景场景,其中,识别模型已学习得到图像特征与场景类型之间的对应关系。
作为另一种可能的实现方式,探测用于场景切换的用户操作,当探测到切换至夜景场景的用户操作时,检测环境亮度,以得到亮度信息,作为一种可能的实现方式,可通过电子设备内置的测光模块对当前的环境亮度进行检测,确定当前环境的亮度信息。根据亮度信息,确定当前拍摄场景属于夜景场景。例如,可通过亮度指数Lix_index衡量亮度高低,其中,亮度信息的值越大,代表当前场景亮度越低。将获取的亮度信息和预设的亮度值比对,若获取的亮度信息大于预设亮度值时,确定当前拍摄场景属于夜景场景。反之,若获取的亮度信息小于预设亮度值,则确定当前拍摄场景属于非夜景场景,在非夜景场景下,采用高动态范围模式进行成像,其中,高动态范围成像时,可通过设置不同的曝光补偿值,获取较高的动态范围,例如,可以采集3帧图像,曝光补偿值的区间为[-4,+1]。
步骤102,根据成像设备的抖动程度,识别当前拍摄场景适用的夜景模式。
具体地,获取成像设备的抖动程度,若抖动程度大于或等于第一抖动阈值,确定采用单帧夜景模式;若抖动程度小于第一抖动阈值,且大于第二抖动阈值,确定采用手持夜景模式;若抖动程度小于或等于第二抖动阈值,确定采用脚架夜景模式,其中,第一抖动阈值大于第二抖动阈值,手持夜景模式下待采集图像的帧数大于一帧,脚架夜景模式下待采集图像的帧数大于手持夜景模式下的帧数。
步骤103,根据夜景模式,确定各帧待采集图像的曝光参数。
其中,曝光参数包括曝光时长、感光度和曝光补偿值。
具体地,根据夜景模式,确定各帧待采集图像预设的感光度,其中,手持夜景模式下预设的感光度大于脚架夜景模式下预设的感光度。根据夜景模式,确定各帧待采集图像预设的曝光补偿值,根据预览图像的亮度信息,确定基准曝光量,根据基准曝光量和各帧待采集图像预设的曝光补偿值,确定各帧待采集图像的目标曝光量,根据各帧待采集图像的目标曝光量和各帧待采集图像预设的感光度,确定各帧待采集图像的曝光时长。
本申请实施例中,由于手持夜景模式下,抖动较大,为了提高成像质量,避免由于抖动导致出现残影,尽量减少手持夜景模式下拍摄图像的帧数。另外,夜景模式下,为了能够使得图像中高亮区域和低亮区域均能够具有适合的曝光度,对多帧图像的曝光补偿值之间的间隔设置较小,使得最终合成图像中能够高亮区域和低亮区域平滑过渡。为了同时满足多帧图像的曝光补偿值之间的较小间隔,以及手持夜景模式下拍摄图像的较少帧数,手持夜景模式 下曝光补偿值的取值范围往往设置得较小,从而手持夜景模式下曝光补偿值的取值范围小于脚架夜景模式下曝光补偿值的取值范围。
可选地,手持夜景模式的抖动程度,还可以进行进一步的细分,在手持夜景模式下,抖动程度和曝光参数之间存在对应关系,作为一种可能的实现方式,当处于手持夜景模式下时,读取抖动程度和曝光参数之间的对应关系,该对应关系具体为:抖动程度与曝光参数中的感光度具有正向关系,也就是说抖动程度越大感光度越大;抖动程度与曝光参数中的曝光时长具有反向关系,也就是说抖动程度越大,曝光时长越短,这是由于,尽管感光度越低图像中的噪声越少,但在同等曝光量的情况下,所需的曝光时长越长。为了适应抖动情况,避免由于时长过长导致图像出现残影,可以根据抖动情况,适当提高感光度,以降低曝光时长。同时,抖动程度还与曝光参数中的曝光补偿值的取值范围具有反向关系,这是因为,抖动程度越大时,为了提高成像质量,避免由于抖动导致出现残影,可适当减少拍摄图像的帧数。另外,为了能够使得图像中高亮区域和低亮区域均能够具有适合的曝光度,对多帧图像的曝光补偿值之间的间隔设置越小,使得最终合成图像中能够高亮区域和低亮区域平滑过渡。因此,抖动程度越大,为了同时满足多帧图像的曝光补偿值之间的较小间隔,以及拍摄图像的较少帧数,曝光补偿值的取值范围往往设置得较小。
进而,根据成像设备的抖动程度,查询该对应关系,得到各帧待采集图像的曝光参数。通过在手持夜景模式下,对抖动程度进行进一步细分,实现了在手持夜景模式下,基于不同的抖动程度,使用不同的曝光参数,提高了手持夜景模式下的图像的成像质量。
步骤104,采用曝光参数进行曝光控制。
具体地,根据确定的夜景模式,采用确定的各帧待采集图像的曝光参数进行曝光控制。
本实施例的曝光控制方法中,确定当前拍摄场景属于夜景场景,根据成像设备的抖动程度,识别当前拍摄场景适用的夜景模式,根据夜景模式,确定各帧待采集图像的曝光参数,采用曝光参数进行曝光控制,实现了基于不同的拍摄场景,动态调整夜景模式和曝光参数,提高了夜景拍摄时图像的成像质量,解决了相关技术中,夜景拍摄的模式单一,无法适用所有的拍摄场景,造成某些场景下拍摄质量较差的技术问题。
基于上述实施例,本实施例提供了另一种曝光控制方法,进一步清楚的说明上述步骤102中,根据成像设备的抖动程度,识别当前拍摄场景适用的夜景模式,图2为本申请实施例所提供的另一种曝光控制方法的流程示意图。
如图2所示,步骤102可以包括以下子步骤:
步骤1021,获取成像设备的抖动程度。
具体地,成像设备中设置有传感器,获取采集到的位移信息,根据位移信息,确定成像设备的抖动程度。作为一种可能的实现方式,该传感器可以为陀螺仪,陀螺仪可以输出x、 y和z三个轴的位移信息,通过获取陀螺仪的三个轴的位移信息,将三个轴对应的位移信息取绝对值并相加求和,三个轴对应的位移信息之和用S来表示,通过位移信息S的取值来指示成像设备的抖动程度,也就是说抖动程度的取值等于位移信息S的取值。
步骤1022,判断抖动程度是否大于或等于第一抖动阈值,若是,执行步骤1023,若否,执行步骤1024。
步骤1023,确定采用单帧夜景模式。
具体地,若抖动程度大于或等于第一抖动阈值,则当前的抖动较大,直接采用单帧夜景模式,获取单帧图像,这是因为,在抖动程度较大时,若采集多帧图像,每一帧图像可能差异较大,无法进行获取的图像的合成,因此,抖动较大时,仅采用单帧夜景模式采集单帧图像。
步骤1024,判断抖动程度是否小于或等于第一抖动阈值,且大于第二抖动阈值,若是,执行步骤1026,若否,执行步骤1025。
其中,第二抖动阈值小于第一抖动阈值。
步骤1025,确定采用手持夜景模式。
具体地,若抖动程度不是大于或等于第一抖动阈值,则说明当前抖动不是非常大,进一步判断,当前抖动程度是否小于或等于第一抖动阈值,且大于第二抖动阈值,若是,则采用手持夜景模式。
其中,手持夜景模式下待采集图像的帧数大于一帧,例如,可以为7帧,其中,每一帧所采用的曝光参数不完全相同,对于曝光参数的确定方法,下述实施例中会详细介绍。
步骤1026,确定采用脚架夜景模式。
具体地,若当前抖动程度不是为小于或等于第一抖动阈值,且大于第二抖动阈值,则采用脚架夜景模式。
其中,脚架夜景模式下待采集图像的帧数大于手持夜景模式下待采集图像的帧数,也就是说,脚架夜景模式,对应的抖动非常小,加上成像设备本身具有防抖动消除策略,如光学防抖(Optical image stabilization,OIS)等,因此,脚架夜景模式下的抖动是非常小的,因此可拍摄的时间较久,从而获取高质量的画面,因此,在该模式下,可以连续拍摄较多帧图像,例如,可以为17帧,以提高图像的成像质量,其中,每一帧所采用的曝光参数不完全相同,对于曝光参数的确定方法,下述实施例中会详细介绍。
本实施例的曝光控制方法中,获取成像设备的抖动程度,根据抖动程度的大小,确定需要采用的夜景模式分别为脚架夜景模式,手持夜景模式和单帧夜景模式,通过将夜景模式进行细分,实现了不同的夜景拍摄的场景采用不同的夜景模式,提高了采集图像的成像质量。
基于上述实施例,本申请实施例还提出了一种曝光控制方法,进一步清楚的说明根据确 定的夜景模式,如何确定各帧待采集图像的曝光参数,图3为本申请实施例所提供的又一种曝光控制方法的流程示意图,如图3所示,步骤103还可以包含如下的子步骤:
步骤1031,根据夜景模式,确定各帧待采集图像预设的感光度。
在一种场景下,若夜景模式为手持夜景模式,则确定手持夜景模式下各帧待采集图像预设的感光度iso,例如,感光度为200iso。
在另一种场景下,若夜景模式为脚架夜景模式,则确定脚架夜景模式下各帧待采集图像预设的感光度iso,例如,感光度为100iso。
需要说明的是,本实施例中,手持夜景模式下各帧待采集图像的预设感光度,可以完全相同,也可以存在较小的差异,本实施例中不作限定,同理,脚架夜景模式下预设感光度的原理同手持夜景模式下,此处不再赘述。
需要理解的是,脚架夜景模式下感光度的预设值小于手持夜景模式下的预设感光度,这是因为,脚架夜景模式下抖动程度小于手持夜景模式的抖动程度,因抖动程度更小,因此,采集各帧待采集图形时,可以采用较低的感光度,从而延长曝光时长,降低待采集图像的噪点,提高图像的成像质量。
步骤1032,根据夜景模式,确定各帧待采集图像预设的曝光补偿值。
具体地,基于不同的夜景模式,确定不同的各帧待待机图像预设的曝光补偿值。
在一种场景下,若夜景模式为脚架夜景模式,确定各帧待采集图像的预设的曝光补偿值,例如,脚架夜景模式下,待采集的图像设定为17帧,将曝光补偿值EV(Exposure Compensation Value)的范围设定为[-6,+2]EV,每帧的曝光补偿值的间隔设定为0.5EV。
在另一种场景下,若夜景模式为手持夜景模式下,确定各帧待采集图像的预设的曝光补偿值,例如,手持夜景模式下,待采集的图像设定为7帧,将曝光补偿值的范围设定为[-6,+1]EV,例如,对应待采集的7帧图像,对应曝光补偿值可分别为[-6,-3,0,+1,+1,+1,+1]EV。
需要说明的是,在实际应用中,根据环境亮度和拍摄的图像,曝光补偿值还可以发生变化,将曝光的范围调小,例如,在手持夜景模式下,曝光补偿值的范围调整为[-5,+1]EV。
步骤1033,根据预览图像的亮度信息,确定基准曝光量。
具体地,根据预览图像的亮度信息,确定基准曝光量,作为一种可能的实现方式,通过电子设备中的测光模块,测量当前拍摄场景对应的预览图像的亮度信息,并以设定的较低的感光度,将测量得到的亮度信息进行转化,确定基准曝光量,设定为EVO,例如,根据测光模块测量得到的感光度为500iso,曝光时长为50毫秒(ms),目标感光度为100iso,则转化后得到感光度为100iso,曝光时长为250ms,将感光度100iso,曝光时长250ms作为基准曝光量EVO。
需要说明的是,EVO并不是一个固定的值,而是根据预览图像的亮度信息进行变化的 值,当环境亮度发生变化时,预览图像的亮度信息则会发生变化,那么基准曝光量EV0则也发生变化。
需要说明的是,上述步骤1031,步骤1032和步骤1033的执行并没有时序之分。
步骤1034,根据基准曝光量和各帧待采集图像预设的曝光补偿值,确定各帧待采集图像的目标曝光量。
例如,在手持夜景模式下,待采集的7帧图像,对应的预设的曝光补偿值分别为-6EV,-3EV,0EV,+1EV,+1EV,+1EV,+1EV,其中,“+”表示在测光所定基准曝光量的基础上增加曝光,“-”表示减少曝光,相应的数字为补偿曝光的级数,根据各帧待采集图像预设的曝光补偿值和基准曝光量,确定各帧待采集图像的目标曝光量,例如,若一帧图像的曝光补偿值为-6EV,数字-6为补偿曝光的级数,基准曝光量为EVO,则确定的该帧图像的目标曝光量为EVO*2
-6,即EVO/64,即降低该帧图像采集的亮度;若一帧图像的曝光补偿值为1EV,基准曝光量为EVO,则确定的该帧图像的目标曝光量为EVO*2,即为2倍的EVO,即提高该帧图像采集的亮度,同理,其他各帧待采集图像的目标曝光量的确认方法相同,此处不一一列举。
需要说明的是,脚架夜景模式下,各帧待采集图像的目标曝光量的确定方法相同,此处不再赘述。
步骤1035,根据各帧待采集图像的目标曝光量和各帧待采集图像预设的感光度,确定各帧待采集图像的曝光时长。
本申请实施例中,在夜景模式下,采集各帧图像时,光圈值是固定的,针对每一帧待采集图像,目标曝光量由感光度和曝光时长共同确定的,当感光度确定时,则对应的曝光时长则可以确定。
例如,基准曝光量对应的感光度IOS值和曝光时长分为为:100iso和250ms,一帧待采集图像的预设的感光度为100iso,曝光补偿值为-3EV,则该帧待采集图像的目标曝光时长为
即为32ms,即降低了曝光时长,同理,当曝光补偿值为+1EV时,得到的曝光时长则为500ms,即提高了曝光时长。同理,可确定各帧的曝光时长。通过设置的较宽的动态范围,使得待采集的各帧图像分别采用不同的曝光时长进行采集,从而使得图像中各部分的细节均可以在不同的曝光时长的控制下得到清晰的成像,从而提高成像效果。
本申请实施例中,快门支持的曝光时长的最低值为10毫秒(ms),在一种可能的场景下,当拍摄场景中有较量的灯光照射摄像头时,测光装置会误以为当前场景光线较亮,使得确定的基准曝光量较小,即基准曝光量对应的曝光时长较小,进而,以基准曝光量为EVO计算得到各帧待采集图像的曝光时长时,容易出现曝光补偿值对应-6E时,计算得到的曝光时长可能低于预设的曝光时长的最小值10ms,例如为8ms,则将曝光时长由8ms提升至 10ms,并确定提升对应的倍率补偿值,以保证采集最暗的一帧时有一定的拍摄亮度,同时,各帧待采集图像都按照这个倍率补偿值进行提升,以使得得到的待采集图像在亮度上是成线性增加的,以使得采集的图像在后续图像合成时,光晕的过渡自然,提高合成后图像的效果。
在另一种可能的场景下,当环境非常黑时,以基准曝光量计算得到的单帧最长曝光时长可能会大于曝光时长设定的最大值,例如,5秒,例如,基准曝光量的感光度为100iso,曝光时长为2秒,对应曝光补偿值为+2EV时,待采集图像的预设感光度也为100iso,则计算得到的曝光时长为8秒,超过了预设的最大值5秒,则需要将该帧曝光时长进行压低,压低到预设的最大值5秒,确定压低的倍率,并以此倍率调整感光度,防止曝光时长过久,当最长的曝光时长进行压缩后,若还需要调整曝光时的曝光量,则仅通过适当提高感光度的值来提高拍摄时的亮度,而感光值在提升时也不能超过预设的感光值的上限,例如550iso,这是因为设置较大的感光度,会增加图片的中的噪点,降低成像质量。
需要说明的是,当夜景模式为单帧夜景模式时,因抖动较大,很难采集到高质量的图像,因此,仅采集一帧图像,并以短于手持夜景模式和脚架夜景模式的曝光时长,对要采集的图像进行图像曝光控制,本实施例中不再详细介绍。
本申请实施例的曝光控制方法中,根据不同的夜景模式,确定各帧待采集图像的预设感光度和预设的曝光补偿值,根据预览图像的亮度信息,确定基准曝光量,依据基准曝光量,确定对应各帧待采集图像的目标曝光量,根据目标曝光量和预设的各帧待采集图像的预设感光度,确定曝光时长,从而确定各帧待采集图像的曝光参数,通过在不同夜景模式下设定不同的曝光参数,使得在振动较大的手持夜景模式下,获取多帧待拍摄图像,并降低曝光时长进行图像采集,在抖动较小的脚架夜景模式下,获取较多帧的待拍摄图像,并增加曝光时长进行图像采集,实现在夜景模式下,动态调整夜景拍摄时的曝光参数,提高了夜景拍摄时图像的成像质量。
基于上述实施例,本申请实施例还提出了一种曝光控制方法,图4为本申请实施例所提供的再一种曝光控制方法的流程示意图,如图4所示,步骤104之后,该方法还可以包括如下步骤:
步骤401,获取在曝光控制下采集到的各帧图像,对各帧图像进行合成,得到成像图像。
具体地,根据确定的各帧待采集图像的曝光参数,获取在对应的曝光参数控制下采集到的各帧图像,将获取到的各帧图像进行对齐,以消除抖动的影响,同时,对图像中的运动物体进行检测,以消除鬼影,进而,将各帧图像中对应像素进行加权合成,得到对应的一帧目标图像。由于获取到的各帧图像采用的曝光参数不同,对应不同的曝光时长,因此,通过将各帧图像进行合成,则最终输出的成像图像中暗部可以由曝光时长较长的图像中的对应像素信息进行补偿,亮部可以由曝光时长较短的图像中对应像素信息进行压制,因此,最终输出 的合成成像图像不存在过曝区域及欠曝区域,图像的亮度明暗过度均匀,具有较佳的成像效果。同时,在合成过程中,因电流生成的噪声幅度和位置是随机的,因此,在多张图像进行叠加合成时,噪声可以被相互抵消,从而提高了成像质量。
本申请实施例的曝光控制方法中,确定当前拍摄场景属于夜景场景,根据成像设备的抖动程度,识别当前拍摄场景适用的夜景模式,根据夜景模式,确定各帧待采集图像的曝光参数,采用曝光参数进行曝光控制,并根据获取得到的多帧图像进行合成得到成像图像,实现了在夜景场景拍摄时,基于不同的拍摄场景采用不同的夜景模式和曝光参数获取多帧图像,并将多帧图像进行合成,保留亮部细节和相应的过渡,提高了成像效果。
为了实现上述实施例,本申请还提出一种曝光控制装置。
图5为本申请实施例提供的一种曝光控制装置的结构示意图。
如图5所示,该装置包括:场景确定模块51、识别模块52、参数确定模块53和控制模块54。
场景确定模块51,用于确定当前拍摄场景属于夜景场景。
识别模块52,用于根据成像设备的抖动程度,识别当前拍摄场景适用的夜景模式。
参数确定模块53,用于根据夜景模式,确定各帧待采集图像的曝光参数。
控制模块54,用于采用曝光参数进行曝光控制。
进一步地,在本申请实施例的一种可能的实现方式中,该装置还包括:合成模块。
合成模块,用于获取在曝光控制下采集到的各帧图像;对各帧图像进行合成,得到成像图像。
在本申请实施例的一种可能的实现方式中,上述识别模块52,还可以包括:获取单元和确定单元。
获取单元,用于获取成像设备的抖动程度。
确定单元,若抖动程度大于或等于第一抖动阈值,确定采用单帧夜景模式;若抖动程度小于所述第一抖动阈值,且大于第二抖动阈值,确定采用手持夜景模式;若抖动程度小于或等于第二抖动阈值,确定采用脚架夜景模式;
其中,第一抖动阈值大于所述第二抖动阈值,手持夜景模式下待采集图像的帧数大于一帧,脚架夜景模式下待采集图像的帧数大于手持夜景模式下的帧数。
作为一种可能的实现方式,获取单元,具体用于从成像设备设置的传感器,获取采集到的位移信息;根据位移信息,确定成像设备的抖动程度。
作为一种可能的实现方式,上述参数确定模块53,具体用于:
根据夜景模式,确定各帧待采集图像预设的感光度;
其中,手持夜景模式下预设的感光度大于所述脚架夜景模式下预设的感光度。
作为一种可能的实现方式,上述参数确定模块53,具体还用于:
根据夜景模式,确定各帧待采集图像预设的曝光补偿值;
根据预览图像的亮度信息,确定基准曝光量;
根据基准曝光量和各帧待采集图像预设的曝光补偿值,确定各帧待采集图像的目标曝光量;
根据各帧待采集图像的目标曝光量和各帧待采集图像预设的感光度,确定各帧待采集图像的曝光时长。
作为一种可能的实现方式,手持夜景模式下曝光补偿值的取值范围小于脚架夜景模式下曝光补偿值的取值范围。
作为一种可能的实现方式,上述参数确定模块53,具体还可以用于:
若在手持夜景模式下,读取抖动程度与曝光参数之间对应关系;
根据成像设备的抖动程度,查询对应关系,得到各帧待采集图像的曝光参数;其中,曝光参数包括曝光时长、感光度、曝光补偿值。
作为一种可能的实现方式,抖动程度与曝光参数中的感光度具有正向关系;抖动程度与曝光参数中的曝光时长具有反向关系;抖动程度与曝光参数中的曝光补偿值的取值范围具有反向关系。
作为一种可能的实现方式,上述场景确定模块51,具体用于:
对预览图像进行图像特征提取;将提取的图像特征输入识别模型,根据识别模型输出的场景类型确定当前拍摄场景属于夜景场景;其中,识别模型已学习得到图像特征与场景类型之间的对应关系。
作为另一种可能的实现方式,上述场景确定模块51,具体还用于:
探测用于场景切换的用户操作;当探测到切换至夜景场景的用户操作时,检测环境亮度,以得到亮度信息;根据亮度信息,确定当前拍摄场景属于夜景场景。
需要说明的是,前述对方法实施例的解释说明也适用于该实施例的装置,此处不再赘述。
本实施例的曝光控制装置中,确定当前拍摄场景属于夜景场景,根据成像设备的抖动程度,识别当前拍摄场景适用的夜景模式,根据夜景模式,确定各帧待采集图像的曝光参数,采用曝光参数进行曝光控制,实现了在夜景场景拍摄时,基于不同的拍摄场景采用不同的夜景模式和曝光参数,提高了拍摄效果,解决了相关技术中,夜景拍摄的模式单一,无法适用所有的拍摄场景,造成某些场景下拍摄质量较差的技术问题。
为了实现上述实施例,本申请实施例还提出了一种电子设备,包括:存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时,实现如前述方法实施例所述的曝光控制方法。
图6为一个实施例中电子设备200的内部结构示意图。该电子设备200包括通过系统总线81连接的处理器60、存储器50(例如为非易失性存储介质)、内存储器82、显示屏83和输入装置84。其中,电子设备200的存储器50存储有操作系统和计算机可读指令。该计算机可读指令可被处理器60执行,以实现本申请实施方式的控制方法。该处理器60用于提供计算和控制能力,支撑整个电子设备200的运行。电子设备200的内存储器50为存储器52中的计算机可读指令的运行提供环境。电子设备200的显示屏83可以是液晶显示屏或者电子墨水显示屏等,输入装置84可以是显示屏83上覆盖的触摸层,也可以是电子设备200外壳上设置的按键、轨迹球或触控板,也可以是外接的键盘、触控板或鼠标等。该电子设备200可以是手机、平板电脑、笔记本电脑、个人数字助理或穿戴式设备(例如智能手环、智能手表、智能头盔、智能眼镜)等。本领域技术人员可以理解,图6中示出的结构,仅仅是与本申请方案相关的部分结构的示意图,并不构成对本申请方案所应用于其上的电子设备200的限定,具体的电子设备200可以包括比图中所示更多或更少的部件,或者组合某些部件,或者具有不同的部件布置。
请参阅图7,本申请实施例的电子设备200中包括图像处理电路90,图像处理电路90可利用硬件和/或软件组件实现,包括定义ISP(Image Signal Processing,图像信号处理)管线的各种处理单元。图7为一个实施例中图像处理电路90的示意图。如图7所示,为便于说明,仅示出与本申请实施例相关的图像处理技术的各个方面。
如图7所示,图像处理电路90包括ISP处理器91(ISP处理器91可为处理器60)和控制逻辑器92。摄像头93捕捉的图像数据首先由ISP处理器91处理,ISP处理器91对图像数据进行分析以捕捉可用于确定摄像头93的一个或多个控制参数的图像统计信息。摄像头93可包括一个或多个透镜932和图像传感器934。图像传感器934可包括色彩滤镜阵列(如Bayer滤镜),图像传感器934可获取每个成像像素捕捉的光强度和波长信息,并提供可由ISP处理器91处理的一组原始图像数据。传感器94(如陀螺仪)可基于传感器94接口类型把采集的图像处理的参数(如防抖参数)提供给ISP处理器91。传感器94接口可以为SMIA(Standard Mobile Imaging Architecture,标准移动成像架构)接口、其它串行或并行照相机接口或上述接口的组合。
此外,图像传感器934也可将原始图像数据发送给传感器94,传感器94可基于传感器94接口类型把原始图像数据提供给ISP处理器91,或者传感器94将原始图像数据存储到图像存储器95中。
ISP处理器91按多种格式逐个像素地处理原始图像数据。例如,每个图像像素可具有8、10、12或14比特的位深度,ISP处理器91可对原始图像数据进行一个或多个图像处理操作、收集关于图像数据的统计信息。其中,图像处理操作可按相同或不同的位深度精度进行。
ISP处理器91还可从图像存储器95接收图像数据。例如,传感器94接口将原始图像数据发送给图像存储器95,图像存储器95中的原始图像数据再提供给ISP处理器91以供处理。图像存储器95可为存储器50、存储器50的一部分、存储设备、或电子设备内的独立的专用存储器,并可包括DMA(Direct Memory Access,直接直接存储器存取)特征。
当接收到来自图像传感器934接口或来自传感器94接口或来自图像存储器95的原始图像数据时,ISP处理器91可进行一个或多个图像处理操作,如时域滤波。处理后的图像数据可发送给图像存储器95,以便在被显示之前进行另外的处理。ISP处理器91从图像存储器95接收处理数据,并对处理数据进行原始域中以及RGB和YCbCr颜色空间中的图像数据处理。ISP处理器91处理后的图像数据可输出给显示器97(显示器97可包括显示屏83),以供用户观看和/或由图形引擎或GPU(Graphics Processing Unit,图形处理器)进一步处理。此外,ISP处理器91的输出还可发送给图像存储器95,且显示器97可从图像存储器95读取图像数据。在一个实施例中,图像存储器95可被配置为实现一个或多个帧缓冲器。此外,ISP处理器91的输出可发送给编码器/解码器96,以便编码/解码图像数据。编码的图像数据可被保存,并在显示于显示器97设备上之前解压缩。编码器/解码器96可由CPU或GPU或协处理器实现。
ISP处理器91确定的统计数据可发送给控制逻辑器92单元。例如,统计数据可包括自动曝光、自动白平衡、自动聚焦、闪烁检测、黑电平补偿、透镜932阴影校正等图像传感器934统计信息。控制逻辑器92可包括执行一个或多个例程(如固件)的处理元件和/或微控制器,一个或多个例程可根据接收的统计数据,确定摄像头93的控制参数及ISP处理器91的控制参数。例如,摄像头93的控制参数可包括传感器94控制参数(例如增益、曝光控制的积分时间、防抖参数等)、照相机闪光控制参数、透镜932控制参数(例如聚焦或变焦用焦距)、或这些参数的组合。ISP控制参数可包括用于自动白平衡和颜色调整(例如,在RGB处理期间)的增益水平和色彩校正矩阵,以及透镜932阴影校正参数。
例如,以下为运用图6中的处理器60或运用图7中的图像处理电路90(具体为ISP处理器91)实现曝光控制方法的步骤:
01:确定当前拍摄场景属于夜景场景;
02:根据成像设备的抖动程度,识别当前拍摄场景适用的夜景模式;
03:根据夜景模式,确定各帧待采集图像的曝光参数;
04:采用曝光参数进行曝光控制。
为了实现上述实施例,本申请实施例还提出了一种计算机可读存储介质,其上存储有计算机程序,当所述存储介质中的指令由处理器被执行时,实现如前述方法实施例所述的曝光控制方法。
在本说明书的描述中,参考术语“一个实施例”、“一些实施例”、“示例”、“具体示例”、或“一些示例”等的描述意指结合该实施例或示例描述的具体特征、结构、材料或者特点包含于本申请的至少一个实施例或示例中。在本说明书中,对上述术语的示意性表述不必须针对的是相同的实施例或示例。而且,描述的具体特征、结构、材料或者特点可以在任一个或多个实施例或示例中以合适的方式结合。此外,在不相互矛盾的情况下,本领域的技术人员可以将本说明书中描述的不同实施例或示例以及不同实施例或示例的特征进行结合和组合。
此外,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括至少一个该特征。在本申请的描述中,“多个”的含义是至少两个,例如两个,三个等,除非另有明确具体的限定。
流程图中或在此以其他方式描述的任何过程或方法描述可以被理解为,表示包括一个或更多个用于实现定制逻辑功能或过程的步骤的可执行指令的代码的模块、片段或部分,并且本申请的优选实施方式的范围包括另外的实现,其中可以不按所示出或讨论的顺序,包括根据所涉及的功能按基本同时的方式或按相反的顺序,来执行功能,这应被本申请的实施例所属技术领域的技术人员所理解。
在流程图中表示或在此以其他方式描述的逻辑和/或步骤,例如,可以被认为是用于实现逻辑功能的可执行指令的定序列表,可以具体实现在任何计算机可读介质中,以供指令执行系统、装置或设备(如基于计算机的系统、包括处理器的系统或其他可以从指令执行系统、装置或设备取指令并执行指令的系统)使用,或结合这些指令执行系统、装置或设备而使用。就本说明书而言,"计算机可读介质"可以是任何可以包含、存储、通信、传播或传输程序以供指令执行系统、装置或设备或结合这些指令执行系统、装置或设备而使用的装置。计算机可读介质的更具体的示例(非穷尽性列表)包括以下:具有一个或多个布线的电连接部(电子装置),便携式计算机盘盒(磁装置),随机存取存储器(RAM),只读存储器(ROM),可擦除可编辑只读存储器(EPROM或闪速存储器),光纤装置,以及便携式光盘只读存储器(CDROM)。另外,计算机可读介质甚至可以是可在其上打印所述程序的纸或其他合适的介质,因为可以例如通过对纸或其他介质进行光学扫描,接着进行编辑、解译或必要时以其他合适方式进行处理来以电子方式获得所述程序,然后将其存储在计算机存储器中。
应当理解,本申请的各部分可以用硬件、软件、固件或它们的组合来实现。在上述实施方式中,多个步骤或方法可以用存储在存储器中且由合适的指令执行系统执行的软件或固件来实现。如,如果用硬件来实现和在另一实施方式中一样,可用本领域公知的下列技术中的任一项或他们的组合来实现:具有用于对数据信号实现逻辑功能的逻辑门电路的离散逻辑电 路,具有合适的组合逻辑门电路的专用集成电路,可编程门阵列(PGA),现场可编程门阵列(FPGA)等。
本技术领域的普通技术人员可以理解实现上述实施例方法携带的全部或部分步骤是可以通过程序来指令相关的硬件完成,所述的程序可以存储于一种计算机可读存储介质中,该程序在执行时,包括方法实施例的步骤之一或其组合。
此外,在本申请各个实施例中的各功能单元可以集成在一个处理模块中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。所述集成的模块如果以软件功能模块的形式实现并作为独立的产品销售或使用时,也可以存储在一个计算机可读取存储介质中。
上述提到的存储介质可以是只读存储器,磁盘或光盘等。尽管上面已经示出和描述了本申请的实施例,可以理解的是,上述实施例是示例性的,不能理解为对本申请的限制,本领域的普通技术人员在本申请的范围内可以对上述实施例进行变化、修改、替换和变型。
Claims (20)
- 一种曝光控制方法,其特征在于,所述方法包括以下步骤:确定当前拍摄场景属于夜景场景;根据成像设备的抖动程度,识别当前拍摄场景适用的夜景模式;根据所述夜景模式,确定各帧待采集图像的曝光参数;采用所述曝光参数进行曝光控制。
- 根据权利要求1所述的曝光控制方法,其特征在于,所述根据成像设备的抖动程度,识别当前拍摄场景适用的夜景模式,包括:获取成像设备的抖动程度;若所述抖动程度大于或等于第一抖动阈值,确定采用单帧夜景模式;若所述抖动程度小于所述第一抖动阈值,且大于第二抖动阈值,确定采用手持夜景模式;若所述抖动程度小于或等于所述第二抖动阈值,确定采用脚架夜景模式;其中,所述第一抖动阈值大于所述第二抖动阈值,所述手持夜景模式下待采集图像的帧数大于一帧,所述脚架夜景模式下待采集图像的帧数大于所述手持夜景模式下的帧数。
- 根据权利要求2所述的曝光控制方法,其特征在于,所述根据所述夜景模式,确定各帧待采集图像的曝光参数,包括:根据所述夜景模式,确定各帧待采集图像预设的感光度;其中,所述手持夜景模式下所述预设的感光度大于所述脚架夜景模式下所述预设的感光度。
- 根据权利要求3所述的曝光控制方法,其特征在于,所述根据所述夜景模式,确定各帧待采集图像的曝光参数,还包括:根据所述夜景模式,确定各帧待采集图像预设的曝光补偿值;根据所述预览图像的亮度信息,确定基准曝光量;根据所述基准曝光量和各帧待采集图像预设的曝光补偿值,确定各帧待采集图像的目标曝光量;根据各帧待采集图像的目标曝光量和各帧待采集图像预设的感光度,确定各帧待采集图像的曝光时长。
- 根据权利要求4所述的曝光控制方法,其特征在于,所述手持夜景模式下所述曝光补偿值的取值范围小于所述脚架夜景模式下所述曝光补偿值的取值范围。
- 根据权利要求2-5任一项所述的曝光控制方法,其特征在于,所述获取成像设备的抖动程度,包括:从成像设备设置的传感器,获取采集到的位移信息;根据所述位移信息,确定所述成像设备的抖动程度。
- 根据权利要求2-5任一项所述的曝光控制方法,其特征在于,所述根据所述夜景模式,确定各帧待采集图像的曝光参数,包括:若在手持夜景模式下,读取抖动程度与曝光参数之间对应关系;根据所述成像设备的抖动程度,查询所述对应关系,得到各帧待采集图像的曝光参数;所述曝光参数包括曝光时长、感光度、曝光补偿值。
- 根据权利要求7所述的曝光控制方法,其特征在于,所述抖动程度与所述曝光参数中的感光度具有正向关系;所述抖动程度与所述曝光参数中的曝光时长具有反向关系;所述抖动程度与所述曝光参数中的曝光补偿值的取值范围具有反向关系。
- 根据权利要求1-8任一项所述的曝光控制方法,其特征在于,所述采用所述曝光参数进行曝光控制之后,还包括:获取在所述曝光控制下采集到的各帧图像;对各帧图像进行合成,得到成像图像。
- 根据权利要求1-8任一项所述的曝光控制方法,其特征在于,所述确定当前拍摄场景属于夜景场景,包括:对所述预览图像进行图像特征提取;将提取的图像特征输入识别模型,根据所述识别模型输出的场景类型确定当前拍摄场景属于夜景场景;其中,所述识别模型已学习得到图像特征与场景类型之间的对应关系。
- 根据权利要求1-8任一项所述的曝光控制方法,其特征在于,所述确定当前拍摄场景属于夜景场景,包括:探测用于场景切换的用户操作;当探测到切换至夜景场景的用户操作时,检测环境亮度,以得到亮度信息;根据所述亮度信息,确定当前拍摄场景属于夜景场景。
- 一种曝光控制装置,其特征在于,所述装置包括:场景确定模块,用于确定当前拍摄场景属于夜景场景;识别模块,用于根据成像设备的抖动程度,识别当前拍摄场景适用的夜景模式;参数确定模块,用于根据所述夜景模式,确定各帧待采集图像的曝光参数;控制模块,用于采用所述曝光参数进行曝光控制。
- 根据权利要求12所述的曝光控制装置,其特征在于,所述识别模块,包括:获取单元,用于获取成像设备的抖动程度;确定单元,用于若所述抖动程度大于或等于第一抖动阈值,确定采用单帧夜景模式;若所述抖动程度小于所述第一抖动阈值,且大于第二抖动阈值,确定采用手持夜景模式;若所述抖动程度小于或等于所述第二抖动阈值,确定采用脚架夜景模式;其中,所述第一抖动阈值大于所述第二抖动阈值,所述手持夜景模式下待采集图像的帧数大于一帧,所述脚架夜景模式下待采集图像的帧数大于所述手持夜景模式下的帧数。
- 根据权利要求13所述的曝光控制装置,其特征在于,所述参数确定模块,具体用于:根据所述夜景模式,确定各帧待采集图像预设的感光度;其中,所述手持夜景模式下所述预设的感光度大于所述脚架夜景模式下所述预设的感光度。
- 根据权利要求14所述的曝光控制装置,其特征在于,所述参数确定模块,具体还用于:根据所述夜景模式,确定各帧待采集图像预设的曝光补偿值;根据所述预览图像的亮度信息,确定基准曝光量;根据所述基准曝光量和各帧待采集图像预设的曝光补偿值,确定各帧待采集图像的目标曝光量;根据各帧待采集图像的目标曝光量和各帧待采集图像预设的感光度,确定各帧待采集图像的曝光时长。
- 根据权利要求15所述的曝光控制装置,其特征在于,所述手持夜景模式下所述曝光补偿值的取值范围小于所述脚架夜景模式下所述曝光补偿值的取值范围。
- 根据权利要求13-16任一项所述的曝光控制装置,其特征在于,所述获取单元,具体用于:从成像设备设置的传感器,获取采集到的位移信息;根据所述位移信息,确定所述成像设备的抖动程度。
- 根据权利要求13-16任一项所述的曝光控制装置,其特征在于,所述参数确定模块,具体还用于:若在手持夜景模式下,读取抖动程度与曝光参数之间对应关系;根据所述成像设备的抖动程度,查询所述对应关系,得到各帧待采集图像的曝光参数;所述曝光参数包括曝光时长、感光度、曝光补偿值。
- 一种电子设备,其特征在于,包括:存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时,实现如权利要求1-11中任一所述的 曝光控制方法。
- 一种计算机可读存储介质,其上存储有计算机程序,其特征在于,该程序被处理器执行时实现如权利要求1-11中任一所述的曝光控制方法。
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201810962773.6A CN109040609B (zh) | 2018-08-22 | 2018-08-22 | 曝光控制方法、装置、电子设备和计算机可读存储介质 |
| CN201810962773.6 | 2018-08-22 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2020038072A1 true WO2020038072A1 (zh) | 2020-02-27 |
Family
ID=64627979
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2019/090476 Ceased WO2020038072A1 (zh) | 2018-08-22 | 2019-06-10 | 曝光控制方法、装置和电子设备 |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN109040609B (zh) |
| WO (1) | WO2020038072A1 (zh) |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112818732A (zh) * | 2020-08-11 | 2021-05-18 | 腾讯科技(深圳)有限公司 | 一种图像处理方法、装置、计算机设备和存储介质 |
| CN112911165A (zh) * | 2021-03-02 | 2021-06-04 | 杭州海康慧影科技有限公司 | 内窥镜曝光方法、装置及计算机可读存储介质 |
| CN114222075A (zh) * | 2022-01-28 | 2022-03-22 | 广州华多网络科技有限公司 | 移动端图像处理方法及其装置、设备、介质、产品 |
| CN114697595A (zh) * | 2020-12-28 | 2022-07-01 | 北京小米移动软件有限公司 | 视频录制方法、装置及存储介质 |
| CN118555493A (zh) * | 2024-07-29 | 2024-08-27 | 福瑞泰克智能系统有限公司 | 背光场景的曝光控制方法、装置、设备及存储介质 |
Families Citing this family (19)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN109040609B (zh) * | 2018-08-22 | 2021-04-09 | Oppo广东移动通信有限公司 | 曝光控制方法、装置、电子设备和计算机可读存储介质 |
| CN108833804A (zh) * | 2018-09-20 | 2018-11-16 | Oppo广东移动通信有限公司 | 成像方法、装置和电子设备 |
| CN109618102B (zh) * | 2019-01-28 | 2021-08-31 | Oppo广东移动通信有限公司 | 对焦处理方法、装置、电子设备及存储介质 |
| CN110060213B (zh) * | 2019-04-09 | 2021-06-15 | Oppo广东移动通信有限公司 | 图像处理方法、装置、存储介质及电子设备 |
| CN110072051B (zh) * | 2019-04-09 | 2021-09-03 | Oppo广东移动通信有限公司 | 基于多帧图像的图像处理方法和装置 |
| CN110113539A (zh) * | 2019-06-13 | 2019-08-09 | Oppo广东移动通信有限公司 | 曝光控制方法、装置、电子设备以及存储介质 |
| CN110536057B (zh) * | 2019-08-30 | 2021-06-08 | Oppo广东移动通信有限公司 | 图像处理方法和装置、电子设备、计算机可读存储介质 |
| CN112532857B (zh) * | 2019-09-18 | 2022-04-12 | 华为技术有限公司 | 一种延时摄影的拍摄方法及设备 |
| CN110708475B (zh) * | 2019-11-27 | 2021-08-24 | 维沃移动通信有限公司 | 一种曝光参数确定方法、电子设备及存储介质 |
| CN111654594B (zh) * | 2020-06-16 | 2022-05-17 | Oppo广东移动通信有限公司 | 图像拍摄方法、图像拍摄装置、移动终端及存储介质 |
| CN111988523B (zh) * | 2020-08-14 | 2022-05-13 | RealMe重庆移动通信有限公司 | 超级夜景图像生成方法及装置、终端和可读存储介质 |
| CN112911109B (zh) * | 2021-01-20 | 2023-02-24 | 维沃移动通信有限公司 | 电子设备及拍摄方法 |
| CN113660425B (zh) * | 2021-08-19 | 2023-08-22 | 维沃移动通信(杭州)有限公司 | 图像处理方法、装置、电子设备及可读存储介质 |
| CN116723408B (zh) | 2022-02-28 | 2024-05-14 | 荣耀终端有限公司 | 一种曝光控制方法及电子设备 |
| CN115016354A (zh) * | 2022-06-20 | 2022-09-06 | 上海千眼科技发展有限公司 | 一种环境侦查用可自主调节夜视仪 |
| CN115719316A (zh) * | 2022-11-24 | 2023-02-28 | 哲库科技(上海)有限公司 | 图像处理方法及装置、电子设备和计算机可读存储介质 |
| CN119254904A (zh) * | 2024-02-24 | 2025-01-03 | 荣耀终端有限公司 | 图像处理方法、芯片系统及电子设备 |
| CN120769167A (zh) * | 2024-03-30 | 2025-10-10 | 荣耀终端股份有限公司 | 拍摄防抖方法、电子设备及存储介质 |
| CN118540592B (zh) * | 2024-07-26 | 2025-01-14 | 比亚迪股份有限公司 | 自动曝光控制方法、装置、电子设备、车辆及存储介质 |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20090051783A1 (en) * | 2007-08-23 | 2009-02-26 | Samsung Electronics Co., Ltd. | Apparatus and method of capturing images having optimized quality under night scene conditions |
| CN101795358A (zh) * | 2009-01-30 | 2010-08-04 | 佳能株式会社 | 摄像设备及其控制方法 |
| CN103002224A (zh) * | 2011-09-09 | 2013-03-27 | 佳能株式会社 | 摄像装置及其控制方法 |
| CN103227896A (zh) * | 2012-01-26 | 2013-07-31 | 佳能株式会社 | 电子装置及电子装置控制方法 |
| CN109040609A (zh) * | 2018-08-22 | 2018-12-18 | Oppo广东移动通信有限公司 | 曝光控制方法、装置和电子设备 |
Family Cites Families (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101241294B (zh) * | 2007-02-06 | 2010-09-01 | 亚洲光学股份有限公司 | 摄像控制方法及其装置 |
| CN101262567B (zh) * | 2008-04-07 | 2010-12-08 | 北京中星微电子有限公司 | 自动曝光方法与装置 |
| CN106375676A (zh) * | 2016-09-20 | 2017-02-01 | 广东欧珀移动通信有限公司 | 终端设备的拍照控制方法、装置和终端设备 |
| CN108322669B (zh) * | 2018-03-06 | 2021-03-23 | Oppo广东移动通信有限公司 | 图像获取方法及装置、成像装置和可读存储介质 |
-
2018
- 2018-08-22 CN CN201810962773.6A patent/CN109040609B/zh active Active
-
2019
- 2019-06-10 WO PCT/CN2019/090476 patent/WO2020038072A1/zh not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20090051783A1 (en) * | 2007-08-23 | 2009-02-26 | Samsung Electronics Co., Ltd. | Apparatus and method of capturing images having optimized quality under night scene conditions |
| CN101795358A (zh) * | 2009-01-30 | 2010-08-04 | 佳能株式会社 | 摄像设备及其控制方法 |
| CN103002224A (zh) * | 2011-09-09 | 2013-03-27 | 佳能株式会社 | 摄像装置及其控制方法 |
| CN103227896A (zh) * | 2012-01-26 | 2013-07-31 | 佳能株式会社 | 电子装置及电子装置控制方法 |
| CN109040609A (zh) * | 2018-08-22 | 2018-12-18 | Oppo广东移动通信有限公司 | 曝光控制方法、装置和电子设备 |
Cited By (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112818732A (zh) * | 2020-08-11 | 2021-05-18 | 腾讯科技(深圳)有限公司 | 一种图像处理方法、装置、计算机设备和存储介质 |
| CN112818732B (zh) * | 2020-08-11 | 2023-12-12 | 腾讯科技(深圳)有限公司 | 一种图像处理方法、装置、计算机设备和存储介质 |
| CN114697595A (zh) * | 2020-12-28 | 2022-07-01 | 北京小米移动软件有限公司 | 视频录制方法、装置及存储介质 |
| CN112911165A (zh) * | 2021-03-02 | 2021-06-04 | 杭州海康慧影科技有限公司 | 内窥镜曝光方法、装置及计算机可读存储介质 |
| CN112911165B (zh) * | 2021-03-02 | 2023-06-16 | 杭州海康慧影科技有限公司 | 内窥镜曝光方法、装置及计算机可读存储介质 |
| CN114222075A (zh) * | 2022-01-28 | 2022-03-22 | 广州华多网络科技有限公司 | 移动端图像处理方法及其装置、设备、介质、产品 |
| CN118555493A (zh) * | 2024-07-29 | 2024-08-27 | 福瑞泰克智能系统有限公司 | 背光场景的曝光控制方法、装置、设备及存储介质 |
Also Published As
| Publication number | Publication date |
|---|---|
| CN109040609A (zh) | 2018-12-18 |
| CN109040609B (zh) | 2021-04-09 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN109068067B (zh) | 曝光控制方法、装置和电子设备 | |
| WO2020038072A1 (zh) | 曝光控制方法、装置和电子设备 | |
| US11582400B2 (en) | Method of image processing based on plurality of frames of images, electronic device, and storage medium | |
| AU2019326496B2 (en) | Method for capturing images at night, apparatus, electronic device, and storage medium | |
| CN108683862B (zh) | 成像控制方法、装置、电子设备及计算机可读存储介质 | |
| CN109005364B (zh) | 成像控制方法、装置、电子设备以及计算机可读存储介质 | |
| JP6911202B2 (ja) | 撮像制御方法および撮像装置 | |
| CN110072052B (zh) | 基于多帧图像的图像处理方法、装置、电子设备 | |
| CN109788207B (zh) | 图像合成方法、装置、电子设备及可读存储介质 | |
| CN109194882B (zh) | 图像处理方法、装置、电子设备及存储介质 | |
| WO2020057199A1 (zh) | 成像方法、装置和电子设备 | |
| WO2020034737A1 (zh) | 成像控制方法、装置、电子设备以及计算机可读存储介质 | |
| CN108833802B (zh) | 曝光控制方法、装置和电子设备 | |
| CN110191291B (zh) | 基于多帧图像的图像处理方法和装置 | |
| CN108712608B (zh) | 终端设备拍摄方法和装置 | |
| WO2020029732A1 (zh) | 全景拍摄方法、装置和成像设备 | |
| CN107509044B (zh) | 图像合成方法、装置、计算机可读存储介质和计算机设备 | |
| CN108683863B (zh) | 成像控制方法、装置、电子设备以及可读存储介质 | |
| WO2020207261A1 (zh) | 基于多帧图像的图像处理方法、装置、电子设备 | |
| WO2020029679A1 (zh) | 控制方法、装置、成像设备、电子设备及可读存储介质 | |
| CN110264420A (zh) | 基于多帧图像的图像处理方法和装置 | |
| CN110166709A (zh) | 夜景图像处理方法、装置、电子设备以及存储介质 | |
| CN108900785A (zh) | 曝光控制方法、装置和电子设备 | |
| EP3836532A1 (en) | Control method and apparatus, electronic device, and computer readable storage medium | |
| CN108833803A (zh) | 成像方法、装置和电子设备 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19852869 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 19852869 Country of ref document: EP Kind code of ref document: A1 |