CN109246362A - A kind of image processing method and mobile terminal - Google Patents
A kind of image processing method and mobile terminal Download PDFInfo
- Publication number
- CN109246362A CN109246362A CN201710297673.1A CN201710297673A CN109246362A CN 109246362 A CN109246362 A CN 109246362A CN 201710297673 A CN201710297673 A CN 201710297673A CN 109246362 A CN109246362 A CN 109246362A
- Authority
- CN
- China
- Prior art keywords
- depth
- field
- exposure
- layer
- layers
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 19
- 238000000034 method Methods 0.000 claims abstract description 29
- 238000010586 diagram Methods 0.000 description 14
- 238000004590 computer program Methods 0.000 description 7
- 230000000694 effects Effects 0.000 description 7
- 230000006870 function Effects 0.000 description 7
- 101100228469 Caenorhabditis elegans exp-1 gene Proteins 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 3
- 101100242909 Streptococcus pneumoniae (strain ATCC BAA-255 / R6) pbpA gene Proteins 0.000 description 2
- 101100269618 Streptococcus pneumoniae serotype 4 (strain ATCC BAA-334 / TIGR4) aliA gene Proteins 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000011282 treatment Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/73—Circuitry for compensating brightness variation in the scene by influencing the exposure time
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
Abstract
The embodiment of the invention discloses a kind of image processing method and mobile terminals, this method comprises: obtaining the corresponding N number of depth of field figure layer of image to be processed by the first camera and second camera;Wherein, N is the natural number more than or equal to 2;Determine the corresponding target light exposure amount of each depth of field figure layer;Wherein, the corresponding target light exposure amount of at least two depth of field figure layers is different;Corresponding depth of field figure layer is handled according to each target light exposure amount.
Description
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to an image processing method and a mobile terminal.
Background
With the continuous development of mobile terminal technologies such as smart phones and tablet computers, mobile terminals have long been limited to single communication functions, but have been devices integrating functions such as leisure, communication and entertainment. For example, a general mobile terminal is provided with a camera for meeting the photographing or photographing requirements of a user.
The development trend of cameras on mobile terminals has gradually changed from the original single camera to the double camera, and each mobile terminal manufacturer also takes the double camera as a standard configuration. The depth of field information of the photo can be recorded through the hardware of the double cameras, the distance of the photographed object is sensed, the object is separated from the background, and even the background is subjected to fuzzy processing. In addition, a picture of a 3D effect can be taken, and a 360-degree panoramic shooting function and the like can be provided.
Under normal light in daytime, images shot by the cameras are relatively uniformly exposed, the brightness of distant objects and the brightness of nearby objects are not greatly distinguished, the contrast displayed by the images is not high, and the transparency is not strong; under dim light at night, the object can be clearly distinguished for human eyes, but the object is difficult to distinguish by a camera. The image shot by the camera is not uniformly exposed, and the distant object is generally dark, and the imaging effect of the near object is slightly better than that of the distant object.
In the conventional image processing method, when an image to be processed is subjected to exposure processing, the same processing method is adopted for all depth layers corresponding to the image to be processed, for example, the exposure amount corresponding to all depth layers of the image to be processed is adjusted from the current exposure amount to a target exposure amount, so that the exposure processing of the image to be processed is realized.
In the process of implementing the invention, the inventor finds that at least the following problems exist in the prior art:
in the existing image processing method, the same exposure processing is performed on each depth layer corresponding to the image to be processed, and the image processing effect is poor.
Disclosure of Invention
In order to solve the existing technical problem, embodiments of the present invention desirably provide an image processing method and a mobile terminal, where different exposure processing is adopted for each depth-of-field layer corresponding to an image to be processed, so that an image processing effect is good.
In order to achieve the above purpose, the technical solution of the embodiment of the present invention is realized as follows:
the embodiment of the invention provides an image processing method, which comprises the following steps:
acquiring N depth-of-field layers corresponding to an image to be processed through a first camera and a second camera; wherein N is a natural number greater than or equal to 2;
determining the target exposure corresponding to each depth of field layer; wherein the target exposure corresponding to at least two depth layers is different;
and processing the depth-of-field layers corresponding to the target exposure quantities.
In the above embodiment, the determining the target exposure amount corresponding to each depth of field layer includes:
determining the reference exposure corresponding to each depth layer;
setting exposure coefficients corresponding to all the depth-of-field layers;
and obtaining the target exposure amount corresponding to each depth of field layer according to the reference exposure amount and the exposure coefficient corresponding to each depth of field layer.
In the above embodiment, the determining the reference exposure amount corresponding to each depth layer includes:
acquiring target brightness parameters corresponding to each depth-of-field layer;
and searching the reference exposure corresponding to each target brightness parameter according to the corresponding relation between the target brightness parameter and the reference exposure stored in advance.
In the above embodiment, the setting of the exposure coefficient corresponding to each depth layer includes:
dividing the N depth of field layers into M groups of depth of field layers; wherein M is a natural number greater than or equal to 2;
acquiring an exposure index corresponding to the image to be processed;
and setting exposure coefficients corresponding to all the depth layers in the M groups of depth layers according to the exposure index and a preset exposure index threshold.
In the above embodiment, the dividing the N depth layers into M groups of depth layers includes:
determining a depth-of-field boundary layer in the N depth-of-field layers according to the average brightness of the image to be processed which is stored in advance and the reference exposure corresponding to each depth-of-field layer;
dividing the N depth-of-field layers into a first depth-of-field layer and a second depth-of-field layer according to the depth-of-field boundary layer;
dividing all the first depth-of-field layers and all the second depth-of-field layers into P groups and Q groups respectively; wherein, P and Q are both natural numbers which are more than or equal to 1, and the sum of P and Q is M.
An embodiment of the present invention further provides a mobile terminal, where the mobile terminal includes: the device comprises an acquisition unit, a determination unit and a processing unit; wherein,
the acquisition unit is used for acquiring N depth-of-field layers corresponding to the image to be processed through the first camera and the second camera; wherein N is a natural number greater than or equal to 2;
the determining unit is used for determining the target exposure corresponding to each depth layer; wherein, the target adjustment amounts corresponding to at least two depth layers are different;
and the processing unit is used for processing the depth-of-field layers corresponding to the target adjustment quantities.
In the above embodiment, the determining unit includes: a determining subunit, a setting subunit and an obtaining subunit; wherein,
the determining subunit is configured to determine a reference exposure amount corresponding to each depth layer;
the setting subunit is configured to set an exposure coefficient corresponding to each depth-of-field layer;
and the acquisition subunit is used for acquiring the target exposure amount corresponding to each depth of field layer according to the reference exposure amount and the exposure coefficient corresponding to each depth of field layer.
In the foregoing embodiment, the determining subunit is specifically configured to obtain target brightness parameters corresponding to each depth-of-field layer; and searching the reference exposure corresponding to each target brightness parameter according to the corresponding relation between the target brightness parameter and the reference exposure stored in advance.
In the above embodiment, the setting subunit is specifically configured to divide the N depth layers into M groups of depth layers; wherein M is a natural number greater than or equal to 2; acquiring the exposure index corresponding to the to-be-processed image; and setting exposure coefficients corresponding to all the depth layers in the M groups of depth layers according to the exposure index and a preset exposure index threshold.
In the above embodiment, the setting subunit is specifically configured to determine a depth-of-field boundary layer in the N depth-of-field layers according to the pre-stored average brightness of the image to be processed and the reference exposure amount corresponding to each depth-of-field layer; dividing the N depth-of-field layers into a first depth-of-field layer and a second depth-of-field layer according to the depth-of-field boundary layer; dividing all the first depth-of-field layers and all the second depth-of-field layers into P groups and Q groups respectively; wherein, P and Q are both natural numbers which are more than or equal to 1, and the sum of P and Q is M.
Therefore, in the technical scheme of the embodiment of the invention, the first camera and the second camera are used for acquiring the N depth-of-field layers corresponding to the image to be processed, then the target exposure amount corresponding to each depth-of-field layer is determined, and finally the depth-of-field layers corresponding to each depth-of-field layer are processed according to each target exposure amount. That is to say, in the technical solution of the embodiment of the present invention, the mobile terminal may perform different exposure processing on each depth layer of the image. In the prior art, the mobile terminal performs the same exposure process on each depth layer of the image. Therefore, compared with the prior art, the image processing method and the mobile terminal provided by the embodiment of the invention adopt different exposure processing aiming at each depth-of-field layer corresponding to the image to be processed, and the image processing effect is better; in addition, the technical scheme of the embodiment of the invention is simple and convenient to realize, convenient to popularize and wider in application range.
Drawings
FIG. 1 is a schematic flow chart illustrating an implementation of an image processing method according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of the composition of N depth-of-field layers corresponding to an image to be processed in the embodiment of the present invention;
FIG. 3 is a schematic flow chart illustrating an implementation method for determining a target exposure corresponding to each depth-of-field layer according to an embodiment of the present invention;
FIG. 4 is a schematic flow chart illustrating an implementation method for determining a reference exposure corresponding to each depth-of-field layer according to an embodiment of the present invention;
fig. 5 is a schematic flow chart of an implementation method for setting exposure coefficients corresponding to each depth-of-field layer in the embodiment of the present invention;
fig. 6 is a schematic diagram of a first component structure of the mobile terminal according to the embodiment of the present invention;
fig. 7 is a schematic diagram of a second component structure of the mobile terminal in the embodiment of the present invention.
Detailed Description
The technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention.
Fig. 1 is a schematic flow chart illustrating an implementation of an image processing method according to an embodiment of the present invention. As shown in fig. 1, the image processing method may include the steps of:
step 101, acquiring N depth-of-field layers corresponding to an image to be processed through a first camera and a second camera; wherein N is a natural number of 2 or more.
In a specific embodiment of the present invention, the mobile terminal may obtain, through the first camera and the second camera, the image to be processed and N depth map layers corresponding to the image to be processed. Specifically, in the embodiment of the present invention, two cameras may be disposed at the rear of the mobile terminal; the first camera can be used as a main camera; the second camera can be used as an auxiliary camera; specifically, the first camera adopts a color sensor, and is characterized in that color restoration is good, and the obtained image is the image to be processed. The second camera adopts a black-and-white sensor, and is characterized by stronger light sensing capability, wide dynamic range and suitability for shooting details, and the acquired image is mainly used for calculating the depth of field by reference. The two cameras are arranged in parallel and are separated by a certain distance.
Specifically, in the specific embodiment of the present invention, the mobile terminal may obtain N depth-of-field layers corresponding to the image to be processed through the two cameras. Fig. 2 is a schematic structural diagram of the composition of N depth-of-field layers corresponding to an image to be processed in the embodiment of the present invention. As shown in fig. 2, the N depth-of-field layers corresponding to the image to be processed are respectively: depth _1, depth _2, depth _3, …, depth _ N; wherein N is a natural number of 2 or more. Specifically, depth _1 denotes a first depth layer, depth _2 denotes a second depth layer, depth _3 denotes a third depth layer, …, and depth _ N denotes an nth depth layer. For example, the depth of field of the image to be processed is 20 meters, the mobile terminal may set the layer with the depth of field of 1m as depth _1, set the layer with the depth of field of 2m as depth _2, …, and set the layer with the depth of field of 20m as depth _20, where the 20 depth layers are all depth layers corresponding to the image to be processed.
Step 102, determining target exposure corresponding to each depth of field layer; and the target exposure amount corresponding to at least two depth layers is different.
In the specific embodiment of the invention, after acquiring the N depth layers corresponding to the image to be processed, the mobile terminal can determine the target exposure amount corresponding to each depth layer; and the target exposure amount corresponding to at least two depth layers is different. Preferably, in an embodiment of the present invention, the target exposure amount corresponding to each depth-of-field layer may include: gain and exposure time Exp.
Fig. 3 is a flowchart illustrating an implementation method for determining a target exposure amount corresponding to each depth-of-field layer according to an embodiment of the present invention. As shown in fig. 3, the method for determining the target exposure amount corresponding to each depth of field layer by the mobile terminal may include the following steps:
and 102a, determining the reference exposure corresponding to each depth layer.
In the specific embodiment of the invention, the mobile terminal can determine the reference exposure corresponding to each depth layer; the reference exposure amount corresponding to each depth layer may include: gain, exposure time exp and brightness lum.
Preferably, in an embodiment of the present invention, each depth-of-field layer may have a relatively clear and a relatively blurred region, the mobile terminal may select a certain local region from the relatively clear regions in each depth-of-field layer in advance, and then the mobile terminal determines the reference exposure of the selected local region. And the reference exposure of the local area selected by each depth of field layer is the reference exposure corresponding to each depth of field layer.
Fig. 4 is a flowchart illustrating an implementation method for determining the reference exposure corresponding to each depth-of-field layer according to an embodiment of the present invention. As shown in fig. 4, the method for determining the reference exposure amount corresponding to each depth layer by the mobile terminal may include the following steps:
step 401, obtaining target brightness parameters corresponding to each depth layer.
In the specific embodiment of the invention, the mobile terminal can acquire the target brightness parameters corresponding to each depth layer at a specific storage position; preferably, in the embodiment of the present invention, the mobile terminal may obtain a target brightness parameter corresponding to a certain local area of each depth layer at a specific storage location, where the target brightness parameter is a target brightness parameter corresponding to the depth layer.
Step 402, according to the pre-stored corresponding relationship between the target brightness parameters and the reference exposure, searching the reference exposure corresponding to each target brightness parameter.
In the specific embodiment of the present invention, after obtaining each target brightness, the mobile terminal may search for a pre-stored correspondence between a target brightness parameter and a reference exposure, so as to obtain a reference exposure corresponding to each target brightness parameter.
Specifically, in a specific embodiment of the present invention, the mobile terminal may pre-store the corresponding relationship between the target brightness parameter and the reference exposure amount; the value interval of the target brightness parameter may be: 0-255, each target brightness parameter can correspond to a set of reference exposures. Specifically, the correspondence relationship between the target brightness parameter and the reference exposure amount, which are pre-saved by the mobile terminal, can be as shown in the following table 1:
| Y | gain | exp | lum |
| 0 | gain0 | exp0 | lum0 |
| 1 | gain1 | exp1 | lum1 |
| 2 | gain2 | exp2 | lum2 |
| … | … | … | … |
| 255 | gain255 | exp255 | lum255 |
TABLE 1
As shown in table 1 above, the target brightness parameter is Y, and the reference exposure amount includes: gain, exposure time exp and brightness lum. gain0, gain1, gain2, … and gain255 correspond to the case where the target luminance parameter is 0, 1, 2, … and 255, respectively, exp0, exp1, exp2, … and exp255 correspond to the exposure time when the target luminance parameter is 0, 1, 2, … and 255, respectively, and lum0, lum1, lum2, … and lum255 correspond to the case where the target luminance parameter is 0, 1, 2, … and 255, respectively.
In the embodiment of the present invention, the reference exposure corresponding to each target brightness parameter can be directly found out according to table 1. When the target luminance parameter is greater than 255, the target luminance parameter may be rounded or normalized to a range in which the target luminance parameter is 255.
According to the analysis, through the steps 401 to 402, the mobile terminal may obtain the target brightness parameter corresponding to each depth-of-field layer at the specific storage location, and obtain the reference exposure corresponding to each target brightness parameter according to the pre-stored corresponding relationship between the target brightness parameter and the reference exposure.
And 102b, setting exposure coefficients corresponding to the depth layers.
In a specific embodiment of the present invention, the mobile terminal may set a corresponding exposure coefficient for each depth-of-field layer, and then obtain a target exposure amount corresponding to each depth-of-field layer according to a reference exposure amount and an exposure coefficient corresponding to each depth-of-field layer. Specifically, the mobile terminal may divide the N depth of field layers into a first depth of field layer and a second depth of field layer according to the brightness corresponding to each depth of field layer, then divide all the first depth of field layers and all the second depth of field layers into a plurality of groups, and finally set the exposure coefficient corresponding to each depth of field layer in the M groups of depth of field layers according to the exposure index corresponding to the image to be processed and the preset exposure index threshold. Fig. 5 is a schematic flow chart of an implementation method for setting an exposure coefficient corresponding to each depth layer in the embodiment of the present invention. As shown in fig. 5, the method for setting the exposure coefficient corresponding to each depth layer by the mobile terminal may include the following steps:
step 501, dividing N depth of field layers into M groups of depth of field layers; wherein M is a natural number of 2 or more.
In a specific embodiment of the present invention, the mobile terminal determines a depth-of-field boundary layer in the N depth-of-field layers according to the average brightness of the pre-stored image to be processed and the reference exposure corresponding to each depth-of-field layer, and divides the N depth-of-field layers into two types of depth-of-field layers through the depth-of-field boundary layer, where the two types of depth-of-field layers are: dividing all the first depth-of-field layers and all the second depth-of-field layers into P groups and Q groups respectively; wherein, P and Q are both natural numbers which are more than or equal to 1, and the sum of P and Q is M.
Specifically, in the specific embodiment of the present invention, the mobile terminal may determine one depth-of-field layer among the N depth-of-field layers as a depth-of-field boundary layer, and use the depth-of-field boundary layer as a boundary of a second depth-of-field layer of the first depth-of-field layer. The mobile terminal can determine a depth-of-field boundary layer in the N depth-of-field layers according to the average brightness of the image to be processed; the mobile terminal can obtain the average brightness of the mobile terminal at a specific storage position, and each depth of field layer with the depth of field smaller than the depth of field corresponding to the depth of field boundary layer is divided into a first type of depth of field layer; dividing each depth of field layer with the depth of field larger than the depth of field corresponding to the depth of field boundary layer into a second type of depth of field layer; the depth-of-field boundary layer may not be divided into the first type of depth-of-field layer or the second type of depth-of-field layer.
Specifically, in the specific embodiment of the present invention, the mobile terminal searches for the brightness equal to or closest to the average brightness from all the brightnesses, and determines the depth-of-field layer corresponding to the searched brightness equal to or closest to the average brightness as the depth-of-field boundary layer. For example, the mobile terminal finds that the brightness in the reference exposure corresponding to depth _ i is equal to the average brightness of the image to be processed in the N depth layers, that is, the depth _ i is determined to be a depth boundary layer. Thus, depth _1, depth _2, …, depth _ i-1 constitute the first type of depth layer, and depth _ i +1, depth _ i +2, …, depth _ N constitute the second type of depth layer; wherein i is a natural number of 1 or more. Preferably, when the brightness equal to or closest to the average brightness found by the mobile terminal is not unique, any depth layer corresponding to the brightness equal to or closest to the average brightness may be determined as the depth boundary layer.
Specifically, in the specific embodiment of the present invention, after the mobile terminal divides the N depth-of-field layers into the first type of depth-of-field layer and the second type of depth-of-field layer, all the first type of depth-of-field layers may be further divided into P groups, and all the second type of depth-of-field layers may be divided into Q groups; wherein, P and Q are both natural numbers which are more than or equal to 1, and the sum of P and Q is M. Specifically, in the specific embodiment of the present invention, during the process that the mobile terminal divides all the first type depth-of-field layers and all the second type depth-of-field layers into multiple groups, the division of each group of depth-of-field layers may be determined according to any number. For example, the mobile terminal has acquired 10 depth layers of the image to be processed, and determines that the 6 th depth layer is a depth boundary layer, the 1 st to 5 th depth layers are all first-type depth layers, and the 7 th to 10 th depth layers are all second-type depth layers. The mobile terminal may divide the 1 st to 3 rd depth of field layers into a first group of depth of field layers, divide the 4 th and 5 th into a second group of depth of field layers, and divide the 7 th to 10 th into a third group of depth of field layers.
And 502, acquiring an exposure index corresponding to the image to be processed.
In an embodiment of the present invention, the mobile terminal may obtain the exposure index corresponding to the image to be processed at a specific storage location. The exposure index can represent the exposure degree of the image to be processed, and then the mobile terminal can judge the light condition of the image to be processed according to the exposure index and a preset exposure index threshold; wherein the light conditions include: bright light and dim light.
Step 503, setting an exposure coefficient corresponding to each depth layer in the M groups of depth layers according to the exposure index and a preset exposure index threshold.
In a specific embodiment of the present invention, after obtaining the exposure index and the preset exposure index threshold, the mobile terminal may set, according to the exposure index and the preset exposure index threshold, an exposure coefficient corresponding to each depth layer in the M sets of depth layers.
Specifically, in the specific embodiment of the present invention, the mobile terminal may determine the setting range of the exposure coefficient corresponding to each depth layer in the first type of depth layer and the second type of depth layer. When the exposure index is greater than or equal to the exposure index threshold, that is: when the light condition of the image to be processed is dark light, the mobile terminal can set the exposure coefficient corresponding to each depth of field layer in the first type of depth of field layer to be a numerical value of 0-1, and set the exposure coefficient corresponding to each depth of field layer in the second type of depth of field layer to be a numerical value greater than 1; or when the exposure index is smaller than the exposure index threshold, that is, when the light condition of the image to be processed is bright light, the mobile terminal may set the exposure coefficient corresponding to each depth layer in the first type of depth layer to a value larger than 1, and set the exposure coefficient corresponding to each depth layer in the second type of depth layer to a value between 0 and 1. For example, the first type of depth layer includes: depth _1, depth _2, …, depth _ i-1, the second type of depth layer comprising: depth _ i +1, depth _ i +2, …, depth _ N. When the light condition of the image to be processed is dim light, the mobile terminal sets depth _1, depth _2, …, exposure coefficients ratio _1, ratio _2, … corresponding to depth _ i-1, ratio _ i-1 is a numerical value of 0-1, depth _ i +2, …, exposure coefficients ratio _ i +1 corresponding to depth _ N, ratio _ i +2, … and ratio _ N are numerical values larger than 1; when the light condition of the image to be processed is bright light, the mobile terminal sets depth _1, depth _2, …, exposure coefficients ratio _1, ratio _2, … corresponding to depth _ i-1, ratio _ i-1 is a numerical value larger than 1, and exposure coefficients ratio _ i +1, ratio _ i +2, … corresponding to depth _ N, ratio _ i +2, ratio _ i + 25, … corresponding to depth _ N are numerical values of 0-1. The exposure coefficient corresponding to the depth boundary layer depth _ i is set to 1 under the condition that the light condition of the image to be processed is bright light or dim light, namely the depth boundary layer depth _ i is not processed in the exposure processing process.
Specifically, after determining the setting range of the exposure coefficient corresponding to each of the first type of depth-of-field layer and the second type of depth-of-field layer, the mobile terminal may further set different setting ranges of the exposure coefficient for different depth-of-field layer groups, so as to more accurately set the exposure coefficient corresponding to each depth-of-field layer. For example, when the light condition of the image to be processed is dim light, the mobile terminal has divided all depth layers into 6 groups, wherein all depth layers in the 1 st group to the 4 th group are the first type of depth layers, and all depth layers in the 5 th group and the 6 th group are the second type of depth layers. At this time, the mobile terminal may set the exposure coefficient corresponding to each depth layer in the 1 st group of depth layers to a numerical value of 0 to 0.3, the exposure coefficient corresponding to each depth layer in the 2 nd group of depth layers to a numerical value of 0.3 to 0.5, the exposure coefficient corresponding to each depth layer in the 3 rd group of depth layers to a numerical value of 0.5 to 0.8, the exposure coefficient corresponding to each depth layer in the 4 th group of depth layers to a numerical value of 0.8 to 1, the exposure coefficient corresponding to each depth layer in the 5 th group of depth layers to a numerical value of 1 to 1.5, and the exposure coefficient corresponding to each depth layer in the 6 th group of depth layers to a numerical value of 1.5 to 1.8.
And 102c, obtaining the target exposure amount corresponding to each depth of field layer according to the reference exposure amount and the exposure coefficient corresponding to each depth of field layer.
In the specific embodiment of the invention, the mobile terminal sets the exposure coefficient corresponding to each depth of field layer, obtains the reference exposure corresponding to each depth of field layer, and can calculate the target exposure corresponding to each depth of field layer according to the exposure coefficient and the reference exposure corresponding to each depth of field layer.
Specifically, in an embodiment of the present invention, the target exposure amount corresponding to each depth-of-field layer is a product of an exposure coefficient corresponding to each depth-of-field layer and a reference exposure amount corresponding to the exposure coefficient. Specifically, the target exposure amount for each depth layer is as shown in table 2 below:
| Gain | Exp |
| Gain_1=ratio_1×gain_1 | Exp_1=ratio_1×exp_1 |
| Gain_2=ratio_2×gain_2 | Exp_2=ratio_2×exp_2 |
| Gain_3=ratio_3×gain_3 | Exp_3=ratio_3×exp_3 |
| … | … |
| Gain_i=ratio_i×gain_i | Exp_i=ratio_i×exp_i |
| … | … |
| Gain_N=ratio_N×gain_N | Exp_N=ratio_N×exp_N |
TABLE 2
In each expression in table 2 above, ratio _1, ratio _2, ratio _3, …, ratio _ i, …, and ratio _ N are exposure coefficients corresponding to each depth layer, Gain _1, Gain _2, Gain _3, …, Gain _ i, …, and Gain _ N are gains in reference layers corresponding to each depth layer, Exp _1, Exp _2, Exp _3, …, Exp _ i, …, and Exp _ N are exposure times in reference exposures corresponding to each depth layer, and Gain _1, Gain _2, Gain _3, …, Gain _ i, …, and Gain _ N are gains in target exposures corresponding to each depth layer, Exp _1, Exp _2, Exp _3, …, Exp _ i, …, and Exp _ N are exposure times in target exposures corresponding to each depth layer.
According to the analysis, the mobile terminal can determine the target exposure amount corresponding to each depth layer of the image to be processed through the steps 102a to 102 c. Specifically, the mobile terminal may obtain the target exposure amount corresponding to each depth of field layer according to the reference exposure amount corresponding to each depth of field layer and the exposure parameter corresponding to each depth of field layer.
And 103, processing the depth of field layers corresponding to the target exposure quantities.
In a specific embodiment of the present invention, the mobile terminal may process each corresponding depth-of-field layer according to each target exposure amount. Specifically, after obtaining the target exposure amount corresponding to each depth of field layer, the mobile terminal may perform exposure processing on each depth of field layer of the image to be processed. And adjusting the Gain and the exposure time Exp of each depth of field layer according to the corresponding target exposure amount of each depth of field layer to obtain a new depth of field layer. The mobile terminal fuses all the new depth of field layers together, so that new images can be obtained, and different exposure processing is performed on different depth of field layers of the images to be processed.
According to the image processing method provided by the embodiment of the invention, N depth of field layers corresponding to an image to be processed are obtained through the first camera and the second camera, then the target exposure amount corresponding to each depth of field layer is determined, and finally the depth of field layers corresponding to each depth of field layer are processed according to each target exposure amount. That is to say, in the image processing method according to the embodiment of the present invention, the mobile terminal may perform different exposure processing on each depth layer of the image. In the prior art, the mobile terminal performs the same exposure process on each depth layer of the image. Therefore, compared with the prior art, the image processing method provided by the embodiment of the invention adopts different exposure treatments for each depth-of-field layer corresponding to the image to be processed, and the image processing effect is better; in addition, the technical scheme of the embodiment of the invention is simple and convenient to realize, convenient to popularize and wider in application range.
Fig. 6 is a schematic diagram of a first component structure of the mobile terminal in the embodiment of the present invention. As shown in fig. 6, the mobile terminal includes: an acquisition unit 601, a determination unit 602, and a processing unit 603; wherein,
the acquiring unit 601 is configured to acquire N depth-of-field layers corresponding to an image to be processed through a first camera and a second camera; wherein N is a natural number greater than or equal to 2;
the determining unit 602 is configured to determine a target exposure amount corresponding to each depth layer; wherein, the target adjustment amounts corresponding to at least two depth layers are different;
the processing unit 603 is configured to process the depth map layer corresponding to each target adjustment amount.
Fig. 7 is a schematic diagram of a second component structure of the mobile terminal in the embodiment of the present invention. As shown in fig. 7, the determining unit 602 includes: a determination subunit 6021, a setting subunit 6022, and an acquisition subunit 6023; wherein,
the determining subunit 6021 is configured to determine the reference exposure amount corresponding to each depth layer;
the setting subunit 6022 is configured to set exposure coefficients corresponding to the depth layers;
the obtaining subunit 6023 is configured to obtain the target exposure amount corresponding to each depth-of-field layer according to the reference exposure amount and the exposure coefficient corresponding to each depth-of-field layer.
Further, the determining subunit 6021 is specifically configured to obtain target brightness parameters corresponding to each depth layer; and searching the reference exposure corresponding to each target brightness parameter according to the corresponding relation between the target brightness parameter and the reference exposure stored in advance.
Further, the setting subunit 6022 is specifically configured to divide the N depth of field layers into M groups of depth of field layers; wherein M is a natural number greater than or equal to 2; acquiring the exposure index corresponding to the to-be-processed image; and setting exposure coefficients corresponding to all the depth layers in the M groups of depth layers according to the exposure index and a preset exposure index threshold.
Further, the setting subunit 6023 is specifically configured to determine a depth-of-field boundary layer in the N depth-of-field layers according to the pre-stored average brightness of the image to be processed and the reference exposure amount corresponding to each depth-of-field layer; dividing the N depth-of-field layers into a first depth-of-field layer and a second depth-of-field layer according to the depth-of-field boundary layer; dividing all the first depth-of-field layers and all the second depth-of-field layers into P groups and Q groups respectively; wherein, P and Q are both natural numbers which are more than or equal to 1, and the sum of P and Q is M.
In practical applications, the obtaining unit 601, the determining unit 602, and the processing unit 603 may be implemented by a Central Processing Unit (CPU), a Microprocessor (MPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), or the like inside the mobile terminal.
In addition, each unit in the embodiment may exist alone physically, or two or more units may be integrated into one unit. The integrated unit can be realized in a form of hardware or a form of a software functional module.
Based on the understanding that the technical solution of the present embodiment essentially or a part contributing to the prior art, or all or part of the technical solution may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for enabling a mobile terminal device (which may be a personal mobile terminal, a server, or a network device, etc.) or a processor (processor) to execute all or part of the steps of the method of the present embodiment. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
Specifically, the mobile terminal program instructions corresponding to an image processing method in the present embodiment may be stored on a storage medium such as an optical disc, a hard disc, or a usb disk, and when the mobile terminal program instructions corresponding to an image processing method in the storage medium are read or executed by an electronic device, the method includes the following steps:
acquiring N depth-of-field layers corresponding to an image to be processed through a first camera and a second camera; wherein N is a natural number greater than or equal to 2;
determining the target exposure corresponding to each depth of field layer; wherein the target exposure corresponding to at least two depth layers is different;
and processing the depth-of-field layers corresponding to the target exposure quantities.
The mobile terminal provided by the embodiment of the invention firstly obtains N depth-of-field layers corresponding to an image to be processed through the first camera and the second camera, then determines the target exposure amount corresponding to each depth-of-field layer, and finally processes the depth-of-field layer corresponding to each target exposure amount. That is to say, in the mobile terminal according to the embodiment of the present invention, the mobile terminal may perform different exposure processing on each depth layer of the image. In the prior art, the mobile terminal performs the same exposure process on each depth layer of the image. Therefore, compared with the prior art, the mobile terminal provided by the embodiment of the invention adopts different exposure treatments for each depth-of-field layer corresponding to the image to be processed, so that the image processing effect is better; in addition, the technical scheme of the embodiment of the invention is simple and convenient to realize, convenient to popularize and wider in application range.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of a hardware embodiment, a software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention.
Claims (10)
1. An image processing method, characterized in that the method comprises:
acquiring N depth-of-field layers corresponding to an image to be processed through a first camera and a second camera; wherein N is a natural number greater than or equal to 2;
determining the target exposure corresponding to each depth of field layer; wherein the target exposure corresponding to at least two depth layers is different;
and processing the depth-of-field layers corresponding to the target exposure quantities.
2. The method according to claim 1, wherein the determining the target exposure amount corresponding to each depth layer comprises:
determining the reference exposure corresponding to each depth layer;
setting exposure coefficients corresponding to all the depth-of-field layers;
and obtaining the target exposure amount corresponding to each depth of field layer according to the reference exposure amount and the exposure coefficient corresponding to each depth of field layer.
3. The method according to claim 2, wherein the determining the reference exposure amount corresponding to each depth layer comprises:
acquiring target brightness parameters corresponding to each depth-of-field layer;
and searching the reference exposure corresponding to each target brightness parameter according to the corresponding relation between the target brightness parameter and the reference exposure stored in advance.
4. The method according to claim 2, wherein the setting of the exposure coefficient corresponding to each depth layer comprises:
dividing the N depth of field layers into M groups of depth of field layers; wherein M is a natural number greater than or equal to 2;
acquiring an exposure index corresponding to the image to be processed;
and setting exposure coefficients corresponding to all the depth layers in the M groups of depth layers according to the exposure index and a preset exposure index threshold.
5. The method according to claim 4, wherein said dividing the N depth layers into M sets of depth layers comprises:
determining a depth-of-field boundary layer in the N depth-of-field layers according to the average brightness of the image to be processed which is stored in advance and the reference exposure corresponding to each depth-of-field layer;
dividing the N depth-of-field layers into a first depth-of-field layer and a second depth-of-field layer according to the depth-of-field boundary layer;
dividing all the first depth-of-field layers and all the second depth-of-field layers into P groups and Q groups respectively; wherein, P and Q are both natural numbers which are more than or equal to 1, and the sum of P and Q is M.
6. A mobile terminal, characterized in that the mobile terminal comprises: the device comprises an acquisition unit, a determination unit and a processing unit; wherein,
the acquisition unit is used for acquiring N depth-of-field layers corresponding to the image to be processed through the first camera and the second camera; wherein N is a natural number greater than or equal to 2;
the determining unit is used for determining the target exposure corresponding to each depth layer; wherein, the target adjustment amounts corresponding to at least two depth layers are different;
and the processing unit is used for processing the depth-of-field layers corresponding to the target adjustment quantities.
7. The mobile terminal according to claim 6, wherein the determining unit comprises: a determining subunit, a setting subunit and an obtaining subunit; wherein,
the determining subunit is configured to determine a reference exposure amount corresponding to each depth layer;
the setting subunit is configured to set an exposure coefficient corresponding to each depth-of-field layer;
and the acquisition subunit is used for acquiring the target exposure amount corresponding to each depth of field layer according to the reference exposure amount and the exposure coefficient corresponding to each depth of field layer.
8. The mobile terminal according to claim 7, wherein the determining subunit is specifically configured to obtain a target brightness parameter corresponding to each depth layer; and searching the reference exposure corresponding to each target brightness parameter according to the corresponding relation between the target brightness parameter and the reference exposure stored in advance.
9. The mobile terminal according to claim 7, wherein the setting subunit is specifically configured to divide the N depth layers into M groups of depth layers; wherein M is a natural number greater than or equal to 2; acquiring the exposure index corresponding to the to-be-processed image; and setting exposure coefficients corresponding to all the depth layers in the M groups of depth layers according to the exposure index and a preset exposure index threshold.
10. The mobile terminal according to claim 9, wherein the setting subunit is specifically configured to determine a depth-of-field boundary layer in the N depth-of-field layers according to a pre-stored average brightness of the image to be processed and a reference exposure amount corresponding to each depth-of-field layer; dividing the N depth-of-field layers into a first depth-of-field layer and a second depth-of-field layer according to the depth-of-field boundary layer; dividing all the first depth-of-field layers and all the second depth-of-field layers into P groups and Q groups respectively; wherein, P and Q are both natural numbers which are more than or equal to 1, and the sum of P and Q is M.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201710297673.1A CN109246362B (en) | 2017-04-28 | 2017-04-28 | Image processing method and mobile terminal |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201710297673.1A CN109246362B (en) | 2017-04-28 | 2017-04-28 | Image processing method and mobile terminal |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN109246362A true CN109246362A (en) | 2019-01-18 |
| CN109246362B CN109246362B (en) | 2021-03-16 |
Family
ID=65082756
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201710297673.1A Active CN109246362B (en) | 2017-04-28 | 2017-04-28 | Image processing method and mobile terminal |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN109246362B (en) |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101621629A (en) * | 2008-06-30 | 2010-01-06 | 睿致科技股份有限公司 | Automatic exposure method |
| CN104010212A (en) * | 2014-05-28 | 2014-08-27 | 华为技术有限公司 | Method and device for multi-layer synthesis |
| CN105578026A (en) * | 2015-07-10 | 2016-05-11 | 宇龙计算机通信科技(深圳)有限公司 | A shooting method and user terminal |
| US20160191896A1 (en) * | 2014-12-31 | 2016-06-30 | Dell Products, Lp | Exposure computation via depth-based computational photography |
| CN106408518A (en) * | 2015-07-30 | 2017-02-15 | 展讯通信(上海)有限公司 | Image fusion method and apparatus, and terminal device |
| CN106550184A (en) * | 2015-09-18 | 2017-03-29 | 中兴通讯股份有限公司 | Photo processing method and device |
-
2017
- 2017-04-28 CN CN201710297673.1A patent/CN109246362B/en active Active
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101621629A (en) * | 2008-06-30 | 2010-01-06 | 睿致科技股份有限公司 | Automatic exposure method |
| CN104010212A (en) * | 2014-05-28 | 2014-08-27 | 华为技术有限公司 | Method and device for multi-layer synthesis |
| US20160191896A1 (en) * | 2014-12-31 | 2016-06-30 | Dell Products, Lp | Exposure computation via depth-based computational photography |
| CN105578026A (en) * | 2015-07-10 | 2016-05-11 | 宇龙计算机通信科技(深圳)有限公司 | A shooting method and user terminal |
| CN106408518A (en) * | 2015-07-30 | 2017-02-15 | 展讯通信(上海)有限公司 | Image fusion method and apparatus, and terminal device |
| CN106550184A (en) * | 2015-09-18 | 2017-03-29 | 中兴通讯股份有限公司 | Photo processing method and device |
Also Published As
| Publication number | Publication date |
|---|---|
| CN109246362B (en) | 2021-03-16 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN104301624B (en) | A kind of image taking brightness control method and device | |
| KR102302352B1 (en) | Adjustment method for automatic exposure control of region of interest, terminal device and non-transitory computer-readable storage medium | |
| CN105144233B (en) | Reference picture selection for moving ghost image filtering | |
| CN107454343B (en) | Photographic method, camera arrangement and terminal | |
| CN110942427B (en) | Image noise reduction method, device, equipment, and storage medium | |
| CN104486552B (en) | A kind of method and electronic equipment obtaining image | |
| US20160328853A1 (en) | Image processing method and apparatus | |
| CN112788251B (en) | Image brightness processing method and device, and image processing method and device | |
| CN108234858B (en) | Image blurring processing method and device, storage medium and electronic equipment | |
| CN105243371A (en) | Human face beauty degree detection method and system and shooting terminal | |
| WO2011014236A1 (en) | Digital image brightness adjustment using range information | |
| CN110706162B (en) | Image processing method, device and computer storage medium | |
| CN110784659B (en) | Exposure control method and device and storage medium | |
| CN111161299A (en) | Image segmentation method, computer program, storage medium, and electronic device | |
| CN114339060A (en) | Exposure adjusting method and device, storage medium and electronic equipment | |
| CN113709365B (en) | Image processing method, device, electronic equipment and storage medium | |
| CN107909551A (en) | Image processing method, device, computer installation and computer-readable recording medium | |
| CN106454140B (en) | A kind of information processing method and electronic equipment | |
| CN114531551B (en) | Image processing method and device, electronic equipment and storage medium | |
| CN112995633B (en) | Image white balance processing method and device, electronic equipment and storage medium | |
| CN111179158A (en) | Image processing method, image processing apparatus, electronic device, and medium | |
| CN109246362B (en) | Image processing method and mobile terminal | |
| CN111800568B (en) | Light supplement method and device | |
| CN113870300B (en) | Image processing method, device, electronic device and readable storage medium | |
| CN112087556B (en) | Dark light imaging method and device, readable storage medium and terminal equipment |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |