[go: up one dir, main page]

WO2020000262A1 - Procédé d'estimation de source de lumière, procédé de traitement d'image et produits associés - Google Patents

Procédé d'estimation de source de lumière, procédé de traitement d'image et produits associés Download PDF

Info

Publication number
WO2020000262A1
WO2020000262A1 PCT/CN2018/093144 CN2018093144W WO2020000262A1 WO 2020000262 A1 WO2020000262 A1 WO 2020000262A1 CN 2018093144 W CN2018093144 W CN 2018093144W WO 2020000262 A1 WO2020000262 A1 WO 2020000262A1
Authority
WO
WIPO (PCT)
Prior art keywords
color
cluster
clusters
degree
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CN2018/093144
Other languages
English (en)
Chinese (zh)
Inventor
林威丞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to PCT/CN2018/093144 priority Critical patent/WO2020000262A1/fr
Priority to CN201880095117.9A priority patent/CN112313946A/zh
Publication of WO2020000262A1 publication Critical patent/WO2020000262A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/88Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control

Definitions

  • the present application relates to the field of image processing, and in particular, to a light source estimation method, an image processing method, and related products.
  • Daylight is a natural light source. Except for the natural light source such as daylight, artificial light sources can be generated by different lamps, and the color temperature of daylight and light sources of different lamps are different.
  • Color temperature is a physical quantity that evaluates the color of a light source. It is defined as heating a black body (absolute black radiator, which is similar to a closed carbon block that does not reflect incident light) to a certain temperature, and it will emit light. Black body temperature changes. When the color of the light emitted by a light source is the same as the color of the black body, the temperature of the black body is the color temperature of the light source.
  • the color temperature of the light source can be classified into high color temperature, medium color temperature, and low color temperature from high to low.
  • the color of high color temperature light sources is light blue, and the color of low color temperature light sources is light yellow.
  • AVB correction technology can calculate the color temperature of the light source in the shooting environment. This corrects the color cast of the image, and strives to make the white objects in the original scene appear white in the image.
  • the traditional AWB correction technology cannot determine whether there is a mixed color temperature in the environment, so usually only one light source color temperature can be estimated as the color deviation of the corrected image in accordance with. For example, if the color temperature of the light source calculated by the AWB correction technology is closer to the high color temperature in the environment, the low color temperature area in the corrected image will be yellowish, and if the calculated light source color temperature is closer to the low color temperature, the high color temperature area in the image There is a bluish problem.
  • the human eye is not sensitive to the difference in the color of the light source in the mixed color temperature light source scene (the brain will think of the light source as close to white), in the mixed color temperature scene, the light blue and yellow light source colors in the image may trigger the user Feel the color is wrong. Therefore, how to estimate the number of light sources with different color temperatures in the environment has become an urgent need.
  • the embodiments of the present application provide a light source estimation method, an image processing method, and related products.
  • an embodiment of the present application provides a method for estimating a light source, which may include:
  • m sub-blocks may be all the same size, partially the same, or different from each other, and the shape of the sub-blocks may be square, rectangular, or other shapes), where m is an integer greater than 1.
  • the information group includes brightness information and color information.
  • the image may be an original image or another image captured by a camera.
  • the color brightness three-dimensional space includes two color dimensions and one brightness dimension.
  • each of the k layers corresponds to a brightness interval (continuous brightness interval), where P is an integer greater than 1, and k is a positive integer.
  • a second blending degree of the number of clusters P corresponding to the image is determined based on the first blending degree of the number of clusters P corresponding to each of the k layers. It can be understood that when k is equal to 1, it means that m color brightness samples are divided into the same layer, then only the first blending degree of the number of clusters P corresponding to the only layer is obtained. At this time, the The first blending degree of the number of clusters P corresponding to this layer is the second blending degree of the number of clusters P corresponding to the image.
  • the blending degree between the two color luminance sample clusters can indicate the closeness of the connection between the two color luminance sample clusters.
  • the greater the degree of blending between the two clusters the higher the degree of closeness between the color brightness samples of the two clusters; the smaller the degree of blending between the two clusters, the color brightness samples of the two clusters The tighter the connection is.
  • the number of color brightness samples included in different layers of the k layers may be all the same, partly the same, or different from each other.
  • the heights of the brightness intervals corresponding to different layers in the k layers may be all the same, partially the same, or different from each other.
  • the second blending degree of the corresponding clustering number P of the image when the second blending degree of the corresponding clustering number P of the image is less than or equal to the blending degree threshold, it may be estimated that there are differences in the shooting environments corresponding to the images.
  • the number of color temperature light sources is not P (in this case, there may be a monochrome temperature light source or a mixed color temperature light source in the shooting environment corresponding to the image).
  • This light source estimation method obtains the color brightness by mapping the color brightness information group of the image segmented sub-blocks to the color brightness three-dimensional space. Multiple color brightness sample points in three-dimensional space, and perform multiple layer clustering processing on multiple color brightness sample points, and then calculate the blending degree of the entire image by calculating the blending degree of the color brightness sample point clustering of each layer, and based on this It is estimated that there are multiple light sources with different color temperatures in the shooting environment corresponding to the image, which lays a foundation for image correction based on the situation that includes multiple light sources with different color temperatures.
  • determining the second blending degree of the number of clusters P corresponding to the image based on the first blending degree of the number of clusters P corresponding to each of the k layers may include: The first blending degree corresponding to the number of clusters P of each layer is summed or weighted to obtain a second blending degree corresponding to the number of clusters P of the image.
  • the number of clusters corresponding to each layer may be determined based on the height of the brightness interval of each layer. For example, the higher the weighted summation weight of a layer with a relatively higher brightness interval, the greater the correspondingly smaller height of the brightness interval. Layer, the corresponding weighted summation weight can be smaller.
  • the weighted summation weight of the first blending degree corresponding to the number of clusters P of each layer can be determined based on the number of color brightness samples of each layer. For example, the layer with a relatively large number of color brightness samples has a corresponding weighted summation weight The larger the layer with a relatively small number of color luminance samples, the smaller the corresponding weighted summation weight can be.
  • the weighted summation weight of each layer can also be determined based on other parameters.
  • calculating the first blending degree of the number of clusters P corresponding to the i-th layer in the k-layer may include: calculating each of the P-groups of the i-th layer A third degree of fusion between the two clusters; summing or weighting the third degree of fusion between each two clusters in the P clusters to obtain the number of clusters P corresponding to the i-th layer First degree of blending.
  • the i-th layer is any one of the k layers.
  • calculating the first blending degree of the number of clusters P corresponding to the i-th layer in the k layers may include: calculating between two clusters of the i-th layer A third degree of integration; wherein the first degree of integration of the number of clusters P corresponding to the i-th layer is equal to the third degree of integration between the two clusters.
  • the i-th layer is any one of the k layers.
  • calculating the third blending degree between the cluster gi and the cluster gj may include inserting a continuous array of measurement cells between the center point of the cluster gi and the center point of the cluster gj, the continuous array
  • the number of measurement cells is T.
  • the statistics include the number of measurement cells Q of the color brightness samples in the clusters gi and gj.
  • a third blending degree between the cluster gi and the cluster gj is determined as Q / T, where the T and the Q are integers, the T is greater than 0 and the Q is greater than or equal to 0.
  • the cluster gi and the cluster gj are any two clusters among the P clusters in the i-th layer.
  • the length of a single measurement cell may be Dist_D65_D50.
  • the width of a single measurement cell can be Dist_D65_D50 / 32.
  • the length and width of the measurement cell are also designed to other values, and the specific values can be set based on the needs of the scene.
  • the length direction of the continuously arranged measurement cells may be perpendicular to a line connecting the center point of the cluster gi and the center point of the cluster gj.
  • the length direction of the continuously arranged measurement cells may not be perpendicular to the line connecting the center point of the cluster gi and the center point of the cluster gj (the angle between the length direction and the line connecting the center points)
  • the range may be, for example, 60 ° to 90 °).
  • the two color dimensions included in the three-dimensional color luminance space are a first color dimension and a second color dimension, wherein a first color dimension coordinate of a center point of any cluster is equal to the any cluster
  • the average value of the first color dimension coordinates of all color brightness samples in the cluster, and the second color dimension coordinate of the center point of the any cluster is equal to the average of the second color dimension coordinates of all color brightness samples in the any cluster value.
  • the first color dimension coordinate of the center point of the cluster gi is equal to the average value of the first color dimension coordinates of all color brightness samples in the cluster gi
  • the second color dimension coordinate of the center point of the cluster gi is equal to the cluster.
  • P 2, ... Y light sources with different color temperatures in the shooting environment
  • P Y light sources with different color temperatures exist in the shooting environment.
  • the Y is an integer greater than 2 and less than or equal to the X.
  • a first blending degree of the number of clusters P1 corresponding to each layer in the k layers calculates a first blending degree of the number of clusters P1 corresponding to each layer in the k layers; and determine a number of clusters P1 corresponding to the image based on the first blending degree of the number of clusters P1 corresponding to each layer in the k layers
  • the second blending degree of the image comparing the second blending degree of the corresponding clustering number P1 of the image with the blending degree threshold of the clustering number P1.
  • a first blending degree of the number of clusters P2 corresponding to each of the k layers is calculated; and a number of clusters P2 corresponding to the image is determined based on the first blending degree of the number of clusters P2 corresponding to each of the k layers.
  • the second blending degree of the image comparing the second blending degree of the corresponding number of clusters P2 of the image with the blending degree threshold of the number of clusters P2.
  • the second degree of fusion of the corresponding number of clusters P1 in the image is greater than the threshold of the degree of fusion of the number of clusters P1, and the second degree of fusion of the corresponding number of clusters P2 of the image is less than or equal to the threshold of the degree of fusion of the number of clusters P2 In this case, it can be estimated that there are P1 light sources with different color temperatures in the shooting environment of the image.
  • the second degree of fusion of the corresponding number of clusters P1 in the image is greater than the threshold of the degree of fusion of the number of clusters P1 and the second degree of fusion of the corresponding number of clusters P2 of the image is greater than the threshold of the degree of fusion of the number of clusters P2
  • the second blending degree of the corresponding number of clusters P1 of the image is greater than the threshold of the blending degree of the number of clusters P1, and the second corresponding number of clusters of the image P2
  • the blending degree is greater than the blending degree threshold of the number of clusters P2
  • the second blending degree of the corresponding clustering number P2 of the image and the blending degree threshold of the clustering number P2 are different (this difference can be expressed as (the The second degree of fusion corresponding to the number of clusters P2 of the image-the degree of fusion threshold of the number of clusters P2) / the threshold of the degree of fusion) is greater than the second degree of fusion of the corresponding number of clusters P1 and the threshold of the degree of fusion of the number of clusters P1 It is estimated that the shooting environment corresponding to the image includes P2 light sources with different color temperatures.
  • an embodiment of the present application provides another method for estimating a light source, which may include:
  • Step S1 Divide the image into m sub-blocks, where m is an integer greater than 1.
  • Step S2 Obtain m color brightness information groups of the m sub-blocks, where each color brightness information group corresponds to one sub-block, and the color brightness information group includes brightness information and color information.
  • Step S3 Map the m color brightness information groups to a color brightness three-dimensional space to obtain m color brightness samples located in the color brightness three-dimensional space, each color brightness sample point corresponding to one color brightness information group
  • the color brightness three-dimensional space includes two color dimensions and one brightness dimension.
  • Step S4 divide the m color brightness samples into k layers along the brightness dimension
  • Step S5 Assign a value that has not been selected from the set of candidate numbers of light sources with different color temperatures to P.
  • Step S6 Divide each of the k layers into P color brightness sample point clusters, and calculate a first blending degree of the number of clusters P corresponding to each of the k layers.
  • Each of the k layers corresponds to a brightness interval, where P is an integer greater than 1, and k is a positive integer.
  • Step S7 Determine a second blending degree of the number of clusters P corresponding to the image based on the first blending degree of the number of clusters P corresponding to each of the k layers.
  • Step S8 Compare the second blending degree of the corresponding clustering number P of the image with the blending degree threshold. In a case where the second blending degree of the corresponding grouping number P of the image is greater than the blending degree threshold, it is estimated that there are at least P light sources with different color temperatures in the shooting environment corresponding to the image. Return to step S5.
  • an embodiment of the present application further provides an image processing method, including:
  • the light source estimation method according to any one of the first aspect or the second aspect is performed.
  • the image is corrected according to P light sources with different color temperatures, wherein the correction includes at least one of the following corrections: automatic white balance correction, color correction, saturation correction, or contrast correction.
  • an embodiment of the present application further provides a light source estimation device.
  • the light source estimation device may include a segmentation unit, an acquisition unit, a mapping unit, a calculation unit, a determination unit, a comparison unit, and an estimation unit.
  • the segmentation unit is configured to segment an image into m sub-blocks, where m is an integer greater than 1.
  • the obtaining unit is configured to obtain m color brightness information groups of the m sub-blocks, each color brightness information group corresponding to one sub-block, and the color brightness information group includes brightness information and color information.
  • a mapping unit configured to map the m color brightness information groups to a color brightness three-dimensional space to obtain m color brightness samples located in the color brightness three-dimensional space, where each color brightness sample is associated with a color
  • the brightness information group corresponds, and the three-dimensional space of color brightness includes two color dimensions and one brightness dimension.
  • a calculation unit configured to calculate all the m color luminance sample points into k layers along the luminance dimension, and each of the k layers is divided into P color luminance sample groups.
  • the first blending degree of the number of clusters P corresponding to each of the k layers is described.
  • Each of the k layers corresponds to a brightness interval (continuous brightness interval).
  • P is an integer greater than 1
  • k is a positive integer.
  • the determining unit is configured to determine a second blending degree of the number of clusters P corresponding to the image based on a first blending degree of the number of clusters P corresponding to each of the k layers.
  • the comparison unit is configured to compare the second blending degree of the corresponding grouping number P of the image with the blending degree threshold.
  • an estimation unit is configured to, when the second blending degree corresponding to the corresponding number of clusters P of the image is greater than the blending degree threshold, estimate that at least P different color temperatures exist in the shooting environment corresponding to the image. light source.
  • the estimation unit may be further configured to estimate light sources with different color temperatures in the shooting environment corresponding to the image when the second blending degree of the corresponding clustering number P of the image is less than the blending degree threshold. The number is not P.
  • the determining unit is specifically configured to perform summing or weighted summing processing on a first blending degree of the number of clusters P corresponding to each of the k layers to obtain a second blend of the corresponding number of clusters P of the image degree.
  • the P is greater than 2; in calculating a first blending degree of the number of clusters P corresponding to the i-th layer in the k-layers, the calculation unit is specifically configured to calculate the first a third blending degree between every two clusters in the P clusters of the i-layer; summing or weighted summing the third blending degree between every two clusters in the P clusters to obtain the first
  • the first blending degree of the grouping number P corresponding to the i-layer, and the i-th layer is any one of the k-layers.
  • the calculation unit is specifically configured to:
  • a continuous array of measurement cells is inserted between the center point of the cluster gi and the center point of the cluster gj, and the number of the continuous array measurement cells is T.
  • the cluster gi and the cluster gj are any two of the P clusters in the i-th layer.
  • the statistics include the number of measurement cells Q of the color brightness sample points in the cluster gi and the cluster gj; it is determined that the third blending degree between the cluster gi and the cluster gj is Q / T, where T and all Said Q is an integer, said T is greater than 0 and said Q is greater than or equal to 0.
  • a length direction of the continuously arranged measurement cells is perpendicular to a line connecting a center point of the cluster gi and a center point of the cluster gj.
  • the two color dimensions included in the three-dimensional color luminance space are a first color dimension and a second color dimension, wherein a first color dimension coordinate of a center point of any cluster is equal to the any cluster The average value of the first color dimension coordinates of all color brightness samples in the cluster, and the second color dimension coordinate of the center point of the any cluster is equal to the average of the second color dimension coordinates of all color brightness samples in the any cluster value.
  • the calculation unit, the determination unit, the comparison unit, and the estimation unit may perform the calculation, the determination, the comparison, and In the estimation, X is an integer greater than two.
  • Y is an integer greater than 2 and less than or equal to X.
  • an embodiment of the present application further provides a light source estimation device.
  • the light source estimation device may include a segmentation circuit, an acquisition circuit, a mapping circuit, a calculation circuit, a determination circuit, a comparison circuit, and an estimation circuit.
  • a segmentation circuit is used to segment an image into m sub-blocks, where m is an integer greater than 1.
  • the obtaining circuit is configured to obtain m color brightness information groups of the m sub-blocks, each color brightness information group corresponding to one sub-block, and the color brightness information group includes brightness information and color information.
  • a mapping circuit configured to map the m color luminance information groups to a color luminance three-dimensional space to obtain m color luminance samples located in the color luminance three-dimensional space, where each color luminance sample is associated with a color
  • the brightness information group corresponds, and the three-dimensional space of color brightness includes two color dimensions and one brightness dimension.
  • a calculation circuit configured to calculate all the m color luminance sample points divided into k layers along the luminance dimension, and each of the k layers is divided into P color luminance sample groups.
  • the first blending degree of the number of clusters P corresponding to each of the k layers is described.
  • Each of the k layers corresponds to a brightness interval (continuous brightness interval), where P is an integer greater than 1, and k is a positive integer.
  • a determining circuit is configured to determine a second blending degree of the number of clusters P corresponding to the image based on a first blending degree of the number of clusters P corresponding to each of the k layers.
  • the comparison circuit is configured to compare the second blending degree and the blending degree threshold of the corresponding grouping number P of the image.
  • An estimation circuit is configured to estimate that at least P light sources with different color temperatures exist in the shooting environment corresponding to the image when the second blending degree of the corresponding clustering number P of the image is greater than the blending degree threshold.
  • the determining circuit is specifically configured to perform summing or weighted summing processing on a first blending degree of the number of clusters P corresponding to each of the k layers to obtain a second blend of the corresponding number of clusters P of the image degree.
  • the P is greater than 2; in calculating a first blending degree of the number of clusters P corresponding to the i-th layer in the k-layers, the calculation circuit is specifically configured to calculate the first a third blending degree between every two clusters in the P clusters of the i-layer; summing or weighted summing the third blending degree between every two clusters in the P clusters to obtain the first
  • the first blending degree of the grouping number P corresponding to the i-layer, and the i-th layer is any one of the k-layers.
  • the calculation circuit when the P is equal to 2; in terms of calculating a first blending degree of the number of clusters P corresponding to the ith layer in the k layer, the calculation circuit may be specifically configured to: The third degree of integration between the two subgroups of the i-th layer is described, wherein the first degree of integration of the number of subgroups P corresponding to the i-th layer is equal to the third degree of integration between the two subgroups.
  • the i-th layer is any one of the k layers.
  • the calculation circuit is specifically configured to:
  • a continuous array of measurement cells is inserted between the center point of the cluster gi and the center point of the cluster gj, and the number of the continuously arrayed measurement cells is T; where the cluster gi and the cluster gj are the i-th Any two of the P groupings of the layer;
  • the statistics include the number of measurement cells Q of the color brightness sample points in the cluster gi and the cluster gj; it is determined that the third blending degree between the cluster gi and the cluster gj is Q / T, where T and all Said Q is an integer, said T is greater than 0 and said Q is greater than or equal to 0.
  • a length direction of the continuously arranged measurement cells is perpendicular to a line connecting a center point of the cluster gi and a center point of the cluster gj.
  • the two color dimensions included in the three-dimensional color luminance space are a first color dimension and a second color dimension, wherein a first color dimension coordinate of a center point of any cluster is equal to the any cluster The average value of the first color dimension coordinates of all color brightness samples in the cluster, and the second color dimension coordinate of the center point of the any cluster is equal to the average of the second color dimension coordinates of all color brightness samples in the any cluster value.
  • the calculation unit, the determination unit, the comparison unit, and the estimation unit may perform the calculation, the determination, the comparison, and In the estimation, X is an integer greater than two.
  • Y is an integer greater than 2 and less than or equal to X.
  • an embodiment of the present application further provides an image processing apparatus, including:
  • the light source estimation device is any one of the light source estimation devices provided in the fifth aspect or the fourth aspect.
  • the correction device is configured to correct the image according to P light sources with different color temperatures, and the correction includes at least one of the following corrections: automatic white balance correction, color correction, saturation correction, or contrast correction.
  • the correction device may be, for example, an image signal processor (ISP).
  • the ISP may include at least one of the following correction circuits: an automatic white balance correction circuit, a color correction circuit, and a saturation correction circuit. , Or contrast correction circuit.
  • an embodiment of the present application further provides a light source estimation device.
  • the light source estimation device includes a processor and a memory coupled to each other.
  • the memory stores a computer program.
  • the processor is configured to call the memory.
  • a computer program stored in the computer to execute any one of the light source estimation methods provided in the first aspect or the second aspect.
  • an embodiment of the present application further provides an image processing apparatus.
  • the light source estimation apparatus includes a processor and a memory coupled to each other.
  • the memory stores a computer program.
  • the processor is configured to call the memory.
  • a stored computer program to execute any one of the image processing methods provided by the third aspect.
  • an embodiment of the present application further provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program, wherein the computer program is executed by related hardware to complete the first aspect or the second aspect Any of the light source estimation methods provided.
  • an embodiment of the present application further provides a computer-readable storage medium.
  • the computer-readable storage medium stores a computer program, and the computer program is executed by related hardware to complete any one of the images provided by the third aspect. Approach.
  • an embodiment of the present application further provides a computer program product, and when the computer program product runs on a computer, the computer is caused to execute any one of the light source estimation methods provided in the first aspect or the second aspect.
  • an embodiment of the present application further provides a computer program product, and when the computer program product runs on a computer, the computer is caused to execute any one of the image processing methods provided in the third aspect.
  • FIG. 1A is a schematic diagram of a device system architecture according to an example of the present application.
  • FIG. 1B is a schematic structural diagram of an image processing component provided as an example in an embodiment of the present application.
  • FIG. 1C is a color temperature level division method of a standard light source provided by way of example in the embodiment of the present application.
  • FIG. 2 is a schematic diagram of mapping a color information group to a two-dimensional color coordinate plane according to an example of the present application.
  • 3A and 3B are schematic diagrams of sub-speed division of several images provided by way of example in the embodiment of the present application.
  • FIG. 4 is a schematic diagram of the distribution of color brightness samples in a mixed color temperature light source scene provided by way of example in the embodiment of the present application.
  • FIG. 5 is a schematic diagram of the distribution of color brightness samples in a monochrome temperature light source scene provided by way of example in the embodiment of the present application.
  • FIG. 6 is a schematic flowchart of a light source estimation method according to an embodiment of the present application.
  • FIG. 7 is a schematic diagram of color brightness sample points layered along the brightness dimension according to an embodiment of the present application.
  • FIG. 8 is a schematic diagram of clustering of color brightness samples according to an embodiment of the present application.
  • FIG. 9 is a schematic diagram of inserting a continuously arranged measurement cell between the center points of two color brightness sample point clusters according to an embodiment of the present application.
  • FIG. 10 is a schematic flowchart of an image processing method according to an embodiment of the present application.
  • FIG. 11A is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application.
  • FIG. 11B is a schematic structural diagram of another image processing apparatus according to an embodiment of the present application.
  • FIG. 12 is a schematic structural diagram of a light source estimation device according to an embodiment of the present application.
  • FIG. 13 is a schematic structural diagram of another light source estimation device according to an embodiment of the present application.
  • FIG. 14 is a schematic diagram of another light source estimation device according to an embodiment of the present application.
  • FIG. 15 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application.
  • FIG. 16 is a schematic structural diagram of another image processing apparatus according to an embodiment of the present application.
  • FIG. 1A is a schematic diagram of a system architecture provided by the present application.
  • the system 100 includes a camera 110, an image processing unit 120, and a memory 130.
  • the camera 110 is configured to capture an image (original image).
  • the image processing unit 120 is configured to perform some related processing on the image obtained by the camera 110 (for example, perform automatic white balance correction, color correction, saturation correction, and / or contrast correction on the image obtained by the camera 110).
  • the memory 130 is used to store some program code or data related to image processing.
  • the image processing component 120 may be composed of one or more processors; or, the image processing component 120 may include both one or more processors or some hardware circuits; or the image processing component 120 may not include processing Controller, but includes some hardware circuits.
  • the image processing section 120 includes, for example, an image signal processor 121, a light source estimation device 122, and the like.
  • the light source estimation device 122 can be used to estimate the existence of a light source in the image shooting environment (for example, it can be estimated whether a monochrome temperature light source or a mixed color temperature light source exists in the image shooting environment).
  • the estimation result of the existence of the light source in the image shooting environment output by the light source estimation device 122 can be performed by the image signal processor 121 for related image processing (for example, automatic white balance correction, color correction, saturation of the image). (Such as degree correction and / or contrast correction).
  • the camera 110, the image signal processor 121, the light source estimation device 122, and the like are physically set independently as an example.
  • some components may be physically integrated as a whole.
  • the light source estimation device 122 may be integrated into the image signal processor 121, and the light source estimation device 122 may also be integrated into the camera 110.
  • the image signal processor may have the function of the light source estimation device described in the above example, and the image signal processor integrated with the light source estimation device It can still be called an image signal processor.
  • the camera can also have the function of the light source estimation device described in the above example, and the camera integrated with the light source estimation device can still be called Camera, and so on in other cases.
  • the standard light source generally refers to a light source whose color temperature is a standard color temperature.
  • the color temperature level of a standard light source can be divided into three levels: high color temperature, medium color temperature, and low color temperature. High color temperature, medium color temperature, and low color temperature can be divided into several sub-levels.
  • FIG. 1C illustrates a color temperature level division method of a standard light source by way of example. In the example shown in FIG. 1C, the color temperature is divided into 10 sub-levels. , CWF, TL84 and U30 are classified as medium color temperature, and A and H are classified as low color temperature. Among them, the color of high color temperature light sources is light blue, and the color of low color temperature light sources is light yellow.
  • the image is cut into n ⁇ m subblocks, and all pixels of each subblock are added to obtain the color average value (R, G, B) of this subblock. Based on this color average value, this can be obtained.
  • the color information group (R / G, B / G) of the sub-block, or (R, G, B) can be converted to (Y, Cb, Cr) or (Y, U, V) color space, then the color information Groups can also be expressed as (Cb, Cr) or (U, V).
  • the color information group of each sub-block is mapped on a two-dimensional color coordinate plane (for example, as shown in FIG. 2), and then a color sample point located in the two-dimensional color coordinate plane is formed, and each color sample point corresponds to one color information. group.
  • block (block) and subblock (subblock) is a relative concept. Dividing a block (image block) can get the subblock of this block. If you continue to divide the subblock, you can get this The sub-block of the sub-block, that is, the block is composed of the sub-blocks, and the sub-blocks are obtained by segmenting the blocks. Of course, both blocks and sub-blocks can be called blocks (image blocks).
  • small dots represent color sample points of a color information group of a sub-block
  • nine large dots represent calibration points of nine standard light sources.
  • the color samples within the range enclosed by the dotted line can be regarded as close enough to the nine light sources.
  • These color samples can be regarded as valid color samples.
  • the effective color samples can be used as the basis for subsequent calculations. Outside the range enclosed by the dotted line
  • the color samples can be regarded as invalid color samples, and the invalid color samples can not be used as the basis for subsequent calculations.
  • all color samples can also be regarded as valid color samples.
  • the main goal of the above AWB correction method is to calculate the color temperature of the light source. Therefore, all color samples within the circled range of the dashed line can be added and the average Avg (R / G, B / G) can be calculated; according to the calibration point of the standard light source (Such as the calibration points of the nine standard light sources in Figure 2) and the relative positional relationship with Avg (R / G, B / G) to estimate the color temperature of the only light source existing in the image shooting environment; based on the average Avg (R / G, B / G) and the estimated color temperature of the only light source in the image shooting environment is converted to obtain the gain values of the three RGB channels, namely (R-gain, G-gain, B-gain); Gain, G-gain, B-gain) is multiplied by (R, G, B) of each pixel in the image to correct the color temperature deviation of the sole light source, which results in AWB correction.
  • the above AWB correction method can estimate the color temperature of the only light source in the image shooting environment (for example, the color temperature of this unique light source can be specifically output 5000K), and when there are multiple light sources in the shooting environment and the color temperature of each light source is different, such as at the same time There are light sources with high, medium, and low color temperatures.
  • the above AWB correction method cannot effectively determine whether there are multiple light sources with different color temperatures in the image shooting environment, so the color temperature of the only light source can usually be estimated as the basis for correcting the color cast of the image.
  • the only light source can be called a single light source.
  • multiple different color temperatures can be called mixed color temperatures.
  • Multiple different color temperature light sources can be called mixed color temperature light sources.
  • the embodiments of the present application further provide some light source estimation methods. These light source estimation methods strive to estimate whether there are multiple light sources with different color temperatures in the image shooting environment.
  • the inventors of the present application have found through extensive research that when there are multiple light sources with different color temperatures in the image shooting environment, the physical characteristics of light sources projected by different color temperatures are usually different from the physical characteristics of a single light source projection. Therefore, the physical characteristics of the light source projection can be analyzed to determine whether there are multiple light sources with different color temperatures in the image shooting environment, and this idea is helpful to overcome the misjudgement of the color temperature of the light source caused by the color of the object itself.
  • some solutions in the embodiments of the present application use color information and brightness information of an image to estimate whether there are two or more light sources with different color temperatures in an image shooting environment.
  • the color and brightness information of an image can be obtained, for example, by dividing the image into m sub-blocks, where m is an integer greater than 1.
  • Acquire m color luminance information groups of the m sub-blocks Each color luminance information group corresponds to one sub-block (that is, a one-to-one correspondence between the m sub-blocks and m color luminance information groups), and the color luminance information group includes luminance information and color information.
  • the image may be an original image obtained by a camera or other images.
  • the color information and brightness information of the image include m color brightness information groups of m sub-blocks of the image.
  • m may be equal to 2, 3, 4, 8, 12, 16, 32, 64, 128, 256, or other values.
  • the sizes of the m sub-blocks may be all the same, partially the same, or different from each other.
  • the shape of the sub-blocks can be square, rectangular, or other shapes.
  • the image is divided into m sub-blocks of the same size, and in the example shown in FIG. 3B, the image is divided into m sub-blocks of different sizes.
  • the sub-block segmentation method of the image is not limited to the method illustrated in FIGS. 3A and 3B.
  • the color information may be expressed in the following form: (R / G, B / G) or (Cb, Cr) or (U, V), and the brightness information may be expressed as BV (bright value). Therefore, the color luminance information group of the sub-block can be expressed as (R / G, B / G, BV) or (Cb, Cr, BV) or (U, V, BV), for example.
  • the m color brightness information groups may be mapped to a color brightness three-dimensional space to obtain m color brightness sample points located in the color brightness three-dimensional space.
  • Each color brightness sample point corresponds to one color brightness information group (that is, one-to-one correspondence between m color brightness sample points and m color brightness information groups in a color brightness three-dimensional space), and the color brightness three-dimensional space Includes two color dimensions and one brightness dimension. Further, by performing clustering processing on m color brightness sample points located in the color brightness three-dimensional space, multiple color brightness sample point groups can be obtained. Alternatively, the m color brightness samples located in the three-dimensional color brightness space are first layered along the brightness dimension, and then the color brightness samples of each layer are grouped, and multiple color brightness samples can be obtained for each layer. Point group.
  • the inventors of the present application analyzed the distribution characteristics of relevant color brightness sample points and found that in the mixed color temperature scene, there is a strong degree of blending between the color brightness sample groups; in the monochrome temperature scene, , There is a weaker blending degree between the color brightness sample clusters.
  • the blending degree between the two color luminance sample clusters can indicate the closeness of the connection between the two color luminance sample clusters. Among them, the greater the degree of blending between the two clusters, the higher the degree of closeness between the color brightness samples of the two clusters; the smaller the degree of blending between the two clusters, the color brightness samples of the two clusters The tighter the connection is.
  • some examples through two experimental examples are some examples through two experimental examples.
  • the right image in FIG. 4 is a captured image.
  • Daylight with a high color temperature in the shooting environment is illuminated through the window and A light with a low color temperature, that is, there is a mixed color temperature in the shooting environment.
  • the left figure in FIG. 4 shows the distribution of the color brightness samples of each sub-block of the image in the three-dimensional space of color brightness.
  • the horizontal plane of the three-dimensional space of color brightness has two color dimensions and one brightness dimension in the vertical direction. .
  • the 9 points above the left are the marked points of the standard light source, and the points below the left are the color brightness samples of the subblocks on the right.
  • the color brightness samples are divided into two groups (Group1, Group2), Group1 belongs to a high color temperature, and Group2 belongs to a low color temperature.
  • Group1 belongs to a high color temperature
  • Group2 belongs to a low color temperature.
  • This blending characteristic is mainly caused by light sources with high and low color temperatures being projected on the same object (such as the floor, etc.), and the two light sources will have a fusion effect in color and brightness.
  • the right image of Figure 5 is the captured image.
  • the shooting environment there is only a single high color temperature daylight light source, and no other artificial light source, that is, there is no mixed color temperature in the shooting environment.
  • the color brightness samples of this wood texture wall are distributed in the low color temperature region.
  • the left figure in Figure 5 shows the color brightness samples of each sub-block in the color brightness.
  • the 9 points on the upper left are labeled points of the standard light source, and the points on the lower left are sample points of the color brightness of each sub-block on the right.
  • Group1 belongs to a high color temperature
  • Group2 belongs to a low color temperature.
  • FIG. 6 is a schematic flowchart of a light source estimation method according to an embodiment of the present application.
  • a light source estimation method may be implemented in the system architecture shown in FIG. 1A or FIG. 1B.
  • the light source estimation method may be mainly performed by the image processing unit 120, and specifically, for example, by the light source estimation device in the image processing unit 120.
  • the methods can specifically include:
  • the image may be an original image or another image captured by a camera.
  • Each color luminance information group corresponds to one sub-block (that is, a one-to-one correspondence between the m sub-blocks and m color luminance information groups), and the color luminance information group includes luminance information and color information.
  • each color brightness sample point corresponds to a color brightness information group, that is, there is a one-to-one correspondence between m color brightness sample points and m color brightness information groups in a color brightness three-dimensional space.
  • the three-dimensional space of color brightness includes two color dimensions and one brightness dimension.
  • m 128, it means that the image is cut into 128 sub-blocks, and then the color brightness information groups of 128 sub-blocks can be obtained, and a total of 128 color brightness information groups are mapped to the 128 color brightness information groups. In the three-dimensional space of brightness, a total of 128 color brightness samples can be obtained. Each color brightness sample corresponds to a color brightness information group, and each color brightness information group corresponds to a sub-block.
  • the m color brightness sample points are divided into k layers along the brightness dimension, and each of the k layers is divided into P color brightness sample groupings.
  • any one of the following clustering algorithms can be used: k-means algorithm, hierarchical clustering algorithm, and density-based clustering algorithms.
  • Clustering density based clustering, DBSCAN
  • DBSCAN density based clustering
  • clustering density based clustering
  • other clustering algorithms may be used for clustering the color and brightness samples, which is not specifically limited in this embodiment.
  • k when k is equal to 1, it means that m color luminance samples are divided into the same layer, so the action of layering is not actually performed.
  • Each of the k layers corresponds to a brightness interval.
  • P is an integer greater than 1.
  • K is a positive integer.
  • P may be equal to 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 13, 15, 20, or 35 or other values.
  • k may be equal to 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 13, 17, or other values.
  • the number of color brightness samples included in different layers of the k layers may be all the same, partly the same, or different from each other.
  • the heights of the brightness intervals corresponding to different layers in the k layers may be all the same, partially the same, or different from each other.
  • the color brightness sample points are divided into k layers along the brightness dimension, and each layer is distributed from dark to light, and the number of color brightness sample points of each layer are S1, S2, ..., Sk, respectively.
  • the number of color brightness samples in different layers may be all the same, some of them may be the same or different from each other.
  • the number of color brightness samples of brighter layers may be more or less, and for example, the number of color brightness samples of each layer is equal.
  • the blending degree threshold here is a blending degree threshold corresponding to the number of clusters P, that is, when the value of the number of clusters P is different, the blending threshold may also be different.
  • the blending degree threshold may not correspond to the number of clusters P, that is, when the value of the number of clusters P is different, the blending threshold may not change.
  • the blending threshold may be an empirical value or obtained based on experimental data, which is not limited in this application.
  • the second blending degree of the corresponding number of clusters P of the image is greater than the blending degree threshold, it is estimated that there are P light sources with different color temperatures in the shooting environment of the image, that is, a suitable number of light sources is obtained. P.
  • the second blending degree of the corresponding grouping number P of the image is less than or the blending degree threshold, it may be estimated that there are no P color temperatures in the shooting environment corresponding to the image Different light sources.
  • the present application may further combine other technical means to estimate whether there is only a single light source or multiple light sources with different color temperatures in the shooting environment corresponding to the image.
  • the blending degree between the two color luminance sample clusters can indicate the closeness of the connection between the two color luminance sample clusters.
  • the greater the degree of blending between the two clusters the higher the degree of closeness between the color brightness samples of the two clusters; the smaller the degree of blending between the two clusters, the color brightness samples of the two clusters The tighter the connection is.
  • This light source estimation method obtains the color brightness by mapping the color brightness information group of the image segmented sub-blocks to the color brightness three-dimensional space. Multiple color brightness sample points in three-dimensional space, and perform multiple layer clustering processing on multiple color brightness sample points, and then calculate the blending degree of the entire image by calculating the blending degree of the color brightness sample point clustering of each layer, and based on this It is estimated that there are multiple light sources with different color temperatures in the shooting environment corresponding to the image. Then, this lays the foundation for image correction based on the situation that includes multiple light sources with different color temperatures. For example, it makes it possible to perform targeted image correction when multiple light sources with different color temperatures are included, which in turn helps to improve the rationality of image correction.
  • P 2, ... Y light sources with different color temperatures in the shooting environment
  • P 2, ... Y
  • each value of P is determined as the final number of light sources. Therefore, each value of P is actually a hypothetical light source number, and this value is applied to the light source estimation method to determine whether the hypothetical value is appropriate.
  • the first blending degree of P1 determines the second blending degree of the number of clusters P1 corresponding to the image; comparing the second blending degree of the corresponding number of clusters P1 of the image with the blending degree threshold of the number of clusters P1.
  • the second degree of fusion of the corresponding number of clusters P1 in the image is greater than the threshold of the degree of fusion of the number of clusters P1, and the second degree of fusion of the corresponding number of clusters P2 of the image is less than or equal to the threshold of the degree of fusion of the number of clusters P2 In this case, it can be estimated that there are P1 light sources with different color temperatures in the shooting environment of the image.
  • the second degree of fusion of the corresponding number of clusters P1 in the image is greater than the threshold of the degree of fusion of the number of clusters P1 and the second degree of fusion of the corresponding number of clusters P2 of the image is greater than the threshold of the degree of fusion of the number of clusters P2
  • the second blending degree of the corresponding number of clusters P1 in the image is greater than the threshold of the blending degree of the number of clusters P1, and the second blending degree of the corresponding number of clusters P2 of the image is greater than
  • the blending degree threshold of the number of clusters P2 when the second blending degree of the corresponding clustering number P2 of the image is different from the blending degree threshold of the clustering number P2 (this difference can be expressed as (the image's The second degree of fusion corresponding to the number of clusters P2-the threshold of the degree of fusion of the number of clusters P2) / the threshold of the degree of fusion) is greater than the difference between the second degree of fusion of the corresponding number of clusters P1 and the threshold of the degree of fusion of the number of clusters P1 It is estimated that the shooting environment corresponding to the image includes P2 light sources with different color temperatures. That is, the number of clusters corresponding to the second blending degree that exceeds the corresponding blending degree
  • determining the second blending degree of the number of clusters P corresponding to the image based on the first blending degree of the number of clusters P corresponding to each of the k layers may include: The first blending degree corresponding to the number of clusters P of each layer is summed or weighted to obtain a second blending degree corresponding to the number of clusters P of the image.
  • the weighted summation weight of the first blending degree of the number P can be determined based on the height of the brightness interval of each layer. For example, the higher the weighted summation weight of a layer with a relatively higher brightness interval, the greater the height of the brightness interval. The relatively smaller the layer, the smaller the corresponding weighted summation weight can be.
  • the weighted summation weight of the first blending degree corresponding to the number of clusters P of each layer may be determined based on the number of color brightness samples of each layer. For example, the layer with a relatively large number of color brightness samples has a corresponding weighted calculation. The larger the sum weight, the smaller the number of color brightness samples, and the smaller the corresponding weighted sum weight.
  • the weighted summation weight of each layer can also be determined based on other parameters.
  • calculating the first blending degree of the number of clusters P corresponding to each of the k layers may include: calculating the P-th clusters of the i-th layer The third degree of integration between each two subgroups. The third blending degree between each two clusters in the P clusters is summed or weighted to obtain a first blending degree of the number of clusters P corresponding to the i-th layer.
  • the i-th layer is any one of the k layers.
  • calculating the first blending degree of the number of clusters P corresponding to the i-th layer in the k-layer may include: calculating two of the i-th layer The third degree of integration between the clusters; wherein the first degree of integration of the number of clusters P corresponding to the i-th layer is equal to the third degree of integration between the two clusters.
  • the i-th layer is any one of the k layers.
  • calculating the third degree of blending between the cluster gi and the cluster gj may include inserting a continuous array of measurement cells between the center point of the cluster gi and the center point of the cluster gj, the continuous The number of arranged measurement cells is T.
  • the statistics include the number of measurement cells Q of the color brightness samples in the clusters gi and gj. It is determined that the third blending degree between the cluster gi and the cluster gj is Q / T.
  • the T and the Q are integers.
  • the T is greater than 0 and the Q is greater than or equal to 0.
  • the cluster gi and the cluster gj are any two clusters among the P clusters in the i-th layer.
  • the third blending degree between two pairs of subgroups can also be calculated in the same color plane.
  • calculating the third degree of fusion between cluster gi and cluster gj includes: projecting cluster gi and cluster gj to the same color plane (that is, the brightness of the color brightness sample points in cluster gi and cluster gj take the same value).
  • Continuously arranged measurement cells are inserted between the center point of the cluster gi and the center point of the cluster gj that are projected onto the same color plane.
  • the number of the continuously arranged measurement cells is T.
  • the statistics include the number of measurement cells Q of the color brightness samples in the clusters gi and gj.
  • the third blending degree between the cluster gi and the cluster gj is Q / T. among them.
  • the T is greater than 0 and the Q is greater than or equal to 0, and T and Q are integers.
  • the cluster gi and the cluster gj are any two clusters among the P clusters in the i-th layer.
  • the distance between the color brightness sample point in the cluster gi and the center point of the group gi is less than or equal to the distance between this color brightness sample point and the center point of the group gj.
  • the length of a single measurement cell may be equal to, for example, Dist_D65_D50, Dist_D75_D65, or Dist_D55_D50 or other experience values.
  • Dist_D75_D65 represents the distance between the calibration points of the standard light sources D75 and D65 in the color plane
  • Dist_D65_D50 represents the distance between the calibration points of the standard light source D65 and D50 in the color plane
  • Dist_D55_D50 represents the color plane, the standard The distance between the calibration points of the light sources D55 and D50; in other cases and so on.
  • the width of a single measurement cell is equal to length ⁇ 1/32, length ⁇ 1/20, length ⁇ 1/16, length ⁇ 1/19, or other experience values, such as the width of a single measurement cell Dist_D65_D50 / 32, Dist_D75_D65 / 32 , And so on in other cases.
  • the third blending degree between two pairs of subgroups can be calculated and calculated in the three-dimensional space of color brightness, and the third blending degree between two pairs of subgroups can also be calculated in the same color plane.
  • the two clusters are projected onto the same color plane to obtain all the samples of the two clusters located on the same color plane).
  • a single measurement cell is a three-dimensional measurement cell having a length, a width, and a height, and in this case, a single measurement cell The height may be greater than or equal to the layer height of the layer where the two clusters are located.
  • a single measurement cell is a measurement cell with a length and a width but not a high plane.
  • the single measurement cell For the length and width, please refer to the above examples.
  • the center points of the two clusters that are projected onto the same color plane are C1 and C2 respectively.
  • the points in the figure represent the color brightness samples of the two clusters.
  • the box indicates the measurement cell.
  • the number above the measurement cell indicates the number of color brightness samples that fall into the measurement cell. For example, when the number above the measurement cell is 1, it indicates the color that falls into the measurement cell in two subgroups.
  • the number of brightness samples is 1, when the number above the measurement cell is 0, it means that the number of color brightness samples that fall into the measurement cell in the two clusters is 0, and when the number above the measurement cell is 3, it means two The number of color brightness samples that fall into the measurement cell in the cluster is 3, and so on in other cases.
  • the measurement cells inserted into the measurement cells of C1 and C2 may fall into the color brightness samples, that is, some or all of the measurement cells inserted into the measurement cells of C1 and C2 include the color brightness samples. If the number of measured measurement cells including color brightness samples is Q and the total number of measurement cells inserted into C1 and C2 is T, then the third degree of integration between the two clusters can be calculated as Q / T (also expressed as (Q / T) ⁇ 100%).
  • the length direction of the continuously arranged measurement cells may be perpendicular to a line connecting a center point of the cluster gi and a center point of the cluster gj (for example, as shown in FIG. 9).
  • the length direction of the continuously arranged measurement cells may not be perpendicular to the line connecting the center point of the cluster gi and the center point of the cluster gj.
  • the included angle range between the longitudinal direction and the line connecting the center point may be, for example, 60 ° to 90 °.
  • the two color dimensions included in the three-dimensional color luminance space are a first color dimension and a second color dimension, wherein a first color dimension coordinate of a center point of the cluster gi is equal to that in the cluster gi The average value of the first color dimension coordinates of all the color brightness sample points, and the second color dimension coordinate of the center point of the cluster gi is equal to the average value of the second color dimension coordinates of all the color brightness sample points in the cluster gi.
  • the color brightness sample points in the color brightness three-dimensional space are divided into k layers along the brightness dimension.
  • the number of color brightness samples for each layer is S1, S2 ... Sk. Please refer to FIG. 7.
  • the horizontal plane (color plane) in FIG. 7 is two color dimensions, and the vertical direction is one brightness dimension.
  • the four planes perpendicular to the brightness dimension divide the color brightness three-dimensional space into five layers, so that multiple color brightness samples are distributed in these five layers.
  • the layer division rules have been described in the previous embodiments, and will not be repeated here.
  • the color brightness samples in each layer are grouped.
  • the plane in FIG. 8 is a color plane formed by two color dimensions.
  • FIG. 8 specifically shows a result of dividing a plurality of color luminance samples included in a layer into two clusters, and the center points of each cluster are respectively C1 and C2.
  • the clustering algorithm can refer to the introduction of the previous embodiment.
  • the two clusters may not be included in the calculation of the image blending degree. .
  • the distance between the center points of two clusters is greater than Cent_Dist_min and is included in the calculation as an example.
  • the distance between the calibration points of the standard light sources D65 and D50 on the color planes (R / G and B / G planes) is Dist_D65_D50.
  • the above minimum distance threshold Cent_Dist_min Dist_D65_D50 ⁇ 70% can be set, and This is used as a criterion to select whether this layer is included in the calculation of image blending.
  • the two clusters may be included in the calculation of the image fusion degree. In this case, it can be considered as the smallest.
  • the distance threshold Cent_Dist_min 0.
  • successively arranged measurement cells are inserted between C1 and C2.
  • a plurality of measurement cells arranged consecutively are inserted between C1 and C2
  • the length of a single measurement cell may be Dist_D65_D50.
  • the width of a single measurement cell can be Dist_D65_D50 / 32.
  • the length and width of a single measurement cell actually used may also be other suitable values, which are not limited in this application. Assuming that the number of consecutively arranged measurement cells is T (in the example shown in FIG.
  • T represents the total number of measurement cells inserted between C1 and C2)
  • P is greater than 2
  • the degree of blending between the two subgroups of each layer can be calculated according to the above example.
  • a certain layer is divided into (g1, g2, g3, g4) four groups.
  • L (g1, g2) represents the blending degree between g1 and g2, and so on.
  • the blending degree of each layer is summed (or weighted summation) to obtain the blending degree of the entire image. Specifically, after multiplying the blending degree of each layer by the corresponding weighted value, the summation of the image's blending degree Total_Con P (P represents how many groups each layer is divided into), if the blending degree of the image exceeds the blending threshold Th P ( The blending threshold Th P corresponds to the number of clusters P), and it can be estimated that there are P light sources of different color temperatures in the image shooting environment.
  • Total_Con 2 and Total_Con 3 are larger than the corresponding blending threshold and Total_Con 4 is smaller than the corresponding blending threshold Th 4 , it can be estimated that there are three different color temperature light sources in the image shooting environment.
  • Total_Con 2 is larger than the corresponding blending threshold Th 2
  • Total_Con 3 and Total_Con 4 are smaller than the corresponding blending threshold, it can be estimated that there are two different color temperature light sources in the image shooting environment.
  • Total_Con 2 is larger than the corresponding blending threshold Th 2 , Total_Con 3 and Total_Con 4 are smaller than the corresponding blending threshold, it can be estimated that there are two different color temperature light sources in the image shooting environment. If Total_Con 4 is less than the corresponding blending degree threshold Th 4 , Total_Con 2 and Total_Con 3 are greater than the corresponding blending degree threshold, then the image blending degree that exceeds the corresponding blending degree threshold the most (that is, the difference between the image blending degree and the blending degree threshold is the largest ), The number of clusters is determined as the number of light sources with different color temperatures.
  • the image may be correlated and corrected according to P light sources with different color temperatures.
  • FIG. 10 is a schematic flowchart of an image processing method according to an embodiment of the present application.
  • An image processing method may be implemented in the system architecture shown in FIG. 1A or FIG. 1B.
  • the image processing method may be mainly performed by the image processing unit 120.
  • steps related to light source estimation in the image processing method may be performed by The light source estimation device 121 in the image processing component 120 is mainly executed, and the steps related to the correction in the image processing method may be mainly performed by the image signal processor 122 in the image processing component 120.
  • the method may specifically include:
  • the light source estimation method may be any one of the light source estimation methods provided in the foregoing embodiments.
  • the image is corrected according to the P light sources with different color temperatures.
  • the image is corrected according to the monochrome temperature light source.
  • the correction may include at least one of the following corrections: automatic white balance correction, color correction, saturation correction, or contrast correction.
  • the correction may be performed by an image signal processor (ISP).
  • the light source estimation method can be performed by a light source estimation device.
  • a structural diagram of a specific image processing apparatus may be shown in FIG. 11A.
  • the color shift of the light source is corrected by the AWB correction circuit.
  • white objects will appear as white as possible, but other colors may not be accurate.
  • a color correction (CC) circuit corrects each color to the correct color through color correction.
  • the saturation correction circuit can further use the color enhancement (CE) mechanism to specify a specific color in the image, and increase or decrease its saturation to complete the saturation correction, thereby improving the color style of the image.
  • CE color enhancement
  • a contrast correction circuit (such as Gamma) is used to correct the contrast of the image brightness.
  • the order of the circuits in the ISP in FIG. 11A can be adjusted and changed. For example, it can also be adjusted to the order shown in the example in FIG. 11A to obtain the structure in FIG. 11B. Of course, it can also be adjusted to other orders according to needs.
  • This application does not go into details.
  • the light source estimation device After estimating the number of light sources, the light source estimation device transmits information with the number to the ISP, so that the ISP performs the correction using the information, and the execution order of different types of corrections can be adjusted and changed. For specific correction methods, refer to other existing literatures, which are not described in this application.
  • the captured image may appear light blue (caused by a high color temperature light source) and light yellow (caused by a low color temperature light source).
  • the color of the light source will be more intense on the image. Because the brain recognizes that the light source is white. People are not sensitive to the color of the light source. For images taken under a multi-color temperature light source environment, the excessively strong color of the light source will be considered by the user to be the wrong color, especially a light blue with a high color temperature.
  • the AWB is adjusted so that the correction of the light source is biased to a high color temperature to reduce the color cast of light blue, reduce the color intensity of CC and CE, reduce the color of the light source, and reduce the contrast of Gamma , To reduce the brightness difference between high and low color temperature light sources, so that photos taken under a multi-color temperature light source environment, is conducive to closer to the scene seen by the human eye.
  • an embodiment of the present application further provides a light source estimation device 1200.
  • the light source estimation device 1200 may include:
  • the segmentation unit 1210 is configured to segment an image into m sub-blocks, where m is an integer greater than 1.
  • the obtaining unit 1220 is configured to obtain m color brightness information groups of the m sub-blocks, each color brightness information group corresponding to one sub-block, and the color brightness information group includes brightness information and color information.
  • a mapping unit 1230 is configured to map the m color brightness information groups to a color brightness three-dimensional space to obtain m color brightness samples located in the color brightness three-dimensional space, where each color brightness sample is associated with one The color brightness information group corresponds, and the color brightness three-dimensional space includes two color dimensions and one brightness dimension.
  • a calculation unit 1240 is configured to calculate when the m color luminance samples are divided into k layers along the luminance dimension, and each of the k layers is divided into P color luminance sample groups.
  • Each of the k layers corresponds to a brightness interval (continuous brightness interval).
  • P is an integer greater than 1
  • k is a positive integer.
  • a determining unit 1250 is configured to determine a second blending degree of the grouping number P corresponding to the image based on a first blending degree of the grouping number P corresponding to each of the k layers.
  • the comparing unit 1260 is configured to compare the second blending degree of the corresponding grouping number P of the image with the blending degree threshold.
  • the estimation unit 1270 is configured to estimate that at least P different color temperatures exist in the shooting environment corresponding to the image when the second blending degree of the corresponding clustering number P of the image is greater than the blending degree threshold.
  • Light source In addition, the estimation unit 1270 may be further configured to, when the second blending degree of the corresponding clustering number P of the image is less than the blending degree threshold, estimate the different color temperatures in the shooting environment corresponding to the image.
  • the number of light sources is not P.
  • the determining unit 1250 is specifically configured to perform summing or weighted summing processing on a first blending degree P corresponding to the number of clusters P of each of the k layers to obtain a second blending degree corresponding to the number of clusters P of the image .
  • the P is greater than 2; in calculating a first blending degree of the number of clusters P corresponding to the i-th layer in the k-layers, the calculation unit 1240 is specifically configured to calculate the i-th A third blending degree between every two clusters in the P subgroups of the layer; summing or weighting the third blending degree between each two clusters in the P clusters to obtain the i-th
  • the first blending degree of the grouping number P corresponding to the layer, and the i-th layer is any one of the k layers.
  • the calculation unit 1240 is specifically configured to: insert between the center point of the cluster gi and the center point of the cluster gj
  • the number of consecutively arranged measurement cells is T.
  • the cluster gi and the cluster gj are any two clusters among the P clusters in the i-th layer.
  • the calculation unit 1240 may count the number of measurement cells Q including the color brightness sample points in the cluster gi and the cluster gj, and determine that the third blending degree between the cluster gi and the cluster gj is Q / T, wherein the T and the Q are integers, the T is greater than 0 and the Q is greater than or equal to 0.
  • a length direction of the continuously arranged measurement cells is perpendicular to a line connecting a center point of the cluster gi and a center point of the cluster gj.
  • the two color dimensions included in the three-dimensional color luminance space are a first color dimension and a second color dimension, and a first color dimension coordinate of a center point of any cluster is equal to all The average value of the first color dimension coordinates of the color brightness sample points, and the second color dimension coordinate of the center point of the any group is equal to the average value of the second color dimension coordinates of all the color brightness sample points in the any group.
  • the calculation unit, the determination unit, the comparison unit, and the estimation unit may perform the calculation, the determination, the comparison, and
  • X is an integer greater than two.
  • all units in FIG. 12 may be specifically implemented by software codes (specifically, may be implemented by software codes executed by a processor); or some units in FIG. 12 may be specifically implemented by software codes, and another part of the units may be implemented by hardware circuits. ; Or all units in FIG. 12 may be specifically implemented by hardware circuits. In the example shown in FIG. 13, all units in FIG. 12 are implemented by hardware circuits as an example.
  • an embodiment of the present application further provides a light source estimation device 1300.
  • the light source estimation device 1300 may be implemented by a hardware circuit, and may include a segmentation circuit 1310, an acquisition circuit 1320, a mapping circuit 1330, and a calculation.
  • any circuit may include multiple transistors, logic gates, or basic circuit logic units.
  • a segmentation circuit 1310 is configured to segment an image into m sub-blocks, where m is an integer greater than 1.
  • the obtaining circuit 1320 is configured to obtain m color brightness information groups of the m sub-blocks, each color brightness information group corresponding to one sub-block, and the color brightness information group includes brightness information and color information.
  • a mapping circuit 1330 is configured to map the m color brightness information groups to a color brightness three-dimensional space to obtain m color brightness samples located in the color brightness three-dimensional space, and each color brightness sample is associated with one color brightness.
  • the three-dimensional space of color brightness includes two color dimensions and one brightness dimension.
  • a calculation circuit 1340 is configured to calculate when the m color luminance samples are divided into k layers along the luminance dimension, and each of the k layers is divided into P color luminance sample groups.
  • Each of the k layers corresponds to a brightness interval (continuous brightness interval).
  • P is an integer greater than 1
  • k is a positive integer.
  • a determining circuit 1350 is configured to determine a second blending degree of the number of clusters P corresponding to the image based on a first blending degree of the number of clusters P corresponding to each of the k layers.
  • the comparison circuit 1360 is configured to compare the second blending degree and the blending degree threshold of the corresponding grouping number P of the image.
  • an estimation circuit 1370 is configured to estimate that at least P different color temperatures exist in the shooting environment corresponding to the image when the second blending degree of the corresponding clustering number P of the image is greater than the blending degree threshold.
  • the estimation circuit 1370 may be further configured to estimate a difference in a shooting environment corresponding to the image in a case where the second blending degree of the corresponding clustering number P of the image is smaller than the blending degree threshold.
  • the number of light sources of color temperature is not P.
  • the determining circuit 1350 is specifically configured to perform summing or weighted summing processing on the first blending degree P of the number of clusters P corresponding to each of the k layers to obtain the second blending degree of the corresponding number of clusters P of the image .
  • the P is greater than 2; in calculating a first blending degree of the number of clusters P corresponding to the i-th layer in the k-layers, the calculation circuit 1340 is specifically configured to calculate the i-th A third blending degree between every two clusters in the P subgroups of the layer; summing or weighting the third blending degree between each two clusters in the P clusters to obtain the i-th
  • the first blending degree of the grouping number P corresponding to the layer, and the i-th layer is any one of the k layers.
  • the calculation circuit 1340 is specifically configured to: insert a continuous between a center point of cluster gi and a center point of cluster gj
  • the number of the arranged measurement cells is T.
  • the cluster gi and the cluster gj are any two clusters among the P clusters in the i-th layer.
  • the calculation circuit 1340 is configured to count the number of measurement cells Q including color brightness samples in the cluster gi and the cluster gj, and determine that the third degree of integration between the cluster gi and the cluster gj is Q / T, wherein the T and the Q are integers, the T is greater than 0 and the Q is greater than or equal to 0.
  • a length direction of the continuously arranged measurement cells is perpendicular to a line connecting a center point of the cluster gi and a center point of the cluster gj.
  • the two color dimensions included in the three-dimensional color luminance space are a first color dimension and a second color dimension
  • a first color dimension coordinate of a center point of any cluster is equal to all
  • the second color dimension coordinate of the center point of the any group is equal to the average value of the second color dimension coordinates of all the color brightness sample points in the any group.
  • the calculation circuit, the determination circuit, the comparison circuit, and the estimation circuit may perform the calculation, the determination, the comparison, and In the estimation, X is an integer greater than two.
  • an embodiment of the present application further provides a light source estimation device 1400.
  • the light source estimation device 1400 includes a processor 1410 and a memory 1420 that are coupled to each other.
  • a computer program is stored in the memory 1410.
  • the processor 1410 is configured to call a computer program stored in the memory 1420 to execute any one of the light source estimation methods provided in the embodiments of the present invention. For details, refer to the previous embodiments.
  • the processor 1410 may include a central processing unit (CPU) or other processors, such as a digital signal processor (DSP), a microprocessor, a microcontroller, or a neural network calculator.
  • the components of the light source estimation device are coupled together, for example, through a bus system.
  • the bus system may include a data bus, a power bus, a control bus, and a status signal bus.
  • the various buses are marked as the bus system 1430 in the figure.
  • the light source estimation method disclosed in the foregoing embodiment of the present application may be applied to the processor 1410, or implemented by the processor 1410.
  • the processor 1410 may be an integrated circuit chip, and has processing capabilities of image signals.
  • each step of the above-mentioned light source estimation method may be completed by an integrated logic circuit of hardware in the processor 1410 or an instruction in the form of software. That is, the processor 1410 may include, in addition to a computing unit executing software instructions, other hardware accelerators, such as an application-specific integrated circuit, an off-the-shelf programmable gate array or other programmable logic device, a discrete gate or transistor logic device, and a discrete Hardware components.
  • the processor 1410 may implement or execute various light source estimation methods, steps, and logic block diagrams disclosed in the embodiments of the present application. The steps of the light source estimation method disclosed in the embodiments of the present application can be directly implemented as hardware, software, or a combination of hardware and software modules.
  • the software module may be located in a random storage, a flash memory, a read-only memory, a programmable read-only memory, or an electrically erasable programmable memory, a register, and the like, which are mature storage media in the art.
  • the storage medium is located in the memory 1420.
  • the processor 1410 can read information in the memory 1420 and complete the steps of the foregoing method in combination with its hardware.
  • the processor 1410 may be used, for example, to divide an image into m sub-blocks, where m is an integer greater than 1, and obtain m color luminance information groups of the m sub-blocks, each color luminance information group and one sub-block
  • the color brightness information group includes brightness information and color information
  • the m color brightness information groups are mapped to a color brightness three-dimensional space to obtain m color brightness samples located in the color brightness three-dimensional space, where Each color brightness sample corresponds to a color brightness information group, and the color brightness three-dimensional space includes two color dimensions and a brightness dimension
  • the m color brightness samples are divided into k layers along the brightness dimension
  • the first blending degree of the number of clusters P corresponding to each of the k layers is calculated;
  • Each layer corresponds to a brightness interval, the P is an integer greater than 1, and the k is a positive integer; the number of clusters corresponding to the image is determined based on the first
  • an embodiment of the present application further provides an image processing apparatus 1500.
  • the image processing apparatus 1500 includes a processor 1510 and a memory 1520 coupled to each other.
  • a computer program is stored in the memory 1520.
  • the processor 1510 is configured to call a computer program stored in the memory 1520 to execute any image processing method provided by an embodiment of the present invention. In addition to light source estimation, this method also performs the previously mentioned correction operations.
  • an embodiment of the present application further provides an image processing device 1600.
  • the image processing device 1600 includes a light source estimation device 1610 and a correction device 1620 coupled to each other.
  • the light source estimation device 1610 may be, for example, the light source estimation device 1200 or 1300 or 1400.
  • the correction device 1620 is configured to correct the image according to P light sources with different color temperatures.
  • the correction includes at least one of the following corrections: automatic white balance correction, color correction, saturation correction, or contrast correction.
  • the correction device may be, for example, an image signal processor (ISP).
  • the ISP may include at least one of the following correction circuits: an automatic white balance correction circuit 1621, a color correction circuit 1622, a saturation correction circuit 1623, or a contrast correction. Circuit 1624.
  • the automatic white balance correction circuit 1621 may be used to perform automatic white balance correction on an image according to P light sources with different color temperatures.
  • the color correction circuit 1622 can be used to perform color correction on an image according to P light sources with different color temperatures.
  • the saturation correction circuit 1623 can be used to perform saturation correction on an image according to P light sources with different color temperatures.
  • the contrast correction circuit 1624 can be used to perform contrast correction on an image according to P light sources with different color temperatures.
  • An embodiment of the present application further provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program, wherein the computer program is executed by related hardware to complete the execution of any one of the light sources provided by the embodiments of the present invention. Estimate method.
  • an embodiment of the present application further provides a computer-readable storage medium.
  • the computer-readable storage medium stores a computer program, and the computer program is executed by related hardware to complete execution of any image provided by the embodiment of the present invention. Approach.
  • An embodiment of the present application further provides a computer program product, wherein when the computer program product runs on a computer, the computer is caused to execute any one of the light source estimation methods provided by the embodiments of the present invention.
  • an embodiment of the present application further provides a computer program product, and when the computer program product runs on a computer, the computer is caused to execute any one of the image processing methods provided by the embodiments of the present invention.
  • the disclosed device may be implemented in other ways.
  • the device embodiments described above are only schematic.
  • the division of the above units is only a logical function division.
  • multiple units or components may be combined or integrated.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may be electrical or other forms.
  • the functional units in the embodiments of the present application may be integrated into one processing unit, or each of the units may exist separately physically, or two or more units may be integrated into one unit.
  • the above integrated unit may be implemented in the form of hardware or in the form of software functional unit.
  • the technical solution of the present application is essentially a part that contributes to the existing technology or all or part of the technical solution can be embodied in the form of a software product, where the computer software product is stored in a
  • the computer-readable storage medium includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, and specifically a processor in the computer device) to perform all of the foregoing methods of the embodiments of the present application. Or some steps.
  • the foregoing storage medium may include: various programs that can store programs such as a U disk, a mobile hard disk, a magnetic disk, an optical disk, a read-only memory (ROM), or a random access memory (RAM).
  • programs such as a U disk, a mobile hard disk, a magnetic disk, an optical disk, a read-only memory (ROM), or a random access memory (RAM).
  • the medium of the code may include: various programs that can store programs such as a U disk, a mobile hard disk, a magnetic disk, an optical disk, a read-only memory (ROM), or a random access memory (RAM).

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)
  • Color Television Image Signal Generators (AREA)
  • Spectrometry And Color Measurement (AREA)

Abstract

La présente invention concerne un procédé d'estimation de source de lumière, un procédé de traitement d'image et des produits associés. Le procédé d'estimation de source de lumière mettent en correspondance un groupe d'informations de luminosité de couleur d'un sous-bloc de segmentation d'image avec un espace tridimensionnel de luminosité de couleur pour obtenir une pluralité de points d'échantillon de luminosité de couleur dans l'espace tridimensionnel de luminosité de couleur, effectue un traitement de groupe hiérarchique sur la pluralité de points d'échantillon de luminosité de couleur, et calcule le degré de mélange de la grappe de points d'échantillon de luminosité de couleur de chaque couche pour calculer le degré de mélange de l'image entière, et estime si une pluralité de sources de lumière de différentes températures de couleur existe dans l'environnement de photographie de l'image sur la base du degré de mélange. Comme le procédé d'estimation de source de lumière peut estimer si une pluralité de sources de lumière de différentes températures de couleur existe dans l'environnement de photographie de l'image dans une certaine mesure, ceci fournit une base pour effectuer une correction d'image correspondante sur la base du cas où une pluralité de sources de lumière de différentes températures de couleur est comprise, de sorte qu'il est possible d'effectuer une correction d'image ciblée dans le cas où une pluralité de sources de lumière de différentes températures de couleur est comprise, ce qui facilite l'amélioration de la caractéristique ciblée et la rationalité de la correction d'image.
PCT/CN2018/093144 2018-06-27 2018-06-27 Procédé d'estimation de source de lumière, procédé de traitement d'image et produits associés Ceased WO2020000262A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2018/093144 WO2020000262A1 (fr) 2018-06-27 2018-06-27 Procédé d'estimation de source de lumière, procédé de traitement d'image et produits associés
CN201880095117.9A CN112313946A (zh) 2018-06-27 2018-06-27 光源估测方法、图像处理方法和相关产品

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/093144 WO2020000262A1 (fr) 2018-06-27 2018-06-27 Procédé d'estimation de source de lumière, procédé de traitement d'image et produits associés

Publications (1)

Publication Number Publication Date
WO2020000262A1 true WO2020000262A1 (fr) 2020-01-02

Family

ID=68984561

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/093144 Ceased WO2020000262A1 (fr) 2018-06-27 2018-06-27 Procédé d'estimation de source de lumière, procédé de traitement d'image et produits associés

Country Status (2)

Country Link
CN (1) CN112313946A (fr)
WO (1) WO2020000262A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090097745A1 (en) * 2007-10-11 2009-04-16 Korea Advanced Institute Of Science And Technology Method of performing robust auto white balance
CN103929632A (zh) * 2014-04-15 2014-07-16 浙江宇视科技有限公司 一种自动白平衡校正方法以及装置
CN105430367A (zh) * 2015-12-30 2016-03-23 浙江宇视科技有限公司 一种自动白平衡的方法和装置
CN106791758A (zh) * 2016-12-07 2017-05-31 浙江大华技术股份有限公司 一种图像中自然光混合色温的判断方法及装置

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8229215B2 (en) * 2007-12-03 2012-07-24 Omnivision Technologies, Inc. Image sensor apparatus and method for scene illuminant estimation
JP5818668B2 (ja) * 2011-12-16 2015-11-18 株式会社東芝 オートホワイトバランス調整システム
CN105959662B (zh) * 2016-05-24 2017-11-24 深圳英飞拓科技股份有限公司 自适应白平衡调整方法及装置
CN107959851B (zh) * 2017-12-25 2019-07-19 Oppo广东移动通信有限公司 色温检测方法及装置、计算机可读存储介质和计算机设备

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090097745A1 (en) * 2007-10-11 2009-04-16 Korea Advanced Institute Of Science And Technology Method of performing robust auto white balance
CN103929632A (zh) * 2014-04-15 2014-07-16 浙江宇视科技有限公司 一种自动白平衡校正方法以及装置
CN105430367A (zh) * 2015-12-30 2016-03-23 浙江宇视科技有限公司 一种自动白平衡的方法和装置
CN106791758A (zh) * 2016-12-07 2017-05-31 浙江大华技术股份有限公司 一种图像中自然光混合色温的判断方法及装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
CHENG-FU YANG ET AL: "Fuzzy neural system for estimating the color temperature of digitally captured image with fpga implementation", 2015 IEEE 10TH CONFERENCE ON INDUSTRIAL ELECTRONICS AND APPLICATIONS (ICIEA), 15 June 2015 (2015-06-15), pages 410 - 415, XP032815907, DOI: 10.1109/ICIEA.2015.7334148 *

Also Published As

Publication number Publication date
CN112313946A (zh) 2021-02-02

Similar Documents

Publication Publication Date Title
CN111798467B (zh) 一种图像分割方法、装置、设备及存储介质
CN106412547B (zh) 一种基于卷积神经网络的图像白平衡方法、装置和计算设备
CN111587573B (zh) 一种图像处理方法、装置及计算机存储介质
CN112788322B (zh) 自适应白平衡处理方法、装置、介质及电子设备
TWI777536B (zh) 針對圖像識別模型的增強訓練方法及裝置
CN111369659B (zh) 一种基于三维模型的纹理映射方法、装置及设备
CN105898263B (zh) 一种图像白平衡方法、装置和计算设备
WO2022257396A1 (fr) Procédé et appareil de détermination de point de pixel de frange de couleur dans une image et dispositif informatique
CN110213556B (zh) 单纯色场景下的自动白平衡方法及系统、存储介质及终端
CN114745532B (zh) 混合色温场景白平衡处理方法、装置、存储介质及终端
WO2022199710A1 (fr) Procédé et appareil de fusion d'images, dispositif informatique, et support de stockage
CN113301318A (zh) 图像的白平衡处理方法、装置、存储介质及终端
CN105681775A (zh) 一种白平衡方法和装置
TW201830337A (zh) 對圖像執行自動白平衡的方法和設備
CN115587948A (zh) 一种图像暗场校正方法及设备
CN111670575B (zh) 图像的白平衡处理方法和装置
CN110033412A (zh) 一种图像处理方法及装置
CN112492286A (zh) 一种自动白平衡校正方法、装置及计算机存储介质
CN109064490B (zh) 一种基于MeanShift的运动目标跟踪方法
CN114723610A (zh) 图像智能处理方法、装置、设备及存储介质
JP7003776B2 (ja) 色決定方法、装置及び電子機器
US9131200B2 (en) White balance adjusting method with scene detection and device thereof
WO2020000262A1 (fr) Procédé d'estimation de source de lumière, procédé de traitement d'image et produits associés
CN113177886B (zh) 图像处理方法、装置、计算机设备及可读存储介质
CN119181026B (zh) 一种高光谱图像变化检测方法、装置及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18924631

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18924631

Country of ref document: EP

Kind code of ref document: A1