[go: up one dir, main page]

HK1203011B - Devices and methods for automated self-training of auto white balance in electronic cameras - Google Patents

Devices and methods for automated self-training of auto white balance in electronic cameras Download PDF

Info

Publication number
HK1203011B
HK1203011B HK15103354.7A HK15103354A HK1203011B HK 1203011 B HK1203011 B HK 1203011B HK 15103354 A HK15103354 A HK 15103354A HK 1203011 B HK1203011 B HK 1203011B
Authority
HK
Hong Kong
Prior art keywords
white balance
color
electronic camera
images
real
Prior art date
Application number
HK15103354.7A
Other languages
Chinese (zh)
Other versions
HK1203011A1 (en
Inventor
Changmeng Liu
Original Assignee
豪威科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 豪威科技股份有限公司 filed Critical 豪威科技股份有限公司
Publication of HK1203011A1 publication Critical patent/HK1203011A1/en
Publication of HK1203011B publication Critical patent/HK1203011B/en

Links

Abstract

The present invention is titled a device and method for automated self-training of auto weight balance in an electronic camera. The method for calibrating auto white balance in an electronic camera includes: (a) obtaining a plurality of color values from a respective plurality of images of real-life scenes captured by the electronic camera under a first illuminant; (b) invoking an assumption about a true color value of at least portions of the real-life scenes; and (c) determining, based upon the difference between the true color value and the average of the color values, a plurality of final auto white balance parameters for a respective plurality of illuminants including the first illuminant. The electronic camera device includes an image sensor for capturing real-life images of real-life scenes, instructions including a partially calibrated auto white balance parameter set and auto white balance self-training instructions, and a processor for processing the real-life images according to the self-training instructions to produce a fully calibrated auto white balance parameter set specific to the electronic camera.

Description

Automatic self-training device and method for automatic white balance of electronic camera
Technical Field
The present invention relates to the field of auto white balance or color balance technology for electronic cameras, and more particularly, to an apparatus and method for automatic self-training of auto white balance for electronic cameras.
Background
White balancing is the process of removing impractical color shifts from images captured by an electronic camera so that these images provide a realistic color representation of a scene. For example, an object in a scene that appears white to the human eye appears white by white balancing the initial output of the image sensor. It is very good for the human eye to judge what is white under different light sources, but image sensors often have great difficulty doing so and often create unsightly shifts in blue, orange or green color. Different luminophores, i.e. light sources, have their unique spectral characteristics. The spectral characteristics of a given illuminant may be represented by its color temperature. The color temperature of a light source is the temperature of an ideal black body radiator that radiates light of a comparable hue to the light source. The color temperature represents the relative warmth or coolness of white light. As the color temperature increases, the light energy increases. Thus, the wavelength of the light emitted by the emitter becomes shorter, i.e., shifted to the blue portion of the visible spectrum, and the color tone becomes cooler.
An image sensor that captures an image of a scene illuminated by a given illuminant will first produce an image having a color that is affected by the color temperature of the illuminant. Therefore, many electronic cameras use Automatic White Balance (AWB) to correct the color output of the image sensor according to the illuminant. To apply AWB, an electronic camera must have AWB parameters for each light emitter, often described as the gain of the color channel. The AWB unit of the electronic camera first decides which luminary to use to illuminate the scene. The AWB unit then applies the AWB parameters for that illuminant to the image of the scene to provide an image with a more realistic representation of the color of the scene.
Generally, to generate a set of AWB parameters for an electronic camera, the electronic camera captures images of gray objects (e.g., specially fabricated gray cards) under various color temperature illumination conditions that represent the range of illuminants encountered in actual use. For example, images are captured under four different reference illuminants: a D65 light source corresponding to midday daylight and having a color temperature of 6504 degrees; a Cold White Fluorescent (CWF) tube having a color temperature of 4230 degrees K; TL84 fluorescent lamp tube with 4000K color temperature; and light source a (incandescent tungsten lamp) with a color temperature of 2856K. Ideally, a manufacturer of electronic cameras having AWB functions should perform such a calibration procedure for each electronic camera manufactured. However, such an embodiment is generally too expensive. It is common practice in the image sensor industry to calibrate one or a small number of electronic cameras called gold modules under various lighting conditions and then apply the resulting set of AWB parameters to all other image sensors. However, the differences between the sensors are essentially due to the differences in spectral characteristics (e.g., the spectral characteristics of quantum efficiency, the color filter array, and the ir cut filter of the image sensor). Therefore, all other image sensors frequently cause errors using the golden module AWB parameter set.
Disclosure of Invention
In one embodiment, a method of correcting automatic white balance in an electronic camera includes: (a) obtaining a plurality of first color values from a plurality of first images of a plurality of real scenes captured by an electronic camera under a first illuminant; (b) invoking an assumption about true color values of at least a portion of a real scene; and (c) determining a plurality of final auto white balance parameters for each of the plurality of light emitters including the first light emitter based on a difference between the true color value and the average value of the first color value.
In one embodiment, an electronic camera device includes: (a) the image sensor is used for capturing a real image of a real scene; (b) a non-volatile memory having a plurality of machine-readable instructions, the instructions including a set of partially corrected auto white balance parameters and a plurality of auto white balance self-training instructions; and (c) a processor for processing the real image according to the self-training instructions to generate a set of fully corrected auto white balance parameters, wherein the set of fully corrected auto white balance parameters is specific to the electronic camera.
Drawings
FIG. 1 shows an exemplary scenario 100 for automated self-training of an electronic camera including a self-training module, according to one embodiment.
Fig. 2 is a diagram showing exemplary AWB parameters for a plurality of exemplary light emitters, in accordance with one embodiment.
Fig. 3 shows an exemplary electronic camera including modules for automated self-training of AWB parameters, in accordance with one embodiment.
Fig. 4 shows an exemplary memory of an electronic camera including a module for automated self-training of AWB parameters, in accordance with one embodiment.
Fig. 5 illustrates an exemplary method for correcting AWB parameter sets for an electronic camera, utilizing in part automated self-training by the electronic camera through imaging of real scenes, in accordance with one embodiment.
Fig. 6 is a diagram showing an exemplary conversion performed in the method of fig. 5 with respect to an exemplary plurality of luminaires, wherein the base AWB parameter set is converted into the initial AWB parameter set, according to an embodiment.
Fig. 7 is a diagram showing an exemplary conversion performed in the method of fig. 5 with respect to an exemplary plurality of luminaires, wherein an initial AWB parameter set is converted to a final AWB parameter set, in accordance with one embodiment.
Fig. 8 illustrates an exemplary method for correcting AWB parameters for a reference light by image capture with a gray card, according to one embodiment.
FIG. 9 shows an exemplary method for performing the automated self-training portion of the method of FIG. 5 using a gray world assumption, in accordance with one embodiment.
Fig. 10 is a diagram illustrating an example method for identifying an example luminaire, according to one embodiment.
FIG. 11 shows an exemplary method for performing the automated self-training portion of the method of FIG. 5 using a universal face tone hypothesis, according to one embodiment.
Fig. 12 illustrates an exemplary method for correcting AWB parameters for reference illuminant by imaging of a sample set of human faces, in accordance with one embodiment.
Detailed Description
Disclosed herein are apparatus and methods for correcting AWB parameters of electronic cameras based in part on automated self-training of the camera during initial use by an actual user. Automated self-training completes the AWB calibration procedure to provide fully calibrated AWB functionality while protecting the manufacturer from very costly calibration expenditures. The AWB calibration procedure includes at least three main steps. First, a gold module electronic camera is used to generate a basic set of AWB parameters that cover luminaries with a range of color temperatures. The basic AWB parameter set is applied to all electronic cameras associated with gold module electronic cameras, for example, all cameras of the same model or all cameras from the same manufacturing line. The AWB parameters for a single reference illuminant (e.g., the D65 illuminant) are then corrected for each individual electronic camera. After this step, the camera is delivered to the user. Finally, a second AWB parameter for another light emitter is corrected by automated self-training of the electronic camera during normal use by the user. After calibration of the second AWB parameter by automated self-training, the entire set of AWB parameters is converted in accordance with the two corrected AWB parameters.
Fig. 1 shows an exemplary scenario 100 for automated self-training of an electronic camera 110. The electronic camera includes a self-training module 120 and an AWB parameter set 130. A user captures images of real scenes 150. The self-training module 120 analyzes the imagery of the real scene 150 to update the AWB parameter set 130 from an initial AWB parameter set (provided by the electronic camera) to a final AWB parameter set (used for automatic white balancing of imagery captured after automated self-training). In one embodiment, the initial set of AWB parameters is a base set of AWB parameters obtained from a calibration of an associated gold module electronic camera. In another embodiment, the initial set of AWB parameters is a set of AWB parameters obtained by adjusting a basic set of AWB parameters obtained from a calibration of an associated gold module electronic camera in accordance with a local calibration by a manufacturer of the electronic camera 110.
Fig. 2 is a graph 200 showing exemplary AWB parameters for a plurality of exemplary light emitters. The graph 200 includes AWB parameters 220, 222, 224, and 226 for each of the luminaires D65, TL84, CWF, and a. In one embodiment, the AWB parameters 220, 222, 224, and 226 are the basic AWB parameters obtained from calibration of a gold module electronic camera by capturing images under illuminant D65, TL84, CWF, and a. The graph 200 places the AWB parameters 220, 222, 224, and 226 in a two-dimensional space spanned by the lateral axis 210 and the longitudinal axis 212. It is assumed that color is defined by the relative intensities of the three primary color components (e.g., red (R), green (G), and blue (B)) output by the image sensor, as is most commonly used in RGB image sensors in electronic cameras. Each of the lateral axis 210 and the longitudinal axis 212 represents a color ratio. A point in space spanned by the transverse axis 210 and the longitudinal axis 212 represents an ordered pair x, y of color ratios. The color ratio of the ordered pair defines a color composite. Examples of color ratios for ordered pairs include [ G/B, G/R ], [ (R B)/(G2), B/R ], [ log (G/B), log (G/R) ], [ log ((R B)/(G2)), log (B/R) ], and derivatives thereof. Hereinafter, assume that the color ratio of the ordered pair is [ G/B, G/R ]. Other ordered pairs of color ratios, such as those described above with other sets of primary colors, may be used without departing from this scope.
As is evident from the dispersion of the AWB parameters 220, 222, 224 and 226 in the graph 200, the individual luminaires D65, TL84, CWF and a have different color compositions. For example, emitter D65 (label 220) is shifted towards the blue end of the visible spectrum, while emitter a (label 226) is shifted towards the red and green portions of the visible spectrum. The emitters TL84, CWF and a appear redder and less blue than the emitter D65. This shows the importance of proper white balance of the images captured by the electronic camera, depending on the illuminant illuminating the scene. For example, if the image is not white balanced, the image captured under illuminant a may appear to have a red color shift. The white balance of the image captured under illuminant a is achieved by modifying the color of the image according to the color ratio of the ordered pairs associated with illuminant a in graph 200. Under the specified assumption of the ordered pair [ G/B, G/R ], the blue and red color composition of the image is multiplied by the respective color ratios of the horizontal axis 210 and the vertical axis 212. By characterizing the illuminant in terms of color ratios G/B and G/R, the graph 200, or any of its identical or non-identical figures, suitably provides the color gain to be used to white balance this image. Other examples of ordered pairs (e.g., [ (R x B)/(G2), B/R ]) will provide the same color gain after simple algebraic manipulation.
Fig. 3 shows an exemplary electronic camera 300. Electronic camera 300 is one embodiment of electronic camera 110 of fig. 1 and includes self-training module 120 of fig. 1. The electronic camera 300 includes an image sensor 310 for capturing images formed thereon through objective lenses 320. Electronic camera 300 also includes a processor 330, a memory 340, and an interface 380. Processor 330 is communicatively coupled to image sensor 310, memory 340, and interface 380. The memory 340 includes the AWB parameter set 130 of fig. 1, a plurality of machine readable instructions 350, and a data store 360. Memory 340 may include both volatile and non-volatile memory. In certain embodiments, the instructions 350 and the AWB parameter set 130 are stored in a non-volatile portion of the memory 340, while portions of the data storage 360 are provided in volatile memory. The processor 330 processes the image captured by the image sensor 310 according to the instructions 350. The electronic camera 300 further includes an optional power supply 385 and a cavity 390 for respectively powering and environmentally protecting the components of the electronic camera 300. During automated self-training of the electronic camera 300 for automatic white balance, images captured by the image sensor 310 are processed by the processor 330 according to self-training instructions included in the instructions 350 to update the AWB parameter set 130 from the initially provided AWB parameter set to the final AWB parameter set.
For example, the processor 330 analyzes the captured images according to the instructions 350 and stores images deemed suitable for AWB self-training to the data store 360 based thereon. When a sufficient number of images suitable for AWB self-training have been stored in the data store 360, the processor 330 analyzes the stored images according to instructions 350 to determine a final set of AWB parameters. During this process, the temporary values and results generated by processor 330 may be stored to data store 360 or maintained in a working memory not shown in FIG. 3. The processor 330 then stores the final AWB parameter set as the AWB parameter set 130.
Processor 330, instructions 350, and data store 360 together form one embodiment of self-training module 120 of FIG. 1. The processor 330, instructions 350, and data store 360 can all perform other functions that are not self-training with respect to AWB. The processor 330 may automatically white balance images captured after completing the self-training according to the instructions 350. In one example of use, all images captured during AWB self-training are stored to the data store 360. After completion of AWB self-training, all stored images can be automatically white balanced by the processor 330 according to the instructions 350 and using the final AWB parameter set 130. Accordingly, a suitably auto-white balanced version of the imagery captured during AWB self-training may be made available to the user of the electronic camera 300.
Images captured by the image sensor 310 and optionally white balanced by the processor 330 may be output to a user via the interface 380. Interface 380 may include, for example, a display and a wired or wireless communication port. Interface 380 may further be used to receive instructions and other data from external sources, such as a user.
Fig. 4 shows an exemplary memory 400, which is an embodiment of memory 340 of electronic camera 300 (fig. 3). The memory 400 includes the AWB parameter set 130 (fig. 1 and 3), a plurality of instructions 450, and a data store 460. Instruction 450 is an embodiment of instruction 350 (FIG. 3). The instructions 450 include components, some of which tasks will be discussed later in this disclosure. The instructions 450 include color value extraction instructions 451 to extract color information from the image, such as the intensities of the primary colors as discussed in connection with fig. 2. The instructions 450 include color ratio calculation instructions 452 to calculate color ratios based on color values determined using the color value extraction instructions 451, such as those discussed in relation to FIG. 2. The instructions 450 include color ratio to AWB parameter calculation instructions 453 to solve the AWB parameters from the color ratios determined using the color ratio calculation instructions 452, as discussed in relation to fig. 2. The instructions 450 further include: a plurality of illuminant identification instructions 454 for identifying illuminants under which images are captured by the image sensor 310 (FIG. 3) of the electronic camera 300, for example; face detection instructions 455 for detecting faces in the image; and a plurality of AWB parameter conversion instructions 456 for converting the base set of AWB parameters generated by the golden module calibrating or partially correcting, initially provided set of AWB parameters, into a final set of AWB parameters. A processor, such as processor 330 (fig. 3), executes instructions 451 through 456. The memory 400 also includes a plurality of hypotheses 480 to utilize in automated AWB self-training based on imagery of a real scene. Hypotheses 480 may include gray world hypothesis instructions 481 and/or universal face tone hypothesis instructions 482.
Data store 460 is one embodiment of data store 360 (FIG. 3). Data storage 460 includes image storage 461, color value storage 462, and color ratio storage 463. A processor, such as processor 330 of fig. 3, may access all of these storage elements. The image storage 461 stores images captured by an image sensor (e.g., the image sensor 310 of FIG. 3). The color value store 462 stores color values generated by, for example, the processor 330 of FIG. 3 in accordance with the color value extraction instructions 451. Color ratio storage 463 is used to store color ratios generated by, for example, processor 330 of fig. 3, according to color ratio calculation instructions 452.
In certain embodiments, the data store 460 also includes an initial set of AWB parameters 464 that is a locally corrected set of AWB parameters that is derived from information provided by the manufacturer with the electronic camera, such as the electronic camera 300 (fig. 3), either by the manufacturer. In such an embodiment, the AWB parameter set 130 is a base AWB parameter set obtained by calibration of an associated gold module electronic camera. In accordance with the AWB parameter conversion instructions 456, the initial AWB parameter set 464 can be generated, such as by the processor 330 (fig. 3), based on the base AWB parameter set 130 and the manufacturer provided information stored in the memory 400. In other embodiments, an electronic camera, such as electronic camera 300 (fig. 3), having memory 400 is provided by the manufacturer with the AWB parameter set 130, which is an initial AWB parameter set resulting from a local calibration of the electronic camera. In this case, the initial AWB parameter set 464 is not needed.
Fig. 5 shows an exemplary method 500 for correcting the set of AWB parameters for an electronic camera using automated self-training by the electronic camera through image capture of real scenes. Automated self-training may be performed by a user during normal use of the electronic camera and to complete local calibration performed by the camera manufacturer. The method 500 is implemented in the electronic camera 110 of fig. 1 or the electronic camera 300 of fig. 3.
In step 510, a set of basic AWB parameters is obtained from a calibration of an associated gold module electronic camera under a plurality of illuminants. The graph 200 of fig. 2 shows an exemplary set of basic AWB parameters with four AWB parameters 220, 222, 224, and 226 for four individual luminaires D65, TL84, CWF, and a. In one example, the manufacturer of the electronic camera 300 (fig. 3) stores the base AWB parameter set to the electronic camera 300 as the AWB parameter set 130 (fig. 1 and 3). The processor 330 of the electronic camera 300 may then retrieve the AWB parameter set 130 from the memory 340 as needed.
In step 520, the electronic camera captures an image under a reference illuminant, which is one of the illuminants used to generate the set of basic AWB parameters obtained in step 510. For example, before shipping electronic camera 300 (fig. 3) to a user, its manufacturer captures a plurality of images under D65 illuminant by using electronic camera 300. In step 530, the image captured in step 520 is analyzed to determine AWB parameters for the reference emitter, where the AWB parameters are specifically calibrated for the electronic camera, such as the electronic camera 300 (fig. 3).
In step 540, the base set of AWB parameters obtained in step 510 is converted into an initial set of AWB parameters such that the initial AWB parameter for the reference emitter is the one obtained in step 530. In one embodiment, step 540 is performed by the manufacturer and the generated initial set of AWB parameters are stored to an electronic camera, such as electronic camera 300 (fig. 3), as, for example, the set of AWB parameters 130 (fig. 1 and 3). In another embodiment, the initial AWB parameters generated in step 530 for the reference emitter are stored to the electronic camera, for example, to the memory 340 (fig. 3) of the camera 300 (fig. 3). In the present embodiment, the basic AWB parameter set obtained in step 510 is also stored in the electronic camera, such as the AWB parameter set 130 (fig. 1 and 3) of the electronic camera 300 (fig. 3). The conversion of the base AWB parameter set to the initial AWB parameter set is then performed on the board of the electronic camera. For example, the processor 330 (fig. 3) of the electronic camera 300 (fig. 3) having the memory 400 (fig. 4) implemented as the memory 340 (fig. 3) performs the conversion of the AWB parameter sets 130 in accordance with the AWB parameter conversion instructions 456. The processor 330 (fig. 3) then stores the generated AWB parameter set to the memory 400 (fig. 4) as the initial parameter set 464 (fig. 4).
In step 550, an image of the real scene is captured using an electronic camera. Step 550 is performed as by a user capturing imagery of a real scene using electronic camera 300 (fig. 3) with memory 400 (fig. 4) implemented as memory 340 (fig. 3). The processor 330 (fig. 3) receives the real images from the image sensor 310 (fig. 3) and either stores the real images in the image store 461 (fig. 4) or maintains them in working memory for further processing in the subsequent step 555. In step 555, the electronic camera analyzes the real-world image captured in step 550. Real images captured under a given, first illuminant are used to correct AWB parameters for the first illuminant. The first luminaire is one of the luminaires, or one substantially similar thereto, used to generate the set of basic AWB parameters obtained in step 510. The first illuminant is different from the reference illuminant used in step 530. Step 555 is performed, for example, by processor 330 (fig. 3) of electronic camera 300 (fig. 3) having memory 400 (fig. 4) implemented as memory 340 (fig. 3). Processor 330 (fig. 3) analyzes images received by image sensor 310 (fig. 3) or retrieved from image store 461 (fig. 4). Next, the processor 330 (fig. 3) analyzes the real image according to the illuminant identification instruction 454 (fig. 4), and selects the real image captured under, for example, illuminant a for further processing according to the instruction 450 (fig. 4) to determine a corrected AWB parameter for illuminant a. Steps 550 and 555 may be performed simultaneously or sequentially with step 540.
In step 560, the initial set of AWB parameters generated in step 540 is further converted in accordance with the calibration generated in step 555 of the AWB parameters for the first light emitter. This will result in a final set of AWB parameters that are specifically calibrated for this particular electronic camera. The final set of AWB parameters includes the corrected AWB parameters for the reference and first light emitters generated in steps 540 and 555, respectively. Step 560 is performed, for example, by processor 330 (fig. 3) of electronic camera 300 (fig. 3) having memory 400 (fig. 4) implemented as memory 340 (fig. 3). The processor 330 (fig. 3) obtains an initial AWB parameter set from the AWB parameter set 130 (fig. 1 and 3) or the initial AWB parameter set 464. The processor 330 (fig. 3) then converts the initial AWB parameter set in accordance with the AWB parameter conversion instructions 456 (fig. 4).
Steps 550, 555, and 560 constitute an automated self-training portion of the calibration of the AWB parameters for the electrophotographic.
Fig. 6 is a chart 600 showing an exemplary conversion performed in step 540 (fig. 5) of method 500 with respect to an exemplary plurality of illuminants. Graph 600 shows a transformation of the base AWB parameters obtained in step 510 (fig. 5) to form an initial set of AWB parameters in step 540 (fig. 5), where the transformation is performed in a color ratio parameter space as discussed in relation to fig. 2. Graph 600 pertains to graph 200 of fig. 2, wherein graph 200 shows a base AWB parameter set. Step 530 (fig. 5) provides the AWB parameters to the reference illuminant specifically calibrated for the electronic camera in question. In the graph 600, the reference emitter is assumed to be the D65 emitter. In step 540 (fig. 5), the set of basic AWB parameters is converted to change the position of the basic AWB parameters for illuminant D65 (tag 220) to the position of the specifically corrected AWB parameters for illuminant D65 (tag 620) obtained in step 530 (fig. 5). This results in an initial AWB parameter set consisting of specifically corrected AWB parameters 620 for the D65 emitter and converted AWB parameters 622, 624 and 626 for the individual emitters TL84, CWF and a.
Fig. 7 is a chart 700 showing an exemplary conversion performed in step 560 (fig. 5) of method 500 with respect to an exemplary plurality of illuminants. Graph 700 pertains to graph 600 (fig. 6), wherein AWB parameters 620, 622, 624, and 626 of fig. 6 constitute an initial AWB parameter set. Step 560 (fig. 5) converts the initial AWB parameters into a final set of AWB parameters, which includes specifically corrected AWB parameters 620 and 726 for illuminant a generated in step 555 (fig. 5). The remaining AWB parameters that are not specifically corrected by using the electronic camera in question are thus converted. In the non-limiting example shown in graph 600, the initial set of AWB parameters is converted by rotation 730 followed by scaling 740. The rotation 730 rotates the initial set of AWB parameters about an axis of rotation that coincides with the particular corrected AWB parameter 620. Scaling 740 scales the rotated set of parameters along line 770 such that the AWB parameters 620 are unaffected by the scaling and the initial AWB parameters 626 end up at the location of the specifically corrected AWB parameters 726. Thus, the initial AWB parameters 622 and 624 are rotated and scaled to generate final AWB parameters 722 and 724. The result is a final set of AWB parameters, which consists of the final AWB parameters 620, 722, 724 and 726 for each illuminant D65, TL84, CWF and a.
In certain embodiments, the transformations performed in steps 540 and 560 (fig. 5) of method 500, as shown by the examples of graphs 600 (fig. 6) and 700 (fig. 7), are performed by applying matrix operations to the AWB parameter sets in a two-dimensional color ratio space. Steps 540 and 560 (fig. 5) of method 500 may be performed by using two respective matrix operations, one of which includes the conversion of step 540 (fig. 5) and the other of which includes the conversion of step 560 (fig. 5), respectively. Alternatively, the transformations of steps 540 and 560 (FIG. 5) of method 500 are performed using a single matrix operation, where the matrix applied is the product of two respective matrices associated with the transformations of steps 540 (FIG. 5) and 560 (FIG. 5).
In one embodiment, the initial set of AWB parameters generated in step 540 is further converted to place the AWB parameters for the reference emitter at the origin of the coordinate system performing the conversion. Referring to the example of graph 600 (fig. 6), AWB parameters 620, 622, 624, and 626 are translated such that AWB parameter 620 is at the origin. This simplifies subsequent manipulation of the initial set of AWB parameters performed in step 560 (fig. 5).
The complete AWB calibration procedure for an electronic camera is like a camera specific conversion of the underlying AWB parameter set. The specific calibration of the AWB parameter for the reference light (step 530 of fig. 5) provides a first anchor point, while the specific calibration of another AWB parameter obtained by automated self-training (step 555 of fig. 5) provides a second anchor point. In some embodiments, the two illuminants used for a particular calibration of the AWB parameters are at opposite extremes of the color temperature range. This may provide improved accuracy of the final AWB parameter set.
FIG. 8 shows an exemplary method 800 for performing steps 520 and 530 (FIG. 5) of method 500. In step 810, which is an embodiment of step 520 (FIG. 5), an image is captured by an electronic camera that references a gray card illuminated by a light. For example, the electronic camera 300 of FIG. 3 captures an image of a gray card illuminated by the D65 illuminant. In step 820, the color of each image of the gray card is determined. In one embodiment, the function on the board of the electronic camera performs step 820. For example, the processor 330 of the electronic camera 300 (FIG. 3) having the memory 400 (FIG. 4) implemented as the memory 340 (FIG. 3) processes the captured images according to the color value extraction instructions 451 (FIG. 4). In another embodiment, step 820 is performed using a function (e.g., a device at a manufacturing facility) external to the electronic camera (e.g., electronic camera 300 (FIG. 3)). Step 820 may be performed prior to complete assembly of the electronic camera. In step 830, the colors obtained in step 820 are averaged to determine an average color of the image of the gray card illuminated by the reference illuminant. Step 830 may be performed external to an electronic camera, such as electronic camera 300 (fig. 3). Alternatively, step 830 may be performed on board the electronic camera, such as by processor 330 (fig. 3) of electronic camera 300, in accordance with instructions 350 (fig. 3).
The average color obtained in step 830 may be different from the actual color of the gray card. For example, the average color may be shifted towards red or blue. In step 840, the AWB parameters for the reference emitter are corrected such that the corrected AWB parameters, when applied to the average color determined in step 830, produce the color gray, i.e., the actual color of the gray card. In one embodiment, step 840 is performed on a board of an electronic camera. For example, the processor 330 (FIG. 3) of the electronic camera 300 executes step 840 according to the instructions 350 (FIG. 3). In another embodiment, step 840 is performed external to the electronic camera.
The method 800 illustrates the processing of the images in steps 810, 820, and 830, using all images processed by step 810, followed by all images processed by step 820, followed by all images processed by step 830. Without deviating from this scope, the image may instead be processed sequentially by two subsequent steps of steps 810, 820 and 830, or all of steps 810, 820 and 830.
FIG. 9 shows one exemplary method 900 for performing step 555 (FIG. 5) of method 500. The method 900 is part of automated self-training based on real-world imagery and utilizes so-called gray world assumptions. The gray world assumption states that given an image with a sufficient amount of color variation, the average of its primary components (e.g., R, G and B components) should average out to a common gray level. In general, this assumption is a reasonable approximation, since any given real-world scene typically has many color variations. However, a single real-world scene may have a color composition that does not average to a gray level, such as a scene consisting primarily of blue sky. However, during normal use of the electronic camera, the camera will likely capture images of a wide variety of real-world scenes such that the average color of the captured images is indeed gray.
In step 910, a color value is determined for each real image captured by the electronic camera. In one embodiment, the color value of the real image is the average color of the image. Step 910 is performed, for example, by processor 330 of electronic camera 300 (fig. 3) having memory 400 (fig. 4) implemented as memory 340 (fig. 3). The processor 330 (fig. 3) either receives images from the image sensor 310 (fig. 3) or retrieves images from the image store 461 (fig. 4) and processes the images according to the color value extraction instructions 451 (fig. 4). In step 920, the color values obtained in step 910 are evaluated to identify the real images captured under the first illuminant. In one embodiment, a real-world image having an associated color value within a particular range of color values of a gray card illuminated by a first light emitter is considered to be captured under the first light emitter. Step 920 is performed, for example, by processor 330 of electronic camera 300 (fig. 3) having memory 400 (fig. 4) implemented as memory 340 (fig. 3). The processor 330 (fig. 3) obtains color values from the color value store 462 (fig. 4) and processes the color values according to the illuminant identification instructions 454 (fig. 4) to identify a real-world image captured under illuminant a, for example. The processor 330 (fig. 3) then stores the captured real-world image or record thereof under the first illuminant in the image store 461 (fig. 4) and/or stores the color values associated therewith in the color value store 462 (fig. 4).
Fig. 10 is a chart 1000 showing step 920 (fig. 9) of method 900 with respect to one exemplary first illuminant (illuminant a of chart 200 (fig. 2)). The graph 1000 is the same as the graph 200 of fig. 2, except that a range 1010 of color values near the AWB parameter 226 is displayed, which is interpreted to be caused by real images captured under illuminant a.
Returning to FIG. 9, in step 930, an average color value for the real images captured under the first illuminant is determined, the real images contributing the average being those identified in step 920. Step 920 is performed, for example, by processor 330 of electronic camera 300 (fig. 3) having memory 400 (fig. 4) implemented as memory 340 (fig. 3). The processor 330 (FIG. 3) retrieves the appropriate color values from the color value store 462 (FIG. 4) and calculates the average color value according to the instructions in the color value extraction instructions 451 (FIG. 4).
Step 940 invokes the gray world assumption discussed above. For example, processor 330 of electronic camera 300 (fig. 3) having memory 400 (fig. 4) implemented as memory 340 (fig. 3) invokes the gray world assumption. The processor 330 (fig. 3) retrieves the gray world assumption instruction 481 from the instruction 450 of the memory 400. In step 950, camera-specific correction AWB parameters for the first luminaire are determined by using the gray world assumption invoked in step 940. Camera-specific calibration AWB parameters for the first illuminant are determined based on the gray world assumption such that the AWB parameters, when applied to a real image captured under the first illuminant, produce an average color of the real image belonging to gray. In some embodiments, the average color value obtained in step 930 is represented in a color ratio. For example, the average color ratio is represented as a color ratio of an ordered pair, which defines the relative intensities of the three primary color components, as discussed in relation to FIG. 2. Camera-specific correction AWB parameters can then be calculated from the color ratios of the ordered pairs. Step 950 is performed as by processor 330 of electronic camera 300 (fig. 3) having memory 400 (fig. 4) implemented as memory 340 (fig. 3). Processor 330 (FIG. 3) obtains color values from color value store 462 (FIG. 4), derives color ratios according to instructions in color ratio calculation instructions 452 (FIG. 4), and stores the color ratios in color ratio store 463 (FIG. 4). Next, the processor 330 (fig. 3) processes the color ratios stored in the color ratio store 463 (fig. 4) according to the color ratio versus AWB parameter calculation instructions 453 (fig. 4) to generate camera-specific corrected AWB parameters for the first light emitter.
Method 900 illustrates processing of images in steps 910 and 920, using all images processed in step 910, followed by all images processed in step 920. In one embodiment, an electronic camera, such as electronic camera 300 (fig. 3), is pre-configured to capture a number (e.g., 100 or 1000) of real-world images prior to performing method 900. Without departing from the scope hereof, the real images may instead be processed sequentially by steps 910 and 920, rather than first performing step 910 on all real images and then performing step 920 on all real images. This may be extended to the sequential performance of step 550 (fig. 5), step 910, and step 920, which allows an electronic camera (e.g., electronic camera 300 of fig. 3) to continuously assess the amount of available data available for performance of subsequent steps of method 900. In addition, the sequential capture and processing of the images in step 550 (FIG. 5) and steps 910 and 920 allows for reduced storage requirements. Only the storage of the color values retrieved from the image is required for self-training, not the storage of the full image. In one example, at step 550 (FIG. 5), an image is captured by electronic camera 300 (FIG. 3) having memory 400 (FIG. 4) implemented as memory 340 (FIG. 3). The processor 330 (fig. 3) performs steps 910 and 920 for this image. If the image is captured under the first illuminant, the processor 330 (FIG. 3) determines the color value of the image according to the color value extraction instructions 451 (FIG. 4). The processor 330 (FIG. 3) stores the color value in the color value store 462 (FIG. 4).
In one embodiment, when a certain number (e.g., 50 or 500) of real images are identified in step 920, an electronic camera (e.g., the electronic camera 300 of fig. 3) is pre-configured to continue to step 930. In some embodiments, self-training occurs gradually. Since the number of images captured by the electronic camera increases, the steps 550 (fig. 5), 910 and 920, and 560 (fig. 5) are performed a plurality of times. This results in a gradually improved final AWB parameter set, since the accuracy of the grey world assumption increases with the number of different scenes imaged by the electronic camera. In a further embodiment, the self-training consisting of steps 550 (FIG. 5), 910 and 920, and 560 (FIG. 5) is repeated periodically throughout the life of the electronic camera.
FIG. 11 shows one exemplary method 1100 for performing step 555 (FIG. 5) of method 500. The method 900 is part of a real-image based automated self-training and utilizes all faces, regardless of the race or ethnic group, to have essentially the same facial tone. Hue is related to color perception and represents the degree to which a color is similar to or different from a set of primary colors. The hue may be represented by primary color components such as R, G and B, as illustrated by the equation of precicil:
method 1100 is similar to method 900 (fig. 9) using the gray world assumption, except that method 1100 includes identifying faces in real images and using a universal face tone assumption to derive an AWB parameter.
The first two steps of method 1100 are steps 910 and 920 of method 900 (fig. 9). After performing steps 910 and 920, the method 1100 continues to step 1125. When using a face detection algorithm, step 1125 selects a subset of real images identified in step 920 as captured under the first illuminant, which also include at least one face. Step 1125 is performed, for example, by processor 330 of electronic camera 300 (FIG. 3) having memory 400 (FIG. 4) implemented as memory 340 (FIG. 3). The processor 330 (fig. 3) obtains the real image identified in step 920 from the image storage 461 (fig. 4), and processes the real image according to the face detection instruction 455 (fig. 4). The processor 330 (fig. 3) then stores the captured real-world image under the first illuminant and also including at least one human face or a record of these images in the image storage 461 (fig. 4). In step 1130, the average color of the face in the real image selected in step 1125 is determined according to the color extraction instructions 451, such as by the processor 330 (FIG. 3) of the electronic camera 300 having the memory 400 (FIG. 4) implemented as the memory 340 (FIG. 3).
Step 1140 invokes the universal face tone assumption discussed above. For example, the universal face tone hypothesis is invoked by the processor 330 of the electronic camera 300 (FIG. 3) having the memory 400 (FIG. 4) implemented as the memory 340 (FIG. 3). Processor 330 (fig. 3) retrieves universal face hue hypothesis instruction 482 from instruction 450 of memory 400. In step 1150, camera-specific correction AWB parameters for the first light are determined by using the universal face tone hypothesis invoked in step 1140. Camera-specific calibration AWB parameters for the first illuminant are set according to a universal face hue assumption to produce an average hue of a face in a real image belonging to the universal face hue when applied to a real image captured under the first illuminant and including at least one face. Note that the average hue of a human face can be extracted from the average color by using the equation of precicil discussed above. In certain embodiments, the average color obtained in step 1130 is expressed in terms of a color ratio. For example, the average color ratio is represented as a color ratio of an ordered pair, which defines the relative intensities of the three primary color components, as discussed in relation to FIG. 2. Camera-specific correction AWB parameters can then be calculated from the color ratios of the ordered pairs. Step 1150 is performed, for example, by processor 330 of electronic camera 300 (fig. 3) having memory 400 (fig. 4) implemented as memory 340 (fig. 3). Processor 330 (FIG. 3) retrieves the colors from color value store 462 (FIG. 4), derives the color ratios according to color ratio calculation instructions 452 (FIG. 4), and stores the color ratios in color ratio store 463 (FIG. 4). Next, the processor 330 (fig. 3) processes the color ratios stored in the color ratio store 463 (fig. 4) according to the color ratio versus AWB parameter calculation instructions 453 (fig. 4) to generate camera-specific corrected AWB parameters for the first light emitter.
The method 1100 illustrates the processing of the images in steps 910, 920, and 1125, using all images processed in step 1110, followed by all images processed in step 920, followed by all images processed in step 1125. In one embodiment, an electronic camera (e.g., the electronic camera 300 of fig. 3) is pre-configured to capture a number (e.g., 100 or 1000) of real-world images before the method 1100 is performed. Without departing from this scope, instead of propagating the full set of real images as a group through steps 910, 920 and 1125, the real images may be processed sequentially by two subsequent steps of steps 910, 920 and 1125, or all of steps 910, 920 and 1125. This may be extended to the sequential performance of step 550 (fig. 5), step 910, step 920, and step 1125, which allows an electronic camera (e.g., electronic camera 300 of fig. 3) to continuously assess the amount of available data that is available for performance of subsequent steps of method 1100. In addition, the sequential capture and processing of the images in step 550 (fig. 5) and steps 910, 920 and 1125 allows for a reduction in storage requirements. Only the storage of the color values retrieved from the image is required for self-training, not the storage of the full image. In one example, at step 550 (FIG. 5), an image is captured by electronic camera 300 (FIG. 3) having memory 400 (FIG. 4) implemented as memory 340 (FIG. 3). The processor 330 (fig. 3) then performs steps 910 and 920 for this image, and if appropriate, performs step 1125. If the image is captured under the first illuminant and includes at least one face, the processor 330 (fig. 3) fetches a color value representing the color tone of the face in the image according to the color value extraction instruction 451 (fig. 4). The processor 330 (FIG. 3) stores this color value to the color value 462 (FIG. 4).
In one embodiment, when a certain number (e.g., 50 or 500) of real images have been identified in step 1125, an electronic camera (e.g., electronic camera 300 of FIG. 3) is pre-configured to continue to step 1130. In some embodiments, self-training occurs gradually. Since the number of images captured by the electronic camera increases, the steps 550 (fig. 5), 910, 920, and 1125, and 560 (fig. 5) are performed a plurality of times. This may result in a gradually improved final AWB parameter set, as the number of different scenes imaged by the electronic camera increases. In a further embodiment, the self-training consisting of step 550 (FIG. 5), steps 910, 920 and 1125, and step 560 (FIG. 5) is repeated periodically throughout the life of the electronic camera.
In contrast to self-training based on the gray world assumption, self-training based on the universal face tone assumption may require a smaller number of real images to provide a correct calibration of the AWB parameters for the first light emitter. This is because each person's face has a hue that closely approximates the hue of a universal face, although it is likely that it will require a large number of real images to achieve an average color composite of gray. On the other hand, an electronic camera (e.g., the electronic camera 300 of fig. 3) may be first employed by a user to capture images of a real-world scene that does not include a human face. In some embodiments, an electronic camera (e.g., the electronic camera 300 of fig. 3 includes both a gray world hypothesis instruction and a universal face tone hypothesis instruction, and will select either of the two hypotheses depending on the type of image captured.
FIG. 12 shows an exemplary method 1200 for performing steps 520 and 530 (FIG. 5) of method 500. Method 1200 replaces method 800 of fig. 8. The method 1200 utilizes the assumption of universal face tone to correct the AWB parameters for the reference illuminant. In step 1210, the electronic camera captures images of a set of sample faces, actual faces, or a copy thereof illuminated by the reference illuminant. For example, the electronic camera 300 of fig. 3 captures images of a set of sample faces illuminated by a D65 illuminant. In step 1220, the color of each image of a sample face is determined. In one embodiment, a function on the board of the electronic camera performs step 1220. For example, the processor 330 (fig. 3) of the electronic camera 300 having the memory 400 (fig. 4) implemented as the memory 340 (fig. 3) processes the captured image according to the face detection instructions 455 (fig. 4) to locate a human face in the image. The processor 330 (FIG. 3) then processes portions of the image associated with a human face according to the color value extraction instructions 451 (FIG. 4). In another embodiment, step 1220 is performed using a function (e.g., a device at a manufacturing facility) external to the electronic camera (e.g., electronic camera 300 (FIG. 3)). Step 1220 may be performed prior to full assembly of the electronic camera. In step 1230, the colors obtained in step 1220 are averaged to determine an average color of the face in the captured image under the reference illuminant. Step 1230 may be performed external to an electronic camera, such as electronic camera 300 (fig. 3). Alternatively, step 1230 may be performed on board the electronic camera, e.g., by processor 330 (fig. 3) of electronic camera 300, in accordance with instructions 350 (fig. 3).
The average color obtained in step 1230 may represent a hue different from the hue of a generic face. For example, the hue may be shifted to red or blue compared to the hue of a human face. In step 1240, the AWB parameters for the reference light emitter are corrected such that the corrected AWB parameters, when applied to the average color determined in step 1230, produce a color representing the color tone of the universal face. In one embodiment, step 1240 is performed on a board of the electronic camera. For example, the processor 330 (fig. 3) of the electronic camera 300 executes step 1240 according to the instructions 350 (fig. 3). In another embodiment, step 1240 is performed external to the electronic camera.
The method 1200 illustrates the processing of the images in steps 1210 and 1220, using all images processed by step 1210, along with all images processed by step 1220. The images may instead be processed sequentially by steps 1210 and 1220 without departing from this scope.
Combinations of features
The above-mentioned features may be combined in various ways with those claimed below without departing from the scope hereof. For example, it will be appreciated that implementations of one apparatus or method for automated self-training of auto-white balance in an electronic camera described herein may incorporate or exchange features of another apparatus or method for automated self-training of auto-white balance in an electronic camera described herein. The following examples illustrate possible, non-limiting combinations of the above embodiments. It should be apparent that numerous other changes and modifications can be made herein to the methods and apparatus without departing from the spirit and scope of the invention.
(A) A method of correcting automatic white balance in an electronic camera, comprising: (i) obtaining a plurality of first color values from a plurality of first images of a plurality of real scenes captured by the electronic camera under a first illuminant; (ii) invoking an assumption about true color values of at least portions of the real scene; and (iii) determining a plurality of final auto white balance parameters based on a difference between the true color value and the average of the first color values.
(B) The method of (a), the plurality of final automatic white balance parameter associations comprising a respective plurality of illuminants for the first illuminant.
(C) The method of (a) and (B), the plurality of final auto white balance parameters comprising a final first auto white balance parameter for the first light emitter.
(D) The method of (C), the deciding step comprising deciding the final first automatic white balance parameter based on a difference between the true color value and an average value of the first color value.
(E) The method of (C) and (D), further comprising converting a plurality of initial automatic white balance parameters including an initial first automatic white balance parameter for the first light emitter to generate the plurality of final automatic white balance parameters, the initial first automatic white balance parameter being converted into the final first automatic white balance parameter.
(F) The method of (a) through (E), the obtaining step comprising selecting the first plurality of images from a superset of images captured by the electronic camera of a plurality of real scenes, each image of the first plurality of images captured by the first illuminant.
(G) The method of (a) to (F), each of the first color values being an average color of the respective image.
(H) The method of (G), the true color value being an average color of the plurality of real scenes, the average color being gray.
(I) The method of (a) to (F), each of the first plurality of images includes at least one face, and each of the first color values defines an average hue of the at least one face.
(J) The method of (I), wherein the true color value is an average hue of the faces in the plurality of real scenes, and the average hue is a universal face hue.
(K) The method of (I) and (J), the obtaining step comprising selecting the first plurality of images from a superset of images captured by the electronic camera of a plurality of real scenes, each image of the first plurality of images captured by the first illuminant and including at least one human face.
(L) the method of (K), the obtaining step further comprising applying a face detection routine to the images of the superset.
(M) the method of (E) to (L), each of the first images having a color defined by a first, second, and third primary color; and said converting step is performed in a two-dimensional space spanned by first and second color ratios of an ordered pair, said first and second color ratios together defining said relative values of said first, second and third primary colors.
(N) the method of (M), the converting step comprising rotating and scaling the initial set of white balance parameters within the two-dimensional space.
(O) the method as described in (M) and (N), the ordered pair being [ second primary color/third primary color, second primary color/first primary color ], [ first primary color/third primary color/second primary color ^2, third primary color/first primary color ], [ Log (second primary color/third primary color), Log (second primary color/first primary color) ], [ Log (first primary color/third primary color/second primary color ^2), Log (third primary color/first primary color) ], or a derivative thereof.
(P) the method of (C) to (O), the plurality of initial automatic white balance parameters including an initial second automatic white balance parameter for a second light emitting body, the method further comprising determining the plurality of initial automatic white balance parameters by: (i) obtaining a plurality of basic automatic white balance parameters including basic second automatic white balance parameters for the second light emitter; (ii) correcting the base second automatic white balance parameter to generate a corrected value thereof; and (iii) converting the base automatic white balance parameter set to generate the initial automatic white balance parameter set, the initial second automatic white balance parameter being the correction value.
(Q) the method of (P), the correcting step comprising capturing, by the electronic camera, a second plurality of images of one or more scenes under the second illuminant, such that the correction values, when applied to white balance the second plurality of images, produce an average color of the second plurality of images that is gray in color.
(R) the method of (P), the correcting step comprising capturing, by the electronic camera, a second plurality of images of one or more scenes under the second illuminant, each of the one or more scenes comprising a human face; and said correction values, when applied to white balance said second plurality of images, produce an average hue of said face that is a universal face hue.
(S) the method of (P) to (R), wherein the basic auto white balance parameters are determined from images captured by the second electronic camera.
(T) an electrophotographic apparatus comprising: (i) the image sensor is used for capturing a plurality of real images of a plurality of real scenes; (ii) a non-volatile memory comprising a plurality of machine-readable instructions, the instructions comprising a partially corrected set of automatic white balance parameters and a plurality of automatic white balance self-training instructions; and (iii) a processor for processing the real image in accordance with the self-training instructions to generate a set of fully corrected auto white balance parameters, the set of fully corrected auto white balance parameters being specific to the electronic camera device.
(U) the apparatus of (T), the self-training instructions comprising assumptions about the real scene.
(V) the apparatus of (U), the assumption comprising a plurality of assumptions that the average color of the real scene is gray.
(W) the apparatus of (V), the hypothesis comprising a hypothesis that the hue of the plurality of faces is a universal face hue.
(X) the apparatus of (T) to (W), the self-training instructions comprising a plurality of illumination recognition instructions that, when executed by the processor, identify the real-world imagery of the subset captured under a first illuminant.
(Y) the apparatus of (X), a plurality of automatic white balance parameter conversion instructions that, when executed by the processor, convert a partially corrected set of automatic white balance parameters to a fully corrected set of automatic white balance parameters based on an analysis of the image identified by using the illumination identification instructions.
(Z) the apparatus of (T) to (Y), the self-training instructions further comprising face detection instructions that, when executed by the processor, identify faces in the real images.
Without departing from the spirit and scope of the present invention, it is noted that the above-mentioned methods and systems can be modified and altered, and that they are described in the above specification and drawings by way of example only, and not by way of limitation. The following claims are intended to cover both the generic and specific features described, and all statements of the scope of the present method and system, which, as a matter of language, might be said to fall there between.

Claims (20)

1. A method of correcting automatic white balance in an electronic camera, comprising:
obtaining a plurality of first color values from a respective first plurality of images of a respective plurality of real scenes captured by the electronic camera under a first illuminant, the electronic camera including a plurality of initial auto-white balance parameters including pre-calibrated auto-white balance parameters for reference illuminants different from the first illuminant;
invoking assumptions regarding true color values of at least portions of the real scene; and
determining a plurality of final auto white balance parameters for a respective plurality of lights including the first light emitter and the reference light emitter based on the plurality of initial auto white balance parameters and a difference between the true color value and an average of the first color value.
2. The method of claim 1, wherein the plurality of final automatic white balance parameters includes a final first automatic white balance parameter for the first light emitter, and the determining step includes:
determining the final first auto white balance parameter based on a difference between the true color value and an average of the first color values; and
converting the plurality of initial automatic white balance parameters including an initial first automatic white balance parameter for the first light emitter to generate the plurality of final automatic white balance parameters, the initial first automatic white balance parameter being converted into the final first automatic white balance parameter.
3. The method of claim 1, wherein the obtaining step comprises: selecting the first plurality of images from a superset of images captured by the electronic camera of a plurality of real scenes, each image of the first plurality of images captured under the first illuminant.
4. The method of claim 1, wherein:
each of the first color values is an average color of each image; and
the true color value is an average color of the plurality of real scenes, and the assumption is that the average color of the plurality of real scenes is gray.
5. The method of claim 1, wherein:
each of the first plurality of images comprises at least one face;
each of the first color values defines an average hue of the at least one human face; and
the real color value is an average hue of the faces in the plurality of real scenes, and the assumption is that the average hue of the faces in the plurality of real scenes is a universal face hue.
6. The method of claim 5, wherein:
the obtaining step includes selecting the first plurality of images from a superset of images captured by the electronic camera of a plurality of real scenes, each image of the first plurality of images captured under the first illuminant and including at least one human face.
7. The method of claim 6, wherein the obtaining step further comprises applying a face detection routine to the images of the superset.
8. The method of claim 2, wherein:
each of the first plurality of images has a color defined by a first, second, and third primary color; and
the converting step is performed in a two-dimensional space spanned by first and second color ratios of an ordered pair, which together define the relative intensities of the first, second and third primary colors.
9. The method of claim 8, wherein the converting step comprises: rotating and scaling the plurality of initial automatic white balance parameters within the two-dimensional space.
10. The method of claim 8, wherein the ordered pair is [ second primary color/third primary color, second primary color/first primary color ], [ (first primary color/third primary color)/(second primary color ^2), third primary color/first primary color ], [ Log (second primary color/third primary color), Log (second primary color/first primary color) ] or [ Log ((first primary color/third primary color)/(second primary color ^2)), Log (third primary color/first primary color) ].
11. The method of claim 2, further comprising determining the plurality of initial automatic white balance parameters by:
obtaining a plurality of basic automatic white balance parameters including basic second automatic white balance parameters for the reference light emitter;
correcting the base second automatic white balance parameters to produce the pre-corrected automatic white balance parameters; and
converting the plurality of base automatic white balance parameters to generate the plurality of initial automatic white balance parameters.
12. The method of claim 11, wherein:
the calibrating step includes capturing, by the electronic camera, a second plurality of images of one or more scenes beneath the reference illuminant; and
the pre-corrected automatic white balance parameters, when applied to white balance the second plurality of images, produce an average color of the second plurality of images that is gray.
13. The method of claim 11, wherein:
the correcting step comprises: capturing, by the electronic camera, a second plurality of images of one or more scenes beneath the reference illuminant, each of the one or more scenes comprising a human face; and
the pre-corrected automatic white balance parameters, when applied to white balance the second plurality of images, produce an average hue of the face that is a universal face hue.
14. The method of claim 11, wherein the plurality of basic auto white balance parameters are determined from a plurality of images captured by a second electronic camera.
15. An electronic camera device comprising:
the image sensor is used for capturing a plurality of real images of a plurality of real scenes;
a non-volatile memory comprising machine-readable instructions including a partially corrected set of automatic white balance parameters including initial automatic white balance parameters having a pre-corrected automatic white balance parameter for a reference light emitter and automatic white balance self-training instructions; and
a processor for processing the real images of the subset captured under a first illuminant different from the reference illuminant according to the auto white balance self-training instruction to generate a set of fully corrected auto white balance parameters, the set of fully corrected auto white balance parameters being specific to the electronic camera device.
16. The apparatus of claim 15, wherein the automatic white balance self-training instructions comprise assumptions about the real scene.
17. The apparatus of claim 16, wherein the hypotheses comprise a plurality of hypotheses that an average color of the real scene is gray.
18. The apparatus of claim 16, wherein the hypotheses comprise hypotheses that a hue of the plurality of faces is a universal face hue.
19. The apparatus of claim 15, wherein the automatic white balance self-training instructions comprise:
a plurality of illumination identification instructions that, when executed by the processor, identify the real-world image of the subset captured under a first light emitter; and
a plurality of automatic white balance parameter conversion instructions that, when executed by the processor, convert the partially corrected set of automatic white balance parameters to the fully corrected set of automatic white balance parameters based on an analysis of the imagery identified by using the illumination identification instructions.
20. The apparatus of claim 19, wherein the auto white balance self-training instructions further comprise face detection instructions that, when executed by the processor, identify faces in the plurality of real images.
HK15103354.7A 2013-03-13 2015-04-02 Devices and methods for automated self-training of auto white balance in electronic cameras HK1203011B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361780898P 2013-03-13 2013-03-13
US61/780,898 2013-03-13

Publications (2)

Publication Number Publication Date
HK1203011A1 HK1203011A1 (en) 2015-10-09
HK1203011B true HK1203011B (en) 2018-03-02

Family

ID=

Similar Documents

Publication Publication Date Title
TWI551150B (en) Devices and methods for automated self-training of auto white balance in electronic cameras
TWI769509B (en) Method of using a camera system to characterize ambient illumination, computer readable storage medium, and camera system
Banić et al. Unsupervised learning for color constancy
TWI539812B (en) Automatic white balance methods for electronic cameras
Nguyen et al. Training-based spectral reconstruction from a single RGB image
CN113170028B (en) Method for generating image data for machine learning based imaging algorithms
HK1225544A1 (en) Automatic white balance methods for electronic cameras
US20070196095A1 (en) Color balanced camera with a flash light unit
TWI588779B (en) Automatic white balance systems for electronic cameras
HK1225884A1 (en) Automatic white balance methods and systems for electronic cameras
US8478028B2 (en) Method and system for converting at least one first-spectrum image into a second-spectrum image
CN117221741A (en) Image white balance adjustment method and device, electronic equipment and storage medium
JP2006217082A (en) Imaging apparatus, observation condition parameter setting method thereof, and program
HK1203011B (en) Devices and methods for automated self-training of auto white balance in electronic cameras
WO2012020381A1 (en) Method and apparatus for recognizing an interesting object
CN114245095B (en) A white balance processing method and system for color imaging of image sensor
Tódová et al. Constrained spectral uplifting for HDR environment maps
WO2018198062A1 (en) A method of controlling lighting systems, corresponding system and computer program product
CN121304514A (en) A method, apparatus, and electronic device for lamp color reproduction
Bianco et al. Adaptive Illuminant Estimation and Correction for Digital Photography