HK1197709B - Spectral image processing - Google Patents
Spectral image processing Download PDFInfo
- Publication number
- HK1197709B HK1197709B HK14111335.5A HK14111335A HK1197709B HK 1197709 B HK1197709 B HK 1197709B HK 14111335 A HK14111335 A HK 14111335A HK 1197709 B HK1197709 B HK 1197709B
- Authority
- HK
- Hong Kong
- Prior art keywords
- color values
- wavelength
- input color
- representation
- spectral
- Prior art date
Links
Description
Cross reference to related applications
Priority of U.S. provisional patent application serial No.61/581,051, 2011, 12, 28, 61/581,048, 2012, 12, 5, and 61/733,551, all of which are hereby incorporated by reference.
Technical Field
The present disclosure relates to color processing for digital images. More particularly, embodiments of the present invention relate to processing methods for transforming between three colors and spectral representations of a digital image and for processing such spectral representations.
Background
As used herein, the phrases "spectral composition" and "spectral composition for image capture device processing" may relate to processing methods that may be performed or calculated to achieve, for example, accurate color output from an image capture device. Three-color processing models such as RGB (red, green, blue) are common. While RGB and other tri-color models are sufficient for color recognition, matching, and classification, such models may be inherently limited in color processing. By its nature, light comprises the electromagnetic spectrum, which is generally not represented completely by color values such as red, green, and blue. Using RGB-based information and tristimulus values corresponding to cones receiving short, medium, and long wavelength light (e.g., blue, green, and red), the Human Visual System (HVS) attempts to infer the original, natural stimulus.
Multispectral systems typically capture, process, and display multispectral images. The multispectral camera may output more than three channels, for example. The output channels may be presented with a multi-primary printer or display. Some multispectral systems are designed to present a printout with a reflectance spectrum that is nearly equal to the reflectance spectrum of the original object. Multispectral representations of images generally fall into two categories. The more common type measures intensity or reflectance over a smaller interval in wavelength, which typically requires the use of more than three channels (e.g., more than channels R, G, and B) (see [1], which is incorporated herein by reference in its entirety). An unusual type uses the Wyszecki hypothesis (see reference [2], which is incorporated herein by reference in its entirety), which characterizes the reflectance spectrum as being composed of two components, capturing the base component of the perceptually relevant three-color representation plus a residual component representing a large number of features of the entire reflectance spectrum. Wyszecki marks this residual component as a metameric (metameric) black. An example of this second type is the LabPQR color space. In the LabPQR notation, the tristimulus portion is the Lab color space, while the PQR notation remains. In order to reproduce and present images using electronic display emission, the reflective spectral characteristics are not critical.
The pictures produced by a camera or other image capture device are generally not exactly as perceived by the human eye.
The processing internal to the image capture device typically includes transforming the sensor output to a 3x3 matrix of the color space of the output image. The result of applying this matrix transformation will generally not reproduce what is perceived by the human eye unless the spectral sensitivity of the sensor of the image capturing device can be expressed as a linear combination of color matching functions. In many cases, the magnitude of these errors in the results is not trivial.
For example, an existing DSLR (digital single lens reflex camera) camera may have knobs to select different 3x3 matrices for different types of scenes (e.g., night, motion, clouds, portrait, etc.). In practice, however, it may be problematic to make the colors correct in general and for certain memory colors such as facial (skin) tones, for example.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate one or more embodiments of the present invention and, together with the detailed description, serve to explain the principles and implementations of the disclosure.
FIG. 1A depicts a spectral image processing method according to an embodiment of the present disclosure.
FIG. 1B depicts a spectrum expansion module.
FIG. 1C depicts operation of an image capture device color sensor output according to an embodiment of the present disclosure.
Fig. 2 depicts an (x, y) chromaticity space divided into rectangular bandpass and bandgap spectra.
Fig. 3 depicts spectral representations of three parameters and a rectangle of four parameters.
Fig. 4 depicts a rectangular spectrum in the λ domain and the colors represented in the chromaticity space.
Fig. 5A-5C depict circular representations of the lambda domain.
Fig. 6A depicts a spectral synthesis module according to the present disclosure.
Fig. 6B depicts a wavelength determination module according to the present disclosure.
Fig. 7 depicts a rectangular representation of RGB (red, green, blue) primaries, white, magenta, and RYGCBV (red, yellow, green, cyan, blue, violet) filter banks.
Fig. 8 depicts an S-shaped mapping curve relating input and output color intensities.
Fig. 9 describes a digital camera processing method.
Fig. 10 depicts a simplified digital camera processing method.
Fig. 11 depicts the comparison of the output color from the camera and the actual color.
Fig. 12 depicts CIE (commission internationale de l' eclairage) 1931 color matching functions.
Fig. 13 depicts camera spectral sensitivity for a digital camera.
Fig. 14A depicts a method for synthesizing correct color output based on image capture device sensor output in accordance with an embodiment of the present disclosure.
FIGS. 14B-14C depict examples of substantially rectangular spectral representations.
15A-15D describe an image capture device processing method that utilizes a synthesized substantially rectangular spectrum according to an embodiment of the present disclosure.
Fig. 16 depicts an example of an image capture device process that is equivalent to mapping from captured RGB (red, green, blue) values to correct color XYZ (corresponding to tristimulus values of the CIE1931 color space) values.
FIG. 17 depicts a method for determining an image capture device gamut in accordance with an embodiment of the present disclosure.
Figure 18 depicts bandpass and bandgap spectra in the linear and circular wavelength domains.
Fig. 19 depicts behavior near the boundary between the bandpass and bandgap spectra.
FIGS. 20A and 20B are depicted for use at [ λ ]↑,λ↓]The color of the rectangular spectrum represented in the plane.
Detailed Description
In an example embodiment of the present disclosure, a method for synthesizing a substantially rectangular spectral representation based on input color values, the input color values being based on an input spectrum of an image, is presented, the method comprising: providing input color values, wherein each input color value is associated with a corresponding analysis function; determining a first wavelength and a second wavelength of a substantially rectangular spectral representation based on the input color values; and calculating a scale factor based on the first wavelength, the second wavelength, and any of the input color values and its corresponding analytical function to synthesize a substantially rectangular spectral representation based on the input color values, wherein: if the synthesized substantially rectangular spectral representation is applied to an analysis function corresponding to the input color value, the synthesized substantially rectangular spectral representation is adapted to produce the input color value, and the first wavelength comprises wavelengths where the substantially rectangular spectral representation is transformed from a lower value to a scale factor and the second wavelength comprises wavelengths where the substantially rectangular spectral representation is transformed from the scale factor to the lower value.
In an example embodiment of the present disclosure, a system configured to synthesize a substantially rectangular spectral representation based on input color values, the input color values being based on an input spectrum of an image, wherein each input color value is associated with a corresponding analysis function, is presented, the system comprising: a wavelength determination module configured to determine a first wavelength and a second wavelength of a substantially rectangular spectral representation based on an input color value; and a scale factor calculation module configured to calculate a scale factor based on any of the first wavelength, the second wavelength, and the input color value and its corresponding analytical function to synthesize a substantially rectangular spectral representation based on the input color value, wherein: if the synthesized substantially rectangular spectral representation is applied to an analysis function corresponding to the input color value, the synthesized substantially rectangular spectral representation is adapted to produce the input color value, and the first wavelength comprises wavelengths where the substantially rectangular spectral representation transforms from a lower value to a scale factor and the second wavelength comprises wavelengths where the substantially rectangular spectral representation transforms from a scale factor to a lower value.
Tristimulus-based systems and spectral or multispectral systems may be largely incompatible and practiced by individual enterprises. The present disclosure bridges the gap between tristimulus-based systems and spectral or multispectral systems and describes methods for transforming from the tristimulus domain to the spectral domain. These transformations enable spectral and multispectral image processing methods to be applied to the tristimulus image data.
As used herein, the term "image capture device" may refer to any device adapted to form an image. The image capture device captures visual information in the form of still or moving pictures. Image information associated with such images (e.g., image size, image resolution, file format, etc.) may also be stored. Processing of the stored information may also be performed. Such image capture devices may include cameras and/or line scan cameras, flat panel scanners, and other such devices.
As used herein, the term "synthesizing" may refer to generating a signal based on a combination of entities including parameters and/or functions. According to the present disclosure, synthesis based on color values or spectral representations of image capture device sensor outputs is provided.
As used herein, the terms "actual color" and "correct color" are used interchangeably and are defined herein to mean the color perceived by the human visual system.
As used herein, the term "module" may refer to a unit configured to perform certain functions. A module may be implemented in hardware, software, firmware, or a combination thereof.
Part 1
Fig. 1A depicts an image processing method according to an embodiment of the present disclosure. Fig. 1B depicts the expansion module described in fig. 1A, including a spectral synthesis module (600A), followed by a filter bank (700). As described in fig. 1C, the processing may not necessarily occur within the image capture device. In this case, there may simply be an extension to the substantially rectangular representation, followed by projection into the original domain in order to render the exact color (further explained in section 2).
Referring now to fig. 1A, a tristimulus representation of color values produced by an input spectrum, e.g., RGB (red, green, blue), is first expanded to six base representations RYGCBV (red, yellow, green, cyan, blue, violet). This selection of the number of primaries is enhanced in part by methods of statistical analysis of large databases of real-world object reflectivities. In this case, it can be shown that the statistical variation of the 99% reflectivity data set can be described by as few as 6 basis vectors (see reference [3], which is incorporated herein by reference in its entirety). The image may then be processed in the RYGCBV domain. After processing, the RYGCBV representation of the processed image may then be projected back onto the original RGB domain. It should be noted that although RGB and RYGCBV color spaces are considered in this discussion, color spaces such as YUV, YCbCr, HSV, CMYK and other color spaces known to those skilled in the art may also be considered.
By way of example and not limitation, the input (102A) and output (118A) may also be XYZ tristimulus values. By way of example and not limitation, the extended representation (105A, 115A) may include many values other than 6, such as 31, if the visible spectrum is considered to be 10 nm increments from 400 nm to 700 nm. Other possibilities include using 7 color Representations (ROYGCBV) or 10 color representations. The 6 color representation is useful because it provides a balance between accuracy and computational complexity.
Expanding RGB or XYZ values may not produce a unique spectral expansion due to metamerism (a phenomenon in which two different input spectra may produce the same RGB color values). However, as long as the selected spectral representation is applied to an analytical function corresponding to RGB or XYZ values (discussed in more detail below), producing RGB or XYZ values produced by the input spectrum, an accurate color representation relative to the actual color may be maintained.
Any given color is the spectrum of light. Such spectra can be approximated according to the equation given below:
wherein S [ lambda ]]Which is representative of the input light spectrum,represents an approximate representation, CiRepresenting the ith color output value, Bi[λ]Denotes the ith basis function, and N denotes the number of basis functions. For example, the approximate RGB representation may be represented by the following equation
A basis function is generally defined as a function. For the CIE (International Commission on illumination) 1931 color space (a mathematically defined color space created by CIE in 1931; see reference [4], which is incorporated herein by reference in its entirety), the basis functions are narrow bands with peaks at 435.8, 546.1, and 700 nanometers. For displays, the basis function is the spectral emission.
Basis function and matching analysis function Ai[λ]Correlation, which may be used to determine the color output value C according to the following equationi:
Wherein the matching analysis function and the basis function are related according to the following equation:
Ai[λ]·Bj[λ]=ij
and the integration limits at 360 nm and 810 nm represent the lower limit of the visible wavelength (lambda)min) And upper limit (λ)max)。
The foregoing equations may also be generalized for use with other analytical functions (e.g., may be extended to include infrared and/or ultraviolet). Although the above equations indicate orthogonal basis functions, other basis functions may be used. By way of example and not limitation, the basis functions may be orthogonal with respect to the matrix. Although generally unlikely, it should be noted that the analytical function may be equal to its corresponding basis function. The analysis function also has significance. For the CIE1931 color space, they are spectral matching functions. The analysis function may also be the spectral sensitivity of the image capture device or the eye spectral sensitivity.
Embodiments of the present disclosure use a substantially rectangular spectrum similar to that proposed by MacAdam to generate a new representation of a given color using, for example, the six primary colors RYGCBV.
MacAdam formalizes the spectral representation for the "maximum efficiency" reflection spectrum (see reference [5 ]]Which is incorporated herein by reference in its entirety). These spectra have the property that for any desired chromaticity and saturation, the efficiency (i.e. the brightness of the reflection) is maximized. Such spectra may be interpreted as "optimal ink". This spectrum family is complete-any possible chromaticity can be represented. MacAdam characterizes these reflectance spectra as having binary values of 0 and 1, and two transform wavelengths, λ for the 0 → 1 transition↑And λ for 1 → 0 transition↓. This results in band pass and band gap spectra that occupy the (x, y) chromaticity domain, as in fig. 2As described in (1).
Although MacAdam treats these rectangular spectra as pure reflectance spectra, for a general spectral representation of light, they can be extended by introducing a scale factor I:
a rectangular spectrum of three parameters is suitable for representing all possible perceivable colors. However, real objects cannot generally be represented completely by rectangular spectra. Real objects tend to reflect or transmit some light at all wavelengths, even though reflection may be dominant over a more limited range of wavelengths. This can be solved in large part by adding an additional parameter that represents a low value for the rectangular spectrum. This can be written as a rectangular spectrum of four parameters:
it should be noted that a rectangular spectrum of three parameters can be represented as a rectangular spectrum of four parameters with a low value Il set to 0. The spectral plots for the rectangular spectral representation of the three and four parameters are depicted in fig. 3. A further discussion of the properties of the rectangular spectrum, which forms an integral part of the present disclosure, is given in table a.
A given color represented by (x, y) coordinates in chromaticity space can be represented as a rectangular bandpass or bandgap spectrum (see fig. 4). Alternatively, [ lambda ]min,λmax]The domains (wavelengths of visible light) themselves can be interpreted as circles, allowing all spectra to have a bandpass form. In this case, the original band gap spectrum becomes equal to λmin/λmaxPoint-crossed bandpass spectra (see fig. 5A-5C). The highlighted portions of the circular domains depicted in fig. 5B and 5C indicate wavelengths at which the value of the rectangular spectrum is equal to the scaling factor I. Will [ lambda ]min,λmax]The interpretation of the fields as circles results in all spectra having a band-pass form, also depicted in fig. 18, further illustrated in table a.
To perform spectral expansion on the original tristimulus values (which may be from any color space) from the image to be processed, the chromaticity coordinates are first used to calculate λ↑And λ↓. The scaling factor I is then derived from λ↑、λ↓And the original tristimulus values.
Fig. 6A depicts a spectral synthesis module (600A) that may be used in the image processing method depicted in fig. 1A (specifically, as part of the extension depicted in fig. IB). The XYZ tristimulus values (610A) are input to a wavelength determination module (620A, described in detail in fig. 6B) to determine the transition wavelength (615B) λ↑And λ↓. The calculating may include calculating (600B) values x and y (605B) based on XYZ tristimulus values (610A):
a two-dimensional look-up table (2D-LUT, 610B) may then be used to map values x and y to determine a transition wavelength (615B) λ↑And λ↓. Based on these transition wavelengths lambda↑And λ↓The scale factor I may be determined by a scale factor calculation module (650A) consisting of a cyclic integration module (640A) and a division module (660A). The cyclic integration module (640A) may be configured to convert the wavelength λ↑And λ↓Performing a spectral analysis function within a defined interval(630A) By way of example, and not limitation, FIG. 6A depicts integration of any of the cycles inThe cycle integration of (2). As will be appreciated by those skilled in the art, the functions of the spectral analysisOrAny combination may also be used to perform the loop integration. The result of the cyclic integration module (640A) is then provided to a division module (660A) to generate the scaling factor I. Specifically, the scale factor I can be given by:
these integrals can be calculated, for example, using two table lookups and addition/subtraction operations. In this way, the three parameters [ λ ] of the rectangular spectrum↑,λ↓,I](670) Are determined to synthesize a rectangular spectrum.
As described in fig. 7, the synthesized rectangular spectrum may be filtered (the filter banks may, but need not, be equal in bandwidth) to generate RYGCBV color values. At this point, image processing may be performed on the RYGCBV color values.
According to several embodiments of the present disclosure, types of processing operations are described that are applicable to three-color and spectral or multispectral representations. For conventional RGB data associated with an image, there are two common mathematical operations: 3x3 matrix multiplication and independent nonlinear transformation of each RGB channel. Matrix multiplication is commonly used to adjust the color and/or effect of a light object. The non-linear transformation is often referred to as tone mapping because the non-linear transformation changes the visual tone (brightness and contrast) of the image. Note that any number of 3x3 matrix multiplications may be collapsed into a single 3x3 matrix multiplication and similarly any number of one-dimensional (1D) transforms for each of the RGB channels may be similarly collapsed into only one non-linear transform for each of those three channels. Thus, the image may be processed via a set of three non-linear transformations, one for each channel associated with the image, followed by a matrix multiplication.
For example, even with only the traditional three RGB channels, the above-specified transformations, including performing a non-linear transformation followed by a matrix multiplication, can accurately generalize the differences between the now common image formats with limited intensity range and color to potential future formats with larger intensity ranges (often called dynamic ranges) and color gamuts.
Such transformations can be generalized to the multispectral case for any number of color channels via the following relationship:
Oi=MijTj[Ij]
i,j=I,...,N
wherein IjRepresenting the jth input color channel value (e.g., R, G, or B in an RGB representation, or R, Y, G, C, B, or V in a RYGCBV representation), TjRepresentation is applied to IjNon-linear transformation of (m)ijDenotes the N × N matrix, and OiRepresenting the ith output color channel value.
The spectral image processing methods discussed above may be applied to spectral color correction. In practice, the primary color correction may be applied directly to the RGB channels independently via modification. While these modifications may address the primary color operations that will need to be performed, it is difficult to operate only on a particular hue. For example, it is difficult to make the yellow chromaticity stronger without modifying R and G (adjacent colors) or B (lowering blue has the effect of directly increasing yellow). Fundamentally, this is because a three color channel is sufficient to match a given color, but insufficient for hue control, because there are perceptually four fundamental hues, red vs. green and blue vs. yellow, as described by the antagonistic color theory (opponentcolorortheory).
In practice, this can be handled by a quadratic color correction. The secondary color correction transforms the RGB data to an HSL (chroma-saturation-luma) representation and conditionally modifies the HSL values within a certain range of chroma-saturation-luma values. Because yellow is half way between red and green, cyan is half way between green and blue, and magenta is half way between blue and red, the secondary color correction is typically implemented with 6 hues, RYGCBM, which may be referred to as 6-axis secondary color correction. Primary and global quadratic color correction can be integrated into the method of spectral image processing according to the present disclosure using, for example, RYGCBV color values (note that M is not necessary because it is always a BR mixture, unlike C and Y). The sigmoid tone curve depicted in fig. 8 can be modified for amplitude, gamma, toe and shoulder adjustments. The re-saturation may be performed by adding white or spectral broadening.
Section 2
According to additional embodiments of the present disclosure, the spectral synthesis method may be applied to image capture device processing. Fig. 9 describes the processing method performed in a modern digital image capture device (see reference [6], which is incorporated herein by reference in its entirety). Raw image capture device sensor output (905) (e.g., R, Gl, G2, and B values) produced from the input spectrum S (λ) is first processed (910) according to a method that includes one or more of linearization (if necessary), de-bayer (debye) (e.g., for a single chip sensor), and white balancing. The bayer filter mosaic is a color filter array for arranging RGB color filters on a photosensor square grid in an image capturing apparatus. Obtaining a full color image includes this process inverting in a step called de-bayer (see reference [7], incorporated herein by reference in its entirety). White balance is the adjustment of the intensity of primary colors (e.g., red, green, and blue) to properly render a particular color (see reference [8], which is incorporated herein by reference in its entirety).
The data obtained from such processing (910) of the raw image capture device sensor output (905) is then typically transformed into an output color space (e.g., the RGB color space in fig. 9) through a 3x3 matrix (915). The transformed output may be tonal processed and/or clipped (920) to a particular range. The processing performed in each of blocks (910), (915), and (920) derives an output color space (925) from the raw image capture device sensor output (905).
According to several embodiments of the present disclosure, alternative image capture device processing methods to a 3x3 matrix are presented. A 3x3 matrix alone may not be sufficient to describe an accurate transformation of the image capture device sensor output to the output color (925).
To focus on this possible problem of the 3x3 matrix (915), consider the simplified configuration depicted in fig. 10.
For the present discussion, consider an image capture device that includes red (R), green (G), and blue (B) channels (e.g., RGB is considered the input color space) and consider CIE [ X, Y, Z ] as the output color space (1020). It should be noted that although RGB and CIE color spaces are considered in the present discussion, color spaces such as YUV, YCbCr, HSV, CMYK and other color spaces known to those skilled in the art may also be considered.
Tristimulus values [ X, Y, Z ]]Can be derived from the input spectrum S [ lambda ]]And in the interval [ lambda ]min,λmax]Color matching function defined aboveDetermined by the following equation:
wherein the interval [ lambda ]min,λmax]Covering the wavelengths of light typically perceivable by the human visual system. Fig. 12 depicts an example of such a function determined by CIE in 1931.
Similarly, the image capture device sensor outputs [ R ]S,GS,BS](1010) (where subscript indicates sensor) spectral sensitivity by image capture deviceDetermined using the following simulation equation:
wherein the image capture device spectral sensitivity represents a wavelength response of the image capture device color channel. Fig. 13 depicts an example of such a function for an example modern digital camera.
FIG. 11 depicts transforming (1130) the sensor output [ R ] to the image capture device by applying a 3x3 matrixs,GS,BS](1125) Obtained output color [ X ] from image capture deviceS,YS,ZS](1135) May not match the actual color X, Y, Z perceived by the human visual system](1105) The same is true. Only when the image capturing device is spectrally activeSensitivity of the reaction(1120) Is a reversible linear combination of color matching functions (1110) from an image capture device [ X [ ]S、YS、ZS](1135) Output color and actual color [ X, Y, Z ]](1105) The color accuracy in between can generally be guaranteed. Mathematically, this occurs when there is a 3x3 matrix Q such that the following holds:
wherein Q-1Are present.
Multiplying both sides of the above equation by the input spectrum S [ lambda ] (1115) and integrating on both sides to output the following result
Thus, the
Referring back to FIG. 11, to directly obtain the actual colors [ X, Y, Z ]](1105) Ideally, the image capture device spectral sensitivity(1120) Should equal the color matching function(1110). If these ideal spectral sensitivities (which are equal to the color matching functions) are modified by the matrix Q to generate the actual image capture device spectral sensitivities, such modifications may be cancelled (e.g., by utilizing Q)-1Transform image capture device sensor output (1125)) to generate actual color [ X, Y, Z ]](1105)。
If the image capture device spectral sensitivity(1120) Function matched with colorThe relationship between is more complex than a linear transformation, the 3x3 matrix may beIs insufficient.
The difference between the color matching function and the image capture device spectral sensitivity may be significant. Fig. 12 depicts the CIE (commission internationale de l' eclairage) 1931 color matching functions, while fig. 13 depicts exemplary spectral sensitivities for a modern digital cinema camera (ariid-21). In 1931, CIE created a mathematically defined color space called CIE 1931. The color matching function described in fig. 12 corresponds to this color space (see reference [4], which is incorporated herein by reference in its entirety). There is no linear combination of color matching functions that will output the spectral sensitivity for this camera. Specifically, the CIE1931 color matching function is smooth, while the camera spectral sensitivity is not. Consideration of such camera response provided by camera spectral sensitivity generally does not take into account output processing in the sensor.
While applying a matrix transform may not be sufficient to transform the image capture device sensor output to actual color, the exact processing of the output from the image capture device sensor to the output color space may exhibit overall linearity. I.e., ifIndicating sensor output [ R ]s、Gs、Bs],Representing output color [ X ]s、Ys、Zs]And P [2]]Representing the processing between the image capture device sensor output and the output color, e.g.,
will inputMultiplying by some constant value α produces an output to change by the same factor,
however, even when such linear properties are exhibited, the processing may not be necessary by applying the matrix exclusively. Such matrices may be determined by minimizing a particular error metric within a training set of color stimuli. Some recommended methods for determining the matrix in this manner are described in reference [6], which is incorporated herein by reference in its entirety.
According to several embodiments of the present disclosure, methods and systems are described for generating actual colors perceived by the human visual system from a given image capture device spectral sensitivity.
FIG. 14A depicts a representation for synthesizing spectra(1435A) Embodiments of the method and system of (1), the spectral representationIf applied to an image capture device, an observed image capture device sensor output [ R ] may be generateds,GS,BS](1425A) In that respect This method can be used for negative tristimulus values.
Synthesized spectrum(1435A) Image capture device sensor output R that can produce a views,GS,BS](1425A) Irrespective of the spectrum synthesized(1435A) Whether it is an actual spectrum(1415A) In that respect In particular, the synthesized spectrum(1435A) Can generate the correct [ X, Y, Z ]](1405A,1445A)。
As long as the synthesized spectrum(1435A) Having been determined, the correct output color [ X, Y, Z ] can be obtained](1405A,1445A)。
According to several embodiments of the present disclosure, a substantially rectangular spectrum may be used as the synthesized spectrum(1435A) In that respect As used in this disclosure, the term "substantially rectangular" may refer to a spectral shape that closely approximates a rectangular shape, but is not necessarily a completely rectangular shape. By way of example and not limitation, a spectrum characterized by sidewalls that are not perfectly perpendicular to the horizontal (λ) axis may be considered substantially rectangular. By way of further example and not limitation, a spectrum characterized by a small range of maxima rather than just one maximum may also be considered substantially rectangular. The substantially rectangular spectrum may be a continuous function or a discrete wavelength (sampled) function. Examples of continuous substantially rectangular spectra and discrete wavelength substantially rectangular spectra are depicted in fig. 14B and 14C, respectively.
15A-15D describe an image capture device processing method using synthesized rectangular spectra according to an embodiment of the disclosure.
FIG. 15A depicts an actual image spectrum S [ λ ] captured by an image capture device](1505A) In that respect Actual image spectrum S [ lambda ]](1505A) Spectral sensitivity for image capture devices(1510A) To produce an image capture device sensor output [ R ]S,GS,BS](1515A) In that respect Image capture device sensor output [ R ]S,GS,BS](1515A) Can be input to a spectral synthesis module (1520A) to generate a rectangular spectral parameter [ lambda ]↑,λ↓,I](1525A) Is a spectral representation of the feature. The synthesized rectangular spectrum (1525A), if applied to an image capture device, is adapted to produce an observed image capture device sensor output [ R [ R ] ]S,GS,BS](1515A):
Where the sign of the integration represents the cyclic integration over the lambda domain,
the resultant rectangular spectrum (1525A) may be applied to a spectrum application module (1530A) to generate the correct color output [ X, Y, Z ] (1535A).
Alternatively, [ lambda ]min,λmax]The domain itself can be interpreted as a circle, allowing all spectra to have a bandpass form. In this case, the band gap spectrum originally becomes a transnational λmin/λmaxStrip of dotsPass spectrum.
Thus, mathematically, the first step is to output [ R ] at a given image capture device sensorS,GS,BS](1515A) Time to find [ lambda ]↑,λ↓,I](1525A) In that respect Correct output [ X, Y, Z ] whenever parameters of the rectangular spectrum have been determined](1535A) The calculation of (d) may directly follow:
this process is described in fig. 15D. In particular, in addition to the color matching functionExternal wavelength lambda↑And λ↓Is sent to the loop integration module (1505D). The integrated results are separately multiplied by a scaling factor I via a multiplication module (1510D) to generate correct color output values [ X, Y, Z](1535A)。
Using spectral representation equivalent to the original imageIs a rectangular spectrum of S [ lambda ]]The performance of this process to obtain the correct color output for the image capture device may take any of several forms, each of which has different complexity and performance.
In one embodiment, referring back to FIG. 15A, a generally computationally intensive implementation may first determine a rectangular parameter [ λ ]↑,λ↓,I](1525A) And then is measured by a factor of [ lambda ]↑,λ↓,I](1525A) Producing output colors [ X, Y, Z ] for a characteristic rectangular spectrum](1535A) In that respect This process provides a slave [ R ] since the number of degrees of freedom (e.g., three in this case) is preservedS,GS,BS]To [ X, Y, Z]As depicted in fig. 16.
A three-dimensional look-up table (3D-LUT) may be used to perform this mapping. It is also possible to perform calculations (to be provided below) and then use a 2D-LUT based on the calculations to determine the transition wavelength λ↑And λ↓. The transition wavelength may then be used to determine the scaling factor I. This process is described in fig. 15B.
Fig. 15B depicts a spectral synthesis module (1520A), which may be used in the image capture device processing method depicted in fig. 15A. Image capture device sensor output [ R ]S,GS,BS](1515A) Is input to a wavelength determination module (1505B, described in further detail in FIG. 15C) to determine λ↑And λ↓. The calculating may include based on the image capture device sensor output [ R ]S,GS,BS](1515A) Calculated p (1500C) and q (1505C):
a two-dimensional look-up table (2D-LUT, 15I0C) may then be used to map the values p and q (1505C) to determine the transition wavelength λ↑And λ↓(1515C) In that respect Based on these transition wavelengths lambda↑And λ↓The scale factor I may be determined by a scale factor calculation module (1520B) consisting of a cyclic integration module (1510B) and a division module (1515B). The cyclic integration module (1510B) may be configured to convert the wavelength λ↑And λ↓Performing image capture device spectral sensitivity within defined intervals(1510A) The cyclic integration of any of (a). The result of the loop integration module (1510B) is then provided to a division module (1515B) to produce the scaling factor I. Specifically, the scale factor I can be represented by the following formula:
these integers may be computed, for example, using two table lookups and addition/subtraction operations.
The previously discussed methods including the synthesis of a rectangular spectral representation equivalent to the spectral representation of an image captured by an image capture device may be used in checking and characterizing image capture device accuracy. With some spectra, this approach can produce an output rectangular spectrum that is equal to the input rectangular spectrum for some stimulus sets.
According to several embodiments of the present disclosure, a method for determining this set of stimuli is provided as follows. The method may first comprise simulating the equation [ λ ]↑,λ↓,I]The cube (1705) is exposed to the spectral sensitivity of the image capture device(1710) To produce an analog image capture device sensor output [ R ]S,GS,BS](1715). The cube (1705) is provided by three dimensions [ lambda ]↑,λ↓,I]A representation of a rectangular spectrum in a defined rectangular space. Specifically, each point in the cube corresponds to a rectangular spectrum, where the rectangular spectrum is characterized by a scale factor I, and a first wavelength λ↑Is the wavelength at which the input spectrum transitions from zero to the scale factor, and a second wavelength λ↓Is the input spectrum where the input spectrum transitions from the input scale factor to zero. The next step is to determine a set of spectra for which the output of the previously described spectral synthesis method is the output by comparing (1730)(1725) With the inputs described in FIG. 17(1705) The same is true. The set of rectangular spectra that can be accurately recovered with this method constitutes the image capture device gamut. Rectangular spectral components that cannot be accurately recovered are often referred to as image capture device metamerism.
The examples set forth above provide those of ordinary skill in the art with a complete disclosure and description of how to make and use embodiments of the spectral image processing of the present disclosure, and are not intended to limit the scope of what the inventors regard as their disclosure.
Modifications of the above-described modes for carrying out the methods and systems disclosed herein that are obvious to those of ordinary skill in the art are intended to be within the scope of the following claims. All patents and publications mentioned in the specification are indicative of the levels of skill of those skilled in the art to which the disclosure pertains. All references cited in this disclosure are incorporated by reference as if each reference had been individually and fully incorporated by reference.
It is to be understood that the disclosure is not limited to a particular method or system, which can, of course, vary. It is also to be understood that the terminology used herein is for the purpose of describing particular embodiments only, and is not intended to be limiting. As used in this specification and the appended claims, the singular forms "a", "an", and "the" include plural referents unless the content clearly dictates otherwise. The term "plurality" includes two or more of the indicated item unless the content clearly dictates otherwise. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure belongs.
The methods and systems described in this disclosure may be implemented in hardware, software, firmware, or a combination thereof. Features described as blocks, modules, or elements may be implemented together (e.g., in a logic device such as an integrated logic device) or separately (e.g., as separately connected logic devices). The software portion of the methods of the present disclosure may include a computer-readable medium comprising instructions that, when executed, at least partially perform the described methods. The computer-readable medium may include, for example, Random Access Memory (RAM) and/or Read Only Memory (ROM). The instructions may be executed by a processor, such as a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), or a Field Programmable Gate Array (FPGA).
A number of embodiments of the present disclosure have been described. Nevertheless, it will be understood that various modifications may be made without departing from the spirit and scope of the disclosure. Accordingly, other embodiments are within the scope of the following claims.
List of references
[1] Wikipedia, org/wiki/Multispectral _ image retrieved on day 6/12/2011
[2] Wyszecki, g. and Stiles, w.s., ColorScience: ConceptsandMethods, QuantitativeDataandFormulae, Wiley-Interscience,2002, pp, 187-.
[3]M.Parmar,etal,“ADatabaseofHighDynamicRangeVisibleandNear-infraredMultispectralImages”,Proc.IS&T/SPIEElectronicImaging2008:DigitalPhotographyIV。
[4] Wikipedia. org/wiki/CIE _1931_ color _ space retrieved 11/29/2011.
[5]MacAdam,DavidL.,“TheTheoryoftheMaximumVisualEfficiencyofColorMaterials,”J.OpticalSoc.Am.,Vol.25,pp.249-252,1935。
[6]“ColorImageProcessingPipeline.AGeneralSurveyofDigitalStillCameraProcessing”,IEEESignalProcessingMagazine,vol.22,no.1,pp.34-43,2005。
[7] wikipedia.org/wiki/Bayer _ filter retrieved on day 6/12/2011
[8] wikipedia.org/wiki/White balance searched for 12 months, 6 days 2011
[9]Logvinenko,AlexanderD.,“AnObjectColorSpace”,J.Vision.9(11):5,pp.1-23,2009
Table a: rectangular spectrum
Rectangular spectra have many useful properties:
a) macadam efficiency
b) Complete (all possible chromaticities)
c)3 degree of freedom
d) Maximally compact spectrum
The last property, compactness, exists because there are essentially two ways to desaturate any chroma: broadening the spectrum or adding white. Logvinenko (see reference [9], which is incorporated herein by reference in its entirety) exploits this property to produce a more general form of reflective spectra for objects (and also replaces λ ↓ and λ ↓withthe equivalent of central wavelengths and labeled bandwidths, with negative bandwidths used to represent band gap spectra).
Figure 2 depicts how the bandpass and bandgap behavior occurs in the chromaticity space. Note that the boundary between the bandpass and bandgap regions has two sections: one segment from the isoenergetic white point to the maximum wavelength red and the other segment from the white point to the minimum wavelength blue. These are called white-red and white-blue borders, respectively.
A round-robin integration may be used to hide the distinction between bandpass and bandgap spectra. This can also be achieved by considering the wavelength domain as circular, and by eliminating the distinction between bandpass and bandgap spectra. Fig. 5A-5C depict this circular field over which the integrals are evaluated. Specifically, fig. 5A depicts a circular wavelength domain, fig. 5B depicts a bandpass rectangular spectrum, and fig. 5C depicts a bandgap rectangular spectrum.
Note that λminAnd λmaxAre the same points in the circular representation and λ increases in a counterclockwise direction. In the circular representation, the integral is always from λ↑To lambda↓And the solution is always a bandpass as described in figure 18. This alternative visualization would be to extract the spectra outside the tube such that λminAnd λmaxMay be the same point.
Referring now to FIG. 19, the white-red and white-blue boundary segments depicted in FIG. 2 are λ↑Or λ↓Falls to lambdamin/λmaxAt the boundary. The white-red border is λ↓Falls on λmaxAbove, and the white-blue boundary is λ↑Falls on λminThe above points.
See how the (x, y) solution space for the bandpass and bandgap spectra depicted in FIG. 2 maps to [ λ ] depicted in FIGS. 20A and 20B↑,λ↓]The domains in space are illustrative.
In which the isoenergetic white point E is lambda↑=λminAnd λ↓=λmaxWhere. If approximated from the band pass side, the diagonal is spectrally local, and if approximated from the band gap side, the diagonal is near the white point. The upper triangular region for the band-pass spectrum is a closed set (which includes its boundaries), while the triangular region for the band-gap spectrum is open (which does not include its boundaries). Fig. 15B depicts the trajectory in two-dimensional λ -space of a relatively narrow-band spectrum of bandwidths and the center frequency through a circular λ domain. Since the centre frequency is from λmin2 to lambdamax-/2, so the hue passes through violet (V), blue (B), cyan (C), green (G), yellow (Y) and red (R). When the center frequency is [ lambda ]min,λmax]Within/2, the hue passes through magenta.
Claims (45)
1. A method for synthesizing a rectangular spectral representation of an input color value in a tristimulus representation, the input color value generated from an input spectrum of an image, wherein each input color value corresponds to a weight of a corresponding basis function of the tristimulus representation, wherein the basis functions of the tristimulus representation are defined functions, the method comprising:
providing the input color values, wherein each input color value is associated with a corresponding analysis function; wherein an analysis function is associated with basis functions of the tri-chromatic representation;
determining a first wavelength and a second wavelength of a spectral representation of a rectangle based on the input color values; wherein the determining of the first and second wavelengths comprises: dividing two of the input color values by the sum of all input color values; and mapping the result of the division with a two-dimensional look-up table to determine the first and second wavelengths; and
calculating a scale factor based on any one of the first wavelength, the second wavelength, and the input color value and a corresponding analytical function of the input color value; wherein the calculation of the scaling factor comprises: performing a round-robin integration of the corresponding analysis function of one of the input color values over an interval defined by the first and second wavelengths; and dividing the one input color value by a result of the cyclic integration to calculate the scaling factor;
wherein:
if the spectral representation of the synthesized rectangle is applied to the analysis function corresponding to the input color values, the spectral representation of the synthesized rectangle is adapted to produce the input color values,
at the first wavelength, the spectral representation of the rectangle transitions from a low value to the scale factor, and at the second wavelength, the spectral representation of the rectangle transitions from the scale factor to the low value, an
If the first wavelength is less than the second wavelength, the spectral representation of the rectangle exhibits the scale factor between the first wavelength and the second wavelength, and if the first wavelength is greater than the second wavelength, the spectral representation of the rectangle exhibits the low value between the first wavelength and the second wavelength; and
the low value of the spectral representation of the rectangle is equal to zero.
2. The method of claim 1, wherein the analysis function comprises spectral sensitivity of an image capture device or eye spectral sensitivity.
3. A method of spectral spreading, the method comprising:
synthesizing a spectral representation of a rectangle according to the method of any one of claims 1-2; and
applying a plurality of filter banks to the synthesized rectangular spectral representation to generate expanded color values in an expanded representation to expand the spectrum represented by the input color values, wherein the number of expanded color values in the expanded representation is greater than the number of input color values in tristimulus representations.
4. A method of image processing, the method comprising:
performing spectral spreading according to the method of claim 3;
processing the expanded color values; and
the processed expanded color values are projected into a color space corresponding to the input color values.
5. The method of claim 4, wherein the processing comprises spectral color correction.
6. The method of claim 4, wherein the processing comprises a non-linear transformation followed by a matrix multiplication.
7. The method of any one of claims 1-6, wherein the input color values are defined for an RGB (red, green, blue) color space.
8. The method of any one of claims 1-6, wherein the input color values are defined for XYZ (CIE1931) color space.
9. The method of any one of claims 3-6, wherein the expanded color values are defined for RYGCBV (Red, yellow, Green, cyan, blue, Violet) color space.
10. The method of any one of claims 3-6, wherein the expanded color values are defined for a ROYGCBV (red, orange, yellow, green, cyan, blue, violet) color space.
11. The method of any one of claims 3-6, wherein the input color values are defined for an RGB (red, green, blue) color space and the expanded color values are defined for a RYGCBV (red, yellow, green, cyan, blue, violet) color space.
12. The method of any one of claims 3-6, wherein the input color values are defined for an RGB (red, green, blue) color space and the expanded color values are defined for a ROYGCBV (red, orange, yellow, green, cyan, blue, violet) color space.
13. The method of any one of claims 3-6, wherein the input color values are defined for XYZ (CIE1931) color space and the expanded color values are defined for RYGCBV (Red, yellow, Green, cyan, blue, Violet) color space.
14. The method of any one of claims 3-6, wherein the input color values are defined for XYZ (CIE1931) color space and the expanded color values are defined for ROYGCBV (Red, orange, yellow, Green, cyan, blue, Violet) color space.
15. The method of any of claims 1-6, wherein the input color values can contain negative values.
16. A system configured to synthesize a rectangular spectral representation of input color values in a tristimulus representation, the input color values generated from an input spectrum of an image, wherein each input color value corresponds to a weight of a corresponding basis function of the tristimulus representation, wherein the basis functions of the tristimulus representation are defined functions in which each input color value is associated with a corresponding analysis function, wherein an analysis function is associated with a basis function of the tristimulus representation, the system comprising:
a wavelength determination module configured to determine a first wavelength and a second wavelength of a spectral representation of a rectangle based on an input color value; wherein the first and second wavelengths are determined by dividing two of the input color values by a sum of all input color values, and mapping the result of the division with a two-dimensional lookup table to determine the first and second wavelengths; and
a scale factor calculation module configured to calculate a scale factor based on any one of the first wavelength, the second wavelength, and the input color value and a corresponding analysis function of the input color value; wherein the scaling factor is calculated by performing a round-robin integration of the corresponding analysis function of one of the input color values over an interval defined by the first and second wavelengths, and by dividing the one input color value by a result of the round-robin integration to calculate the scaling factor;
wherein:
the spectral representation of the synthesized rectangle is adapted to produce the input color value if it is applied to an analysis function corresponding to the input color value, an
At the first wavelength, the spectral representation of the rectangle transitions from a low value to a scale factor, and at the second wavelength, the spectral representation of the rectangle transitions from a scale factor to a low value;
if the first wavelength is less than the second wavelength, the spectral representation of the rectangle exhibits the scale factor between the first wavelength and the second wavelength, and if the first wavelength is greater than the second wavelength, the spectral representation of the rectangle exhibits the low value between the first wavelength and the second wavelength;
the low value of the spectral representation of the rectangle is equal to zero.
17. The system of claim 16, wherein the analysis function comprises a spectral sensitivity of an image capture device or an eye spectral sensitivity.
18. A spectrum spreading system, the system comprising:
a spectral synthesis module comprising a system according to any one of claims 16-17, configured to synthesize a rectangular spectral representation; and
a plurality of filter banks configured to generate expanded color values in an expanded representation from the synthesized spectral representation of the rectangle to expand the spectrum represented by the input color values, wherein a number of the expanded color values in the expanded representation is greater than a number of the input color values in the tristimulus representation.
19. An image processing system, the system comprising:
the spectrum spreading system of claim 18;
a processing module configured to process the expanded color values; and
a projection module configured to project the processed expanded color values to a color space corresponding to the input color values.
20. The system of claim 19, wherein the processing module comprises a spectral color correction module.
21. The system of claim 19, wherein the processing module is configured to perform a non-linear transformation followed by a matrix multiplication.
22. The system of any of claims 16-21, wherein the input color values are defined for an RGB (red, green, blue) color space.
23. The system of any one of claims 16-21, wherein the input color values are defined for XYZ (CIE1931) color space.
24. The system of any one of claims 18-21, wherein the expanded color values are defined for RYGCBV (red, yellow, green, cyan, blue, violet) color space.
25. The system of any one of claims 18-21, wherein the expanded color values are defined for ROYGCBV (red, orange, yellow, green, cyan, blue, violet) color space.
26. The system of any one of claims 18-21, wherein the input color values are defined for an RGB (red, green, blue) color space and the expanded color values are defined for an RYGCBV (red, yellow, green, cyan, blue, violet) color space.
27. The system of any one of claims 18-21, wherein the input color values are defined for an RGB (red, green, blue) color space and the expanded color values are defined for a ROYGCBV (red, orange, yellow, green, cyan, blue, violet) color space.
28. The system of any one of claims 18-21, wherein the input color values are defined for XYZ (CIE1931) color space and the expanded color values are defined for RYGCBV (red, yellow, green, cyan, blue, violet) color space.
29. The system of any one of claims 18-21, wherein the input color values are defined for XYZ (CIE1931) color space and the expanded color values are defined for ROYGCBV (red, orange, yellow, green, cyan, blue, violet) color space.
30. The system of any of claims 16-21, wherein the input color values can contain negative values.
31. A system for synthesizing a rectangular spectral representation of an input color value in a tristimulus representation, the input color value generated from an input spectrum of an image, wherein each input color value corresponds to a weight of a corresponding basis function of the tristimulus representation, wherein the basis functions of the tristimulus representation are defined functions, the system comprising:
means for providing the input color values, wherein each input color value is associated with a corresponding analytic function; wherein an analysis function is associated with basis functions of the tri-chromatic representation;
means for determining a first wavelength and a second wavelength of a spectral representation of a rectangle based on the input color values; wherein the determining of the first and second wavelengths comprises: dividing two of the input color values by the sum of all input color values; and mapping the result of the division with a two-dimensional look-up table to determine the first and second wavelengths; and
means for calculating a scale factor based on any of the first wavelength, the second wavelength, and the input color value and a corresponding analytical function of the input color value; wherein the calculation of the scaling factor comprises: performing a round-robin integration of the corresponding analysis function of one of the input color values over an interval defined by the first and second wavelengths; and dividing the one input color value by a result of the cyclic integration to calculate the scaling factor;
wherein:
if the spectral representation of the synthesized rectangle is applied to the analysis function corresponding to the input color values, the spectral representation of the synthesized rectangle is adapted to produce the input color values,
at the first wavelength, the spectral representation of the rectangle transitions from a low value to the scale factor, and at the second wavelength, the spectral representation of the rectangle transitions from the scale factor to the low value, an
If the first wavelength is less than the second wavelength, the spectral representation of the rectangle exhibits the scale factor between the first wavelength and the second wavelength, and if the first wavelength is greater than the second wavelength, the spectral representation of the rectangle exhibits the low value between the first wavelength and the second wavelength; and
the low value of the spectral representation of the rectangle is equal to zero.
32. The system of claim 31, wherein the analysis function comprises a spectral sensitivity of an image capture device or an eye spectral sensitivity.
33. A system for spectral spreading, the system comprising:
means for synthesizing a spectral representation of a rectangle from the system of any of claims 31-32; and
means for applying a plurality of filter banks to the synthesized rectangular spectral representation to generate expanded color values in an expanded representation to expand the spectrum represented by the input color values, wherein the number of expanded color values in the expanded representation is greater than the number of input color values in tristimulus representations.
34. A system for image processing, the system comprising:
means for performing spectral spreading by the system of claim 33;
means for processing the expanded color values; and
means for projecting the processed expanded color values into a color space corresponding to the input color values.
35. The system of claim 34, wherein the processing comprises spectral color correction.
36. The system of claim 34, wherein the processing comprises a non-linear transformation followed by a matrix multiplication.
37. The system of any one of claims 31-36, wherein the input color values are defined for an RGB (red, green, blue) color space.
38. The system of any one of claims 31-36, wherein the input color values are defined for XYZ (CIE1931) color space.
39. The system of any one of claims 33-36, wherein the expanded color values are defined for RYGCBV (red, yellow, green, cyan, blue, violet) color space.
40. The system of any one of claims 33-36, wherein the expanded color values are defined for ROYGCBV (red, orange, yellow, green, cyan, blue, violet) color space.
41. The system of any one of claims 33-36, wherein the input color values are defined for an RGB (red, green, blue) color space and the expanded color values are defined for an RYGCBV (red, yellow, green, cyan, blue, violet) color space.
42. The system of any one of claims 33-36, wherein the input color values are defined for an RGB (red, green, blue) color space and the expanded color values are defined for a ROYGCBV (red, orange, yellow, green, cyan, blue, violet) color space.
43. The system of any one of claims 33-36, wherein the input color values are defined for XYZ (CIE1931) color space and the expanded color values are defined for RYGCBV (red, yellow, green, cyan, blue, violet) color space.
44. The system of any one of claims 33-36, wherein the input color values are defined for XYZ (CIE1931) color space and the expanded color values are defined for ROYGCBV (red, orange, yellow, green, cyan, blue, violet) color space.
45. The system of any of claims 31-36, wherein the input color values can contain negative values.
Applications Claiming Priority (7)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201161581051P | 2011-12-28 | 2011-12-28 | |
| US201161581048P | 2011-12-28 | 2011-12-28 | |
| US61/581,048 | 2011-12-28 | ||
| US61/581,051 | 2011-12-28 | ||
| US201261733551P | 2012-12-05 | 2012-12-05 | |
| US61/733,551 | 2012-12-05 | ||
| PCT/US2012/070855 WO2013101642A1 (en) | 2011-12-28 | 2012-12-20 | Spectral image processing |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| HK1197709A1 HK1197709A1 (en) | 2015-02-06 |
| HK1197709B true HK1197709B (en) | 2017-04-21 |
Family
ID=
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN104115481B (en) | Spectral image processing | |
| Trussell et al. | Color image processing [basics and special issue overview] | |
| Rowlands | Color conversion matrices in digital cameras: a tutorial | |
| Pointer et al. | Practical camera characterization for colour measurement | |
| Finlayson et al. | Spectral sharpening and the Bradford transform | |
| Andersen et al. | Weighted constrained hue-plane preserving camera characterization | |
| Fang et al. | Colour correction toolbox | |
| JP2004045189A (en) | Color correction device and color correction method | |
| Hill | High quality color image reproduction: The multispectral solution | |
| HK1197709B (en) | Spectral image processing | |
| Shrestha | Multispectral imaging: Fast acquisition, capability extension, and quality evaluation | |
| HK1196487A (en) | Spectral synthesis for image capture device processing | |
| HK1196487B (en) | Spectral synthesis for image capture device processing | |
| Higham | Color spaces and digital imaging | |
| Zhu et al. | Matched illumination | |
| Finlayson et al. | Colour meets geometry in colorimetric filter design | |
| Post et al. | Fast radiometric compensation for nonlinear projectors | |
| Huebner | Image enhancement methods for turbulence mitigation and the influence of different color spaces | |
| Green | Characterization of Input Devices | |
| Tajbakhsh | CIE chromaticity, Planckian locus, and correlated color temperature estimation from raw image data using color checker training images | |
| Li | Consideration of human visual mechanism for image white balance | |
| Kim et al. | Natural Color Reproduction of an Image with Highlights by Vector Transformation and Non-linear Mapping | |
| Cardei et al. | Issues in color correcting digital images of unknown origin | |
| Son et al. | Color correction of projected image on color-screen for mobile beam-projector | |
| Alsam et al. | Calculating Metamer Sets Without Spectral Calibration |