[go: up one dir, main page]

HK1059958A - Method for analyzing hair and predicting achievable hair dyeing ending colors - Google Patents

Method for analyzing hair and predicting achievable hair dyeing ending colors Download PDF

Info

Publication number
HK1059958A
HK1059958A HK04101399.1A HK04101399A HK1059958A HK 1059958 A HK1059958 A HK 1059958A HK 04101399 A HK04101399 A HK 04101399A HK 1059958 A HK1059958 A HK 1059958A
Authority
HK
Hong Kong
Prior art keywords
color
hair
recipient
values
value
Prior art date
Application number
HK04101399.1A
Other languages
Chinese (zh)
Inventor
苏雷什‧B‧马拉潘恩
权克明
戴维‧沃尔特
Original Assignee
The Procter & Gamble Company
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by The Procter & Gamble Company filed Critical The Procter & Gamble Company
Publication of HK1059958A publication Critical patent/HK1059958A/en

Links

Description

Method for analyzing hair and predicting achievable hair color end color
CROSS-REFERENCE TO RELATED APPLICATIONS
This application is a partial Continuation In Part (CIP) application serial No. 09/570,292 filed on 12.5.2000.
Technical Field
The present invention relates generally to a method and system for analyzing hair, predicting an achievable final color, outputting the predicted achievable color, and recommending hair dyes based on recipient-specific input values.
Background
Countless people worldwide are seeking ways to improve their appearance through the use of hair dyes. As a result, a very large number of products are now available for consumer selection. The general case is: consumer individuals find it difficult to determine the hair coloring agent to select and also to predict how the resulting hair color will look based on their current hair color.
Certain hair coloring agents (hair coloring agents) are sold in packages on which a hair coloring chart is typically depicted containing three primary colors and three corresponding final colors after dyeing. Some systems have been developed that attempt to overlay a hair color picture over a consumer picture. Furthermore, there are systems that attempt to match the original color of the product to the consumer using an index system. However, none of these systems provide personalized color prediction based on the recipient's original hair value (such as color) in a manner that is not limited by specific input numbers. Thus, the consumer may not achieve a final hair color that reasonably approximates the promised color. Moreover, none of these systems is a complete system by combining a hair dye analysis system with a product line of hair dyes in a coordinated manner during the purchase process.
Therefore, what is needed is: methods and systems for analyzing hair, predicting an achievable final color, outputting the predicted achievable color, and recommending hair coloring agents based on recipient-specific input values.
Summary of The Invention
The present invention relates to a method and system for analyzing hair, predicting an achievable final color, outputting the predicted achievable color, and recommending hair dyes based on a recipient's specific input value.
In one aspect, the present invention relates to a method of determining (identity) an available final hair color based on at least one recipient's raw hair value. The method comprises the following steps: the method includes inputting at least one raw hair value for the recipient and then determining at least one available final hair color based on the raw hair value.
In another aspect, the present invention is directed to a method for determining hair color based on at least one recipient's raw hair value. The method comprises the following steps: inputting at least one raw hair value for the recipient; inputting family color selection (family color selection); based on the raw hair values and the color selection family, at least one working image (working image) depicting the available final hair color is output. The method may additionally include one or more steps to improve the appearance of the output manipulated image to account for any differences between the illumination characteristics when the original hair values were obtained and when the manipulated image was viewed.
In another aspect, the present invention relates to a method for outputting an image for a hair color analysis system, the method comprising the steps of: inputting a personal process image; inputting the final hair color that can be obtained; converting the RGB color values of the processed image into L, a and b color values; calculating an average L, a and b color value of the processed image; calculating new L, a and b color values based on the L, a and b color values of the worked image, the average L, a and b color values of the worked image, and the available final hair color; this new L, a and b color value is then reconverted to an RGB value.
In yet another aspect, the present invention relates to a method of providing a hair coloring product to a consumer, the method comprising the steps of: determining an available final hair color for the consumer; depicting the final color available to the consumer; allowing the consumer to select the desired final hair color; hair colorants are recommended for the consumer to achieve the desired final hair color.
Brief Description of Drawings
FIG. 1 is a block diagram of a color prediction method to which the present invention can be applied;
FIG. 2 is a screen display diagram of a welcome screen that may be used in block 110 of FIG. 1 and is also an example of block 110;
FIG. 3 is a screen display diagram of an input screen that may be used in block 120 of FIG. 1 and is also one example of block 120;
FIG. 4 is a screen display diagram of an input screen that may be used in block 130 of FIG. 1 and is also one example of block 130;
FIG. 5 is a screen display diagram of an input screen that may be used in block 140 of FIG. 1 and is also one example of block 140;
FIG. 6 is a screen display diagram of an input screen that may be used in block 150 of FIG. 1 and is also one example of block 150;
FIG. 7 is a screen display diagram of an input screen that may be used in block 160 of FIG. 1 and is also one example of block 160;
FIG. 8 is a screen display diagram of an input screen that may be used in block 170 of FIG. 1 and is also an example of block 170;
FIG. 9 is a screen display diagram of an input screen that may be used in block 190 of FIG. 1 and is also one example of block 190;
FIG. 10 is a table of series colors and corresponding color hues;
fig. 11 is an example of a prior art hair coloring chart;
FIG. 12 is a screen display diagram of an input screen that may be used in block 200 of FIG. 1 and is also one example of block 200;
FIG. 13 is an example of pre-and post-hair dye data generated for a consumer using a hazel tint;
FIG. 14 is a block diagram of a method of providing color using the present invention;
fig. 15 is an example of (a) a processed image before color forming, (b) a masked image of the processed image, and (c) a color-formed image produced using the present invention.
FIG. 16 is a block diagram of a color prediction method using the present invention including the step of evaluating hair damage.
Fig. 17 is a screen display diagram of an input screen that may be used in block 210 of fig. 16 and is also an example of block 210.
Detailed Description
A block diagram of a method 100 for determining hair dye based on a recipient's raw hair conditions is illustrated in fig. 1. In one embodiment, method 100 is implemented by a computer system located at a retail location for the purpose of analyzing and recommending hair coloring products. However, it will be readily apparent to those of ordinary skill in the art that such apparatus may be located anywhere without departing from the scope of the present invention. The device may be used, for example, in a beauty shop.
Referring to fig. 1, a hair color analysis method 100 comprises a series of steps including a screen image (screen) for providing and receiving information to a consumer or user for the purpose of implementing the method 100. These screen images may be displayed on a display screen, which may include, but is not limited to, a computer screen, a Cathode Ray Tube (CRT) device, and a Liquid Crystal Display (LCD). Block 110 includes determining a welcome to the screen (see fig. 2) of the system and associated hair colorants (e.g., VS Sassoon products). Block 120 contains a screen asking the recipient why he/she is dyeing his/her hair (see fig. 3). For example, 1 recipient may require coverage of his gray hair, which requires a stronger hair dye, while another recipient may only need to increase her hair color, which requires a softer hair dye. Block 130 contains a screen asking the recipient if he/she has often colored hair in the past year (see fig. 4). The frequency with which a person dyes hair can affect the condition of their hair and the ability to receive another hair dye. Block 140 includes a screen asking the recipient what the average length of hair is (see fig. 5). The length of the recipient's hair affects the condition of the hair and the ability to receive another hair dye because the longer the hair, the higher the degree of natural damage. It may also affect the method of measuring color to be used (e.g., distorting the build and making it more difficult to take into account the roots and ends of the hair). Block 150 includes a screen (see fig. 6) that asks the recipient for the color of his eyes (and/or skin). The eye and skin information depicted in the color-forming image can be seen as the recipient determines the final color they desire, thereby assisting the recipient in selecting the step.
Still referring to fig. 1, block 160 prompts the recipient to take a variety of color measurements using a colorimeter or spectrophotometer (see fig. 7). In one embodiment, the recipient is prompted to take 6 color readings (front, front left, back right, front right, and back right). This step may be performed by the recipient itself or by another person, such as a beauty counselor, to the recipient. Each reading obtained from the colorimeter is in a format that can be used, including but not limited to, the L, a and b color values and the Hunter (Hunter) L, a and b color values of the Commission International de 1' eclairage (cie). It will be readily appreciated by those of ordinary skill in the art that other color formats may be used without departing from the scope of the present invention. The L, a and b color values are then averaged and stored for later use in a color prediction model. It will be apparent to one of ordinary skill in the art that a plurality of L, a and b color values can be statistically analyzed to correct for outliers and inaccurate values and errors that may occur during the assay. Block 170 prompts the recipient to select from a color family (e.g., flax, light brown, medium brown, dark brown, and red/copper colors) (see fig. 8). The selected color system is also used in the color prediction model. Although the preferred embodiment depicts 5 different color systems, it will be readily understood by those of ordinary skill in the art that any number of color systems may be used wherever possible without departing from the scope of the present invention. Furthermore, one of ordinary skill in the art will readily appreciate that the order of the steps listed in blocks 120-170 is also representative of only one embodiment of the present invention, and that the steps may be performed in any order relative to one another.
Block 180 shows the final hair color available as determined by the steps described in blocks 120-170 based on the input values (see FIG. 9). Block 190 prompts the recipient to select from a set of available final hair colors that represent the different color shades contained in the color system selected in the step described in block 170 (see FIG. 9). It is contemplated that a prompt may be presented to the recipient so that the recipient may return to the flow of the method to achieve a different combination of available final hair colors to be displayed in block 180. By way of non-limiting example, it may cause the recipient to change the input values, including answers to questions, or may cause the recipient to simply indicate a series of final hair colors that they need to see for lighter or darker or redder or blond or lighter blond. The available final hair colors can be divided into a number of categories including, but not limited to, changes relative to increments, where those colors in the change category are considered more extreme color changes, requiring higher pH hair colorants, while those colors in the increment category are considered milder color changes, requiring lower pH hair colorants. Such information regarding the category (e.g., pH level of the hair dye) can help the recipient and/or beauty counselor make a more informed decision in using the hair dye. Although the preferred embodiment contains 16 different color shades (shades) and 5 different color systems (see fig. 10), it will be readily understood by those of ordinary skill in the art that any number of shades may be used wherever possible without departing from the scope of the present invention. For example, 25 color hues may be used. These available final hair colors are derived from a color prediction model, thereby more accurately representing the color of the recipient after hair coloring rather than the common color chart typically found on the back of most hair dye packages (see fig. 11). Block 120 recommends a particular hair dye based on the selected available final hair color, also referred to as the desired hair color (see fig. 12).
Hair damage
Referring to fig. 16, the hair color analysis method 100 may further comprise one or more steps wherein the recipient's raw hair value may be a measure of damage to the recipient's hair. Block 210 includes querying the recipient about the extent of hair damage (see fig. 17). The amount of hair damaged can affect the ability of the hair to color because it is generally the greater the damage to the hair, the less the hair will retain color due to changes in cuticle structure. Furthermore, one of ordinary skill in the art will appreciate that the order of the steps listed in blocks 120-170 and 210 represents only one embodiment of the present invention, and that the steps may be performed in any order relative to one another.
The degree of damage may be represented by such terms as shown in fig. 17, i.e., "mild, moderate, substantially severe"; however, any term or number of levels or consecutive points depicting an increasing or decreasing number of impairments may be used with the present invention. Methods for determining how damaging is may include, but are not limited to: intuitive self-assessment of the recipient; a visual or physical assessment (physical assessment) of the recipient or another person, such as a beauty counselor; evaluation using a device for measuring hair damage; chemical evaluation of hair; and combinations thereof. The method of determining hair damage may be performed during the performance of method 100 or may be performed in advance and the results entered simply during the performance of method 100.
Suitable hair damage determination methods for use herein include, but are not limited to, methods using devices for evaluating roughness and methods for inferring damage based on the degree of friction that hair is subjected to under certain conditions. For example, ease of carding is often used as a measure of smoothness. In one combing test, the force required to disentangle by pulling a comb through a bundle of hair fibers was used to evaluate friction, roughness and damage. EP-A-965,834 describes cA test device for evaluating the friction of cosmetic products on the skin, hair, membranes and eyes. The apparatus evaluates the friction force by deformation of a deformable member on the probe. The degree of damage to hair is determined in JP 63/163143 by comparing the friction force forward and backward. These forces were measured by a torsion meter. JP 62/273433 describes measuring the friction between hairs by passing a liquid in turbulent flow through a bundle of hairs and measuring the friction by detecting the loss of pressure in the liquid.
Also suitable for use herein is a device for measuring the friction force generated by a bundle of hair fibres, said method comprising the steps of: providing a friction member (friction member) that is pulled through a bundle of hair, thereby generating a friction noise signal; and capturing said frictional noise signal by a noise sensor. In addition, the captured frictional noise signal may be converted into a displayable form. The converted signal is then displayed using a display device. In such methods, a friction member and noise sensor are used to measure the friction force generated by a bundle of recipient hair. The friction member may be pulled through the hair such that it contacts and passes over the surface of each hair. This process creates friction between the friction member and the hair. The frictional noise generated is dependent on the level of friction between the friction member and the hair surface. The friction member may be pulled through the recipient's hair one or more times and may be statistically analyzed to determine an average or mean value.
Such friction members may be generally made of a rigid material and may be in the form of a carding tool having multiple teeth. The grooming device is typically pulled through a bundle of hair in a manner typically used for grooming. The resulting frictional noise signal is captured by a frictional noise sensor, such as a microphone, for example, a standard electric loudspeaker or a noise canceling microphone. Once the frictional noise is captured, it can be displayed and analyzed in any suitable manner. It may be displayed in a manner that is visible to the recipient, who then selects a level of impairment based on the circumstances of the display or may simply enter it directly into method 100, while also being displayed to the recipient. The frictional noise detected by the sensor can be converted into a signal which is then relayed to a visual display device and displayed or can be displayed, for example, in the form of a trace of the sound amplitude with respect to time. These transformations may be carried out using known apparatus.
The degree of damage assessed by the method may be reported as a value assigned to a predetermined category of standard hair damage levels that have been measured and tabulated or as a damage value compared to a standard value or as the number of damages measured according to a decibel level of scale corresponding to a particular degree of damage or the degree of damage assessed by the method may be reported as an absolute value of the measurement or the like.
Suitable hair damage assay methods for use herein also include, but are not limited to, methods known in the art that chemically assess the number of broken and unbroken cysteine disulfide bonds in the recipient's hair.
Color prediction model
In developing a color prediction model, consumers were recruited to participate in the hair coloring process in a salon (salon). Upon arrival at the beauty shop, the initial color of the consumer's hair was measured using a colorimeter. The data generated are color readings in the L, a, b color range. Multiple readings are taken at different locations on the head (e.g., front left, back right, and front right). From these individual readings, the total average color reading is calculated. After these initial readings are taken, the consumer's hair is colored with a consumer-selected shade. After dyeing and drying, the hair color was measured again using the colorimeter as described above. This results in data for a large number of consumers before and after dyeing in different shades. An example of the data that would be generated for a consumer using a hazel tint is listed in fig. 13. There was a slight difference between the predicted hair color and the hair color observed after dyeing due to measurement error, styling error and natural variability of the data. The model accuracy, expressed as the root mean square error (standard error of prediction), is around 2.5 points for L and around 1 point for a and b. This is within the range of natural variability of the head. Studies have demonstrated that the L of a consumer can reach 10 or 12 points in the process of empirically describing the color of the consumer's own hair. On this basis, the prediction accuracy of the present invention is considered to be higher than an acceptable level.
A color prediction model for each shade is then generated to predict the available hair color after dyeing compared to the initial color of the hair. Separate models were generated to predict L, a and b (reddish brown, hazel, light brown, kumquat, dark brown, medium copper brown, light margarita, hazy brownish linen, burgundy wine red, lightest reddish brown, lightest golden brown, soft medium brown, pure red, warm dark brown, and soft copper linen) for each hue. The model for a and b contains linear effects on L, a and b. The model was generated using partial least squares regression (partial least squares regression). Although the preferred embodiment incorporates the modeling strategies and techniques described above, those of ordinary skill in the art will readily appreciate that other modeling strategies and techniques may be used wherever possible without departing from the scope of the present invention. Examples of color prediction models generated for the reddish-brown shades L, a and b are as follows:
predicted L-7.037 + (0.5948 Xpre-staining L)
Predicted a ═ 0.0104+ (0.0868 × L before staining) + (0.580 × a before staining) + (0.2085 × b before staining)
Predicted b-3.3911 + (0.2831 x L before staining) - (0.0190 x a + before staining (0.3187 x b before staining)
Referring again to fig. 1, block 180 shows prompting the recipient to select the available final color from block 190 that is important to show the most accurate form of the available final hair color, i.e., the color displayed on the display screen should be substantially similar in the recipient's perception to the color predicted by the color prediction model. There may be and may be a difference between the features under illumination conditions, such as the raw hair value obtained in block 160, and the features under illumination conditions of the available final color shown in block 180. Furthermore, any application that displays a CRT screen of the final color available will itself create some difference between displaying the color according to the L, a and b values and the recipient's perception of their image. To correct for these potential differences, block 180 may include one or more steps to adjust the color of the display image and the perception of the recipient (see FIG. 1).
Color adjustment
Referring to fig. 1, block 180 of hair color analysis method 100 may further comprise one or more steps to correct for any differences between the features under lighting conditions, such as the raw hair values obtained in block 160, and the features under lighting conditions that display the final colors that are available (see block 180) and the application of virtually any display screen itself may produce differences in the recipient's perception. Applicants have found that as a result of these factors, both lighting and the application of the display screen, recipients may find that the image presented to them is darker (duller) or inconsistent with their hair color as they imagined, despite the fact that the L, a, b values of the image provided are the same as their current hair color or the final hair color that is available.
According to one method of determining correction factors for individual hues suitable for use herein, one can start with any given hue. Many people who have had their hair dyed that shade may then have their color measured (see block 160). Although an image is displayed, the L, a, b values of the displayed image may be adjusted until a discernable referee decides about the person's hair color, imaginarily matching the displayed color. The changes in L, a, b values can then be averaged separately across all test populations. These values may then represent correction factors.
Color providing method
Referring to fig. 14, a color rendering method 300 includes a series of steps for converting a processed image into an image that can be rendered on a display screen. Many display screens display their colors using the RGB color format. In addition, each type of display may have a unique graphic profile (profile). To address these issues, the technique in fig. 14 was developed. Method 300 begins at block 310 with reading processed image 315 into the memory of a computer or any other similar electronic device (see fig. 15). Block 320 masks machined image 315 to generate masked image 325 (see fig. 15). The masking image 325 distinguishes areas that do not contain hair from areas that contain hair. This masking step is useful in that only the areas that need to be changed (i.e., the areas containing hair) are determined, thereby reducing the operating time of the computer. The masking step may be performed in an automated or manual manner using various masking techniques known in the industry. It is also desirable to cover other areas such as eyes and skin so that these areas can change with the eye and skin information provided by the recipient in the counter box 150. Although the preferred embodiment uses the same finished image 315 and the same masking image 325 for the recipient, those of ordinary skill in the art will readily appreciate that other techniques may be used without departing from the scope of the present invention, including but not limited to generating a finished image from a photograph of the recipient, and then masking the image. Block 330 converts the masked portions of the processed image 315 in RGB (red-green-blue) color format first into XYZ tristimulus values, which are then further converted into L, a, b colors. Block 340 calculates the Average of the L, a, b values of the hair portion of the manipulated Image 315, which are hereinafter defined as "Average Image Color L", "Average Image Color _ a", and "Average Image Color _ b", respectively. Block 350 then calculates new L, a, b values for each point (x, y) in the hair portion of machined image 315, which are hereinafter defined as "new image color _ L", "new image color _ a", and "new image color _ b", respectively, and are calculated as follows:
new image color _ L (x, y) ═ current image color _ L (x, y) + (desired color _ L-average image color _ L)
New image color _ a (x, y) ═ current image color _ a (x, y) + (desired color _ a-average image color _ a)
New image color _ b (x, y) ═ current image color b _ (x, y) + (desired color _ b-average image color _ b)
Wherein:
the average image color _ L, the average image color _ a and the average image color _ b are the average values of the calculations in question.
The new image color L (x, y), the new image color a (x, y), and the new image color b (x, y) are newly calculated values for the edited point (x, y).
The current image color L (x, y), the current image color a (x, y), and the current image color b (x, y) are the original color values of the edited point (x, y).
The desired color L, the desired color a, and the desired color b are the color values of the desired color that are available as selected in block 190.
Once each point (x, y) within the hair portion of the processed image has been transformed as discussed above, the resulting image will have an average of L, a, b for the desired color _ L, the desired color _ a, and the desired color _ b, respectively.
This result can be demonstrated by summing all points (x, y) in the hair portion and dividing by the total number in the hair portion as shown for L as follows:
SUM [ new image color _ L (x, y) ], SUM [ current image color _ L (x, y) ] + SUM [ desired color _ L-average image color _ L ]
However, SUM of all (x, y) points is equal to the average multiplied by the total. If the total number of dots in the hair portion is N, then:
average new image color _ L N average current image color _ L N + SUM (desired color _ L) -SUM (average image color _ L)
Averaging new image color L N average current image color L N + desired color L N average image color L N
Now, the average image color L N is cancelled out, leaving the average new image color L N the desired color L N
Namely: average new image color _ L-desired color _ L
Block 360 converts the new image colors L, a, b values back to RGB so that they can be displayed on a display screen using the RGB format. They can also be converted to sRGB values. This conversion is done by first converting the new image color L, a, b values into XYZ tristimulus values and then further converting them into RGB colors. Block 370 then takes a particular display requirement (monitor profile) for the particular display screen so that the new image color L, a, b values appear as they are intended to be seen. Such steps as application specific display requirements are described in ICC Profile API Suite version2.0-Professional Color Management Tools, Eastman-Kodak Company, 1997 and incorporated herein by reference as such. Block 380 outputs the provided image 335 (see fig. 15). Output includes, but is not limited to, display on a display screen, printing, saving, or interactive transmission of images.
Color providing method 300 and color adjustment embodiments
This example illustrates how the color providing method 300 can be used to provide an individual with available colors in the shade of hazel. Therein, a color adjustment method is also explained.
For a reddish-brown hue, the color adjustment factor was determined as: l- + 5; a- + 2; and b- + 5. Therefore, the predicted color must be modified as follows:
desired L-predicted L +5
Desired a +5 of prediction
Required b-predicted b +2
Assume that 6 CIE Lab color readings have been obtained for an individual's hair and that the average color values of the CIE Lab are (14.16, -5.06, 4.54). According to the prediction model for a reddish brown color, the achievable output colors should be:
predicted L-7.037 + (0.5948 × 14.16) -15.73
Predicted a-0.0104 + (0.0868 × 14.16) + (0.580 × 5.06) + (0.2085 × 4.54) — 0.75
Predicted b-3.3911 + (0.2831 × 14.16) - (0.0190 × -5.06) + (0.3187 × 4.54) ═ 2.16
02
The prediction is then modified according to the color adjustment value for the hazel, as shown below:
the required L is 15.73+5 is 20.73
The required a-0.75 + 5-4.25
The required b 2.16+2 4.16
The function of the color providing method 300 is therefore to read the image and regenerate a new image with average values of L, a, b hair color values of (20.73, 4.25, 4.16).
The processed image is then read, as required by blocks 310 and 320, and then masked, where the images are representative of the image shown in fig. 15.
The image is next converted from red-green-blue (RGB) to CIE Lab colors as described in block 330. Assume that an image consists of 640 × 480(═ 307200) dots, where each dot has three values — red, green, and blue, respectively. In this step, for each point in the image, the red, green and blue values are first converted to three XYZ tristimulus values according to the following equation:
r ═ red/255.0
g is green/255.0
b is blue/255.0
If (r < or 0.03928), then r/12.92, otherwise ((r +0.055)/1.055)2.4
If (g < or 0.03928), g/12.92, otherwise ((g +0.055)/1.055)2.4
If (b < or 0.03928), then b/12.92, otherwise ((b +0.055)/1.055)2.4
X=(0.4124*r+0.3576*g+0.1805*b)
Y=(0.2126*r+0.7152*g+0.0722*b)
Z=(0.0193*r+0.1192*g+0.9505*b)
The three XYZ tristimulus values are then converted to CIE Lab values according to the equation given below:
Xnorm=X/0.95
Ynorm=Y/1.0
Znorm=Z/1.09
if (Ynorm > 0.008856), then fY ═ Ynorm1/3And L116.0 fY-16.0, otherwise L903.3 Ynorm
If (Xnorm > 0.008856), then fX ═ Ynorm1/3Otherwise, fX 7.787 xnor +16/116
If (Znorm > 0.008856), then fZ ═ Znorm1/3Otherwise fZ is 7.787 Znorm +16/116
a=500*(fX-fY)
b=200*(fY-fZ)
For example, for an image point having red, green, and blue values (198, 130, 111), these equations yield CIE Lab values of (60.1, 24.1, 21.1).
Next, as depicted at block 340, an average of L, a and b is calculated. In this step, the image converts all of its points (i.e., 640 × 480 ═ 307200) converted from red, green, and blue values to CIE L, a, and b values. The average of L, a and b is calculated by conventional mathematical methods (i.e., sum/point) for all points in the area containing hair. For the purposes of this example, the average of L, a and b was inferred to be (26.7, 10.1, 15.2).
New L, a and b values are then calculated as described in block 350. The points within the image area containing hair are changed according to the equation described above in block 350. Assuming that the average value of L, a, b is (26.7, 10.1, 15.2) and the color value of the current point is (59.3, 30.4, 18.4), to get a new image of L, a and b's average color value (20.73, 4.25, 4.16) predicted for a reddish brown color, the new L, a and b color values for this particular point would be converted to:
new L59.3 + (20.73-26.7) ═ 53.33
New a-30.4 + (4.25-10.1) -24.55
New b ═ 18.4+ (4.16-15.2) ═ 7.36
Once all points in the image within the area containing hair have been transformed according to the above technique, the new image (i.e. the rendered image) has L, a and b with average color values (20.73, 4.25, 4.16) which are the predicted (i.e. desired) colors for the hazel.
Next, the L, a and b values are converted to RGB values as described in block 360. In this step, a new image belonging to L, a and b format (e.g. CIE Lab) has to be converted back to RGB (red, green and blue) format. This step is accomplished by first converting the L, a and b values to three XYZ tristimulus values, and then converting the three tristimulus values to RGB values. This step can be accomplished by the following technique:
L plus 16 By 25=(L+16.0)/25.0
(1/100)1/3=0.21544
fX=a/500.0+(1/100)1/3*L plus 16 By 25
fY=L plus 16 By 25
fZ=(1/100)1/3*L plus 16 By 25-(b/200.0)
Xnorm=0.95*fX3.0
Ynorm=1.0*fY3.0/100.0
Znorm=1.09*fZ3.0
r=(3.2410*Xnorm-1.5374*Ynorm-0.4986*Znorm)
g=(-0.9692*Xnorm+1.8760*Ynorm+0.0416*Znorm)
b=(0.0556*Xnorm+0.2040*Ynorm+1.0570*Znorm)
if (r < ═ 0.00304), then r is 12.92 r, otherwise r is 1.055 r1.0/2.4-0.055
If (g < ═ 0.00304), then g is 12.92 g, otherwise g is 1.055 g1.0/2.4-0.055
If (b < ═ 0.00304), then b ═ 12.92 ═ b, otherwise, b ═ 1.055 ═ b1.0/2.4-0.055
Red 255.0 r
Green 255.0 g
Blue 255.0 x b
For example, for an image point having L, a, b values of (60.89, 24.08, 21.12), these equations produce red, green, and blue values of (198, 130, 111).
End of embodiment-
The invention can be implemented, for example, by using a Minolta CR300 colorimeter and an NCR touch screen Kiosk 7401 including a computer system and a touch screen.
The invention may be implemented, for example, by operating a computer system to execute a series of machine-readable commands. These commands may be stored in various types of signal-bearing media such as hard disk drives and main memory. In this regard, another aspect of the invention relates to a program product comprising a signal-bearing medium carrying a program of machine-readable commands executable by a digital data processor, such as a Central Processing Unit (CPU), to perform method steps. The machine-readable commands may comprise any of a number of programmatic languages (e.g., visual basic, C + +, etc.) known in the art.
It should be appreciated that the present invention may be implemented on any type of computer system. One acceptable type of computer system includes a main or Central Processing Unit (CPU) that interfaces with main memory (e.g., Random Access Memory (RAM)), a display adapter, a secondary storage adapter, and a network adapter. The components of these systems may be interconnected by a system bus.
For example, the CPU may be a Pentium processor manufactured by Intel corporation. However, it is to be appreciated that the present invention is not limited to processors produced by any one manufacturer and that the present invention may be implemented using some other type of processor, such as a co-processor or an auxiliary processor. A secondary storage adapter may be used to connect mass storage (such as a hard drive) to a computer system. The programs need not all be stored on the computer system at the same time. In fact, the latter scenario may be the case if the computer system is a network computer, relying on the request to command the transport mechanism to enter the role (mechanism) or part of the role maintained on the server. A display adapter may be used to directly connect the display to the computer system. Network adapters may be used to connect computer systems to other computer systems.
It is important to note that while the present invention has been described in the context of a fully functioning computer system, those of ordinary skill in the art will appreciate that the mechanisms and processes of the present invention are capable of being distributed as a program product in a variety of forms, and that the present invention applies equally regardless of the particular type of signal bearing media used to actually carry out the distribution. Examples of signal bearing media include recordable type media such as floppy disks, hard disk drives, and CD ROMs and transmission type media such as digital and analog communications connectors, and wireless.
While particular embodiments of the present invention have been illustrated and described, it would be obvious to those skilled in the art that changes and modifications can be made without departing from the scope of the invention.

Claims (18)

1. A method for determining an obtainable final hair color based on at least one raw hair value of a recipient, said method comprising the steps of:
a) inputting at least one raw hair value for the recipient; and
b) determining at least one available final hair color based on the raw hair value.
2. The method of claim 1, wherein the raw hair value is selected from the group consisting of a value representing the reason for coloring the recipient, how damaged the recipient's hair is, what color the recipient's eyes and skin are, what color the recipient's hair is, and combinations thereof.
3. The method of claim 2, wherein the color of the recipient's hair is represented using a value selected from the group consisting of Hunter L, a, b color values, and Commission International de 1' Eclairage L, a, b color values.
4. The method of claim 2, wherein the color of the recipient's hair is determined by a colorimeter or spectrophotometer.
5. The method of claim 2, wherein the recipient's hair damage is represented using a value selected from the group consisting of:
a) average hair length of recipient hair;
b) the frequency at which the recipient dyes hair;
c) the number of lesions determined by a method selected from the group consisting of:
i. the recipient self-intuitive evaluation;
a visual or physical assessment of the recipient or other person;
evaluation using a device for measuring hair damage;
chemical evaluation of the recipient's hair; and
combinations thereof; and
d) and combinations of the foregoing.
6. The method of claim 1, wherein said determining of the achievable end hair color is determined by a color prediction algorithm.
7. The method of claim 6, wherein said color prediction algorithm calculates said obtainable final hair color using a color model generated by applying said at least one partial least squares regression technique on raw hair values.
8. The method of claim 1, further comprising the step of outputting at least one processed image depicting an available final hair color, wherein said recipient can select said processed image as the desired final hair color.
9. The method of claim 8, wherein the L, a, b color values of the available final hair color are adjusted to correct for differences between the actual available final hair color and the recipient's perception of the displayed available final hair color.
10. The method of claim 8, further comprising the step of recommending a hair coloring agent to achieve said desired final hair color.
11. The method of claim 8, wherein the processed image is output to a display screen selected from the group consisting of a computer screen, a cathode ray tube device, and a liquid crystal display.
12. The method of claim 1, wherein the steps are embodied in a computer system or in a computer-readable medium or in a computer data signal embodied in a carrier wave.
13. A method for determining a hair coloring agent based on at least one raw hair value of a recipient, said method comprising the steps of:
a) inputting at least one raw hair value for the recipient;
b) inputting family color selection; and
c) and selecting and outputting at least one processed image depicting an available final hair color based on the raw hair value and the family color.
14. The method of claim 13, further comprising the step of inputting a selection of said available final hair color and recommending said selected hair color to be obtained.
15. A method of outputting an image for a hair color analysis system, said method comprising the steps of:
a) inputting a processed image of a person;
b) inputting the final hair color that can be obtained;
c) converting the RGB color values of the processed image to L, a and b color values;
d) calculating an average L, a and b color value using said L, a and b color values;
e) calculating new L, a and b color values based on said L, a and b color values, said average L, a and b color values, and said available final hair color; and
f) the new L, a and b color values are converted to converted RGB values.
16. The method of claim 15, wherein the output image is displayed on a screen selected from the group consisting of a computer screen, a cathode ray tube device, and a liquid crystal display.
17. A method of providing a hair coloring product to a consumer, the method comprising the steps of:
a) determining an available final hair color for the consumer;
b) depicting the final color available to the consumer;
c) allowing the consumer to select the desired final hair color; and
d) recommending to the consumer a hair dye for achieving said desired final hair color;
wherein the steps are performed according to the method of claim 10.
18. The method of claim 17, wherein the method is performed at a location selected from the group consisting of a beauty shop, a retail store, and a consumer's home.
HK04101399.1A 2000-05-12 2001-05-10 Method for analyzing hair and predicting achievable hair dyeing ending colors HK1059958A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US09/570,292 2000-05-12
US09/844,585 2001-04-27

Publications (1)

Publication Number Publication Date
HK1059958A true HK1059958A (en) 2004-07-23

Family

ID=

Similar Documents

Publication Publication Date Title
CN1297807C (en) A method of analyzing hair and predicting the attainable final color of dyed hair
US6707929B2 (en) Method for analyzing hair and predicting achievable hair dyeing ending colors
EP1428168B1 (en) Advanced cosmetic color analysis system
AU2001261404A1 (en) Method for analyzing hair and predicting achievable hair dyeing ending colors
US9256963B2 (en) Skin diagnostic and image processing systems, apparatus and articles
US9101320B2 (en) Skin diagnostic and image processing methods
US7764303B2 (en) Imaging apparatus and methods for capturing and analyzing digital images of the skin
KR101593866B1 (en) Optimal tint selection
MXPA02000191A (en) VIRTUAL MAKEUP.
CN1665444A (en) Measurement and Treatment of Hair Color
CN113711262B (en) Method for determining parameters specific to custom hair coloring for a given individual
JP2020526756A (en) A system for managing hair condition information and how to optimize beauty consultation
JP6323097B2 (en) Color measuring device, color measuring system, color measuring method, and program
EP3979212A1 (en) A system and method for determining a skin tone
HK1059958A (en) Method for analyzing hair and predicting achievable hair dyeing ending colors
JP7761767B2 (en) Method for simulating application of cosmetic makeup products to a body surface
JP2005143655A (en) Hair color advising tool, hair color advising tool preparing method, and hair color advising tool preparing device
JP2008286649A (en) Facial skin color whiteness evaluation method, program using the method, and skin color measuring instrument
JP2020536244A (en) The process for deciding on a hair color crossmaker proposal
JP2005222405A (en) Recommended method of foundation
JP2004144569A (en) Hair dyeing state prediction method of hair dyeing agent
Ansell A Study of ellipsoidal variance as a function of mean CIELAB values in a textile data set
JP2005134262A (en) Application color estimating method of charge of powder makeup using system identification method