WO2001087245A2 - Method for analyzing hair and predicting achievable hair dyeing ending colors - Google Patents

Method for analyzing hair and predicting achievable hair dyeing ending colors Download PDF

Info

Publication number
WO2001087245A2
WO2001087245A2 PCT/US2001/015165 US0115165W WO0187245A2 WO 2001087245 A2 WO2001087245 A2 WO 2001087245A2 US 0115165 W US0115165 W US 0115165W WO 0187245 A2 WO0187245 A2 WO 0187245A2
Authority
WO
WIPO (PCT)
Prior art keywords
hair
color
recipient
values
achievable
Prior art date
Application number
PCT/US2001/015165
Other languages
French (fr)
Other versions
WO2001087245A3 (en
Inventor
Suresh Bandara Marapane
Ke-Ming Quan
David Walter
Original Assignee
The Procter & Gamble Company
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=27075299&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=WO2001087245(A2) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Priority claimed from US09/844,585 external-priority patent/US6707929B2/en
Application filed by The Procter & Gamble Company filed Critical The Procter & Gamble Company
Priority to JP2001583714A priority Critical patent/JP2003533283A/en
Priority to CA002407457A priority patent/CA2407457C/en
Priority to AU2001261404A priority patent/AU2001261404B2/en
Priority to DE60132192T priority patent/DE60132192T2/en
Priority to MXPA02011157A priority patent/MXPA02011157A/en
Priority to EP01935297A priority patent/EP1281054B1/en
Priority to AU6140401A priority patent/AU6140401A/en
Publication of WO2001087245A2 publication Critical patent/WO2001087245A2/en
Publication of WO2001087245A3 publication Critical patent/WO2001087245A3/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/46Measurement of colour; Colour measuring devices, e.g. colorimeters
    • G01J3/52Measurement of colour; Colour measuring devices, e.g. colorimeters using colour charts
    • G01J3/526Measurement of colour; Colour measuring devices, e.g. colorimeters using colour charts for choosing a combination of different colours, e.g. to produce a pleasing effect for an observer
    • G01J3/528Measurement of colour; Colour measuring devices, e.g. colorimeters using colour charts for choosing a combination of different colours, e.g. to produce a pleasing effect for an observer using colour harmony theory
    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45DHAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
    • A45D19/00Devices for washing the hair or the scalp; Similar devices for colouring the hair
    • A45D19/0041Processes for treating the hair of the scalp
    • A45D19/0066Coloring or bleaching
    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45DHAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
    • A45D44/00Other cosmetic or toiletry articles, e.g. for hairdressers' rooms
    • A45D44/005Other cosmetic or toiletry articles, e.g. for hairdressers' rooms for selecting or displaying personal cosmetic colours or hairstyle
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/46Measurement of colour; Colour measuring devices, e.g. colorimeters
    • G01J3/462Computing operations in or between colour spaces; Colour management systems

Definitions

  • the present invention relates in general to a method and system for analyzing hair, predicting achievable end colors, outputting the predicted achievable colors and recommending hair color agents based on recipient specific input.
  • Some hair coloring agents are sold in packaging which depicts a coloring chart usually containing three starting colors and three corresponding ending colors after dyeing.
  • Some systems have been developed that attempt to overlay pictures of hair colors over a picture of the consumer. Other systems attempt to match products to starting colors using an indexing system.
  • none of these systems provide a personalized color prediction based on a starting hair value (such as color) of the recipient in a manner which is not limited by a specific number of inputs. As such, consumers may not achieve an ending hair color which is reasonably close to the promised hair color.
  • none of the systems are made integral in the purchasing experience by synergistically combining a hair color analyzing system with a product line of hair coloring agents.
  • What is needed is a method and system to analyze hair, predict achievable end colors, output the predicted achievable colors and recommend hair color agents based on recipient specific input.
  • the present invention is directed to a method and system for analyzing hair, predicting achievable end colors and recommending hair color agents based on recipient specific input.
  • the invention is directed to a method for identifying an achievable end hair color based upon at least one starting hair value of a recipient.
  • This method includes the steps of inputting at least one starting hair value of a recipient and then identifying at least one achievable end hair color based upon the starting hair value.
  • the invention is directed to a method for identifying a hair coloring agent based upon at least one starting hair value of a recipient.
  • This method includes the steps of inputting at least one starting hair value of a recipient, inputting a family color selection, and outputting at least one working image depicting an achievable end hair color based upon the starting hair value and the family color selection.
  • This method may also include one or more steps to enhance the appearance of the outputted working image to account for any differences between the characteristics of the lighting under which the starting hair value is obtained versus the lighting under which the working image is viewed.
  • the invention is directed to a method for outputting an image for a hair color analyzing system including the steps of inputting a working image of a person; inputting an achievable end hair color; converting RGB color values of the working image to L, a, and b color values; computing averaged L, a, and b color values of the working image; computing new L, a, and b color values based on the working image L, a, and b color values, the working image averaged L, a, and b color values and the achievable end hair color; and then converting the new L, a, and b color values to RGB values.
  • the invention is directed to a method for providing a hair coloring product to a consumer comprising the steps of identifying achievable end hair colors for the consumer, depicting the achievable end colors to the consumer, allowing the consumer to select a desired end hair color, and recommending to the consumer a hair coloring agent to achieve said desired end hair color.
  • FIG. 1 is a block diagram of a color prediction method capable of utilizing the present invention
  • FIG. 2 is a screen shot of a welcoming screen which may be used and is an example of block 110 within Fig 1;
  • FIG. 3 is a screen shot of an input screen which may be used and is an example of block 120 within Fig 1; 5
  • FIG. 4 is a screen shot of an input screen which may be used and is an example of block 130 within Fig 1;
  • FIG. 5 is a screen shot of an input screen which may be used and is an example of block 140 within Fig 1;
  • FIG. 6 is a screen shot of an input screen which may be used and is an example of o block 150 within Fig 1 ;
  • FIG. 7 is a screen shot of an input screen which may be used and is an example of block 160 within Fig 1;
  • FIG. 8 is a screen shot of an input screen which may be used and is an example of block 170 within Fig 1;
  • FIG. 9 is a screen shot of an input screen which may be used and is an example of block 180 and block 190 within Fig 1;
  • FIG. 10 is a table of family colors and corresponding shades of colors
  • FIG. 11 is an example of a hair coloring chart within the prior art
  • FIG. 12 is a screen shot of an output screen which may be used and is an example 0 of block 200 within Fig 1 ;
  • FIG. 13 is an example of a before and after data generated for a consumer using the Light Mahogany Brown shade
  • FIG. 14 is a block diagram of a color rendering method capable of utilizing the present invention.
  • 5 FIG. 15 is an example of (a) working image before rendering, (b) masking image of the working image, and (c) a rendered image created using the present invention.
  • FIG. 16 is a block diagram of a color prediction method capable of utilizing the present invention, which includes a step for assessing hair damage.
  • FIG. 17 is a screen shot of an input screen which may be used and is an example o of block 210 within Fig. 16.
  • FIG. 1 A block diagram of a method 100 for identifying a hair coloring agent based upon starting hair conditions of a recipient is illustrated in Fig. 1.
  • method 100 is embodied in a computer system and is located at a retail counter for the purpose of analyzing and recommending hair coloring products.
  • the apparatus may be used anywhere without departing from the scope of the present invention. For example, the apparatus could be used in a salon.
  • hair color analyzing method 100 contains a series of steps which contain screen shots that are intended to give and receive information to a recipient or user for purposes of executing this method 100. These screen shots may be displayed on a display screen, which may include, but is not limited to: a computer screen, a cathode ray tube (CRT) device, and a liquid crystal display (LCD) device.
  • Block 110 contains a welcoming screen (see Fig. 2) which identifies the system and the associated hair color agents (e.g. NS Sassoon products).
  • Block 120 contains a screen which asks the recipient why he/she is coloring his/her hair (see Fig. 3).
  • Block 130 contains a screen which asks the recipient how often he/she has colored their hair over the past year (see Fig. 4). The frequency to which someone colors their hair may affect their hair's condition and ability to receive another coloring.
  • Block 140 contains a screen which asks the recipient what is the average length of their hair (see Fig. 5). The length of a recipients hair may affect the hair's condition and ability to receive another coloring as the longer the hair, the more weathered it becomes. It may also affect how the color measurements are to be taken (e.g. skew the modeling to more heavily consider the roots versus the ends).
  • Block 150 contains a screen which asks the recipient what is their eye (and/or skin) color, (see Fig. 6).
  • the eye and skin color information may be depicted within the rendered images that the recipient looks at in determining their desired end color and therefore assist in the recipients selection process.
  • block 160 prompts the recipient to take a plurality of color measurements of their hair using a colorimeter or spectrophotometer (see Fig. 7).
  • the recipient is prompted to take six color readings (front, front left, back left, back, back right and front right). This may be performed by the recipient themselves, or it may be performed on the recipient by another, for example a beauty counselor.
  • Each reading from the colorimeter will be in a useable format, including, but not limited to Commission International de l'Eclairage (CIE) L, a, b color values, and Hunter L, a, b color values. Persons skilled in the art would readily appreciate that other color formats may be used without departing from the scope of the present invention.
  • CIE Commission International de l'Eclairage
  • Block 170 prompts the recipient to choose from a family of colors (e.g. Blonde, Light Brown, Medium Brown, Dark Brown and Red/Copper) (see Fig. 8).
  • the selected color family is also used in the color prediction model. While the preferred embodiment depicts five different color families, persons skilled in the art would readily appreciate that any number of color families may be used anywhere without departing from the scope of the present invention. Further, it will be appreciated by those skilled in the art that the order in which the steps set forth in blocks 120 through 170 are shown herein represent but one embodiment of the present invention, and that these steps may actually be performed in any order relative to one another.
  • Block 180 displays achievable end hair colors which have been determined based on the input from the steps described in blocks 120 through 170 (see Fig. 9).
  • Block 190 prompts the recipient to select from a group of achievable end hair colors which represent various color shades contained within the family color selected in the step described in block 170 (see Fig. 9). It is contemplated that prompts may be presented to the recipient which allow the recipient to go backwards in the flow of the method to arrive at a different combination of achievable end hair colors to be displayed in block 180. As non-limiting examples, this may allow the recipient to change inputted values, including answers to questions, or it may allow the recipient to simply indicate that they want to look at a lighter or darker or redder or less red or blonder or less blonder set of end hair colors.
  • the achievable end hair colors may be separated into multiple categories, including but not limited to, changes versus enhances wherein those colors within the changes category are considered a more extreme change in color requiring a higher pH hair coloring agent and those colors in the enhance category are considered milder change in color requiring a lower pH hair coloring agent.
  • Such information regarding the category e.g. pH level of hair coloring agent
  • the preferred embodiment contains sixteen different color shades with the five different color families (see Fig. 10), skilled in the art would readily appreciate that any number of color shades may be used anywhere without departing from the scope of the present invention. For example, twenty-five color shades may be used.
  • Block 200 recommends a particular hair coloring agent based on the selected achievable end hair colors, also referred to as desired hair color (see Fig. 12).
  • hair color analyzing method 100 may further contain one or more steps in which a starting hair value of a recipient may be a measure of damage to the hair of the recipient.
  • Block 210 contains a screen which asks the recipient the degree of damage to their hair (see Fig. 17).
  • the amount of damage to the hair may affect the hair's ability to be colored, as typically the more damaged the hair is, the lesser its ability to hold color due to changes in the structure of the cuticle.
  • the degree of damage may be represented by such terms as shown in Fig. 17, i.e. "slightly, moderately, substantially, heavily,” however, any words or a numbered grade scale or continuum points which depict increasing or decreasing quantities of damage may be used in the invention.
  • the method of determining how much damage may include, but is not limited to: intuitive self-assessment by the recipient, visual or physical assessment by the recipient or another, such as a beauty counselor, assessment using a device which measures hair damage, chemical assessment of the hair, and combinations thereof.
  • the method of determining hair damage may be conducted during the execution of method 100, or it may be conducted previously, and the result simply inputted during the execution of method 100.
  • Suitable hair damage measuring methods for use herein include, but are not limited to methods which employ devices that assess roughness, and by implication damage by measuring the degree of friction generated by subjecting the hair to certain conditions. For instance, ease of combing is commonly used as a measure of smoothness. In one combing test the force required to detangle, by drawing a comb through, a bundle of hair fibres is used to assess friction, roughness and damage.
  • EP-A-965,834 describes friction-measuring equipment for evaluating the effects of cosmetics on skin, hair, membranes and eyes. This equipment assesses friction by means of deformation of a deformable assembly on a probe. JP 63/163143 measures the degree of damage to hair by comparing forward and reverse friction forces.
  • JP 62/273433 measures friction between hairs by passing a fluid in turbulent flow over a bundle of hair and measuring friction by detecting pressure loss in the fluid.
  • a device which utilizes a method for measuring the friction generated by a bundle of hair fibres, comprising providing a friction member, drawing the friction member through the bundle of hair, whereby a frictional noise signal is generated, and capturing the frictional noise signal by a noise sensor. Further, the captured noise signal may be converted to a form which can be displayed. The converted signal may then displayed using display means. In such a method, the friction generated by a bundle of the recipient's hair is measured using both a friction member and a noise sensor.
  • the friction member may be drawn through the hair such that it contacts and passes over the surfaces of the individual hairs. This creates friction between the friction member and the hairs.
  • the frictional noise generated depends upon the level of friction between the friction member and the hair surfaces.
  • the friction member may be drawn through the hair of the recipient one or more times, and a statistical analysis may be performed to determine an average or mean value.
  • Such a friction member may generally formed from rigid material, and may be in the form of a comb means having a plurality of tines.
  • the comb means is usually drawn through the bundle of hair in the manner usual for a comb.
  • the frictional noise signal generated is captured by means of a frictional noise sensor, for example a microphone, which is, for instance, a standard electronic microphone or a noise cancelling microphone.
  • the frictional noise can be displayed and analysed in any suitable manner. It may be visually displayed to the recipient, who then selects a grade of damage based on the display, or it may be simply inputted directly into method 100, while also being displayed to the recipient.
  • the frictional noise sensed by the sensor may be converted to a signal which is then transferred to the visual display unit and displayed, or it may, for instance, be displayed in the form of a trace of sound amplitude versus time. These conversions may be achieved using known means.
  • the degree of damage assessed by this method may be reported as a value indexed against predetermined categories for which standard degrees of hair damage have been measured and tabulated, or as a value of damage relative to a standard, or as an amount of damage as measured on a scale of decibel levels which correspond to particular degrees of damage, or may be reported as an absolute value as measured, or the like.
  • Suitable hair damage measuring methods for use herein also include, but are not limited to methods known in the art which chemically assess the amount of broken versus unbroken disulfide bonds of cystiene in the recipient's hair.
  • Fig. 13 An example of the data generated is provided in Fig. 13 for a consumer using the Light Mahogany Brown shade.
  • Model accuracy as indicated by root mean square error (standard deviation of prediction) is in the region of 2.5 points for L, and 1 point for both a and b. This is in the region of natural variability within a head.
  • Research has shown that in describing the color of their own hair empirically, consumers may be as much as 10 or 12 points out in terms of L. With this in mind, the prediction accuracy of the present invention is considered more than acceptable. Color prediction models were then generated for each shade to predict achievable hair color after coloring versus initial color of the hair.
  • Models for a and b contained linear effects for L, a and b. Models were generated using partial least squares regression. While the preferred embodiment contains the modeling strategy and technique discussed above, skilled in the art would readily appreciate that other modeling strategies and techniques may be used anywhere without departing from the scope of the present invention.
  • An example of color prediction models generated for L, a and b for the Light Mahogany Brown shade is given below:
  • Predicted L 7.307 + (0.5948 x L before coloring)
  • Predicted a 0.0104 + (0.0868 x L before coloring) + (0.580 x a before coloring) + (0.2085 x b before coloring)
  • Predicted b -3.3911 + (0.2831 x L before coloring) - (0.0190 x a before coloring) + 5 (0.3187 x b before coloring)
  • Block 180 displays the achievable end colors from which the recipient will be prompted to select from in block 190, it is important to display the achievable end hair colors in the truest fashion, meaning, the color shown on the display screen should be 0 substantially similar in the recipient's perception, to the color predicted by the color prediction model. There may, and likely will be, differences between the characteristics of the lighting under which a starting hair value, such as that obtained in block 160, and the characteristics of the lighting under which the achievable end colors are displayed in block 180. Further, the use of any CRT screen to display the achievable end colors will inherently cause some discrepancy between 5 the display of the colors according to their L, a, and b values and the recipient's perceived view of them. To correct for these potential differences, block 180 may include one of more steps to adjust the color of the displayed image, and hence, its perception by the recipient (see Fig. 1).
  • block 180 of hair color analyzing method 100 may further contain one or more steps to correct for any differences between the characteristics of the lighting under which a starting hair value, such as that obtained in block 160, and the characteristics of the lighting under which the achievable end colors are displayed, (see block 180), and for the fact that use of any display screen inherently causes a difference in the recipient's perception.
  • Applicants have 5 found that as a result of both of these factors, i.e. lighting and use of a display screen, the recipient may find that the image displayed to them is duller or otherwise not consonant with how they would perceive their hair color, despite the fact that the L, a, b, values of the image being rendered are identical to their current hair color or achievable end hair color.
  • color rendering method 300 contains a series of steps that are intended to convert a working image into an image that may be rendered on a display screen. Many display screens display their colors using an RGB color format. Furthermore, each type of display screen may have a unique profile. To address these issues, the technique in Fig. 14 was developed. Method 300 begins with block 310 which reads a working image 315 (see Fig. 15) into the memory of a computer or any other similar electronic device. Block 320 then masks working image 315 to create masked image 325 (see Fig. 15). Masked image 325 separates the area containing hair from the area that does not hair.
  • This masking step is helpful in that it identifies the area that only requires changing (i.e.: area containing hair), thus reducing computer processing time.
  • the masking may be performed automatically or manually using a variety of masking techniques that are known throughout the industry. It may also be desired to mask other area such as eyes and skin so that these areas may be changed in response to eye and skin information supplied by the recipient in block 150. While the preferred embodiment uses the same working image 315 and the same masking image 325 for each recipient, skilled in the art would readily appreciate that other techniques may be used, including but not limited to, creating a working image from a picture of the recipient and then masking said image, without departing from the scope of the present invention.
  • Block 330 converts the masked portion of working image 315 from RGB (Red-Green-Blue) color format to XYZ tristimulus values first and then converts them further into L, a, b colors.
  • Block 340 computes the average L, a, and b color values of the hair portion of working image 315, said averaged values are defined below as "AveragelmageColorJ ", "AverageImageColor_a” and “AverageImageColor_b”, respectively.
  • Block 350 then computes new L, a, b values for each point (x,y) within the hair portion of working image 315, said new values are defined below as “NewImageColor_L”, “NewImageColor_a” and “NewImageColor_b”, respectively and are calculated as:
  • NewImageColorJ (x,y) CurrentImageColor_L(x,y) + (DesiredColorJL -
  • NewImageColor_a (x,y) CurrentImageColor_a(x,y) + (DesiredColor_a -
  • NewImageColor_b(x,y) * CurrentImageColor_b(x,y) + (DesiredColor_b - AverageImageColor_b)
  • AveragelmageColorJL, AverageImageColor_a and AverageImageColor_b are the computed averages just discussed.
  • NewImageColor_L(x,y), NewImageColor_a(x,y) and NewImageColor_b(x,y) are the new computed values for the point (x,y) being edited..
  • CurrentImageColor_L(x,y), CurrentImageColor_a(x,y) and CurrentImageColor_b(x,y) are the original color values for the point (x,y) being edited.
  • DesiredColor_L, DesiredColor_a and DesiredColor_b are the color values for the desired achievable color that was selected in Block 190.
  • DesiredColor_a and DesiredColor_b are DesiredColor_a and DesiredColor_b, respectively.
  • AverageNewImageColor_L*N AverageCurrentImageColor_L*N
  • AveragelmageColorJL *N DesiredColor_L*N That is,
  • AverageNewImageColorJL DesiredColorJL
  • Block 360 converts the NewImageColor L, a, b values back to RGB so that they may be displayed on the display screen which uses RGB format. They may also be converted into sRGB values. This conversion is accomplished by first converting the NewImageColor L, a, b values to XYZ tristimulus values and then converting them further into RGB colors. Block 370 then applies the particular monitor profile for the particular display screen being used so that the NewImageColor L, a, b values are displayed as they were intended to look. This procedure for applying a particular monitor profile is described in ICC Profile API Suite Version 2.0 - Professional Color Management Tools, Eastman-Kodak Company, 1997, and as such is herein incorporated by reference.
  • Block 380 outputs rendered image 335 (see Fig. 15). The output includes, but is not limited to, display on a display screen, printing, saving on disk, or transmitting the image to another.
  • This example illustrates how to render a person's achievable color for Light Mahogany Brown shade using color rendering method 300.
  • a color adjustment is also illustrated.
  • the color adjustment factors were determined to be +5 for L, +2 for a, and +5 for b. Accordingly, the predicted colors must be modified as follows:
  • color rendering method 300 is to read an image and recreate a new image that will have an average L, a, b hair color value of (20.73, 4.25, 4.16).
  • average values for L, a, and b will be computed.
  • average L, a, b values are computed by routine math (i.e.: sum / # of points). For purposes of this example, it is assumed that the average L, a, b vales are (26.7, 10.1, 15.2).
  • new image i.e.: rendered image
  • L, a, b color value (20.73, 4.25, 4.16), which is the predicted (i.e.: desired) color for Light Mahogany Brown.
  • the L, a, and b values will be converted to RGB values.
  • the new image which is in L, a, b format (e.g., CIE Lab) must be converted back into
  • RGB Red, Green, and Blue
  • r ( 3.2410*Xnorm - 1.5374*Ynorm - 0.4986*Znorm)
  • g (-0.9692*Xnorm + 1.8760*Ynorm + 0.0416*Znorm)
  • the present invention may be implemented, for example, by using the Minolta CR300 Colorimeter and NCR touchscreen Kiosk 7401, comprising a computer system and touchscreen 5 display.
  • the present invention may be implemented, for example, by operating a computer system to execute a sequence of machine-readable instructions. These instructions may reside in various types of signal bearing media, such hard disk drive and main memory.
  • another aspect of the present invention concerns a program product, comprising signal bearing media o embodying a program of machine-readable instructions, executable by a digital data processor, such as a central processing unit (CPU), to perform method steps.
  • the machine-readable instructions may comprise any one of a number of programming languages known in the art (e.g., Visual Basic, C, C++, etc.).
  • One acceptable type of computer system comprises a main or central processing unit (CPU) which is connected to a main memory (e.g., random access memory
  • RAM random access memory
  • auxiliary storage adapter a display adapter
  • network adapter a network adapter
  • the CPU may be, for example, a Pentium Processor made by Intel Corporation. o However, it should be understood that the present invention is not limited to any one make of processor and the invention may be practiced using some other type of a processor such as a coprocessor or an auxiliary processor.
  • An auxiliary storage adapter may be used to connect mass storage devices (such as a hard disk drive) to a computer system. The program need not necessarily all simultaneously reside on the computer system. Indeed, this latter scenario would likely be the case if computer system were a network computer, and therefore, be dependent upon an on-demand shipping mechanism for access to mechanisms or portions of mechanisms that resided on a server.
  • a display adapter may be used to directly connect a display device to the computer system.
  • a network adapter may be used to connect the computer system to other computer systems.
  • signal bearing media include: recordable type media, such as floppy disks, hard disk drives, and CD ROMs and transmission type media, such as digital and analog communications links and wireless.

Abstract

Method and system for analyzing hair, predicting achievable end colors and recommending hair color agents based on recipient specific input. Includes methods for (a) identifying an achievable end hair color based upon at least one starting hair value of a recipient; (b) for identifying a hair coloring agent based upon at least one starting hair value of a recipient; and (c) for outputting an image for a hair color analyzing system. Also includes (d) a method for providing a hair coloring product to a consumer comprising the steps of: (i) identifying achievable end hair colors for the consumer, (ii) depicting the achievable end colors to the consumer, (iii) allowing the consumer to select a desired end hair color, and (iv) recommending to the consumer a hair coloring agent to achieve said desired end hair color.

Description

METHOD FOR ANALYZING HAIR AND PREDICTING ACHIEVABLE HAIR DYEING ENDING COLORS
Cross-reference to Related Applications This application is a continuation-in-part (CIP) of U.S. Serial No. 09/570,292, filed on 12
May, 2000.
Technical Field of the Invention
The present invention relates in general to a method and system for analyzing hair, predicting achievable end colors, outputting the predicted achievable colors and recommending hair color agents based on recipient specific input.
Background of the Invention
Countless individuals all over the world seek to improve their physical appearance through the use of hair coloring agents. As a result there is an extremely large choice of available products for consumers to choose from. Often, the individual consumer finds it difficult to determine which hair coloring agent to choose and also predict what it will look like given their current hair color.
Some hair coloring agents are sold in packaging which depicts a coloring chart usually containing three starting colors and three corresponding ending colors after dyeing. Some systems have been developed that attempt to overlay pictures of hair colors over a picture of the consumer. Other systems attempt to match products to starting colors using an indexing system. However, none of these systems provide a personalized color prediction based on a starting hair value (such as color) of the recipient in a manner which is not limited by a specific number of inputs. As such, consumers may not achieve an ending hair color which is reasonably close to the promised hair color. Furthermore, none of the systems are made integral in the purchasing experience by synergistically combining a hair color analyzing system with a product line of hair coloring agents.
What is needed is a method and system to analyze hair, predict achievable end colors, output the predicted achievable colors and recommend hair color agents based on recipient specific input.
Summary of the Invention The present invention is directed to a method and system for analyzing hair, predicting achievable end colors and recommending hair color agents based on recipient specific input.
In one aspect, the invention is directed to a method for identifying an achievable end hair color based upon at least one starting hair value of a recipient. This method includes the steps of inputting at least one starting hair value of a recipient and then identifying at least one achievable end hair color based upon the starting hair value.
In another aspect, the invention is directed to a method for identifying a hair coloring agent based upon at least one starting hair value of a recipient. This method includes the steps of inputting at least one starting hair value of a recipient, inputting a family color selection, and outputting at least one working image depicting an achievable end hair color based upon the starting hair value and the family color selection. This method may also include one or more steps to enhance the appearance of the outputted working image to account for any differences between the characteristics of the lighting under which the starting hair value is obtained versus the lighting under which the working image is viewed. In yet another aspect, the invention is directed to a method for outputting an image for a hair color analyzing system including the steps of inputting a working image of a person; inputting an achievable end hair color; converting RGB color values of the working image to L, a, and b color values; computing averaged L, a, and b color values of the working image; computing new L, a, and b color values based on the working image L, a, and b color values, the working image averaged L, a, and b color values and the achievable end hair color; and then converting the new L, a, and b color values to RGB values.
In still yet another aspect, the invention is directed to a method for providing a hair coloring product to a consumer comprising the steps of identifying achievable end hair colors for the consumer, depicting the achievable end colors to the consumer, allowing the consumer to select a desired end hair color, and recommending to the consumer a hair coloring agent to achieve said desired end hair color.
These and other features and advantages of the present invention will be apparent to those skilled in the art in view of the detailed description provided below.
Brief Description of the Drawings
FIG. 1 is a block diagram of a color prediction method capable of utilizing the present invention; FIG. 2 is a screen shot of a welcoming screen which may be used and is an example of block 110 within Fig 1;
FIG. 3 is a screen shot of an input screen which may be used and is an example of block 120 within Fig 1; 5 FIG. 4 is a screen shot of an input screen which may be used and is an example of block 130 within Fig 1;
FIG. 5 is a screen shot of an input screen which may be used and is an example of block 140 within Fig 1;
FIG. 6 is a screen shot of an input screen which may be used and is an example of o block 150 within Fig 1 ;
FIG. 7 is a screen shot of an input screen which may be used and is an example of block 160 within Fig 1;
FIG. 8 is a screen shot of an input screen which may be used and is an example of block 170 within Fig 1; 5 FIG. 9 is a screen shot of an input screen which may be used and is an example of block 180 and block 190 within Fig 1;
FIG. 10 is a table of family colors and corresponding shades of colors;
FIG. 11 is an example of a hair coloring chart within the prior art;
FIG. 12 is a screen shot of an output screen which may be used and is an example 0 of block 200 within Fig 1 ;
FIG. 13 is an example of a before and after data generated for a consumer using the Light Mahogany Brown shade;
FIG. 14 is a block diagram of a color rendering method capable of utilizing the present invention; and 5 FIG. 15 is an example of (a) working image before rendering, (b) masking image of the working image, and (c) a rendered image created using the present invention.
FIG. 16 is a block diagram of a color prediction method capable of utilizing the present invention, which includes a step for assessing hair damage.
FIG. 17 is a screen shot of an input screen which may be used and is an example o of block 210 within Fig. 16.
Detailed Description of the Invention A block diagram of a method 100 for identifying a hair coloring agent based upon starting hair conditions of a recipient is illustrated in Fig. 1. In one embodiment, method 100 is embodied in a computer system and is located at a retail counter for the purpose of analyzing and recommending hair coloring products. However, persons of skilled in the art would readily appreciate that the apparatus may be used anywhere without departing from the scope of the present invention. For example, the apparatus could be used in a salon.
Referring to Fig. 1, hair color analyzing method 100 contains a series of steps which contain screen shots that are intended to give and receive information to a recipient or user for purposes of executing this method 100. These screen shots may be displayed on a display screen, which may include, but is not limited to: a computer screen, a cathode ray tube (CRT) device, and a liquid crystal display (LCD) device. Block 110 contains a welcoming screen (see Fig. 2) which identifies the system and the associated hair color agents (e.g. NS Sassoon products). Block 120 contains a screen which asks the recipient why he/she is coloring his/her hair (see Fig. 3). For instance, one recipient may want to cover his gray hair which requires a stronger hair coloring agent, while another recipient may only want to enhance her hair color which requires a milder hair coloring agent. Block 130 contains a screen which asks the recipient how often he/she has colored their hair over the past year (see Fig. 4). The frequency to which someone colors their hair may affect their hair's condition and ability to receive another coloring. Block 140 contains a screen which asks the recipient what is the average length of their hair (see Fig. 5). The length of a recipients hair may affect the hair's condition and ability to receive another coloring as the longer the hair, the more weathered it becomes. It may also affect how the color measurements are to be taken (e.g. skew the modeling to more heavily consider the roots versus the ends). Block 150 contains a screen which asks the recipient what is their eye (and/or skin) color, (see Fig. 6). The eye and skin color information may be depicted within the rendered images that the recipient looks at in determining their desired end color and therefore assist in the recipients selection process.
Still referring to Fig. 1, block 160 prompts the recipient to take a plurality of color measurements of their hair using a colorimeter or spectrophotometer (see Fig. 7). In one embodiment, the recipient is prompted to take six color readings (front, front left, back left, back, back right and front right). This may be performed by the recipient themselves, or it may be performed on the recipient by another, for example a beauty counselor. Each reading from the colorimeter will be in a useable format, including, but not limited to Commission International de l'Eclairage (CIE) L, a, b color values, and Hunter L, a, b color values. Persons skilled in the art would readily appreciate that other color formats may be used without departing from the scope of the present invention. Next, these L, a, b color values are averaged and stored for use later in the color prediction model. It will be apparent to those skilled in the art that a statistical analysis may be performed on the plurality of L, a, and b values, respectively, to correct for abnormalities and inaccuracies and errors which may occur during measurement. Block 170 prompts the recipient to choose from a family of colors (e.g. Blonde, Light Brown, Medium Brown, Dark Brown and Red/Copper) (see Fig. 8). The selected color family is also used in the color prediction model. While the preferred embodiment depicts five different color families, persons skilled in the art would readily appreciate that any number of color families may be used anywhere without departing from the scope of the present invention. Further, it will be appreciated by those skilled in the art that the order in which the steps set forth in blocks 120 through 170 are shown herein represent but one embodiment of the present invention, and that these steps may actually be performed in any order relative to one another.
Block 180 displays achievable end hair colors which have been determined based on the input from the steps described in blocks 120 through 170 (see Fig. 9). Block 190 prompts the recipient to select from a group of achievable end hair colors which represent various color shades contained within the family color selected in the step described in block 170 (see Fig. 9). It is contemplated that prompts may be presented to the recipient which allow the recipient to go backwards in the flow of the method to arrive at a different combination of achievable end hair colors to be displayed in block 180. As non-limiting examples, this may allow the recipient to change inputted values, including answers to questions, or it may allow the recipient to simply indicate that they want to look at a lighter or darker or redder or less red or blonder or less blonder set of end hair colors. The achievable end hair colors may be separated into multiple categories, including but not limited to, changes versus enhances wherein those colors within the changes category are considered a more extreme change in color requiring a higher pH hair coloring agent and those colors in the enhance category are considered milder change in color requiring a lower pH hair coloring agent. Such information regarding the category (e.g. pH level of hair coloring agent) may assist the recipient and/or beauty counselor in making more informed decisions as to which hair coloring agent to use. While the preferred embodiment contains sixteen different color shades with the five different color families (see Fig. 10), skilled in the art would readily appreciate that any number of color shades may be used anywhere without departing from the scope of the present invention. For example, twenty-five color shades may be used. These achievable end hair colors are generated from the color prediction model and therefore more accurately represent the post-dyeing color of the recipient than the ordinary color charts frequently found on the back side of most hair coloring agent packaging (see Fig 11). Block 200 recommends a particular hair coloring agent based on the selected achievable end hair colors, also referred to as desired hair color (see Fig. 12).
Hair Damage
Referring to Fig. 16, hair color analyzing method 100 may further contain one or more steps in which a starting hair value of a recipient may be a measure of damage to the hair of the recipient. Block 210 contains a screen which asks the recipient the degree of damage to their hair (see Fig. 17). The amount of damage to the hair may affect the hair's ability to be colored, as typically the more damaged the hair is, the lesser its ability to hold color due to changes in the structure of the cuticle. Again, it will be appreciated by those skilled in the art that the order in which the steps set forth in blocks 120 through 170, and block 210 are shown herein represent but one embodiment of the present invention, and that these steps may actually be performed in any order relative to one another.
The degree of damage may be represented by such terms as shown in Fig. 17, i.e. "slightly, moderately, substantially, heavily," however, any words or a numbered grade scale or continuum points which depict increasing or decreasing quantities of damage may be used in the invention. The method of determining how much damage may include, but is not limited to: intuitive self-assessment by the recipient, visual or physical assessment by the recipient or another, such as a beauty counselor, assessment using a device which measures hair damage, chemical assessment of the hair, and combinations thereof. The method of determining hair damage may be conducted during the execution of method 100, or it may be conducted previously, and the result simply inputted during the execution of method 100. Suitable hair damage measuring methods for use herein include, but are not limited to methods which employ devices that assess roughness, and by implication damage by measuring the degree of friction generated by subjecting the hair to certain conditions. For instance, ease of combing is commonly used as a measure of smoothness. In one combing test the force required to detangle, by drawing a comb through, a bundle of hair fibres is used to assess friction, roughness and damage. EP-A-965,834 describes friction-measuring equipment for evaluating the effects of cosmetics on skin, hair, membranes and eyes. This equipment assesses friction by means of deformation of a deformable assembly on a probe. JP 63/163143 measures the degree of damage to hair by comparing forward and reverse friction forces. These forces are measured by means of a torque meter. JP 62/273433 measures friction between hairs by passing a fluid in turbulent flow over a bundle of hair and measuring friction by detecting pressure loss in the fluid. Also suitable for use herein is a device which utilizes a method for measuring the friction generated by a bundle of hair fibres, comprising providing a friction member, drawing the friction member through the bundle of hair, whereby a frictional noise signal is generated, and capturing the frictional noise signal by a noise sensor. Further, the captured noise signal may be converted to a form which can be displayed. The converted signal may then displayed using display means. In such a method, the friction generated by a bundle of the recipient's hair is measured using both a friction member and a noise sensor. The friction member may be drawn through the hair such that it contacts and passes over the surfaces of the individual hairs. This creates friction between the friction member and the hairs. The frictional noise generated depends upon the level of friction between the friction member and the hair surfaces. The friction member may be drawn through the hair of the recipient one or more times, and a statistical analysis may be performed to determine an average or mean value. Such a friction member may generally formed from rigid material, and may be in the form of a comb means having a plurality of tines. The comb means is usually drawn through the bundle of hair in the manner usual for a comb. The frictional noise signal generated is captured by means of a frictional noise sensor, for example a microphone, which is, for instance, a standard electronic microphone or a noise cancelling microphone. Once the frictional noise has been captured it can be displayed and analysed in any suitable manner. It may be visually displayed to the recipient, who then selects a grade of damage based on the display, or it may be simply inputted directly into method 100, while also being displayed to the recipient. The frictional noise sensed by the sensor may be converted to a signal which is then transferred to the visual display unit and displayed, or it may, for instance, be displayed in the form of a trace of sound amplitude versus time. These conversions may be achieved using known means.
The degree of damage assessed by this method may be reported as a value indexed against predetermined categories for which standard degrees of hair damage have been measured and tabulated, or as a value of damage relative to a standard, or as an amount of damage as measured on a scale of decibel levels which correspond to particular degrees of damage, or may be reported as an absolute value as measured, or the like.
Suitable hair damage measuring methods for use herein also include, but are not limited to methods known in the art which chemically assess the amount of broken versus unbroken disulfide bonds of cystiene in the recipient's hair. Color Prediction Models
In developing the color prediction models, consumers were recruited to participate in a hair colorant program at a salon. On arrival at the salon, a colorimeter was used to measure the initial color of the consumer's hair. Data generated were color readings expressed in the L, a, b color, space. Multiple readings were taken at different parts of the head (e.g., front, front left, back left, back, back right and front right). An overall average color reading was calculated from these individual readings. After these initial readings were taken, the consumer's hair was colored using a shade selected by the consumer. After coloring and drying, the color of the hair was taken again using the colorimeter as outlined above. Thus data were generated for before and after coloring for a variety of shades on a number of consumers. An example of the data generated is provided in Fig. 13 for a consumer using the Light Mahogany Brown shade. There are slight differences between predicted hair color and that observed after coloring due to measurement error, modeling error and the natural variability in the data. Model accuracy as indicated by root mean square error (standard deviation of prediction) is in the region of 2.5 points for L, and 1 point for both a and b. This is in the region of natural variability within a head. Research has shown that in describing the color of their own hair empirically, consumers may be as much as 10 or 12 points out in terms of L. With this in mind, the prediction accuracy of the present invention is considered more than acceptable. Color prediction models were then generated for each shade to predict achievable hair color after coloring versus initial color of the hair. Separate models were generated to predict L, a and b for each shade (Auburn, Light Mahogany Brown, Light Brown, Golden Chestnut, Dark Brown, Mid Copper Brown, Light Pearl Ash Blonde, Mist Beige Blonde, Rich Burgundy Red, Lightest Auburn Brown, Lightest Brown, Lightest Golden Brown, Soft Medium Brown, True Red, Warm Dark Brown and Soft Copper Blonde). Models for a and b contained linear effects for L, a and b. Models were generated using partial least squares regression. While the preferred embodiment contains the modeling strategy and technique discussed above, skilled in the art would readily appreciate that other modeling strategies and techniques may be used anywhere without departing from the scope of the present invention. An example of color prediction models generated for L, a and b for the Light Mahogany Brown shade is given below:
Predicted L = 7.307 + (0.5948 x L before coloring) Predicted a = 0.0104 + (0.0868 x L before coloring) + (0.580 x a before coloring) + (0.2085 x b before coloring)
Predicted b = -3.3911 + (0.2831 x L before coloring) - (0.0190 x a before coloring) + 5 (0.3187 x b before coloring)
Referring again to Fig. 1, Block 180 displays the achievable end colors from which the recipient will be prompted to select from in block 190, it is important to display the achievable end hair colors in the truest fashion, meaning, the color shown on the display screen should be 0 substantially similar in the recipient's perception, to the color predicted by the color prediction model. There may, and likely will be, differences between the characteristics of the lighting under which a starting hair value, such as that obtained in block 160, and the characteristics of the lighting under which the achievable end colors are displayed in block 180. Further, the use of any CRT screen to display the achievable end colors will inherently cause some discrepancy between 5 the display of the colors according to their L, a, and b values and the recipient's perceived view of them. To correct for these potential differences, block 180 may include one of more steps to adjust the color of the displayed image, and hence, its perception by the recipient (see Fig. 1).
Color adjustment o Referring to Fig. 1 , block 180 of hair color analyzing method 100 may further contain one or more steps to correct for any differences between the characteristics of the lighting under which a starting hair value, such as that obtained in block 160, and the characteristics of the lighting under which the achievable end colors are displayed, (see block 180), and for the fact that use of any display screen inherently causes a difference in the recipient's perception. Applicants have 5 found that as a result of both of these factors, i.e. lighting and use of a display screen, the recipient may find that the image displayed to them is duller or otherwise not consonant with how they would perceive their hair color, despite the fact that the L, a, b, values of the image being rendered are identical to their current hair color or achievable end hair color.
According to one method for determining the correcting factors for each shade that is o suitable for use herein, one starts with any given shade. Then, a plurality of persons who have their hair colored to that shade may have their color measured (see block 160). While the image is displayed, one may adjust the L, a, b values of the displayed image until a referee who is able to discern when the color of the displayed image matches in perception, the color of the person's hair. The changes in L, a, b, respectively, may then be averaged across all persons tested. These values may then represent correcting factors.
Color Rendering Method Referring to Fig. 14, color rendering method 300 contains a series of steps that are intended to convert a working image into an image that may be rendered on a display screen. Many display screens display their colors using an RGB color format. Furthermore, each type of display screen may have a unique profile. To address these issues, the technique in Fig. 14 was developed. Method 300 begins with block 310 which reads a working image 315 (see Fig. 15) into the memory of a computer or any other similar electronic device. Block 320 then masks working image 315 to create masked image 325 (see Fig. 15). Masked image 325 separates the area containing hair from the area that does not hair. This masking step is helpful in that it identifies the area that only requires changing (i.e.: area containing hair), thus reducing computer processing time. The masking may be performed automatically or manually using a variety of masking techniques that are known throughout the industry. It may also be desired to mask other area such as eyes and skin so that these areas may be changed in response to eye and skin information supplied by the recipient in block 150. While the preferred embodiment uses the same working image 315 and the same masking image 325 for each recipient, skilled in the art would readily appreciate that other techniques may be used, including but not limited to, creating a working image from a picture of the recipient and then masking said image, without departing from the scope of the present invention. Block 330 converts the masked portion of working image 315 from RGB (Red-Green-Blue) color format to XYZ tristimulus values first and then converts them further into L, a, b colors. Block 340 computes the average L, a, and b color values of the hair portion of working image 315, said averaged values are defined below as "AveragelmageColorJ ", "AverageImageColor_a" and "AverageImageColor_b", respectively. Block 350 then computes new L, a, b values for each point (x,y) within the hair portion of working image 315, said new values are defined below as "NewImageColor_L", "NewImageColor_a" and "NewImageColor_b", respectively and are calculated as:
NewImageColorJ (x,y) = CurrentImageColor_L(x,y) + (DesiredColorJL -
AveragelmageColorJL)
NewImageColor_a (x,y) = CurrentImageColor_a(x,y) + (DesiredColor_a -
AverageImageColor_a) NewImageColor_b(x,y) *= CurrentImageColor_b(x,y) + (DesiredColor_b - AverageImageColor_b)
wherein, AveragelmageColorJL, AverageImageColor_a and AverageImageColor_b are the computed averages just discussed.
NewImageColor_L(x,y), NewImageColor_a(x,y) and NewImageColor_b(x,y) are the new computed values for the point (x,y) being edited..
CurrentImageColor_L(x,y), CurrentImageColor_a(x,y) and CurrentImageColor_b(x,y) are the original color values for the point (x,y) being edited.
DesiredColor_L, DesiredColor_a and DesiredColor_b are the color values for the desired achievable color that was selected in Block 190.
Once each point (x,y) within the hair portion of working image are converted as described above, the resulting image will have an average L, a, and b of DesiredColorJL,
DesiredColor_a and DesiredColor_b, respectively.
This can be easily demonstrated by summing up all (x,y) points within the hair portion and dividing by the total within the hair portion as shown below for L.
SUM[NewImageColor_L (x,y)] = SUM[CurrentImageColor_L(x,y)]
+ SUM[(DesiredColorJL - AveragelmageColorJL)]
But, SUM over all the (x,y) points is equal to the Average multiplied by the total number. If the total number of points within the hair portion is N, then
AverageNewImageColor_L*N = AverageCurrentImageColor_L*N
+ SUM(DesiredColor_L) - SUM(AverageImageColor_L) AverageNewImageColor_L*N = AverageCurrentImageColor_L*N
+ DesiredColor_L*N - AveragelmageColorJL *N Now, AveragelmageColorJL *N will cancel out leaving AverageNewImageColor_L*N = DesiredColor_L*N That is,
AverageNewImageColorJL = DesiredColorJL
Block 360 converts the NewImageColor L, a, b values back to RGB so that they may be displayed on the display screen which uses RGB format. They may also be converted into sRGB values. This conversion is accomplished by first converting the NewImageColor L, a, b values to XYZ tristimulus values and then converting them further into RGB colors. Block 370 then applies the particular monitor profile for the particular display screen being used so that the NewImageColor L, a, b values are displayed as they were intended to look. This procedure for applying a particular monitor profile is described in ICC Profile API Suite Version 2.0 - Professional Color Management Tools, Eastman-Kodak Company, 1997, and as such is herein incorporated by reference. Block 380 outputs rendered image 335 (see Fig. 15). The output includes, but is not limited to, display on a display screen, printing, saving on disk, or transmitting the image to another.
Example of color rendering method 300 and color adjustment
This example illustrates how to render a person's achievable color for Light Mahogany Brown shade using color rendering method 300. A color adjustment is also illustrated. For Light Mahogany Brown shade, the color adjustment factors were determined to be +5 for L, +2 for a, and +5 for b. Accordingly, the predicted colors must be modified as follows:
Desired L = Predicted L + 5 Desired a = Predicted a + 5 Desired b = Predicted b + 2
Assume six CIE Lab color readings of the person's hair have been made and that the average CIE Lab color is (14.16, -5.06, 4.54). According to the prediction model for Light Mahogany Brown color, the achievable output color will be:
Predicted L = 7.307 + (0.5948 x 14.16) = 15.73
Predicted a = 0.0104 + (0.0868 x 14.16) + (0.580 x -5.06) + (0.2085 x 4.54) = -0.75
Predicted b = -3.3911 + (0.2831 x 14.16) - (0.0190 x -5.06) + (0.3187 x 4.54) = 2.16 Next, the predicted values are modified according to the color adjustment values for Light Mahogany Brown, as shown:
Desired L = 15.73 + 5 = 20.73
Desired a = -0.75 + 5 = 4.25 Desired b = 2.16 + 2 = 4.16
So, the function of color rendering method 300 is to read an image and recreate a new image that will have an average L, a, b hair color value of (20.73, 4.25, 4.16).
Next, according to blocks 310 and 320, the working image is read, and then masked, wherein these images are typical to the ones shown in Fig. 15.
Next, according to block 330, the image will then be converted from Red-Green-Blue to
CIE Lab color. Assume image consists of 640 x 480 (= 307200) points with each point having three values - one each for Red, Green, and Blue. At this step, for every point in the image, the Red, Green, Blue values will first be converted to three XYZ tristimulus values according to following equations:
r = Red/255.0 g = Green/255.0 b = Blue/255.0
if (r < or = 0.03928), then r = r/12.92, else r = ((r+0.055)/1.055)2*4 if (g < or = 0.03928), then g = g/12.92, else g = ((g+0.055)/l.055) 2A if (b < or = 0.03928), then b = b/12.92, else b = ((b+0.055)/ 1.055)2*4
X = (0.4124*r + 0.3576*g + 0.1805*b) Y = (0.2126*r + 0.7152*g + 0.0722*b)
Z = (0.0193*r + 0.1192*g + 0.9505*b) Then the three XYZ tristimulus values are converted to the CIE Lab values according to the equations given below.
Xnorm = X/0.95 Ynorm = Y/1.0
Znorm = Z/1.09
if (Ynorm > 0.008856), then fY = Ynorm and L = 116.0*fY - 16.0 else L = 903.3* Ynorm
if (Xnorm > 0.008856), then fX = Xnorm, 1/3 else fX = 7.787*Xnorm + 16/116
if (Znorm > 0.008856), then fZ = Znorm1/3 else fZ = 7J87*Znorm + 16/116
a = 500 * (fX - fY) b = 200 * (fY - fZ) For example, for an image point with a Red, Green, Blue values of (198, 130, 111), these equations will yield CIE Lab values of (60.1, 24.1, 21.1).
Next, according to block 340, average values for L, a, and b, will be computed. At this stage, the image will have all (i.e. 640 x 480 = 307200) its points converted from Red, Green, Blue values to CIE L, a, b values. For all the points within the area containing hair, average L, a, b values are computed by routine math (i.e.: sum / # of points). For purposes of this example, it is assumed that the average L, a, b vales are (26.7, 10.1, 15.2).
Next, according to block 350, new L, a, and b values will be computed. Each point within the area of the image containing hair is changed according to the equations discussed above in block 350. Assuming average L, a, b vales of (26.7, 10.1, 15.2) and current point color values of (59.3, 30.4, 18.4), to obtain a new image with average L, a, b color values of (20.73, 4.25, 4.16) as predicted for Light Mahogany Brown, the new L, a, b color values for this specific point will be changed to:
New L = 59.3 + (20.73-26.7) = 53.33
New a = 30.4 + (4.25-10.1) = 24.55 New b = 18.4 + (4.16-15.2) = 7.36
Once all the points in the image within the area containing hair have been converted according to the above technique, new image (i.e.: rendered image) will have an average L, a, b color value of (20.73, 4.25, 4.16), which is the predicted (i.e.: desired) color for Light Mahogany Brown.
Next, according to block 360, the L, a, and b values will be converted to RGB values. At this stage, the new image which is in L, a, b format (e.g., CIE Lab) must be converted back into
RGB (Red, Green, and Blue) format. This procedure is accomplished by first converting L, a, b values into three XYZ tristimulus, and then converting the three XYZ tristimulus values into RGB values. This procedure may be accomplished by the following technique:
Lplusl6By25 = (L+16.0)/25.0
CubeRootOflBylOO = 0.21544 fX = a/500.0 + CubeRootOflByl00*Lplusl6By25 fY = Lplusl6By25 fZ = CubeRootOflByl00*Lplusl6By25 - (b/200.0)
Xnorm = 0.95 *fX 3 0 Ynorm = 1.0 * fY 3 0 /100.0; Znorm = 1.09 *fZ 3*0
r = ( 3.2410*Xnorm - 1.5374*Ynorm - 0.4986*Znorm) g = (-0.9692*Xnorm + 1.8760*Ynorm + 0.0416*Znorm) b = ( 0.0556*Xnorm - 0.2040*Ynorm + 1.0570*Znorm) if (r <= 0.00304), then r = 12.92 * r, else r = 1.055 * r1 0'24 - 0.055 if (g <= 0.00304), then g = 12.92 * g, else g = 1.055 *g 1 0 2*4 - 0.055 if (b <= 0.00304), then b = 12.92 * b, else b = 1.055 *b mA - 0.055
5 Red = 255.0*r
Green = 255.0*g Blue = 255.0*b
For example, for an image point with L, a, b values of (60.89, 24.08, 21.12) these o equations will yield Red, Green, Blue values of (198, 130, 111).
- END OF EXAMPLE -
The present invention may be implemented, for example, by using the Minolta CR300 Colorimeter and NCR touchscreen Kiosk 7401, comprising a computer system and touchscreen 5 display.
The present invention may be implemented, for example, by operating a computer system to execute a sequence of machine-readable instructions. These instructions may reside in various types of signal bearing media, such hard disk drive and main memory. In this respect, another aspect of the present invention concerns a program product, comprising signal bearing media o embodying a program of machine-readable instructions, executable by a digital data processor, such as a central processing unit (CPU), to perform method steps. The machine-readable instructions may comprise any one of a number of programming languages known in the art (e.g., Visual Basic, C, C++, etc.).
It should be understood that the present invention may be implemented on any type of 5 computer system. One acceptable type of computer system comprises a main or central processing unit (CPU) which is connected to a main memory (e.g., random access memory
(RAM)), a display adapter, an auxiliary storage adapter, and a network adapter. These system components may be interconnected through the use of a system bus.
The CPU may be, for example, a Pentium Processor made by Intel Corporation. o However, it should be understood that the present invention is not limited to any one make of processor and the invention may be practiced using some other type of a processor such as a coprocessor or an auxiliary processor. An auxiliary storage adapter may be used to connect mass storage devices (such as a hard disk drive) to a computer system. The program need not necessarily all simultaneously reside on the computer system. Indeed, this latter scenario would likely be the case if computer system were a network computer, and therefore, be dependent upon an on-demand shipping mechanism for access to mechanisms or portions of mechanisms that resided on a server. A display adapter may be used to directly connect a display device to the computer system. A network adapter may be used to connect the computer system to other computer systems.
It is important to note that while the present invention has been described in the context of a fully functional computer system, those skilled in the art will appreciate that the mechanisms of the present invention are capable of being distributed as a program product in a variety of forms, and that the present invention applies equally regardless of the particular type of signal bearing media used to actually carry out the distribution. Examples of signal bearing media include: recordable type media, such as floppy disks, hard disk drives, and CD ROMs and transmission type media, such as digital and analog communications links and wireless.
While particular embodiments of the present invention have been illustrated and described, it will be obvious to those skilled in the art that various changes and modifications may be made without departing from the scope of the invention.

Claims

WHAT IS CLAIMED IS:
1. A method for identifying an achievable end hair color based upon at least one starting hair value of a recipient, said method comprising the steps of: a) inputting at least one starting hair value of a recipient; and b) identifying at least one achievable end hair color based upon said starting hair value.
2. The method of Claim 1, wherein said starting hair value is selected from the group consisting of values representing the reason the recipient is coloring their hair, how damaged the recipient's hair is, what the eye and skin color of recipient are, what color recipient's hair is, and combinations thereof.
3. The method of Claim 2, wherein the color of recipient's hair is indicated using values selected from the group consisting of Hunter L, a, b color values, and Commission International de l'Eclairage L, a, b color values.
4. The method of Claim 2, wherein the color of recipient's hair is measured by a colorimeter or a spectrophotometer.
5. The method of Claim 2, wherein the recipient's hair damage is indicated using values selected from the group consisting of values representing: a) the recipient's hair average hair length, b) the frequency with which the recipient colors their hair, c) the amount of damage determined by a method selected from the group consisting of: i. intuitive self-assessment by the recipient, ii. visual or physical assessment by the recipient or another, iii. assessment using a device which measures hair damage, iv. chemical assessment of the recipient's hair, and v. combinations thereof; and d) combinations thereof.
6. The method of Claim 1, wherein said identifying of an achievable end hair color is performed by a color prediction algorithm.
7. The method of Claim 6, wherein said color prediction algorithm calculates said achievable end hair color using a color model generated by using a partial least squares regression technique of said at least one starting hair value.
8. The method of Claim 1, further comprising the step of outputting at least one working image depicting the achievable end hair color, wherein the recipient may select said working image as a desired end hair color.
9. The method of Claim 8, wherein the L, a, b color values of said achievable end hair color are adjusted to correct for differences between the actual achievable end hair color and the recipients perception of the displayed achievable end hair color.
10. The method of Claim 8, further comprising the step of recommending a hair coloring agent to achieve said desired end hair color.
11. The method of Claim 8, wherein said working image is outputted to a display screen selected from the group consisting of a computer screen, a cathode ray tube device, and a liquid crystal display device.
12. The method of Claim 1, wherein said steps are contained within a computer system or within a computer-readable medium or within a computer data signal embedded in a carrier wave.
13. A method for identifying a hair coloring agent based upon at least one starting hair value of a recipient, said method comprising the steps of: a) inputting at least one starting hair value of a recipient; b) inputting a family color selection; and c) outputting at least one working image depicting an achievable end hair color based upon said starting hair value and said family color selection.
14. The method of Claim 13, further comprising the step of inputting of a selection of said achievable end hair color and recommending a hair coloring agent to achieve said selection.
15. A method for outputting an image for a hair color analyzing system, said method comprising the steps of: a) inputting a working image of a person; b) inputting an achievable end hair color; c) converting RGB color values of said working image to L, a, and b color values; d) computing averaged L, a, and b color values using said L, a, and b color values; e) computing new L, a, and b color values based on said L, a, and b color values, said averaged L, a, and b color values and said achievable end hair color; and f) converting said new L, a, and b color values to converted RGB values.
16. The method of Claim 15, wherein the image outputted is displayed on a screen selected from the group consisting of a computer screen, a cathode ray tube device, and a liquid crystal display device.
17. A method for providing a hair coloring product to a consumer comprising the steps of: a) identifying achievable end hair colors for the consumer, b) depicting the achievable end colors to the consumer, c) allowing the consumer to select a desired end hair color, d) and recommending to the consumer a hair coloring agent to achieve said desired end hair color, wherein these steps are conducted according to the method of Claim 10.
18. The method of Claim 17, wherein said method is conducted in a place selected from the group consisting of a salon, a retail store, and a consumer's home.
PCT/US2001/015165 2000-05-12 2001-05-10 Method for analyzing hair and predicting achievable hair dyeing ending colors WO2001087245A2 (en)

Priority Applications (7)

Application Number Priority Date Filing Date Title
JP2001583714A JP2003533283A (en) 2000-05-12 2001-05-10 Hair analysis and prediction of achievable hair dye finish colors
CA002407457A CA2407457C (en) 2000-05-12 2001-05-10 Method for analyzing hair and predicting achievable hair dyeing ending colors
AU2001261404A AU2001261404B2 (en) 2000-05-12 2001-05-10 Method for analyzing hair and predicting achievable hair dyeing ending colors
DE60132192T DE60132192T2 (en) 2000-05-12 2001-05-10 METHOD OF HAIR ANALYSIS AND PREDICTION OF EFFICIENT HAIR COLORING
MXPA02011157A MXPA02011157A (en) 2000-05-12 2001-05-10 Method for analyzing hair and predicting achievable hair dyeing ending colors.
EP01935297A EP1281054B1 (en) 2000-05-12 2001-05-10 Method for analyzing hair and predicting achievable hair dyeing ending colors
AU6140401A AU6140401A (en) 2000-05-12 2001-05-10 Method for analyzing hair and predicting achievable hair dyeing ending colors

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US57029200A 2000-05-12 2000-05-12
US09/570,292 2000-05-12
US09/844,585 US6707929B2 (en) 2000-05-12 2001-04-27 Method for analyzing hair and predicting achievable hair dyeing ending colors
US09/844,585 2001-04-27

Publications (2)

Publication Number Publication Date
WO2001087245A2 true WO2001087245A2 (en) 2001-11-22
WO2001087245A3 WO2001087245A3 (en) 2002-03-07

Family

ID=27075299

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2001/015165 WO2001087245A2 (en) 2000-05-12 2001-05-10 Method for analyzing hair and predicting achievable hair dyeing ending colors

Country Status (9)

Country Link
EP (1) EP1281054B1 (en)
JP (1) JP2003533283A (en)
CN (1) CN1297807C (en)
AT (1) ATE382854T1 (en)
AU (2) AU6140401A (en)
CA (1) CA2407457C (en)
ES (1) ES2296753T3 (en)
MX (1) MXPA02011157A (en)
WO (1) WO2001087245A2 (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002083282A1 (en) * 2001-04-10 2002-10-24 The Procter & Gamble Company Customized hair colorant formulating and dispensing apparatus and method
JP2004102495A (en) * 2002-09-06 2004-04-02 Dainippon Printing Co Ltd Color reproduction method for color simulation
JP2004159830A (en) * 2002-11-12 2004-06-10 Lion Corp Hair evaluation system
US6761697B2 (en) 2001-10-01 2004-07-13 L'oreal Sa Methods and systems for predicting and/or tracking changes in external body conditions
WO2005024657A2 (en) * 2003-09-05 2005-03-17 Henkel Kommanditgesellschaft Auf Aktien Computer system and method for processing a plurality of digital images
EP1531720A2 (en) * 2002-06-28 2005-05-25 Seethrough Ltd. Hair color measurement and treatment
JP2005530589A (en) * 2002-06-26 2005-10-13 ザ プロクター アンド ギャンブル カンパニー Method and apparatus for predicting coloring results
US7324668B2 (en) 2001-10-01 2008-01-29 L'oreal S.A. Feature extraction in beauty analysis
US7437344B2 (en) 2001-10-01 2008-10-14 L'oreal S.A. Use of artificial intelligence in providing beauty advice
US7634103B2 (en) 2001-10-01 2009-12-15 L'oreal S.A. Analysis using a three-dimensional facial image
KR101199746B1 (en) * 2002-06-28 2012-11-08 콜로라이트 리미티드 Hair color measurement and treatment
EP2689771A1 (en) * 2012-07-27 2014-01-29 Kao Germany GmbH Hair coloring process
DE102015225458A1 (en) 2015-12-16 2017-06-22 Henkel Ag & Co. Kgaa Method and data processing device for computer-aided determination of properties of hair colors
WO2017103056A1 (en) * 2015-12-16 2017-06-22 Henkel Ag & Co. Kgaa Method and data processing device for the computer-assisted determination of a hair dyeing agent for dyeing hair in a desired hair color and device for producing an individually determined hair dyeing agent
JP2017519192A (en) * 2014-04-27 2017-07-13 コロライト エルティーディー.ColoRight Ltd. Apparatus and method for adjusted hair color
US11282190B2 (en) 2018-05-17 2022-03-22 The Procter And Gamble Company Systems and methods for hair coverage analysis
US11457859B2 (en) 2016-07-05 2022-10-04 Henkel Ag & Co. Kgaa Method for determining a user-specific hair treatment
US11633148B2 (en) 2018-05-17 2023-04-25 The Procter & Gamble Company Systems and methods for hair analysis

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7523018B2 (en) * 2005-02-28 2009-04-21 Seethrough Ltd. Hair coloring system
US7508508B2 (en) * 2006-09-19 2009-03-24 Seethrough Ltd. Device and method for inspecting a hair sample
KR102033159B1 (en) * 2011-03-21 2019-10-16 콜로라이트 리미티드 Method for predicting a result of a treatment of keratinous fibers
US10694832B2 (en) * 2014-12-30 2020-06-30 L'oréal Hair color system using a smart device
WO2017077498A1 (en) 2015-11-04 2017-05-11 Coloright Ltd. Method and system for customized hair-coloring
EP3481277B1 (en) 2016-07-05 2020-09-02 Henkel AG & Co. KGaA Method for determining a user-specific hair treatment
EP3481280B1 (en) * 2016-07-05 2020-09-02 Henkel AG & Co. KGaA Method for establishing a user-specific hair care treatment
EP3296961A1 (en) * 2016-09-16 2018-03-21 Noxell Corporation Calculating a composition for a preparation for treating hair fibers
DE102016222190A1 (en) * 2016-11-11 2018-05-17 Henkel Ag & Co. Kgaa Method and device for determining a color homogeneity of hair
FR3061499B1 (en) * 2016-12-30 2019-05-24 L'oreal SYSTEM FOR REALIZING TREATMENT OF THE HAIR TO CONNECT AT AT LEAST ONE ARRIVAL OF WATER
DE102017211596A1 (en) 2017-07-07 2019-01-10 Henkel Ag & Co. Kgaa System for managing hair condition information and method for optimizing a cosmetic counseling system
WO2019036330A1 (en) * 2017-08-15 2019-02-21 True Hair Llc Method and system for generating a 3d digital model used in hairpiece manufacturing
EP3671754A1 (en) * 2018-12-18 2020-06-24 Henkel AG & Co. KGaA Method for determining a coloration product recommendation
CN111340818A (en) * 2018-12-19 2020-06-26 北京京东尚科信息技术有限公司 Hair dyeing effect picture generation method and device and computer readable storage medium
CN110646354B (en) * 2019-08-28 2021-10-26 东华大学 Color testing device and method for cotton fibers

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0290327A1 (en) * 1987-04-30 1988-11-09 Schwarzkopf France Apparatus for automatically determining a hair dyeing formula in order to obtain a desired colouring
WO1996041139A1 (en) * 1983-07-18 1996-12-19 Chromatics Color Sciences International, Inc. Method and apparatus for hair color characterization and treatment
WO2000015073A1 (en) * 1998-09-11 2000-03-23 Wella Ag Device for determining method data of a method for cosmetically treating hair on a person's head

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4258478A (en) * 1978-05-09 1981-03-31 Magic Mirror, Inc. Fashion previewing system
JPS6113904A (en) * 1984-06-29 1986-01-22 ライオン株式会社 Dyed hair simulation apparatus
US5060171A (en) * 1989-07-27 1991-10-22 Clearpoint Research Corporation A system and method for superimposing images
CZ393197A3 (en) * 1995-06-07 1998-06-17 Chromatics Color Sciences International, Inc. Process and apparatus for for detecting and measuring color influencing conditions
JP3792286B2 (en) * 1996-01-29 2006-07-05 株式会社資生堂 Hair color chart creation method and hair color chart creation apparatus
DE19816631A1 (en) * 1998-04-15 1999-10-21 Ruediger Gregori Optical presentation of hair styles on the head

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1996041139A1 (en) * 1983-07-18 1996-12-19 Chromatics Color Sciences International, Inc. Method and apparatus for hair color characterization and treatment
EP0290327A1 (en) * 1987-04-30 1988-11-09 Schwarzkopf France Apparatus for automatically determining a hair dyeing formula in order to obtain a desired colouring
WO2000015073A1 (en) * 1998-09-11 2000-03-23 Wella Ag Device for determining method data of a method for cosmetically treating hair on a person's head

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP1281054A2 *

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002083282A1 (en) * 2001-04-10 2002-10-24 The Procter & Gamble Company Customized hair colorant formulating and dispensing apparatus and method
US7634103B2 (en) 2001-10-01 2009-12-15 L'oreal S.A. Analysis using a three-dimensional facial image
US6761697B2 (en) 2001-10-01 2004-07-13 L'oreal Sa Methods and systems for predicting and/or tracking changes in external body conditions
US7437344B2 (en) 2001-10-01 2008-10-14 L'oreal S.A. Use of artificial intelligence in providing beauty advice
US7324668B2 (en) 2001-10-01 2008-01-29 L'oreal S.A. Feature extraction in beauty analysis
JP2005530589A (en) * 2002-06-26 2005-10-13 ザ プロクター アンド ギャンブル カンパニー Method and apparatus for predicting coloring results
US6980888B2 (en) 2002-06-26 2005-12-27 The Proctor & Gamble Company Method and apparatus for predicting the result of a coloration
EP1531720A4 (en) * 2002-06-28 2009-08-05 Seethrough Ltd Hair color measurement and treatment
EP1531720A2 (en) * 2002-06-28 2005-05-25 Seethrough Ltd. Hair color measurement and treatment
US7110117B2 (en) 2002-06-28 2006-09-19 Seethrough Ltd. Hair color measurement and treatment
US7304739B2 (en) 2002-06-28 2007-12-04 Seethrough Ltd. Hair color measurement and treatment
AU2003238653B2 (en) * 2002-06-28 2008-10-30 Coloright Ltd Hair color measurement and treatment
KR101199746B1 (en) * 2002-06-28 2012-11-08 콜로라이트 리미티드 Hair color measurement and treatment
JP2004102495A (en) * 2002-09-06 2004-04-02 Dainippon Printing Co Ltd Color reproduction method for color simulation
JP2004159830A (en) * 2002-11-12 2004-06-10 Lion Corp Hair evaluation system
WO2005024657A3 (en) * 2003-09-05 2005-04-28 Henkel Kgaa Computer system and method for processing a plurality of digital images
WO2005024657A2 (en) * 2003-09-05 2005-03-17 Henkel Kommanditgesellschaft Auf Aktien Computer system and method for processing a plurality of digital images
WO2014016193A1 (en) * 2012-07-27 2014-01-30 Kao Germany Gmbh Hair coloring process
EP2689771A1 (en) * 2012-07-27 2014-01-29 Kao Germany GmbH Hair coloring process
US9326585B2 (en) 2012-07-27 2016-05-03 Kao Germany Gmbh Hair coloring process
EP2877151B1 (en) * 2012-07-27 2019-09-04 Kao Germany GmbH Hair coloring process
JP2017519192A (en) * 2014-04-27 2017-07-13 コロライト エルティーディー.ColoRight Ltd. Apparatus and method for adjusted hair color
DE102015225458A1 (en) 2015-12-16 2017-06-22 Henkel Ag & Co. Kgaa Method and data processing device for computer-aided determination of properties of hair colors
WO2017103056A1 (en) * 2015-12-16 2017-06-22 Henkel Ag & Co. Kgaa Method and data processing device for the computer-assisted determination of a hair dyeing agent for dyeing hair in a desired hair color and device for producing an individually determined hair dyeing agent
US11140967B2 (en) 2015-12-16 2021-10-12 Henkel Ag & Co. Kgaa Method and data processing device for ascertaining properties of hair colors in a computer-assisted manner
US11367510B2 (en) 2015-12-16 2022-06-21 Henkel Ag & Co. Kgaa Method and data processing device for the computer-assisted determination of a hair dyeing agent for dyeing hair in a desired hair color and device for producing an individually determined hair dyeing agent
US11457859B2 (en) 2016-07-05 2022-10-04 Henkel Ag & Co. Kgaa Method for determining a user-specific hair treatment
US11282190B2 (en) 2018-05-17 2022-03-22 The Procter And Gamble Company Systems and methods for hair coverage analysis
US11633148B2 (en) 2018-05-17 2023-04-25 The Procter & Gamble Company Systems and methods for hair analysis

Also Published As

Publication number Publication date
CA2407457A1 (en) 2001-11-22
EP1281054A2 (en) 2003-02-05
JP2003533283A (en) 2003-11-11
AU6140401A (en) 2001-11-26
CN1440503A (en) 2003-09-03
WO2001087245A3 (en) 2002-03-07
MXPA02011157A (en) 2003-03-10
CA2407457C (en) 2006-07-04
ATE382854T1 (en) 2008-01-15
CN1297807C (en) 2007-01-31
EP1281054B1 (en) 2008-01-02
AU2001261404B2 (en) 2006-01-19
ES2296753T3 (en) 2008-05-01

Similar Documents

Publication Publication Date Title
US6707929B2 (en) Method for analyzing hair and predicting achievable hair dyeing ending colors
CA2407457C (en) Method for analyzing hair and predicting achievable hair dyeing ending colors
AU2001261404A1 (en) Method for analyzing hair and predicting achievable hair dyeing ending colors
EP1428168B1 (en) Advanced cosmetic color analysis system
US8860748B2 (en) Computerized, personal-color analysis system
US6888963B2 (en) Image processing apparatus and image processing method
AU6315996A (en) Method and apparatus for hair color characterization and treatment
CN108431561B (en) Method and data processing device for computer-aided determination of hair color properties
ES2832612T3 (en) Procedure and data processing device for computer-aided determination of a hair coloring agent for coloring hair in a desired hair color, and device for the preparation of an individually determined hair coloring agent
US20180374140A1 (en) Computerized, personal beauty product analysis system
CN112740266A (en) Method and system for determining characteristics of keratin surfaces and method and system for treating said keratin surfaces
US20150145884A1 (en) Makeup color simulation apparatus and method
JP3321794B2 (en) How to predict makeup skin color
KR20140077322A (en) Method for recommending cosmetic products and apparatus using the method
KR20190136483A (en) Method for providing beauty contents
US11896114B2 (en) System for managing hair condition information and method for optimizing a cosmetic consultation system
JP5290498B2 (en) Makeup method
KR102090370B1 (en) Coaching system of personal color based on color psychology
JP2001025460A (en) Method for displaying skin color information
JP2020536244A (en) The process for deciding on a hair color crossmaker proposal
US20230401627A1 (en) Method for determining a hair coloration treatment
Sun Spectral imaging of human portraits and image quality

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ CZ DE DE DK DK DM DZ EE EE ES FI FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
AK Designated states

Kind code of ref document: A3

Designated state(s): AE AG AL AM AT AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ CZ DE DE DK DK DM DZ EE EE ES FI FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A3

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

WWE Wipo information: entry into national phase

Ref document number: 2407457

Country of ref document: CA

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

WWE Wipo information: entry into national phase

Ref document number: 2001261404

Country of ref document: AU

WWE Wipo information: entry into national phase

Ref document number: PA/a/2002/011157

Country of ref document: MX

Ref document number: 018121748

Country of ref document: CN

WWE Wipo information: entry into national phase

Ref document number: 2001935297

Country of ref document: EP

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
WWP Wipo information: published in national office

Ref document number: 2001935297

Country of ref document: EP

WWG Wipo information: grant in national office

Ref document number: 2001261404

Country of ref document: AU

WWG Wipo information: grant in national office

Ref document number: 2001935297

Country of ref document: EP