This is the US National Stage of PCT/EP 02/09685, filed on Aug. 30, 2002, in Europe. The invention described in the following description is also described and claimed in DE 101 42 526.0, which was filed on Aug. 30, 2001, in Germany. The aforesaid German Patent Application provides the basis for a claim of priority of invention for the invention described herein below under 35 U.S.C. 119 (a) to (d).
1. The Field of the Invention
The invention relates to a method for a hair color consultation or hair color simulation, in which by means of a video camera, a computer, and at least one screen, a person selects a desired hair color displayed on the at least one screen.
2. The Description of the Related Art
One such method for a hair color consultation is known for instance from European Patent Disclosure EP 1147722 A1, which uses a static digital portrait that has been taken, and in which a desired hairstyle color is input selectively and shown on a screen (display).
Before deciding on a new hair color, a person, for instance a hair stylist's customer, has to overcome an inhibition threshold, because she does not know in advance how she will look afterward. To support the hairdresser here in his consultation with her, with the hair color consultation computer system of EP 1147722 A1, a two-dimensional picture (digital photograph) of a person is provided with a new, simulated hair color and shown on a screen; the result looks static and unnatural. One reason why the results of conventional hair color consultation computer systems have a static effect is that they are static because they show a still picture. A photograph cannot replace a moving, living image, of the kind the person sees for instance when she looks into the mirror after her treatment.
The object of the invention is to create a generic method for a hair color consultation that does not have these disadvantages.
According to the invention the foregoing object is attained by a method of hair color consultation or hair color simulation, which comprises the steps of:
a) recording current individual images of the person by means of a video camera and transmitting the recorded individual images into a computer in a continuous image sequence in real time;
b) automatically editing and processing the individual images by means of a device, in which the computer continuously marks hair regions of the individual images by an automatic analysis of close and similar pixel color values and/or of contiguous texture and/or of contiguous morphological properties and changes the hair color of the hair color regions according to predetermined specifications to produce altered individual images; and then
c) displaying the altered individual images with the changed hair color on the at least one screen and/or touch screen in real time.
By means of this method for a hair color consultation, it is possible for the result of hair coloring to be visualized before the actual color change is made. In contrast to existing hair color consultation systems, it is based on the one hand on a dynamic video image and on the other on a change in color of the natural hair of the person. The video image is prepared in real time by the method and reproduced on a screen in such a way that the impression is of using a mirror. Other advantageous features of the invention are disclosed in the dependent claims.
The invention selectively simulates looking at an image either as others would see it or as its own mirror image. To that end, with a video camera, the current picture of the person is taken, automatically prepared by a computer, and displayed on a screen. By means of this dynamic display in real time, the screen becomes a mirror. The task of the computer is to modify the initial picture in such a way that the result of hair coloring is simulated. To that end, the computer system must recognize the hair of the person automatically and then change its color in accordance with its specifications, or the specifications of the stylist. In this way, a dynamic simulation with a natural effect is created. Since only the natural hair in motion is changed in color, the effect of “computerized wigs” placed on the head with appropriate coloring is eliminated. The hair falls into place naturally and is not rigid, and the person can move in front of the virtual mirror, turn her head, look at herself from the side, change her facial expression, and with the new hair color decide on her own whether the desired hair color 32 selected meets her expectations. If not, other desired hair colors 32 can immediately be simulated instead.
The invention will be described in further detail in terms of an exemplary embodiment.
Shown are:
Before an actual simulation with a new desired hair color 36, certain starting parameters are imparted to the computer 13 (initialization). To this end, the device 31 continuously and automatically processes starting parameters of a picture region 34 that are input manually; as at least one starting parameter, a hair region 32 is marked and covered with a selective desired hair color 36. As further combinations of starting parameters, provision can be made to identify which picture regions 34 represent a hair region 32 and which represent skin 35, and what these picture regions are characterized by (color tones, texture, morphology and so forth), what the predominant (natural) initial hair color 33 is, and what color (desired hair color 36) is wanted. This can be done by way of selecting or masking picture regions 34 or selection areas of one or more control menus on the screen 14, 15 or touch screen 22, 23. The input devices needed for this purpose can be provided for instance in the form of a mouse 17 or the like (trackball, touchpad, etc.), as a keyboard 16, and/or as at least one touch screen 22, 23. The initialization phase may be necessary among other reasons in order to tell the computer 13 which picture regions 34 of the hair region 32 represent the hairstyle 37, which is to be separated from the remaining image and changed in color. One way this can be done is for the user, in at least one of the picture regions 34, to mark the hair 18, skin 19, and a background 38 one or more points or one or more regions. Moreover, it is possible for regions to be marked that are explicitly not to be changed in color, such as eyebrows, sideburns, or a beard. To make it easier for the computer 13 to recognize the hair regions 32 and to keep the vulnerability to error of the hair color consultation system 1 slight, it is advantageous to take a picture of a person 11 in front of a homogeneous, single-color background 38, with as well-defined lighting as possible.
The hair region 32 is marked continuously by means of an automatic analysis of close and similar pixel color values and covered accordingly with the desired hair color 36, making it simple to track the hair region 32, which changes continuously as the person 11 moves. To that end, for analyzing close and similar pixel color values, a representative portion of the hair region 32 is selected manually, and the automatic analysis is then started, as a result of which the device 31 for continuous picture preparation only has to prepare the hair region 32 with the desired hair color 36. This automatic marking is based on the thought that the initial hair color 33 in the hair region 32 has a cohesive area of close and similar pixel color values that can be clearly differentiated from the remaining picture region 34.
However, the hair region 32 can be selectively marked continuously by means of an automatic analysis of a cohesive texture and/or cohesive geomorphological properties, in order to clearly differentiate the remaining picture region 34.
Depending on where it is to be used (salon/retail), besides the person 11 an advisor, for instance a hairdresser, can also be present. In that case, it is appropriate for the hairdresser to have his own, second screen 15 or touch screen 23 available, for operating the program and for initialization. On this second screen 15 or touch screen 23, in addition to the camera image, important control information can be shown, such as the color palette 47 (
In a preferred embodiment, the continuous video images of the person 11 are displayed by the computer 13 constantly in mirror inversion about a vertical axis on the screen 14 in the form of a mirror image 39 (
In a special embodiment, it is provided that the hair color 33 is changed continuously and in a cycle in accordance with a color-wheel spectrum 48 (
In
Since only a partial aspect of the real image is modified, there are no discrepancies between real and virtual picture elements, such as angular and position inaccuracies, different lighting of the various picture elements, scaling errors, and so forth. Nor is there any problem of coverage, since there is no need to calculate which parts of the virtual picture element are covered by the real picture. Since the real picture is changed only in its color, the hair of a person automatically behaves physically correctly, which creates a natural effect, including of movement by the person, upon reproduction on the screen or touch screen.
Number | Date | Country | Kind |
---|---|---|---|
101 42 526 | Aug 2001 | DE | national |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/EP02/09685 | 8/30/2002 | WO | 00 | 7/22/2004 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO03/020072 | 3/13/2003 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
3809785 | Calabrese et al. | May 1974 | A |
4258478 | Scott et al. | Mar 1981 | A |
4297724 | Masuda et al. | Oct 1981 | A |
4731743 | Blancato | Mar 1988 | A |
4872056 | Hicks et al. | Oct 1989 | A |
5060171 | Steir et al. | Oct 1991 | A |
5506946 | Bar et al. | Apr 1996 | A |
5508718 | Haikin | Apr 1996 | A |
5562109 | Tobiason | Oct 1996 | A |
5689286 | Wugofski | Nov 1997 | A |
5937081 | O'Brill et al. | Aug 1999 | A |
6034698 | Yasuda | Mar 2000 | A |
6081611 | Linford et al. | Jun 2000 | A |
6141431 | Munetsugu et al. | Oct 2000 | A |
6147692 | Shaw et al. | Nov 2000 | A |
6226010 | Long | May 2001 | B1 |
6293284 | Rigg | Sep 2001 | B1 |
6437866 | Flynn | Aug 2002 | B1 |
6633289 | Lotens et al. | Oct 2003 | B1 |
6692436 | Bluth et al. | Feb 2004 | B1 |
6719565 | Saita et al. | Apr 2004 | B1 |
6792401 | Nigro et al. | Sep 2004 | B1 |
6813591 | Ohwi | Nov 2004 | B1 |
6824387 | Sakai et al. | Nov 2004 | B2 |
6937755 | Orpaz et al. | Aug 2005 | B2 |
7151851 | Ladjevardi | Dec 2006 | B2 |
7634103 | Rubinstenn et al. | Dec 2009 | B2 |
20020010556 | Marapane et al. | Jan 2002 | A1 |
20020054714 | Hawkins et al. | May 2002 | A1 |
20030063102 | Rubinstenn et al. | Apr 2003 | A1 |
20030065255 | Giacchetti et al. | Apr 2003 | A1 |
20040176977 | Broderick et al. | Sep 2004 | A1 |
20050244057 | Ikeda et al. | Nov 2005 | A1 |
Number | Date | Country |
---|---|---|
3213806 | Oct 1983 | DE |
4224922 | Feb 1994 | DE |
196 35 753 | Apr 1998 | DE |
0725364 | Aug 1996 | EP |
1 147 722 | Oct 2001 | EP |
1147722 | Oct 2001 | EP |
WO-9703517 | Jan 1997 | WO |
WO-9923609 | May 1999 | WO |
01 32051 | May 2001 | WO |
Number | Date | Country | |
---|---|---|---|
20040239689 A1 | Dec 2004 | US |