The present invention relates to apparatus and methods for assessing and analyzing the skin and more particularly to digital imaging and analysis of digital photographs taken of a subject. The analysis may be quantitative and comparative relative to another photograph of the same subject taken at another time or relative to a photograph or photographs taken of another person or persons.
Various imaging systems have been proposed that photographically capture images of a person's face for analysis of the health and aesthetic appearance of the skin. Different images, captured at different times or under different lighting conditions can be used and/or compared to one another to gain insight into the condition of the skin and its response to treatment. This was typically done by human operators inspecting the photographs to identify certain visual indicators of skin condition and to ascertain changes between photographs. It would be beneficial for imaging systems to be less reliant on human perception and manual input. For example, in analyzing the skin, e.g., of a person's face, it is beneficial to examine specific regions of the face for specific associated attributes, since the different regions of the face are specialized in form and function and interact with the environment differently. Some skin imaging systems utilize a trained human operator to identify facial regions by manually touching (a stylus to a touch-sensitive input/output screen) or pointing to (with a cursor and clicking or otherwise indicating) fiducial points on a displayed facial image or drawing (with a stylus or cursor/mouse) polygons on an image to identify the facial regions of interest. While effective, such manual operations are labor intensive and require trained operators. It would therefore be beneficial for imaging systems to identify facial regions on images automatically to increase the speed and consistency of identification of the facial regions and to decrease the reliance upon operator input.
While the science of digital skin imaging analysis has identified various skin responses that are useful indicators of various skin condition parameters, it would still be desirable to identify and use additional specific responses of the skin that are indicative of skin condition. For example, it would be beneficial to identify skin imaging techniques that indicate photodamage and that reliably quantify and measure such indicative skin response. One of the indicators of skin condition is color response. Skin color is also important relative to the selection and design of cosmetics. There are limitations inherent in the expression of color in terms of RGB pixel intensity. It would therefore be beneficial to improve current methods of assessing skin color/brightness, e.g., for assessing skin condition and/or for the selection of and design of skin products, such as cosmetics.
Since imaging is a technical activity using complex apparatus, it remains an objective to improve the user-friendliness of digital imaging and analysis apparatus and to promote the ease and effectiveness of a digital imaging session, as well as enhancing the interaction between therapists/clinicians and patients/clients undergoing digital imaging sessions.
The disadvantages and limitations of known apparatus and methods for skin imaging and analysis are overcome by the present invention, which includes an imaging station for capturing images of a subject in a given ambient lighting environment. The imaging station has a housing with an aperture where the subject is presented for capturing images of the subject, a digital image capture device, a light for illuminating the subject during image capture, and a computer for controlling the image capture device and the light. The housing contains the digital image capture device, and the light and at least partially limits ambient light, if present in the ambient lighting environment, when a digital image of the subject is captured. In one embodiment of the present invention, the imaging station has the capability of taking a plurality of digital images of a subject under a plurality of illuminating conditions, storing, displaying and analyzing the digital images.
Since the images recorded are in digital form, i.e., in numerical pixel intensity values, the images lend themselves to quantitative analysis, such as by a program or programs residing on computer 17. For example, instead of just noting that the cheek of a subject whose image is taken at time T1 is more or less red in color in an image of the person taken at time T2, as discerned by a skilled human observer, the values of the intensity of the red pixels in the specific area of the cheek at times T1 and T2 may be quantitatively compared. For example, the two values may be mathematically processed to determine the actual change in red intensity for that pixel. Digital image quantification can be used to discern average values for the skin in specified regions, e.g., by summing the values of pixels in the specific region of interest and then dividing by the number of pixels. In this manner, a whole area of skin, e.g., of the face, may be characterized quantitatively. Various other quantified analyses may be conducted, e.g., the imaged area of skin may be tested for standard deviation in pixel intensity.
The present invention recognizes that digital imaging apparatus 10 may include communications apparatus such as modems, e.g., cable and DSL modems, to communicate digital image data and related communications from a digital imaging computer 17 to a remote computer 20 that may be in the home or workplace of an interested person, such as a fellow clinician, cosmetologist, a patient or a cosmetics customer. The receiver can then review the image data and other related communication on the monitor 22 of their computer 20. These communications may be conducted over the Internet using wired or wireless connections. In addition to sending image data to others, the imaging apparatus 10 may also receive image and related data from others, such as fellow clinicians, patients, etc. via the same means of communication. For example, a cosmetics customer may capture their own image with a local CCD camera 24 and relay that image to the imaging system computer 17 via the Internet, e.g., as an e mail attachment. The image may be examined and a responsive communication prepared by a cosmetologist and sent to the customer. The responsive communication may contain images, e.g., past images taken of the customer that are relevant to the customer, images relevant to cosmetics, etc. These can be sent as attachments to an e mail message. The present invention also recognizes that a cell phone 26 may be used to capture and communicate image information and related communications between parties at the imaging station 10 and parties remote therefrom.
In documenting and assessing skin from cosmetic and health perspectives, the better controlled the light properties (e.g., reflection, scattering, polarization, excitation wavelength and excitation bands), the better the resulting data and analysis of the skin.
The imaging station 32 may be provided with all the image capture and processing apparatus as described in Publication Nos. 2004/0146290, 2005/0195316 and 2006/0092315, which is contained within the housing 32. The imaging station preferably has the capability to rotate at least 90 degrees, e.g., either on a supporting surface or rotatable platform and/or to rotate up and down and/or lift or drop in height to accommodate users of various height and seated on various seating arrangements. The tunnel 36 in conjunction with the chin rest 34 and self visualizing panel 38, cooperate to position the subject person in a maximal position (relative to the light 44 and camera 42) for imaging the face of the subject person. The tunnel 36 also reduces the intrusion of environmental lighting during imaging sessions, limits the light projected into the environment when the light 44 is operated during an imaging session and provides the subject person with privacy. The operator display 40 may be used to visualize the subject person, before, during and after imaging, i.e., the camera 42 may be operated in real-time/video mode to show the moving image of the subject person as they place themselves on the chin rest 34 and otherwise prepare to undergo imaging (close their eyes etc.) This real time feed may be used to display the subject on the operator display 40, as well as on the self visualizing panel 38 (display). In this manner, the operator display 40 may be used to display the view that the camera 42 “sees”. As shown, e.g., in Publication No. 2005/0195316 A1, the camera 45 and illuminating light or lights 44 would be positioned adjacent to the self visualizing panel 38. In the case of a mirror-type self visualizing panel, the mirror material may be partially or entirely made of “one-way” glass or a half-mirror. Of course, if the functionality of subject self-visualization is not needed or desired, e.g., in the context of an imaging station 30, which is not intended for subject control or one which relies on verbal commands to the subject and does not need visual cues to the subject, the self-visualizing panel 38 may be omitted or replaced with a clear panel, such as clear glass or plastic. With respect to insuring that the subject person has their eyes closed, the operator display 40 may be used to apprise the operator of this condition, so the operator may then know it is safe to trigger imaging, e.g., by depressing a virtual button on the operator display 40.
The computer 50 is preferably provided with a plurality of data ports, e.g., USB and or Ethernet ports and/or 1394 “firewire”, wireless capacity or other interface standards for sharing the data captured by the imaging station 30 with other computers, e.g., on a network like the Internet, and for receiving data from other computers for display on the imaging station 30, e.g., imaging data from population studies, skin care product information, etc. A printer (not shown) may also be used in conjunction with the imaging station 30 to print out images, advice and/or information concerning the skin condition and treatment of the subject person.
As described in a commonly owned patent application entitled, Method and Apparatus for Identifying Facial Regions, which was filed on Oct. 2, 2006 as Provisional Application No. 60/848,741 and which is incorporated in its entirety herein, quantitative analysis of skin condition can be facilitated and improved by identifying specific regions of the skin, e.g., of the face. This identification of regions of interest can be conducted automatically through pupil or flash glint identification as fiducial reference points for mapping the facial regions.
As described in a commonly owned patent application entitled, Apparatus and Method for Analyzing Skin Using L*a*b* Colorspace, which was filed on Oct. 2, 2006 as Provisional Application No. 60/848,768 and which is incorporated in its entirety herein, identification of skin photo-response indicative of skin condition, quantitative analysis of skin condition, color and brightness can be facilitated and improved by converting digital image data from RGB format to L*a*b* format. As described in Application No. 60/848,768, the process for converting RGB image data to L*a*b* colorspace data is known to one of normal skill in the art, e.g., as described in Charles Poynton, A Technical Introduction to Digital Video (J. Wiley & Sons) Chapter 7, “Color Science.” Application No. 60/848,768 also discloses that facial regions may be identified by an operator marking a displayed image with a cursor tool or by programmatically analyzing the image data to identify pixel values consistent with unique fiducial reference points, such as the pupils, which would have unique color (black or flash glint), shape (round), size, spacing and orientation. Once pupils are identified, facial regions may be calculated relative thereto in accordance with empirically determined spacial/dimensional relationships to the pupils. RGB to L*a*b* conversion may be used to aid in selecting cosmetics for an individual from a palette of available cosmetic colors or may be used to select/design hues of cosmetics in defining a color palette for use by a population.
As described in a commonly owned patent application entitled, Apparatus and Method for Measuring Photodamage to Skin, which was filed on Oct. 2, 2006 as Provisional Application No. 60/848,767 and which is incorporated in its entirety herein, the green response intensity of skin to illumination by blue light can be used to identify and quantify photodamage to the skin, e.g., due to the presence of elastotic material. Further, a population's digital skin imaging data concerning photodamage and/or other skin conditions can be characterized numerically and analyzed statistically. Quantitative analysis of digital images of the skin response, such as the degree of variation of green signal response over a surface of the face as an indicator of photodamage, can be used to assess photodamage, and develop a number or score of skin condition relative to a population.
As described in a commonly owned patent application entitled, Calibration Apparatus and Method for Fluorescent Imaging, which was filed on Oct. 2, 2006 as Provisional Application No. 60/848,707 and which is incorporated in its entirety herein, a calibration standard may be employed during an imaging session, e.g., during blue fluorescent photography, to identify variations in illumination intensity between images taken at different times. Having identified a circumstance where illumination intensity has varied, the operator of the imaging apparatus can be notified to correct the conditions leading to illumination variations. Alternatively, the imaging apparatus may compensate for the variations by adjusting the intensity of the illuminating light or normalizing the images by digital image processing.
Each of the inventions disclosed in the foregoing applications incorporated by reference may be utilized and incorporated in the imaging station 30 of the present invention. In doing so, certain synergies are realized through their combination and interrelationship. For example, the capability of manually or automatically identifying facial regions has utility in the color ascertainment of certain facial regions, e.g., the cheek or the “average color” of the combination of cheek, forehead and chin. This capability of identifying skin color in certain regions can be helpful in matching cosmetics to those regions or combinations of those regions. For example, in selecting or designing colors for cosmetic foundations, it would be beneficial to know the average color of the facial regions: cheek, forehead and chin and exclude the lips, eyebrows and eyes from the contributing colors.
The identification of facial regions may also be used in the imaging station 30 in conjunction with the teachings of the application directed to ascertaining photodamage, in that specific regions of the skin are more prone to photodamage due to skin composition and exposure likelihood. The calibration of the apparatus and techniques taught in the application pertaining to same can be used in the imaging station 30 in that any set of digital images can be normalized to compensate for variations in illumination intensity as distinguished from variations in response intensity attributable to skin variation.
The imaging station 30, provides an apparatus and method of obtaining digital image data information concerning the appearance and light response characteristics of a person's skin. Since the present invention can be used by numerous persons, the data obtained from the use of the present invention, as well as data obtained from other sources, such as digital images made by others using different apparatus and methods, may be used to analyze the individual relative to others. As noted, e.g., in the Application entitled Apparatus and Method for Measuring Photodamage to Skin, which is incorporated by reference herein, digital images can be analyzed quantitatively to provide a discrete measurement of skin condition(s). An alternative, traditional method for assessing skin condition is through the judgment of a person based upon the appearance of the image. Over the years it has been shown that experienced clinicians can accurately and effectively discern changes in skin condition, based upon their visual perception, e.g., in comparing two images taken at two different times. Accordingly, the present invention recognizes the value of human perceptive judgments of skin condition in forming an accurate assessment of skin condition. This human perceptive measure can be used in parallel to quantitative, automated methods to confirm one with the other and thereby increase the credibility of both.
The present invention includes a method for training, implementing and testing visual analysis of images of subject persons. More particularly, to train persons to perceptively and accurately judge skin condition based upon images, training sets of images may be assembled. A training set may be a set of images taken of a person over time, which illustrates their skin condition over time. The image set can be divided into various categories, e.g. one category may be images taken with blue light and sensitive to green signal response of the skin to indicate photodamage. The image sets can then be reviewed by one or more person who are trained professionals with a proven proficiency in ascertaining skin condition based upon review of images of the skin of subject persons. The proficient individual or individuals will then order the image set into a range of conditions extending from the worse condition to the best condition. This ordering can be conducted in conjunction with quantified analysis. For example, six images may be considered and arranged from best to worst illustrating the spectrum of skin condition exhibited by the subject person relative to photodamage. Having developed this reference set or training set of images, persons to be trained are taught the visual indicators to look for to discern photodamage. The trainees are then shown the training image set, e.g., two images at a time, randomly selected by a computer, and asked to characterize which of the two displayed images is better or worse, e.g., with respect to photodamage. In presenting the images, the position of the better and worse images (either in the right or left, upper or lower positions on the display) is randomized, such that the display position does not correlate to the skin condition. After the trainees can successfully identify the better and worse images consistently, i.e., match the conclusions of a professional with demonstrated expertise, then the trainee can be adjudged to have assimilated the necessary perception to discern skin condition from digital images.
Assuming that a person is available with suitable training in judging skin condition, the same automated technique of presenting a plurality of images and receiving the judgment of better or worse can be applied to the process of evaluating an individual's skin condition over time. For example, if a person has had a series of imaging sessions, e.g., ten sessions over a period of two years, then a clinician can be automatically and randomly presented with a series of images taken from these imaging sessions and categorize them as better or worse than another image displayed. This process can be repeated until an unbiased repeating pattern of relative ranking is clearly established, demonstrating a reproducible clinical judgment. From this judgment, the subject person can be advised as to when their skin exhibited its best condition, the worse condition, the pattern of improvement or worsening, etc. This type of analysis can be applied to measure the success or failure of treatment and can be used in conjunction with quantitative methods to provide a balanced, informed analysis.
The foregoing methods for training and/or clinically assessing skin condition can be conducted on the imaging station 30. For example, after an imaging session, the clinician can perform a clinical assessment witnessed by the subject person.
It should be understood that the embodiments described herein are merely exemplary, and that a person skilled in the art may make many variations and modifications without departing from the spirit and scope of the invention. All such variations and modifications are intended to be included within the scope of the invention.
This application claims the benefit of U.S. Provisional Patent Application Ser. No. 60/848,765 filed Oct. 2, 2006, the disclosure of which is incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
2005426 | Land | Jun 1935 | A |
3353884 | Chaffee et al. | Nov 1967 | A |
3904293 | Gee | Sep 1975 | A |
3971065 | Bayer | Jul 1976 | A |
4070096 | Jasgur | Jan 1978 | A |
4170987 | Anselmo et al. | Oct 1979 | A |
4398541 | Pugliese | Aug 1983 | A |
4592726 | Brilliant | Jun 1986 | A |
4842523 | Bourdier | Jun 1989 | A |
4905700 | Wokalek et al. | Mar 1990 | A |
4911544 | Walsh | Mar 1990 | A |
5005975 | Kawai et al. | Apr 1991 | A |
5016173 | Kenet et al. | May 1991 | A |
5198875 | Bazin et al. | Mar 1993 | A |
5241468 | Kenet | Aug 1993 | A |
5363854 | Martens et al. | Nov 1994 | A |
5446515 | Wolfe et al. | Aug 1995 | A |
5456260 | Kollias et al. | Oct 1995 | A |
5539540 | Spaulding et al. | Jul 1996 | A |
5556612 | Anderson et al. | Sep 1996 | A |
5640957 | Kaminski | Jun 1997 | A |
5742392 | Anderson et al. | Apr 1998 | A |
5760407 | Margosiak et al. | Jun 1998 | A |
5785960 | Rigg et al. | Jul 1998 | A |
5836872 | Kenet et al. | Nov 1998 | A |
5945112 | Flynn et al. | Aug 1999 | A |
5973779 | Ansari et al. | Oct 1999 | A |
5991433 | Osanai et al. | Nov 1999 | A |
6018586 | Kamei | Jan 2000 | A |
6021344 | Lui et al. | Feb 2000 | A |
6032071 | Binder | Feb 2000 | A |
6076904 | Shepherd et al. | Jun 2000 | A |
6081612 | Gutkowicz-Kruisn et al. | Jun 2000 | A |
6134011 | Klein et al. | Oct 2000 | A |
6148092 | Qian | Nov 2000 | A |
6208749 | Gutkowicz-Krusin et al. | Mar 2001 | B1 |
6215893 | Leshem et al. | Apr 2001 | B1 |
6251070 | Khazaka | Jun 2001 | B1 |
6278999 | Knapp | Aug 2001 | B1 |
6293284 | Rigg | Sep 2001 | B1 |
6317624 | Kollias et al. | Nov 2001 | B1 |
6419638 | Hay et al. | Jul 2002 | B1 |
6436127 | Anderson et al. | Aug 2002 | B1 |
6437856 | Jacques | Aug 2002 | B1 |
6441854 | Fellegara et al. | Aug 2002 | B2 |
6507747 | Gowda et al. | Jan 2003 | B1 |
6510366 | Murray et al. | Jan 2003 | B1 |
6597392 | Jenkins et al. | Jul 2003 | B1 |
6600947 | Averback et al. | Jul 2003 | B2 |
6603552 | Cline et al. | Aug 2003 | B1 |
6619860 | Simon | Sep 2003 | B1 |
6624843 | Lennon | Sep 2003 | B2 |
6711426 | Benaron et al. | Mar 2004 | B2 |
6728560 | Kollias et al. | Apr 2004 | B2 |
7004599 | Mullani | Feb 2006 | B2 |
7015929 | Satomi et al. | Mar 2006 | B2 |
D564663 | Udagawa et al. | Mar 2008 | S |
20010013897 | Kowno et al. | Aug 2001 | A1 |
20020059030 | Otworth et al. | May 2002 | A1 |
20020065468 | Utzinger et al. | May 2002 | A1 |
20020071246 | Stewart | Jun 2002 | A1 |
20020093698 | Kagawa | Jul 2002 | A1 |
20020133080 | Apruzzese et al. | Sep 2002 | A1 |
20020161664 | Shaya et al. | Oct 2002 | A1 |
20020177778 | Averback et al. | Nov 2002 | A1 |
20020181752 | Wallo et al. | Dec 2002 | A1 |
20020182235 | Slavtcheff et al. | Dec 2002 | A1 |
20030007071 | Goto | Jan 2003 | A1 |
20030045916 | Anderson et al. | Mar 2003 | A1 |
20030067545 | Giorn et al. | Apr 2003 | A1 |
20030086703 | Kollias et al. | May 2003 | A1 |
20030086712 | Merola et al. | May 2003 | A1 |
20030108542 | Pruche et al. | Jun 2003 | A1 |
20030138249 | Merola et al. | Jul 2003 | A1 |
20030191379 | Benaron et al. | Oct 2003 | A1 |
20040006553 | de Vries et al. | Jan 2004 | A1 |
20040020509 | Waisman | Feb 2004 | A1 |
20040077951 | Lin et al. | Apr 2004 | A1 |
20040125996 | Eddowes et al. | Jul 2004 | A1 |
20040136701 | Nakanishi et al. | Jul 2004 | A1 |
20040146290 | Kollias et al. | Jul 2004 | A1 |
20040174525 | Mullani | Sep 2004 | A1 |
20040186363 | Smit et al. | Sep 2004 | A1 |
20040263880 | Ito et al. | Dec 2004 | A1 |
20050116039 | Zhu et al. | Jun 2005 | A1 |
20050129288 | Chen et al. | Jun 2005 | A1 |
20050131304 | Stamatas et al. | Jun 2005 | A1 |
20050146863 | Mullani | Jul 2005 | A1 |
20050195316 | Kollias et al. | Sep 2005 | A1 |
20060092315 | Payonk et al. | May 2006 | A1 |
20070002479 | Menke et al. | Jan 2007 | A1 |
20070004972 | Cole et al. | Jan 2007 | A1 |
20070005393 | Cole et al. | Jan 2007 | A1 |
Number | Date | Country |
---|---|---|
0682236 | Nov 1995 | EP |
0737932 | Oct 1996 | EP |
1089208 | Apr 2001 | EP |
1118845 | Jul 2001 | EP |
1194898 | Mar 2003 | EP |
1297782 | Apr 2003 | EP |
1376444 | Jan 2004 | EP |
1433418 | Jun 2004 | EP |
1434156 | Jun 2004 | EP |
1541084 | Jun 2005 | EP |
282 1152 | Aug 2002 | FR |
208 314 | Apr 2003 | FR |
2106241 | Apr 1983 | GB |
2293648 | Apr 1996 | GB |
7075629 | Mar 1995 | JP |
7323014 | Dec 1995 | JP |
WO 9424936 | Nov 1994 | WO |
WO 9616698 | Jun 1996 | WO |
WO 9705473 | Feb 1997 | WO |
WO 9747235 | Dec 1997 | WO |
WO 9824360 | Jun 1998 | WO |
WO 9837811 | Sep 1998 | WO |
WO 9917668 | Apr 1999 | WO |
WO 0076398 | Dec 2000 | WO |
WO 0104839 | Jan 2001 | WO |
WO 0122741 | Mar 2001 | WO |
WO 0122869 | Apr 2001 | WO |
WO 0135827 | May 2001 | WO |
WO 0145557 | Jun 2001 | WO |
WO 0172216 | Oct 2001 | WO |
WO 0182786 | Nov 2001 | WO |
WO 02061405 | Aug 2002 | WO |
WO 03040878 | May 2003 | WO |
Number | Date | Country | |
---|---|---|---|
20080079843 A1 | Apr 2008 | US |
Number | Date | Country | |
---|---|---|---|
60848765 | Oct 2006 | US |