The present disclosure relates to methods of quantifying the clinical efficacy of skin care products and, more specifically, to methods for quantitative measurement of under eye dark circles and other color related phenomena in eye area skin.
Currently in the industry, the level of under eye dark circles is assessed using the following methods: (1) visual grading by a trained clinical grader; (2) instrumental colorimetric measurement; and (3) general image analysis method of facial photography. Of these current practices, method (1), visual grading, is a subjective method based on numbers assigned by a clinician within a ten-point scale. Accuracy and reproducibility are major concerns of this method. The instrumental colorimetric method, current method (2), is objective. However, the necessity of making a contact between instrument and the skin during a measurement has been a factor causing significant variation, specifically in the red color component of the skin. In addition, the limited accessibility in the eye area makes it difficult to have accurate measurements using commercial colorimeters. The general image analysis method, current method (3), is a method related to this invention. However, lack of specificity and automation are common drawbacks to such a method. Exact alignment of eye area features has not been addressed in these methods, which results in relatively large variation when pictures from two different time points are analyzed. Thus, there is a current need for improved methods of assessing under eye skin color that are quantitative in nature.
The present disclosure provides a method of image analysis of facial digital photographs that is developed to quantitatively measure dark circles in the eye area skin. The first step is to take a digital photograph, that in one embodiment is preferably three mega pixels and above, using any commercially available digital camera. A set of color palettes with known standard color values are used when the picture is taken to provide color references for the next color correction step. The color correction process is then carried out using a set of computer algorithms. In one embodiment the color correction is performed as described in U.S. Pat. No. 8,319,857 Qu et al.
Next, a set of newly developed computer algorithms are used to automatically carry out the following steps that are important to obtaining a consistent measurement method of eye area skin: (a) detecting the locations of eyes and eyebrows from a digital picture; (b) zooming in to reveal the details of eye area features on a specified eye; (c) prompting the operator to manually confirm two fine features of the eye; (d) rotating the picture to align the eye horizontally using the identified features as reference; (e) calculating and drawing an oval shaped region of interest around the eye; (f) cropping out the region of interest as a single picture file; (g) prompting the operator to manually confirm the upper and lower boundaries of the eye; (h) calculating skin reflective intensity in a fashion which scans across the region of interest stepwise from the upper eyelid area down to the under eye area, and in one embodiment this is performed in a total of one hundred increments; (i) generating a data file from the above step to be plotted to show a skin reflective intensity profile around the eye of specific concern; (j) carrying out the same analysis of the eye to another picture of the same individual (e.g. at a different date during a clinical trial of product treatment); (k) comparing the two profiles before and after product treatment to detect changes in skin reflective intensity. Clinical efficacy can be quantified from these profiles.
Thus, the present disclosure provides a method of quantifying an under eye appearance of a human subject, comprising: detecting an eye area to be evaluated on a digital photograph of the human subject; selecting a region of interest in the eye area; calculating skin reflective intensity data within the region of interest; generating a skin reflective intensity profile from the intensity data; and quantifying a value from the intensity profile of the subject's under eye appearance. In further embodiments, the region of interest in an oval. In still further embodiments, the quantifying step (e) uses a mean square error method to quantify the under eye appearance.
The present disclosure further provides a method of determining efficacy of a skin care product on a human subject's under eye appearance, comprising: detecting an eye area to be evaluated on a digital photograph of the human subject; selecting a region of interest in the eye area; calculating skin reflective intensity data within the region of interest; generating a skin reflective intensity profile from the intensity data; and quantifying a pre-treatment value from the intensity profile of the subject's under eye appearance; treating the subject with the skin care product; repeating steps (a) through (d) after treatment with a skin care product to generate a second intensity profile; quantifying a post-treatment value of the subject's under eye appearance from the second intensity profile; and comparing the post-treatment value with the pre-treatment value to determine the efficacy of the skin care product. In still further embodiments, the quantifying steps (e) and (h) use a mean square error method to quantify the under eye appearance.
The patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawings will be provided by the Office upon request and payment of the necessary fee.
All patents, patent applications, and literature references cited in this specification are hereby incorporated herein by reference in their entirety. In case of conflict, the present description, including definitions, will control.
Under eye dark circles are a common concern of eye area skin. Currently, it is believed that they are caused by two different things: Stagnant blood flow, and hyper-pigmentation in the eye area skin. The current technologies in skin care products are to improve blood flow and to inhibit pigmentation. What is needed at this stage is an accurate, objective method to evaluate the phenomenon and to assess treatment efficacy.
Currently from the literature, we see three different methods used. Visual grading is by far the most commonly used method. However, subjectivity is the primary concern when comes to accuracy and repeatability. Colorimetric measurement is another common method. Skin contact of the instrument, however, introduces variation, particularly in the red component of skin color. It is also difficult to operate in the eye area. Image analysis of facial photographs has become more and more popular. So far, the output has been general. To our knowledge, there has not been a method specific for under eye dark circle measurement. In addition, proper color correction of digital images is often in question when using this general image analysis method.
A quantitative method was developed for measuring under eye dark circles by using a digital image of a subject and optionally applying an image analysis algorithm or algorithms. In this regard, the digital image may be obtained by a digital color video camera of a TRI-CCD type, which is available commercially from SONY; or a digital still photo or picture camera of type D70S, available commercially from NIKON. One of skill in the art, however, will understand that any suitable apparatus for obtaining a digital image of a subject may be used so long as it meets the objects of the present method.
Desirably, the digital image capturing device is configured to capture the image with at least 2 megapixels, and in other embodiments at least 3, 4, 5, 10, megapixels or more. Of course, it may be understood that the larger the number of pixels of an image to be captured, the easier the analysis may be.
In one embodiment, the digital image is obtained using a VISIA system such as those obtained by a VISIA-CR apparatus. It was an objective to use the method to assess the severity of this skin condition, to evaluate treatment efficacy, and to have a screening technology for new product development.
A digital facial image may be obtained using a digital image capturing device described above, such as for example, VISIA-CR. Then, the image is color corrected using a set of developed algorithms. One example of color correction algorithms is set forth in U.S. Pat. No. 8,319,857 to Qu et al., which is incorporated herein by reference in its entirety. Of course it is to be understood that other means of color correction may be implemented.
Next, the locations of eyebrow and the eye are detected and identified. In this regard, the measured distance between the eyebrow and the eye is used as a reference for the ROI determination. The image may also be magnified to reveal details of eye area features on a specified eye. An operator may also be prompted to manually confirm two fine features of the eye. For example, as shown in
Alignment of the features of an eye is an important step in this process. Facial expression may change over time and therefore the alignment step is important for a consistent ROI determination. Leveling also makes it easy to scan the skin around the eye for intensity measurement.
A first region of interest, which is generally an oval shape, is created based on the dimension of the straight line and the locations of the eye and eyebrow. The digital image may then be cropped to focus substantially on the ROI. For example,
Further,
The left-hand side of
The present method provides an objective method which reduces or eliminates subjectivity that is associated with a visual grading method. The method provides an analysis of skin reflective intensity properties from digital photographs. It therefore eliminates the concern of instrument/skin contact inherent to the colorimetric method.
The unique eye feature alignment step is an important improvement to the current general image analysis method. It is well known that features in two pictures may be slightly different when images are taken from two different points in time, particularly for those covering a long time span. The software incorporating the methods of the present method first identifies the eye area. It then provides a zoom-in image for the operator to confirm the eye area features by drawing a line connecting two corners of the eye. This line is subsequently used as a reference to rotate the image and perfectly level the eye horizontally. From there, a region of interest for eye area skin analysis is determined relative to the reference line. The ratio of skin area under evaluation is therefore fixed, relative to the features of the eye (
The method of the present disclosure improves accuracy of under-eye dark circle measurements. To validate that point, an experiment was conducted on a facial picture. The intensity profile was first measured. The pixel brightness of the skin in the under-eye area of the picture was artificially increased by using image analysis software. The increase was slight, about 1.28% of the original brightness in a small area of under-eye skin, which was not visible to the naked eye in the picture. The intensity profile of the same area was measured again and now a clear step increase in skin reflective intensity is shown by the intensity profiles when two profiles are superimposed (
The methods of the present disclosure can be practiced in a different way to achieve a totally different effect. During the steps that obtain skin reflective intensity from a picture, instead of scanning in a one-dimensional fashion, the scan can be carried out in a two-dimensional fashion over the region of interest by using a small incremental area of one or several pixels. In one embodiment, the scan is started from the upper left corner of the region of interest, scanned across the horizontal length, and then moved down one incremental step and repeated again until the entire area is covered. The result of this two-dimensional scan is a contour map of reflective intensity of the eye area skin (
Repeatability on a mannequin: To validate the method, we used a mannequin as a model as illustrated in
Repeatability on human skin:
Sensitivity test:
Quantification of product efficacy. The immediate effect of a concealer is illustrated in
Side-to-side scan embodiment:
Laser ablation:
Quantification:
Application of the method to the quantification of clinical efficacy of a facial essence in the under eye area. The results are seen in
Description of clinical study design: An in-use clinical efficacy study was conducted in an independent testing lab in Texas, U.S.A. for six weeks. Thirty four Asian and Caucasian female volunteers, ages 18-60, participated in the study. The volunteers were instructed to apply at home a test facial essence formulation to their face two times a week for 6 weeks. Facial images of each subject were taken at baseline and at week 6 using VISIA-CR in the testing lab. The images were later used to measure changes in skin color under the eye to quantify treatment efficacy of the test product.
Results: Using the method of the present invention, the following result table was obtained which includes, for each volunteer: (1) Average skin reflective intensity in the under eye area at study baseline; (2) Average skin reflective intensity in the under eye area after 6 week use of product; (3) Mean square error of the intensity profile in the under eye area at study baseline; and (4) Mean square error of the intensity profile in the under eye area after 6 weeks.
Statistical Analysis
Changes in skin reflective intensity before and after 6-week product application are illustrated in
Changes in MSE (mean square error) before and after 6-week product application are illustrated in
Sample images before and after six-week product application are provided in
Sample product effect of lightening the under eye area skin after product application in Subject #034, illustrated in
While the present invention is described herein with reference to illustrated embodiments, it should be understood that the invention is not limited hereto. Those having ordinary skill in the art and access to the teachings herein will recognize additional modifications and embodiments within the scope thereof. Therefore, the present invention is limited only by the claims attached herein.
This application claims priority to U.S. Application No. 61/611,136 filed Mar. 15, 2012, the entire contents of which is incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
7349857 | Manzo | Mar 2008 | B2 |
8218862 | Demirli et al. | Jul 2012 | B2 |
8238623 | Stephan et al. | Aug 2012 | B2 |
8290257 | Demirli et al. | Oct 2012 | B2 |
8319857 | Qu et al. | Nov 2012 | B2 |
8358348 | Mohammadi et al. | Jan 2013 | B2 |
8467583 | Smith et al. | Jun 2013 | B2 |
8915562 | Edgar et al. | Dec 2014 | B2 |
20030088437 | Iobst et al. | May 2003 | A1 |
20030093297 | Schilling et al. | May 2003 | A1 |
20040218810 | Momma | Nov 2004 | A1 |
20070086651 | Stephan et al. | Apr 2007 | A1 |
20070258656 | Aarabi | Nov 2007 | A1 |
20080080755 | Payonk et al. | Apr 2008 | A1 |
20080212894 | Demirli et al. | Sep 2008 | A1 |
20080270175 | Rodriguez et al. | Oct 2008 | A1 |
20090028380 | Hillebrand et al. | Jan 2009 | A1 |
20090196475 | Demirli | Aug 2009 | A1 |
20090201365 | Fukuoka et al. | Aug 2009 | A1 |
20100284610 | Yoshikawa | Nov 2010 | A1 |
20110123703 | Mohammadi et al. | May 2011 | A1 |
20110196616 | Gunn | Aug 2011 | A1 |
20110202480 | Maes et al. | Aug 2011 | A1 |
20110206254 | Patwardhan | Aug 2011 | A1 |
20110300196 | Mohammadi et al. | Dec 2011 | A1 |
20110301441 | Bandic et al. | Dec 2011 | A1 |
20120300050 | Korichi et al. | Nov 2012 | A1 |
20120325141 | Mohammadi et al. | Dec 2012 | A1 |
20130169827 | Santos | Jul 2013 | A1 |
Number | Date | Country |
---|---|---|
2004166801 | Jun 2004 | JP |
WO2012001289 | Jan 2012 | WO |
Entry |
---|
English Translation of WO2012/001289, Boulay published Jan. 2012. |
English Translation of JP2004-166801, Fuji published Jun. 2004. |
Number | Date | Country | |
---|---|---|---|
20130245459 A1 | Sep 2013 | US |
Number | Date | Country | |
---|---|---|---|
61611136 | Mar 2012 | US |