The current invention relates to automated computer-controlled methods to selectively and precisely apply one or more reflectance modifying agent, such as a dye or pigment, to human skin as cosmetics to change the appearance of human features based on a model comprising at least one digital image.
Prior Cosmetic Techniques and their Disadvantages
Prior art techniques for modifying the appearance of skin include natural tanning, artificial tanning, and the deliberate application of cosmetics. Each of these prior art techniques has limitations.
Typically, the applications of cosmetic substances to skin are largely manual, for example through the used of brushes, application tubes, pencils, pads, and fingers. These application methods make prior art cosmetics imprecise, labor intensive, expensive, and sometimes harmful, when compared to the computerized techniques of the present invention.
Most prior art cosmetic approaches are based on the application of opaque substances. As explained in the cross-referenced application U.S. Ser. No. 11/503,806, there is a need for the precise computer-controlled application of reflectance modifying agents (RMAs), such as transparent dyes, to provide a more effective modification of appearance. In this specification, the terms “reflectance modifying agent” or “RMA” refer to any compound useful for altering the reflectance of another material, and are explained in further detail below. Some examples of RMA are inks, dyes, pigments, bleaching agents, chemically altering agents, and other substances that can alter the reflectance of human skin and other features. The terms “dye” and “transparent dyes” are used for brevity in this specification to represent any RMA.
Moreover, cosmetics are typically applied manually to make people look more like certain images. For example, cosmetics may be applied to reconstruct the former appearance of people whose features have been altered or damaged. For example, cosmetics may be applied to the skin of patients who have been burned, to make the burned skin appear to have the color and texture it had before the burns. Cosmetics may be used to create the appearance of eyebrows on cancer patients who have lost their hair as a result of chemotherapy or radiation treatment. And cosmetics are used generally to make older people look more like they were when young. In addition, cosmetics may be used to make people look more like any desired model of themselves that they might have.
Typically the models used as the basis for these cosmetic applications are
Ideal models derived from certain people are also used as the basis for cosmetic applications on other people. For example, a makeup artist may create a “look,” consisting of a certain combination of colors, shading contrasts, and even feature shapes, such as eyebrow shapes, that is used as model for cosmetics applied to many different people. Such a look may be based on the appearance of a popular actress, for example a Nicole Kidman look or a Catherine Zeta-Jones look, because many women would like to look like those actresses. Makeup artists can apply cosmetics to make different people all have a similar look, or individuals may apply their own cosmetics to create such effects, for example based on magazine pictures or digital images of actresses.
However, manual techniques of applying cosmetics for such changes based on images, such as digital images, can be time consuming and require considerable skill to be done well, as anyone can attest who has tried to draw on eyebrows for a distressed cancer patient to match a digital image of her.
Therefore, there is a need for the precise application of reflectance modifying agents (RMAs) to provide a more effective, more automated, faster, less expensive, and less dangerous modification of the appearance of skin to cosmetically change people's features based on digital images.
These and other needs are addressed by the present invention. The following explanation describes the present invention by way of example and not by way of limitation.
It is an aspect of the present invention to automatically change the appearance of human features based on a model digital image.
It is another aspect of the present invention to automatically reconstruct the appearance of human features based on digital images, through the application of RMAs. A useful technique is to employ feature recognition software to compare a person's current features with that person's features in one or more digital images. These images may be previously provided by the present invention's system and method. Or they may be provided by other compatible means. The present invention's enhancement software can then determine reconstructive enhancements based on those digital images and can apply those enhancements to the person precisely and automatically.
It is still another aspect of the present invention to automatically enhance the appearance of a person's features based on a digital image of another person, through the application of RMAs. A useful technique is to employ feature recognition software to compare the person's features with a model person's features in one or more digital images. These images may be previously provided by the present invention's system and method. Or they may be provided by other compatible means. The present invention's software can then determine enhancements based on those digital images and can apply those enhancements to the person precisely and automatically.
These and other aspects, features, and advantages are achieved according to the system and method of the present invention. In accordance with the present invention, a computer-controlled system determines attributes of a frexel, which is an area of human skin, and applies a reflectance modifying agent (RMA) at the pixel level to automatically change the appearance of human features based on one or more digital images.
One embodiment may change the appearance of human features based on one or more digital images of the same frexel, as seen in a prior digital photograph. In an embodiment, the digital images are captured previously by the computer-controlled system. The system scans the frexel and uses feature recognition software to compare the person's current features in the frexel with that person's features in the digital images. It then calculates enhancements to the make the current features appear more like the features in the digital images, and it applies the RMA to the frexel, typically with an inkjet printer, to accomplish the enhancements. The identified attributes in the frexel may relate to reflectance characteristics and to the surface topology of the skin.
Another embodiment may enhance the appearance of a person's features based on a digital image of another person, through the application of RMAs. A useful technique is to employ feature recognition software to compare the person's features with a model person's features in one or more digital images. These images may be previously provided by the present invention's system and method. Or they may be provided by other compatible means. The present invention's software can then determine enhancements based on those digital images and can apply those enhancements to the person precisely and automatically.
The following embodiment of the present invention is described by way of example only, with reference to the accompanying drawings, in which:
The present invention comprises the application or one or more reflectance modifying agents (RMAs) through a computer-controlled system and method to change a person's appearance. For example, the invention may be used to automatically reconstruct the appearance of a person's damaged or altered features, based on the appearance of that person in one or more digital images. To cite another example, the invention may be used to automatically enhance a first person's appearance based on the appearance of second person in a digital image.
U.S. application Ser. No. 11/503,806 filed Aug. 14, 2006 by the present applicants claims the computer-controlled system and method that scans an area of human skin, identifies unattractive attributes, and applies the RMA, typically with an inkjet printer, to improve the appearance of that area of skin. The present invention comprises new innovations to that system and method to accomplish the changes mentioned above.
Enhancement System
Frexels
In this patent specification, the term “frexel” is defined as a small pixel-like region of the skin. A frexel might correspond to a small portion of a freckle or other skin feature, or it may correspond to an area of the skin that does not have special features. A frexel thus refers to skin rather than to an independent coordinate system. The term frexel is used to suggest that what is being measured is on a 3-D surface rather than a flat surface.
Reconstruction System
If a patient's left eyebrow needs to be reconstructed, for example, a digital image 702 may be used that has been previously captured about that patient and stored in storage 250, and that shows the patient's left eyebrow. A woman who has stored a model digital image 702 of her face at age 20 may use that digital image 702 years later, for example when she is 50, as the basis for enhancements with RMAs throughout her face. The same woman could also choose to use as a basis for reconstruction a previous image of herself when she received a special makeup look from a professional makeup artist.
Topographic aspects of features can also be reconstructed to degrees. For example, dark hollows under eyes and below cheekbones can be lightened to make a face appear younger and healthier.
In another embodiment, multiple previous digital images of the person may be used to derive a digital image 702, useful for reconstruction, for example by averaging the relevant values of the multiple digital images. In addition, a current frexel may be reconstructed by using data from different but relevant frexels. For example, a frexel representing a point on a cheek may be reconstructed on the basis of previous images 702 showing adjacent frexels or frexels on different points of the body.
In still another embodiment, a computer display 102 (not shown) and interface 104 (not shown), for example a keyboard, may be used to allow a consumer to select a stored digital image 702, or elements from that image 702 or from multiple stored images, to derive a basis for one or more reconstructions, as will be evident to those skilled in the art.
In other embodiments, the digital image 702 may be created remotely and transferred to computing environment 100 by any method known to those skilled in the art or not yet known. For example, it may be sent over the Internet from a remote computer to computing environment 100. It may be loaded onto computing environment 100 by means of a diskette. It may also be transferred through wireless technology, as well as by many other means.
In one embodiment, image receiving means 712 enable the application system 200, shown in
This digital image 702 may represent not only a prior appearance of the person but the way the person would like to be seen. For example, the person might want to look like his or her appearance in an idealized drawing, an enhanced photograph, or a computer-generated image. These idealized images may be created by the person or someone working on the person's behalf, such as an artist or a makeup designer.
The digital image 702 may be received by the application system 200 by any of multiple image receiving means 712, shown in
Moreover, multiple digital images 702 may be received by the application system 200 to derive a single model for reconstruction. For example, the application algorithm 230 can average multiple digital images 702 to derive more representative values for features in the captured digital data, their locations, and their reflectance patterns, to improve the quality of the printable reconstruction image 708.
Another embodiment comprises scanner 2222, shown in
To use images created through other methods, the present invention employs digital image feature recognition software 710 that identifies features in one or more received digital images 702. In an embodiment, this digital image feature recognition software 710 employs the techniques for the analysis of reflectance patterns explained in detail above, but applied to digital images. In other embodiments, it may use other techniques for feature recognition, for example Optasia™, the model-based feature-recognition platform developed by Image Metrics, Plc. The “Technical White Paper” on the Image Metrics website states that, “The Optasia engine can perform rapid model-to-image matching regardless of the model type, including those with high-frequency elements such as texture.” These other techniques may be used in combination with the techniques for the analysis of reflectance patterns explained above or independently of them.
The digital image feature recognition software 710 is used to create a digital image feature map 704 that identifies features in the digital image 702, their locations, and their reflectance patterns.
The application algorithm 230 in turn creates a frexel feature map 706 from data captured for the person scanned, as explained above. The frexel feature map 706 identifies features in that captured data, their locations, and their reflectance patterns.
The application algorithm 230 then compares the analogous information in the digital image feature map 704 and the frexel feature map 706 and uses the data in the digital image feature map 704 as a model to reconstruct the data in the frexel feature map 706. In an embodiment, this reconstruction can be accomplished by subtracting the frexel values from the analogous digital image values to derive the values of a reconstruction image.
For example, an eyebrow that has become fainter in the frexel feature map 706 may be reconstructed, with pixel-level precision, from a darker eyebrow present in the digital image feature map 704. Skin discolored from burns can be altered to more of its previous color.
The application algorithm 230 uses this reconstruction to define a printable reconstruction image 708 that is used as a basis to apply an RMA to an area of skin 302, typically by inkjet printing, as explained above.
Enhancement System for Changing a First Person's Appearance Based on a Digital Image of a Second Person
In an embodiment, the digital image 702 shown in
For example, the distinctive arch of Nicole Kidman's eyebrow can be used as a model to enhance millions of girls next door by making their eyebrows of more arched. Or Catherine Zeta Jones's olive complexion, individual beauty marks, and full crimson lips can lend those girls a degree of her special charm.
On the other hand, model digital image 702 may represent a second person whose appearance is older and more responsible looking that the first person's, for example for the purpose of a job interview when the first person is very young and wants to look more mature.
The model digital image 702 may further represent a particular kind of appearance of any second person desired to serve as a model for the enhancements to the first person. For example, the model digital image 702 may be one of Nicole Kidman made up to have a special look for a formal event.
Filtering techniques may be used to determine the degree of similarity to be achieved. With no filtering, a very close resemblance between the second and first person may be achieved, depending on the original similarity between the two people. A higher degree of filtering may achieve a softer suggestion of similarity.
Topographic aspects of features can also be enhanced to degrees. On the first person's face, for example, cheekbones can be darkened to make them appear more like those of a second person with prominent, attractive cheekbones.
In another embodiment, multiple digital images of the second person may be used to derive a model digital image 702, useful for enhancements, for example by averaging the relevant values of those images. In addition, a frexel on the first person may be enhanced by using data from different but relevant frexels on the second person. For example, a frexel representing a point on a cheek on a first person may be enhanced on the basis of one or more model digital images 702 showing frexels or frexels on different points of the body of the second person.
In still another embodiment, a computer display 102 (not shown) and an interface 104 (not shown), for example a keyboard, may be used to allow a first person to select a stored model digital image 702 of a second person, or elements from that model digital image 702 or from multiple stored images, to derive a basis for one or more enhancements, as will be evident to those skilled in the art.
In other embodiments, the digital image 702 may be created remotely on a first instance of the present invention and transferred by any method, known to those skilled in the art or not yet known, to computing environment 100, which represents a second instance of the present invention. For example, it may be sent over the Internet from a remote computer to computing environment 100. It may be loaded onto computing environment 100 by means of a diskette. It may also be transferred through wireless technology, as well as by many other means.
Step 730 in
As explained above, the digital image 702 may have been stored previously by application system 200, shown in
Step 732 in
The application system 200, shown in
Step 734 in
The application algorithm 230, shown in
Step 736 in
The reconstructed values derived in Step 734 are treated as a printable reconstruction image 708.
Step 738 in
The application system 200, shown in
Step 740 in
As explained above, the digital image 702 can be received by the application system 200 shown in
Step 742 in
The digital image feature recognition software 710, shown in
Step 744 in
The application system 200, shown in
Step 746 in
The application algorithm 230, shown in
Step 748 in
The application algorithm 230, shown in
Step 750 in
The reconstructed values derived in Step 748 are treated as a printable reconstruction image 708.
Step 752 in
The application system 200, shown in
Advantages of Reconstruction
The advantages of the reconstruction system and method described above are that it enables RMAs to be applied more automatically and more precisely, at the pixel level, to reconstruct the appearance of people's features, based on the appearance of those features in digital images.
In addition, this process may be usefully applied to other substances besides RMAs. For example human skin and artificial reconstruction materials may be applied.
Step 830 in
As explained above, the model digital image 702 may be created by the application system 200 shown in
Step 832 in
The application system 200, shown in
Step 834 in
The application algorithm 230, shown in
Step 836 in
The enhancement values derived in Step 734 are treated as a printable enhancement image 234.
Step 838 in
The application system 200, shown in
Step 840 in
As explained above, the model digital image 702 can be received by the application system 200 shown in
Step 842 in
The digital image feature recognition software 710, shown in
Step 844 in
The application system 200, shown in
Step 846 in
The application algorithm 230, shown in
Step 848 in
The application algorithm 230, shown in
Step 850 in
The enhancement values derived in Step 848 are treated as a printable enhancement image 234.
Step 852 in
The application system 200, shown in
It will be apparent to those skilled in the art that different embodiments of the present invention may employ a wide range of possible hardware and of software techniques. The scope of the current invention is not limited by the specific examples described above.
This patent application is a continuation of and claims priority to U.S. application Ser. No. 15/269,091, filed Sep. 19, 2016, which is a continuation of and claims priority to U.S. application Ser. No. 14/068,894, filed Oct. 31, 2013, now U.S. Pat. No. 9,449,382, which is a continuation of and claims priority to U.S. application Ser. No. 13/476,320, filed May 21, 2012, now U.S. Pat. No. 8,582,830, which is a continuation of and claims priority to U.S. application Ser. No. 12/029,534 filed Feb. 12, 2008, now U.S. Pat. No. 8,184,901, which claims benefit of U.S. Provisional Patent Application No. 60/889,297 filed Feb. 12, 2007 by the present inventors for “SYSTEM AND METHOD FOR APPLYING A REFLECTANCE MODIFYING AGENT TO RECONSTRUCT A PERSON'S APPEARANCE BASED ON A DIGITAL IMAGE OF THE PERSON” and U.S. Provisional Patent Application No. 60/889,298 filed Feb. 12, 2007 by the present inventors for “SYSTEM AND METHOD FOR APPLYING A REFLECTANCE MODIFYING AGENT TO ENHANCE A PERSON'S APPEARANCE BASED ON A DIGITAL IMAGE OF ANOTHER PERSON”, the disclosure of which is expressly incorporated herein by reference in their entirety by applicants. This patent application incorporates by reference the specification, drawings, and claims of U.S. patent application Ser. No. 11/503,806 filed Aug. 14, 2006 by the present inventors for “SYSTEM AND METHOD FOR APPLYING A REFLECTANCE MODIFYING AGENT TO IMPROVE THE VISUAL ATTRACTIVENESS OF HUMAN SKIN”.
Number | Name | Date | Kind |
---|---|---|---|
4190056 | Tapper et al. | Feb 1980 | A |
4401122 | Clark | Aug 1983 | A |
4628356 | Spillman et al. | Dec 1986 | A |
4771060 | Nakagawa | Sep 1988 | A |
4807991 | Carew | Feb 1989 | A |
4882492 | Schlager | Nov 1989 | A |
5027817 | John | Jul 1991 | A |
5156479 | Iizuka | Oct 1992 | A |
5241468 | Kenet | Aug 1993 | A |
5268166 | Barnett et al. | Dec 1993 | A |
5431911 | Reynolds | Jul 1995 | A |
5836872 | Kenet | Nov 1998 | A |
5931166 | Weber et al. | Aug 1999 | A |
6021344 | Lui et al. | Feb 2000 | A |
6067996 | Weber et al. | May 2000 | A |
6111653 | Bucknell et al. | Aug 2000 | A |
6122042 | Wunderman et al. | Sep 2000 | A |
6151031 | Atkins | Nov 2000 | A |
6208749 | Gutkowicz-Krusin et al. | Mar 2001 | B1 |
6250927 | Narlo | Jun 2001 | B1 |
6286517 | Weber et al. | Sep 2001 | B1 |
6292277 | Kikinis | Sep 2001 | B1 |
6293284 | Rigg | Sep 2001 | B1 |
6295737 | Patton et al. | Oct 2001 | B2 |
6312124 | Desormeaux | Nov 2001 | B1 |
6341831 | Weber et al. | Jan 2002 | B1 |
6385487 | Henley | May 2002 | B1 |
6436127 | Anderson et al. | Aug 2002 | B1 |
6477410 | Henley | Nov 2002 | B1 |
6487440 | Deckert et al. | Nov 2002 | B2 |
6502583 | Utsugi | Jan 2003 | B1 |
6543893 | Desormeaux | Apr 2003 | B2 |
6554452 | Bourn et al. | Apr 2003 | B1 |
6575751 | Lehman | Jun 2003 | B1 |
6578276 | Patton | Jun 2003 | B2 |
6641578 | Mukai | Nov 2003 | B2 |
6706035 | Cense et al. | Mar 2004 | B2 |
6719467 | Hess et al. | Apr 2004 | B2 |
6810130 | Aubert et al. | Oct 2004 | B1 |
7027619 | Pavlidis et al. | Apr 2006 | B2 |
7233693 | Momma | Jun 2007 | B2 |
7369692 | Shirai et al. | May 2008 | B2 |
7382400 | Sablak | Jun 2008 | B2 |
7433102 | Takahashi et al. | Oct 2008 | B2 |
7602942 | Bazakos et al. | Oct 2009 | B2 |
7890152 | Edgar et al. | Feb 2011 | B2 |
8007062 | Edgar et al. | Aug 2011 | B2 |
8026942 | Payonk | Sep 2011 | B2 |
8027505 | Edgar | Sep 2011 | B2 |
8182425 | Stamatas | May 2012 | B2 |
8184901 | Edgar et al. | May 2012 | B2 |
8231292 | Rabe | Jul 2012 | B2 |
8384793 | Ciuc | Feb 2013 | B2 |
8464732 | Wong | Jun 2013 | B2 |
8582830 | Edgar et al. | Nov 2013 | B2 |
8610767 | Uzenbajakava et al. | Dec 2013 | B2 |
8695610 | Samain | Apr 2014 | B2 |
8899242 | Wong | Dec 2014 | B2 |
8915562 | Edgar | Dec 2014 | B2 |
8942775 | Edgar | Jan 2015 | B2 |
8977389 | Witchell | Mar 2015 | B2 |
9247802 | Edgar | Feb 2016 | B2 |
9277799 | Takaleh | Mar 2016 | B2 |
9333156 | Ito | May 2016 | B2 |
9449382 | Edgar | Sep 2016 | B2 |
10016046 | Edgar | Jul 2018 | B2 |
20010040982 | Kim | Nov 2001 | A1 |
20020054714 | Hawkins et al. | May 2002 | A1 |
20020064302 | Massengill | May 2002 | A1 |
20020070988 | Desormeaux | Jun 2002 | A1 |
20020081003 | Sobol | Jun 2002 | A1 |
20020105662 | Patton | Aug 2002 | A1 |
20020107456 | Leveque | Aug 2002 | A1 |
20020128780 | De Rigal et al. | Sep 2002 | A1 |
20020155069 | Pruche et al. | Oct 2002 | A1 |
20020172419 | Lin et al. | Nov 2002 | A1 |
20020176926 | Pletcher et al. | Nov 2002 | A1 |
20030010083 | Minnerop et al. | Jan 2003 | A1 |
20030045799 | Bazin et al. | Mar 2003 | A1 |
20030050561 | Bazin et al. | Mar 2003 | A1 |
20030053664 | Pavlidis et al. | Mar 2003 | A1 |
20030053685 | Lestideau | Mar 2003 | A1 |
20030060810 | Syrowicz et al. | Mar 2003 | A1 |
20030062058 | Utsugi | Apr 2003 | A1 |
20030063102 | Rubinstenn et al. | Apr 2003 | A1 |
20030067545 | Giron et al. | Apr 2003 | A1 |
20030100837 | Lys et al. | May 2003 | A1 |
20030108228 | Garnier | Jun 2003 | A1 |
20030130575 | Desai | Jul 2003 | A1 |
20030208190 | Roberts et al. | Nov 2003 | A1 |
20030223622 | Simon | Dec 2003 | A1 |
20030229514 | Brown | Dec 2003 | A2 |
20040005086 | Wolff et al. | Jan 2004 | A1 |
20040007827 | Hahn | Jan 2004 | A1 |
20040073186 | Cameron | Apr 2004 | A1 |
20040078278 | Dauga et al. | Apr 2004 | A1 |
20040125996 | Eddowes et al. | Jul 2004 | A1 |
20040170337 | Simon et al. | Sep 2004 | A1 |
20040174525 | Mullani | Sep 2004 | A1 |
20040179101 | Bodnar et al. | Sep 2004 | A1 |
20040201694 | Gartstein et al. | Oct 2004 | A1 |
20040236229 | Freeman et al. | Nov 2004 | A1 |
20040254546 | Lefebvre | Dec 2004 | A1 |
20040257439 | Shirai et al. | Dec 2004 | A1 |
20040267189 | Mavor et al. | Dec 2004 | A1 |
20050004475 | Giron | Jan 2005 | A1 |
20050010102 | Marchesini et al. | Jan 2005 | A1 |
20050019285 | Lee et al. | Jan 2005 | A1 |
20050053628 | Montanari et al. | Mar 2005 | A1 |
20050053637 | Ma Or et al. | Mar 2005 | A1 |
20050063197 | Nightingale et al. | Mar 2005 | A1 |
20050069208 | Morisada | Mar 2005 | A1 |
20050154382 | Altshuler | Jul 2005 | A1 |
20060104507 | John | May 2006 | A1 |
20060153470 | Simon et al. | Jul 2006 | A1 |
20060228037 | Simon | Oct 2006 | A1 |
20060228038 | Simon et al. | Oct 2006 | A1 |
20060228039 | Simon et al. | Oct 2006 | A1 |
20060228040 | Simon et al. | Oct 2006 | A1 |
20060282137 | Nightingale et al. | Dec 2006 | A1 |
20070016078 | Hoyt et al. | Jan 2007 | A1 |
20070035815 | Edgar et al. | Feb 2007 | A1 |
20070047761 | Wasilunas et al. | Mar 2007 | A1 |
20070049832 | Edgar et al. | Mar 2007 | A1 |
20070134192 | Shimizu | Jun 2007 | A1 |
20070203413 | Frangioni | Aug 2007 | A1 |
20070255589 | Rodriguez | Nov 2007 | A1 |
20080192999 | Edgar et al. | Aug 2008 | A1 |
20080193195 | Edgar et al. | Aug 2008 | A1 |
20080194971 | Edgar et al. | Aug 2008 | A1 |
20080219528 | Edgar et al. | Sep 2008 | A1 |
20090025747 | Edgar et al. | Jan 2009 | A1 |
20090209833 | Waagen et al. | Aug 2009 | A1 |
20090231356 | Barnes | Sep 2009 | A1 |
20100114265 | Lechthaler | May 2010 | A1 |
20100139682 | Edgar et al. | Jun 2010 | A1 |
20100224205 | Mitra et al. | Sep 2010 | A1 |
20100224211 | Rabe et al. | Sep 2010 | A1 |
20110124989 | Edgar et al. | May 2011 | A1 |
20110270200 | Edgar | Nov 2011 | A1 |
20130149365 | Rajagopal et al. | Jun 2013 | A1 |
20130302078 | Edgar | Nov 2013 | A1 |
20140050377 | Edgar et al. | Feb 2014 | A1 |
20150196109 | Edgar | Jul 2015 | A1 |
20150237991 | Edgar | Aug 2015 | A1 |
20150359315 | Rabe | Dec 2015 | A1 |
20150359714 | Rabe | Dec 2015 | A1 |
20170004635 | Edgar | Jan 2017 | A1 |
20170256084 | Iglehart | Sep 2017 | A1 |
20180033161 | Nichol | Feb 2018 | A1 |
Number | Date | Country |
---|---|---|
101287607 | Oct 2008 | CN |
202004003148 | Mar 2005 | DE |
1184663 | Mar 2002 | EP |
1210909 | Jun 2002 | EP |
1304056 | Apr 2003 | EP |
1495781 | Jan 2005 | EP |
1677254 | Jul 2006 | EP |
1763380 | Mar 2007 | EP |
2810761 | Dec 2001 | FR |
59171280 | Sep 1984 | JP |
05281041 | Oct 1993 | JP |
06201468 | Jul 1994 | JP |
11019050 | Jan 1999 | JP |
11019051 | Jan 1999 | JP |
2000139846 | May 2000 | JP |
2000331167 | Nov 2000 | JP |
2001112722 | Apr 2001 | JP |
2002017689 | Jan 2002 | JP |
2002263084 | Sep 2002 | JP |
2003052642 | Feb 2003 | JP |
2003057169 | Feb 2003 | JP |
2003057170 | Feb 2003 | JP |
2003513735 | Apr 2003 | JP |
2003519019 | Jun 2003 | JP |
2003210248 | Jul 2003 | JP |
2004501707 | Jan 2004 | JP |
2004105748 | Apr 2004 | JP |
2004315416 | Nov 2004 | JP |
2004315426 | Nov 2004 | JP |
2008526284 | Jul 2006 | JP |
2006271654 | Oct 2006 | JP |
2007231883 | Sep 2007 | JP |
2008526241 | Jul 2008 | JP |
2336866 | Oct 2008 | RU |
WO2001026735 | Apr 2001 | WO |
WO2001049360 | Jul 2001 | WO |
WO2001077976 | Oct 2001 | WO |
WO2004028420 | Apr 2004 | WO |
WO2004091590 | Oct 2004 | WO |
WO2004095372 | Nov 2004 | WO |
WO2005123172 | Dec 2005 | WO |
WO2006008414 | Jan 2006 | WO |
WO2006074881 | Jul 2006 | WO |
WO2007022095 | Feb 2007 | WO |
Entry |
---|
“Lehrstuhl für Optik 2004 Annual Report” Jun. 2005 (2005-2006), Lehrstuhl Für Optik, Institute Für Optik, Information and Photonik, Max-Planck-Forschungsgruppe, Universität Erlangen-Nürnberg, Erlangen, Germany, XP002460048, 2 pages. |
EPO Office Action in App. No. 06 801 295.4, dated Feb. 3, 2010, 3 pages. |
Authorized Officer Nora Lindner, International Preliminary Report on Patentability and Written Opinion of the International Searching Authority for International Application No. PCT/US2006/031441, dated Feb. 12, 2008, 9 pages. |
Russian Official Action (including translation) for Application No. 2008109234, dated Apr. 2, 2009, 7 pages. |
EPO Office Action in Application No. 06 801 295.4, dated Jun. 10, 2008, 3 pages. |
Authorized Officer Moritz Knupling, International Search Report for International Application No. PCT/US2006/031441, dated Dec. 7, 2007, 2 pages. |
Authorized Officer Lars-Oliver Romich, International Search Report and the Written Opinion for International Application No. PCT/US2006/031441, dated Dec. 7, 2007, 14 pages. |
Notification of the First Office Action (including translation) in Application No. 200680037564.6, dated Jul. 31, 2009, 7 pages. |
Examiner's First Report in Application No. 2006279800, dated Feb. 2, 2011, 2 pages. |
Russian Deputy Chief S.V. Artamonov, Decision on Grant Patent for Invention (including translation) in Application 2008109235, dated Feb. 19, 2009. |
Authorized Officer Dorothee Mulhausen, International Preliminary Report on Patentability for International Application No. PCT/US2006/031657, dated Feb. 12, 2008, 7 pages. |
Authorized Officer Laure Acquaviva, Invitation to Pay Additional Fees and, where applicable, Protest Fees International Application No. PCT/US2008/053527, dated Jul. 7, 2008, 8 pages. |
Examiner's First Report in Application No. 2006279652, dated Jan. 28, 2011, 2 pages. |
Notification of the First Office Action (including translation) in Application No. 200680037560.8, dated Jul. 17, 2009, 8 pages. |
EPO Office Action in Application No. 06 789 746.2, dated Apr. 3, 2009, 3 pages. |
Authorized Officer Wolfhard Wehr, International Search Report for International Application No. PCT/US2006/031657, dated Dec. 20, 2006, 2 pages. |
Authorized Officer Athina Nickitas-Etienne, International Preliminary Report on Patentability for International Application No. PCT/US2008/053640, dated Aug. 19, 2009, 5 pages. |
Authorized Officer Michael Eberwein, International Search Report and Written Opinion for International Application No. PCT/US2008/053640, dated Jun. 3, 2008, 9 pages. |
European Patent Office Action for Application No. 08 729 481.5, dated Aug. 23, 2010, 5 pages. |
Authorized Officer Jens Clevorn, International Search Report for Application No. PCT/US2008/053528, dated Nov. 13, 2008, 4 pages. |
Authorized Officer Jens Clevorn, International Preliminary Report on Patentability and Written Opinion of the International Searching Authority for Application No. PCT/US2008/053528, dated Aug. 11, 2009, 9 pages. |
Notification of First Office Action for Application No. 200880009579.0, dated Jul. 14, 2010, 10 pages. |
Authorized Officer Simin Baharlou, International Preliminary Report on Patentability and Written Opinion of the International Searching Authority for Application No. PCT/US2008/065168, dated Dec. 1, 2009, 8 pages. |
Anonimous, “Circular Polarizer Films,” Internet Article, [Online] 2005, http://www.optigrafix.com/circular.htm [retrieved on Sep. 5, 2008]. |
Authorized Officer Carlos Nicolas, International Search Report and Written Opinion for Application No. PCT/US2008/065168, dated Sep. 19, 2008, 13 pages. |
Examination Report for European Application No. 08769826.2, dated Jul. 16, 2013, 6 pages. |
Mike Topping et al., “The Development of Handy 1, A Robotic System to Assist the Severely Disabled,” ICORR '99, Sixth International Conference of Rehabilitation Robotics, Stanford, CA, Jul. 1-2, 1999, pp. 244-249. |
Robot News, “Handy1—Rehabilitation robot for the severely disabled; helping you to eat and drink and brush and even do make-up!”, posted on Apr. 3, 2006, http://robotnews.wordpress.com/2006/04/03/handy1-rehabiliation-robot-for-the-severely-disabledhelping-you-to-eat-and-drink-and-brush-and-even-do-make-up/, 6 pages. |
Mike Topping, “An Overview of the Development of Handy 1, a Rehabilitation Robot to Assist the Severely Disabled” Journal of Intelligent and Robotic Systems, vol. 34, No. 3, 2002, pp. 253-263. |
Notice of Reasons for Rejection for Application No. 2008-526241, dated Aug. 31, 2011, 7 pages. |
Notification of the First Office Action (including translation) in Application No. 200880009069.3, dated Jul. 1, 2011, 8 pages. |
EPO Office Action in App. No. 06 801 295.4, dated Oct. 10, 2011, 5 pages. |
Cula O G et al., “Bidirectional Imaging and Modeling of Skin Texture,” IEEE Transactions on Biomedical Engineering, IEEE Service Center, Piscataway, NJ, USA, vol. 51, No. 12, Dec. 1, 2004, pp. 2148-2159. |
Second Examiner's Report in Application No. 2006279652, dated Nov. 3, 2011, 2 pages. |
Francois-Xavier Bon et al., “Quantitative and Kinetic Evolution of Wound Healing through Image Analysis,” 2000 IEEE Transactions on Medical Imaging, vol. 19, No. 7, Jul. 2000, pp. 767-772. |
Divya Railan et al., “Laser Treatment of Acne, Psoriasis, Leukoderma and Scars,” Seminars in Cutaneous Medicine and Surgery, Dec. 2008, pp. 285-291. |
Robert J. Chiu et al., “Fractionated Photothermolysis: The Fraxel 1550-nm Glass Fiber Laser Treatment,” Facial Plastic Surgery Clinics of North America (2007), vol. 15, Issue 2, May 2007, pp. 229-237. |
Hans Laubach et al., “Effects of Skin Temperature on Lesion Size in Fractional Photothermolysis,” Lasers in Surgery and Medicine, Jan. 2007, pp. 14-18. |
Oana G. Cula et al., “Bidirectional Imaging and Modeling of Skin Texture,” IEEE Engineering of Medicine and Biology Society, Nov. 2004, pp. 1-6. |
Examiner's First Report in Application No. 2008260040, dated Apr. 13, 2012, 2 pages. |
Notice to File a Response in Application No. 10-2008-7006079, dated Aug. 6, 2012, 10 pages. |
Notice to File a Response in Application No. 10-2008-7006079, dated Jun. 25, 2013, 5 pages. |
Notice of Reasons for Rejection for Application No. 2008-526284, dated Apr. 18, 2012, 10 pages. |
Notification of the Second Office Action for Application No. 200880009579.0, dated Mar. 1, 2012, 4 pages. |
Office Action for Application No. 2009148819, dated May 30, 2012, 7 pages. |
Notification of the Third Office Action for Application No. 200880009579.0, dated Jan. 7, 2013, 8 pages. |
Notice to File a Response in Application No. 10-2008-7006041, dated Jan. 29, 2013, 10 pages. |
Chujit Jeamsinkul, “MasqueArray Automatic Makeup Selector/Applicator”, Nov. 11, 1998, Rochester Institute of Technology, 79 pages. |
Office Action for Japanese Patent Application No. 2009-549296, dated Apr. 30, 2013, 12 pages. |
Office Action for Korean Patent Application No. 10-2009-7019063, dated Mar. 24, 2014, 8 pages. |
Examination Report for Canadian Patent Application No. 2,618,706, dated Jul. 31, 2014, 3 pages. |
Weyrich et al., “Analysis of Human Faces using a Measurement-Based Skin Reflectance Model,” Association for Computing Machinery, Inc. 2006, pp. 1-12 (1013-1024). |
Donner et al., “A Layered, Heterogeneous Reflectance Model for Acquiring and Rendering Human Skin” ACM Transactions on Graphics, vol. 27, No. 5, Article 140, Publication date: Dec. 2008 pp. 1-12. |
Examination Report for Canadian Patent Application No. 2,618,519, dated Jan. 16, 2015, 2 pages. |
Examination Report for Australian Patent Application No. 2013200395, dated Feb. 5, 2015, 4 pages. |
Examination Report for Canadian Patent Application No. 2,618,706, dated Jul. 30, 2015, 4 pages. |
Examination Report for Canadian Patent Application No. 2,618,519, dated Jul. 9, 2015, 3 pages. |
Patent Examination Report No. 1 for Australian Patent Application No. 2014200087, dated Jun. 30, 2015, 3 pages. |
Examination Report for Canadian Patent Application No. 2,618,706, dated Jun. 17, 2016, 3 pages. |
Examination Report for Indian Patent Application No. 5301/CHENP/2009, dated Jan. 19, 2017, 7 pages. |
European Search Report for Application No. 11160161.3 dated Mar. 31, 2017, 4 pages. |
European Examination Report for Application No. 11160161.3 dated Apr. 11, 2017, 6 pages. |
European Examination Report for Application No. 11160161.3 dated Mar. 22, 2018, 6 pages. |
Number | Date | Country | |
---|---|---|---|
20180315219 A1 | Nov 2018 | US |
Number | Date | Country | |
---|---|---|---|
60889298 | Feb 2007 | US | |
60889297 | Feb 2007 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15269091 | Sep 2016 | US |
Child | 16028948 | US | |
Parent | 14068894 | Oct 2013 | US |
Child | 15269091 | US | |
Parent | 13476320 | May 2012 | US |
Child | 14068894 | US | |
Parent | 12029534 | Feb 2008 | US |
Child | 13476320 | US |