Imaging apparatus and methods for capturing and analyzing digital images of the skin

Information

  • Patent Grant
  • 7764303
  • Patent Number
    7,764,303
  • Date Filed
    Friday, September 28, 2007
    17 years ago
  • Date Issued
    Tuesday, July 27, 2010
    14 years ago
Abstract
An imaging station for taking a plurality of digital images of a subject under a plurality of illuminating conditions and storing and analyzing the digital images, includes a housing, containing the digital image capturing apparatus, like a camera or video recorder, a computer for processing the image data and one or more displays for displaying images of the person. The imaging station aids in controlling lighting during image capture and may be used to optimally position the subject for imaging. The computer may be programmed to conduct various image processing functions and may be networked to allow image sharing. A display which may be provided on the exterior of the housing allows an operator to visualize the subject and to control the imaging process. The imaging station may be used for teaching purposes.
Description
FIELD OF THE INVENTION

The present invention relates to apparatus and methods for assessing and analyzing the skin and more particularly to digital imaging and analysis of digital photographs taken of a subject. The analysis may be quantitative and comparative relative to another photograph of the same subject taken at another time or relative to a photograph or photographs taken of another person or persons.


BACKGROUND OF THE INVENTION

Various imaging systems have been proposed that photographically capture images of a person's face for analysis of the health and aesthetic appearance of the skin. Different images, captured at different times or under different lighting conditions can be used and/or compared to one another to gain insight into the condition of the skin and its response to treatment. This was typically done by human operators inspecting the photographs to identify certain visual indicators of skin condition and to ascertain changes between photographs. It would be beneficial for imaging systems to be less reliant on human perception and manual input. For example, in analyzing the skin, e.g., of a person's face, it is beneficial to examine specific regions of the face for specific associated attributes, since the different regions of the face are specialized in form and function and interact with the environment differently. Some skin imaging systems utilize a trained human operator to identify facial regions by manually touching (a stylus to a touch-sensitive input/output screen) or pointing to (with a cursor and clicking or otherwise indicating) fiducial points on a displayed facial image or drawing (with a stylus or cursor/mouse) polygons on an image to identify the facial regions of interest. While effective, such manual operations are labor intensive and require trained operators. It would therefore be beneficial for imaging systems to identify facial regions on images automatically to increase the speed and consistency of identification of the facial regions and to decrease the reliance upon operator input.


While the science of digital skin imaging analysis has identified various skin responses that are useful indicators of various skin condition parameters, it would still be desirable to identify and use additional specific responses of the skin that are indicative of skin condition. For example, it would be beneficial to identify skin imaging techniques that indicate photodamage and that reliably quantify and measure such indicative skin response. One of the indicators of skin condition is color response. Skin color is also important relative to the selection and design of cosmetics. There are limitations inherent in the expression of color in terms of RGB pixel intensity. It would therefore be beneficial to improve current methods of assessing skin color/brightness, e.g., for assessing skin condition and/or for the selection of and design of skin products, such as cosmetics.


Since imaging is a technical activity using complex apparatus, it remains an objective to improve the user-friendliness of digital imaging and analysis apparatus and to promote the ease and effectiveness of a digital imaging session, as well as enhancing the interaction between therapists/clinicians and patients/clients undergoing digital imaging sessions.


SUMMARY OF THE INVENTION

The disadvantages and limitations of known apparatus and methods for skin imaging and analysis are overcome by the present invention, which includes an imaging station for capturing images of a subject in a given ambient lighting environment. The imaging station has a housing with an aperture where the subject is presented for capturing images of the subject, a digital image capture device, a light for illuminating the subject during image capture, and a computer for controlling the image capture device and the light. The housing contains the digital image capture device, and the light and at least partially limits ambient light, if present in the ambient lighting environment, when a digital image of the subject is captured. In one embodiment of the present invention, the imaging station has the capability of taking a plurality of digital images of a subject under a plurality of illuminating conditions, storing, displaying and analyzing the digital images.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a diagrammatic view of a digital imaging system in accordance with the present invention.



FIG. 2 is a perspective view of an image capturing apparatus in accordance with an embodiment of the present invention.



FIGS. 3 and 4 are phantom views of the imaging apparatus of FIG. 1 looking from the back and side, respectively.



FIG. 5 is a phantom view like FIGS. 3 and 4 but showing an alternative embodiment of the present invention.



FIG. 6 is a perspective view of a light guide in accordance with an embodiment of the present invention.





DETAILED DESCRIPTION OF THE INVENTION


FIG. 1 diagrammatically illustrates a digital imaging apparatus 10 having a source of illuminating light 12 (e.g., a photo flash) and a digital camera 14. An image of the subject S is recorded by the camera 14 in the form of a matrix 16 of pixel values in RGB format (red, green, blue). The matrix 16 of values is typically produced by a Bayer-filtered close-coupled display (CCD) and the information is stored in a memory device, such as random access memory (RAM) of computer 17 or on a flash memory card. The RGB data can be separated into channels or planes, R, G and B, one plane for each color. Various frequencies of illuminating light LI disposed at varying positions relative to the subject S may be used to capture digital images of the subject S in order to capture different information about the skin of the subject. Filters 18, 20 may be employed on the light 12 and the camera 14, respectively, to control the light frequency/polarity of light LI which is projected on the subject S, as well as controlling the light LR (reflected or emitted from the subject S), which is admitted into the camera 14. Imaging of this type is described at length in U.S. patent application Ser. No. 10/008,753, entitled, “Method of Taking Images of the Skin Using Blue Light and the Use Thereof”, which was published as Pub. No. US 2004/0146290 A1, U.S. patent application Ser. No. 10/978,284, entitled, “Apparatus for and Method of Taking and Viewing Images of the Skin”, which was published as Pub. No. 2005/0195316 A1 and U.S. patent application Ser. No. 11/169,813, entitled, “Skin Imaging System with Probe”, which was published as Pub. No. US 2006/0092315 A1, all of which are incorporated in their entirety by reference herein and which are appended hereto as Appendices 1-3. Unless specifically described otherwise, the present invention as described herein in reference to FIGS. 1-4 has the features and functionality disclosed in the foregoing U.S. patent applications. The foregoing applications disclose various correlations between skin condition and the images produced by various photographic parameters, i.e., specific combinations of illuminating light LI, filters 18, 20, etc. The skin response to different illuminating frequencies, angles, polarity, etc. can reveal information about skin condition and this evidence of skin condition can be recorded and recalled in digital images for purposes of analysis.


Since the images recorded are in digital form, i.e., in numerical pixel intensity values, the images lend themselves to quantitative analysis, such as by a program or programs residing on computer 17. For example, instead of just noting that the cheek of a subject whose image is taken at time T1 is more or less red in color in an image of the person taken at time T2, as discerned by a skilled human observer, the values of the intensity of the red pixels in the specific area of the cheek at times T1 and T2 may be quantitatively compared. For example, the two values may be mathematically processed to determine the actual change in red intensity for that pixel. Digital image quantification can be used to discern average values for the skin in specified regions, e.g., by summing the values of pixels in the specific region of interest and then dividing by the number of pixels. In this manner, a whole area of skin, e.g., of the face, may be characterized quantitatively. Various other quantified analyses may be conducted, e.g., the imaged area of skin may be tested for standard deviation in pixel intensity.


The present invention recognizes that digital imaging apparatus 10 may include communications apparatus such as modems, e.g., cable and DSL modems, to communicate digital image data and related communications from a digital imaging computer 17 to a remote computer 20 that may be in the home or workplace of an interested person, such as a fellow clinician, cosmetologist, a patient or a cosmetics customer. The receiver can then review the image data and other related communication on the monitor 22 of their computer 20. These communications may be conducted over the Internet using wired or wireless connections. In addition to sending image data to others, the imaging apparatus 10 may also receive image and related data from others, such as fellow clinicians, patients, etc. via the same means of communication. For example, a cosmetics customer may capture their own image with a local CCD camera 24 and relay that image to the imaging system computer 17 via the Internet, e.g., as an e mail attachment. The image may be examined and a responsive communication prepared by a cosmetologist and sent to the customer. The responsive communication may contain images, e.g., past images taken of the customer that are relevant to the customer, images relevant to cosmetics, etc. These can be sent as attachments to an e mail message. The present invention also recognizes that a cell phone 26 may be used to capture and communicate image information and related communications between parties at the imaging station 10 and parties remote therefrom.



FIG. 2 shows an imaging station 30 having a generally spherical outer housing 32. The housing 32 may be supported on a flat surface such as a counter or table or be provided with a rotatable mounting, as would be known to one of normal skill in the art. A chin rest 34 is positioned at the open end of a tunnel or “funnel” 36, which extends into the interior of the housing 32. At the other end of the tunnel 36 opposite to chin rest 34, a self-visualizing panel 38, such as a mirror or alternatively, a display, such as a flat panel display, allows a user of the imaging station 30 to see themselves when they are positioned with their chin on the chin rest 34. An operator display 40 on the exterior surface of the housing 32 allows an operator, such as a dermatologist, clinician or cosmetologist to view digital images, e.g., those taken of a subject who uses the imaging station 30. The operator display 40 may be a touch-screen display to function as an operator interface whereby the operator can provide input to the imaging station, e.g., in the form of commands, data, menu selections, or graphic entry by way of drawing on the display 40 with a stylus. After digital images have been captured, the operator display 40 may be used to display the images captured to the operator and to the subject person who can simultaneously view and discuss the images and what is shown in them.



FIGS. 3 and 4 show that the imaging station 30 has a digital camera 42 positioned proximate the self visualizing panel 38 for capturing digital images of a subject person (user of the imaging station). A light 44, such as a strobe light or other photo flash is positioned proximate to the camera 42 for illuminating the subject when photos are taken. Filter wheels 46, 48 with associated positioning motors and position sensors (not shown) filter the light emitted from the light 44 and entering the camera 42, respectively. The light 44, camera 42 and/or associated filter wheels 46, 48, may all be moveable to achieve different orientations for imaging. Computer 50, e.g., a personal computer (PC), controls the imaging session, providing instructions to the operator through a speaker and/or on the operator display 40 and receiving input from the operator, powering and triggering the light 44, the filter wheels 46, 48 and the camera 42. The digital images are stored in the camera 42 and by the computer 50 on at least one storage device, such as in RAM and/or on a DVD recorder 52. The housing also contains other necessary peripherals, such as strobe power pack 54, a power supply 56 for the computer 50 and a power supply 58 for the operator display 40. The positioning of the various components of the imaging station 30 are determined by their physical packing and accommodation into the housing 34 and may be repositioned as needed depending upon their specific exterior dimensions.


In documenting and assessing skin from cosmetic and health perspectives, the better controlled the light properties (e.g., reflection, scattering, polarization, excitation wavelength and excitation bands), the better the resulting data and analysis of the skin. FIGS. 5 and 6 show light guides 60 and 62, which can be used to control light emitted from light 44 and captured by camera 42, respectively. This is because the light leaving a light source (i.e. strobe) will naturally begin to scatter and disperse. Light guide 60 helps to maintain the proper optical properties in order to illuminate a face or other object in order to minimize any interference from surfaces, de-polarization of light or fluorescence of materials (objects or surfaces) other than the analyzed face or object which would cause the light to produce poor quality images. In addition, scatter light captured by camera 42 may interfere with the sharpness of the image. Light guide 62 helps to eliminate the scatter light. The light guides 60, 62 may be formed of one or more flat or curved surfaces placed in front of the light 44 and camera 42, respectively. The light guides 60, 62 may be formed into a square, flare, conical, frustoconical, or round shape hood. Preferably, the color and texture of the interior surfaces 64 of the light guides 60, 62 absorb light, e.g., painted or made of flat black color. The color and/or texture of outside surfaces should not affect the quality of image captured by the camera 42. It should be understood that the light guides 60, 62 depicted in FIG. 4 could be recessed further into the housing 32 by moving the light 44 and the camera 42 deeper into the housing 32, e.g., behind screen 38. As a result, the light guides 60, 62 can be hidden within the housing 32. As shown in FIG. 5, the filters 46′, 48′ can be disposed on a common wheel 47.


The imaging station 32 may be provided with all the image capture and processing apparatus as described in Publication Nos. 2004/0146290, 2005/0195316 and 2006/0092315, which is contained within the housing 32. The imaging station preferably has the capability to rotate at least 90 degrees, e.g., either on a supporting surface or rotatable platform and/or to rotate up and down and/or lift or drop in height to accommodate users of various height and seated on various seating arrangements. The tunnel 36 in conjunction with the chin rest 34 and self visualizing panel 38, cooperate to position the subject person in a maximal position (relative to the light 44 and camera 42) for imaging the face of the subject person. The tunnel 36 also reduces the intrusion of environmental lighting during imaging sessions, limits the light projected into the environment when the light 44 is operated during an imaging session and provides the subject person with privacy. The operator display 40 may be used to visualize the subject person, before, during and after imaging, i.e., the camera 42 may be operated in real-time/video mode to show the moving image of the subject person as they place themselves on the chin rest 34 and otherwise prepare to undergo imaging (close their eyes etc.) This real time feed may be used to display the subject on the operator display 40, as well as on the self visualizing panel 38 (display). In this manner, the operator display 40 may be used to display the view that the camera 42 “sees”. As shown, e.g., in Publication No. 2005/0195316 A1, the camera 45 and illuminating light or lights 44 would be positioned adjacent to the self visualizing panel 38. In the case of a mirror-type self visualizing panel, the mirror material may be partially or entirely made of “one-way” glass or a half-mirror. Of course, if the functionality of subject self-visualization is not needed or desired, e.g., in the context of an imaging station 30, which is not intended for subject control or one which relies on verbal commands to the subject and does not need visual cues to the subject, the self-visualizing panel 38 may be omitted or replaced with a clear panel, such as clear glass or plastic. With respect to insuring that the subject person has their eyes closed, the operator display 40 may be used to apprise the operator of this condition, so the operator may then know it is safe to trigger imaging, e.g., by depressing a virtual button on the operator display 40.


The computer 50 is preferably provided with a plurality of data ports, e.g., USB and or Ethernet ports and/or 1394 “firewire”, wireless capacity or other interface standards for sharing the data captured by the imaging station 30 with other computers, e.g., on a network like the Internet, and for receiving data from other computers for display on the imaging station 30, e.g., imaging data from population studies, skin care product information, etc. A printer (not shown) may also be used in conjunction with the imaging station 30 to print out images, advice and/or information concerning the skin condition and treatment of the subject person.


As described in a commonly owned patent application entitled, Method and Apparatus for Identifying Facial Regions, which was filed on Oct. 2, 2006 as Provisional Application No. 60/848,741 and which is incorporated in its entirety herein, quantitative analysis of skin condition can be facilitated and improved by identifying specific regions of the skin, e.g., of the face. This identification of regions of interest can be conducted automatically through pupil or flash glint identification as fiducial reference points for mapping the facial regions.


As described in a commonly owned patent application entitled, Apparatus and Method for Analyzing Skin Using L*a*b* Colorspace, which was filed on Oct. 2, 2006 as Provisional Application No. 60/848,768 and which is incorporated in its entirety herein, identification of skin photo-response indicative of skin condition, quantitative analysis of skin condition, color and brightness can be facilitated and improved by converting digital image data from RGB format to L*a*b* format. As described in Application No. 60/848,768, the process for converting RGB image data to L*a*b* colorspace data is known to one of normal skill in the art, e.g., as described in Charles Poynton, A Technical Introduction to Digital Video (J. Wiley & Sons) Chapter 7, “Color Science.” Application No. 60/848,768 also discloses that facial regions may be identified by an operator marking a displayed image with a cursor tool or by programmatically analyzing the image data to identify pixel values consistent with unique fiducial reference points, such as the pupils, which would have unique color (black or flash glint), shape (round), size, spacing and orientation. Once pupils are identified, facial regions may be calculated relative thereto in accordance with empirically determined spacial/dimensional relationships to the pupils. RGB to L*a*b* conversion may be used to aid in selecting cosmetics for an individual from a palette of available cosmetic colors or may be used to select/design hues of cosmetics in defining a color palette for use by a population.


As described in a commonly owned patent application entitled, Apparatus and Method for Measuring Photodamage to Skin, which was filed on Oct. 2, 2006 as Provisional Application No. 60/848,767 and which is incorporated in its entirety herein, the green response intensity of skin to illumination by blue light can be used to identify and quantify photodamage to the skin, e.g., due to the presence of elastotic material. Further, a population's digital skin imaging data concerning photodamage and/or other skin conditions can be characterized numerically and analyzed statistically. Quantitative analysis of digital images of the skin response, such as the degree of variation of green signal response over a surface of the face as an indicator of photodamage, can be used to assess photodamage, and develop a number or score of skin condition relative to a population.


As described in a commonly owned patent application entitled, Calibration Apparatus and Method for Fluorescent Imaging, which was filed on Oct. 2, 2006 as Provisional Application No. 60/848,707 and which is incorporated in its entirety herein, a calibration standard may be employed during an imaging session, e.g., during blue fluorescent photography, to identify variations in illumination intensity between images taken at different times. Having identified a circumstance where illumination intensity has varied, the operator of the imaging apparatus can be notified to correct the conditions leading to illumination variations. Alternatively, the imaging apparatus may compensate for the variations by adjusting the intensity of the illuminating light or normalizing the images by digital image processing.


Each of the inventions disclosed in the foregoing applications incorporated by reference may be utilized and incorporated in the imaging station 30 of the present invention. In doing so, certain synergies are realized through their combination and interrelationship. For example, the capability of manually or automatically identifying facial regions has utility in the color ascertainment of certain facial regions, e.g., the cheek or the “average color” of the combination of cheek, forehead and chin. This capability of identifying skin color in certain regions can be helpful in matching cosmetics to those regions or combinations of those regions. For example, in selecting or designing colors for cosmetic foundations, it would be beneficial to know the average color of the facial regions: cheek, forehead and chin and exclude the lips, eyebrows and eyes from the contributing colors.


The identification of facial regions may also be used in the imaging station 30 in conjunction with the teachings of the application directed to ascertaining photodamage, in that specific regions of the skin are more prone to photodamage due to skin composition and exposure likelihood. The calibration of the apparatus and techniques taught in the application pertaining to same can be used in the imaging station 30 in that any set of digital images can be normalized to compensate for variations in illumination intensity as distinguished from variations in response intensity attributable to skin variation.


The imaging station 30, provides an apparatus and method of obtaining digital image data information concerning the appearance and light response characteristics of a person's skin. Since the present invention can be used by numerous persons, the data obtained from the use of the present invention, as well as data obtained from other sources, such as digital images made by others using different apparatus and methods, may be used to analyze the individual relative to others. As noted, e.g., in the Application entitled Apparatus and Method for Measuring Photodamage to Skin, which is incorporated by reference herein, digital images can be analyzed quantitatively to provide a discrete measurement of skin condition(s). An alternative, traditional method for assessing skin condition is through the judgment of a person based upon the appearance of the image. Over the years it has been shown that experienced clinicians can accurately and effectively discern changes in skin condition, based upon their visual perception, e.g., in comparing two images taken at two different times. Accordingly, the present invention recognizes the value of human perceptive judgments of skin condition in forming an accurate assessment of skin condition. This human perceptive measure can be used in parallel to quantitative, automated methods to confirm one with the other and thereby increase the credibility of both.


The present invention includes a method for training, implementing and testing visual analysis of images of subject persons. More particularly, to train persons to perceptively and accurately judge skin condition based upon images, training sets of images may be assembled. A training set may be a set of images taken of a person over time, which illustrates their skin condition over time. The image set can be divided into various categories, e.g. one category may be images taken with blue light and sensitive to green signal response of the skin to indicate photodamage. The image sets can then be reviewed by one or more person who are trained professionals with a proven proficiency in ascertaining skin condition based upon review of images of the skin of subject persons. The proficient individual or individuals will then order the image set into a range of conditions extending from the worse condition to the best condition. This ordering can be conducted in conjunction with quantified analysis. For example, six images may be considered and arranged from best to worst illustrating the spectrum of skin condition exhibited by the subject person relative to photodamage. Having developed this reference set or training set of images, persons to be trained are taught the visual indicators to look for to discern photodamage. The trainees are then shown the training image set, e.g., two images at a time, randomly selected by a computer, and asked to characterize which of the two displayed images is better or worse, e.g., with respect to photodamage. In presenting the images, the position of the better and worse images (either in the right or left, upper or lower positions on the display) is randomized, such that the display position does not correlate to the skin condition. After the trainees can successfully identify the better and worse images consistently, i.e., match the conclusions of a professional with demonstrated expertise, then the trainee can be adjudged to have assimilated the necessary perception to discern skin condition from digital images.


Assuming that a person is available with suitable training in judging skin condition, the same automated technique of presenting a plurality of images and receiving the judgment of better or worse can be applied to the process of evaluating an individual's skin condition over time. For example, if a person has had a series of imaging sessions, e.g., ten sessions over a period of two years, then a clinician can be automatically and randomly presented with a series of images taken from these imaging sessions and categorize them as better or worse than another image displayed. This process can be repeated until an unbiased repeating pattern of relative ranking is clearly established, demonstrating a reproducible clinical judgment. From this judgment, the subject person can be advised as to when their skin exhibited its best condition, the worse condition, the pattern of improvement or worsening, etc. This type of analysis can be applied to measure the success or failure of treatment and can be used in conjunction with quantitative methods to provide a balanced, informed analysis.


The foregoing methods for training and/or clinically assessing skin condition can be conducted on the imaging station 30. For example, after an imaging session, the clinician can perform a clinical assessment witnessed by the subject person.


It should be understood that the embodiments described herein are merely exemplary, and that a person skilled in the art may make many variations and modifications without departing from the spirit and scope of the invention. All such variations and modifications are intended to be included within the scope of the invention.

Claims
  • 1. An imaging station for capturing images of a subject in a given ambient lighting environment, comprising: a housing with an aperture where the subject is presented for capturing images thereof;a camera for capturing digital images;a light for illuminating the subject during image capture;a first display visible to a user and disposed in the housing for displaying a digital image of the subject captured by said camera and permitting the subject to see themselves as they appear when presented to said aperture;a second display visible on the exterior of the housing for displaying a digital image of the subject captured by said camera, said second display presenting an operational interface through which the imaging station may be controlled;a computer for controlling the camera, said first and second displays and the light, said housing containing the camera and the light and including a tunnel extending from said aperture toward said camera and said light, said tunnel at least partially limiting ambient light, if present in the ambient lighting environment, when a digital image of the subject is captured and when viewing images on the first display.
  • 2. The imaging station of claim 1, wherein said second display is mounted on a side of said housing generally parallel to and beside said tunnel.
  • 3. The imaging station of claim 2, further comprising a first moveable filter for effecting the light emitted by the light for illuminating the subject, such that a plurality of different types of images may be captured with the imaging station.
  • 4. The imaging station of claim 3, wherein said first moveable filter is disposed on a filter wheel within the housing, rotatable before said light to assume at least two positions, a first position filtering light from said light and a second position not filtering light from said light.
  • 5. The imaging station of claim 4, further comprising a second moveable filter having a plurality of positions relative to said camera, a first position filtering light entering said camera and a second position not filtering light entering said camera.
  • 6. The imaging station of claim 5, wherein the first and second moveable filters are disposed on a common filter wheel.
  • 7. The imaging station of claim 1, wherein said housing has a chin rest disposed proximate said aperture for receiving a chin of the subject, said housing being adjustable in position relative to a subject to position the subject's chin on said chin rest.
  • 8. The imaging station of claim 1, further comprising a light guide for guiding at least one of light emitted from said light and light directed toward said camera from the subject.
  • 9. The imaging station of claim 8, wherein said light guide has a tunnel shape with an aperture at either end and a lumen passing therebetween.
  • 10. The imaging station of claim 1, wherein said computer is programmed to identify a plurality of facial regions and to convert RGB colorspace images captured by the imaging station to L*a*b* colorspace values to facilitate image processing and quantitative analysis.
  • 11. An imaging station in accordance with claim 1, further comprising: a plurality of said imaging stations coupled to a network, said network allowing the communication of images to selected ones of said imaging stations.
  • 12. The imaging station of claim 11, wherein an image may be communicated to a selected imaging station as an e mail attachment.
  • 13. The imaging station of claim 11, wherein a cell phone image may be communicated to a selected imaging station through said network.
  • 14. An imaging system, comprising: a plurality of imaging stations capable of capturing and storing digital images of persons, each imaging station having a digital image capture device, an illuminating light, at least one filter for changing the illuminating light, such that a plurality of types of light are available for capturing a plurality of different types of digital images and a computer for controlling said image capture device, said filter and said illuminating light, said plurality of imaging stations coupled to a network, said network allowing the communication of images to selected ones of said imaging stations, wherein said computer is programmed to quantitatively analyze digital images, such that a plurality of digital images may be quantitatively ordered relative to each other based upon a given criteria exhibited by the plurality of digital images.
  • 15. The imaging system station of claim 14, wherein said computer is programmed to present the plurality of digital images to a trainee to test the trainee's capacity to differentiate between condition states of skin as revealed by the plurality of digital images and to record the degree of success/failure exhibited by the trainee.
  • 16. The imaging system of claim 15, wherein the plurality of images are presented in pairs and the trainee is asked to rank the pair of images as to a particular condition state.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims the benefit of U.S. Provisional Patent Application Ser. No. 60/848,765 filed Oct. 2, 2006, the disclosure of which is incorporated herein by reference in its entirety.

US Referenced Citations (93)
Number Name Date Kind
2005426 Land Jun 1935 A
3353884 Chaffee et al. Nov 1967 A
3904293 Gee Sep 1975 A
3971065 Bayer Jul 1976 A
4070096 Jasgur Jan 1978 A
4170987 Anselmo et al. Oct 1979 A
4398541 Pugliese Aug 1983 A
4592726 Brilliant Jun 1986 A
4842523 Bourdier Jun 1989 A
4905700 Wokalek et al. Mar 1990 A
4911544 Walsh Mar 1990 A
5005975 Kawai et al. Apr 1991 A
5016173 Kenet et al. May 1991 A
5198875 Bazin et al. Mar 1993 A
5241468 Kenet Aug 1993 A
5363854 Martens et al. Nov 1994 A
5446515 Wolfe et al. Aug 1995 A
5456260 Kollias et al. Oct 1995 A
5539540 Spaulding et al. Jul 1996 A
5556612 Anderson et al. Sep 1996 A
5640957 Kaminski Jun 1997 A
5742392 Anderson et al. Apr 1998 A
5760407 Margosiak et al. Jun 1998 A
5785960 Rigg et al. Jul 1998 A
5836872 Kenet et al. Nov 1998 A
5945112 Flynn et al. Aug 1999 A
5973779 Ansari et al. Oct 1999 A
5991433 Osanai et al. Nov 1999 A
6018586 Kamei Jan 2000 A
6021344 Lui et al. Feb 2000 A
6032071 Binder Feb 2000 A
6076904 Shepherd et al. Jun 2000 A
6081612 Gutkowicz-Kruisn et al. Jun 2000 A
6134011 Klein et al. Oct 2000 A
6148092 Qian Nov 2000 A
6208749 Gutkowicz-Krusin et al. Mar 2001 B1
6215893 Leshem et al. Apr 2001 B1
6251070 Khazaka Jun 2001 B1
6278999 Knapp Aug 2001 B1
6293284 Rigg Sep 2001 B1
6317624 Kollias et al. Nov 2001 B1
6419638 Hay et al. Jul 2002 B1
6436127 Anderson et al. Aug 2002 B1
6437856 Jacques Aug 2002 B1
6441854 Fellegara et al. Aug 2002 B2
6507747 Gowda et al. Jan 2003 B1
6510366 Murray et al. Jan 2003 B1
6597392 Jenkins et al. Jul 2003 B1
6600947 Averback et al. Jul 2003 B2
6603552 Cline et al. Aug 2003 B1
6619860 Simon Sep 2003 B1
6624843 Lennon Sep 2003 B2
6711426 Benaron et al. Mar 2004 B2
6728560 Kollias et al. Apr 2004 B2
7004599 Mullani Feb 2006 B2
7015929 Satomi et al. Mar 2006 B2
D564663 Udagawa et al. Mar 2008 S
20010013897 Kowno et al. Aug 2001 A1
20020059030 Otworth et al. May 2002 A1
20020065468 Utzinger et al. May 2002 A1
20020071246 Stewart Jun 2002 A1
20020093698 Kagawa Jul 2002 A1
20020133080 Apruzzese et al. Sep 2002 A1
20020161664 Shaya et al. Oct 2002 A1
20020177778 Averback et al. Nov 2002 A1
20020181752 Wallo et al. Dec 2002 A1
20020182235 Slavtcheff et al. Dec 2002 A1
20030007071 Goto Jan 2003 A1
20030045916 Anderson et al. Mar 2003 A1
20030067545 Giorn et al. Apr 2003 A1
20030086703 Kollias et al. May 2003 A1
20030086712 Merola et al. May 2003 A1
20030108542 Pruche et al. Jun 2003 A1
20030138249 Merola et al. Jul 2003 A1
20030191379 Benaron et al. Oct 2003 A1
20040006553 de Vries et al. Jan 2004 A1
20040020509 Waisman Feb 2004 A1
20040077951 Lin et al. Apr 2004 A1
20040125996 Eddowes et al. Jul 2004 A1
20040136701 Nakanishi et al. Jul 2004 A1
20040146290 Kollias et al. Jul 2004 A1
20040174525 Mullani Sep 2004 A1
20040186363 Smit et al. Sep 2004 A1
20040263880 Ito et al. Dec 2004 A1
20050116039 Zhu et al. Jun 2005 A1
20050129288 Chen et al. Jun 2005 A1
20050131304 Stamatas et al. Jun 2005 A1
20050146863 Mullani Jul 2005 A1
20050195316 Kollias et al. Sep 2005 A1
20060092315 Payonk et al. May 2006 A1
20070002479 Menke et al. Jan 2007 A1
20070004972 Cole et al. Jan 2007 A1
20070005393 Cole et al. Jan 2007 A1
Foreign Referenced Citations (33)
Number Date Country
0682236 Nov 1995 EP
0737932 Oct 1996 EP
1089208 Apr 2001 EP
1118845 Jul 2001 EP
1194898 Mar 2003 EP
1297782 Apr 2003 EP
1376444 Jan 2004 EP
1433418 Jun 2004 EP
1434156 Jun 2004 EP
1541084 Jun 2005 EP
282 1152 Aug 2002 FR
208 314 Apr 2003 FR
2106241 Apr 1983 GB
2293648 Apr 1996 GB
7075629 Mar 1995 JP
7323014 Dec 1995 JP
WO 9424936 Nov 1994 WO
WO 9616698 Jun 1996 WO
WO 9705473 Feb 1997 WO
WO 9747235 Dec 1997 WO
WO 9824360 Jun 1998 WO
WO 9837811 Sep 1998 WO
WO 9917668 Apr 1999 WO
WO 0076398 Dec 2000 WO
WO 0104839 Jan 2001 WO
WO 0122741 Mar 2001 WO
WO 0122869 Apr 2001 WO
WO 0135827 May 2001 WO
WO 0145557 Jun 2001 WO
WO 0172216 Oct 2001 WO
WO 0182786 Nov 2001 WO
WO 02061405 Aug 2002 WO
WO 03040878 May 2003 WO
Related Publications (1)
Number Date Country
20080079843 A1 Apr 2008 US
Provisional Applications (1)
Number Date Country
60848765 Oct 2006 US