Multispectral detection and presentation of an object's characteristics

Information

  • Patent Grant
  • 11051697
  • Patent Number
    11,051,697
  • Date Filed
    Tuesday, October 15, 2013
    11 years ago
  • Date Issued
    Tuesday, July 6, 2021
    3 years ago
Abstract
An apparatus for capturing a multispectral image of an object is described. The apparatus includes one or more means for transmitting a beam of laser light at a first wavelength and a beam of laser light at one or more additional wavelengths different from the first wavelength. There is a means for causing the beams of laser light to travel in a coaxial path and a moving mirror. The beams of light bounce off the mirror thereby producing a two dimensional projection pattern. This pattern travels from the mirror along a first path to an object and wherein some of the laser light penetrates the object and travels to an internal structure of the object. The reflection of the laser light returns to a photo detector along a path different from said first path.
Description
FIELD OF THE INVENTION

The present invention relates to improvements in multispectral imaging for determining the characteristics of an objects, and more particularly to improvements which are capable of providing imaging of internal structure through trans-illumination apparatus and techniques.


BACKGROUND OF THE INVENTION

The human visual system is able to detect light in a range of wavelengths that are typically described as “visible light.” The longest wavelengths detected are red, the mid range is green and shortest wavelengths are blue. Long wavelength light such as infrared and short wavelength light such as ultraviolet are invisible to the human eye. The characteristics of an object that we can determine with the unaided eye are limited to those that can be detected in this spectrum. Furthermore, the trichromatic system used by the eye is broadband in nature and cannot see narrowband artifacts such as would be seen by a spectrophotometer.


Several products have reached market that emit infrared light on an object and use the reflected light to detect a pattern of varying contrast in the infrared spectrum. The device then projects an image that follows those contrast changes using a wavelength within the visible spectrum. In one such product, the AccuVein AV300, detects a pattern of absorption and reflection in the infrared and re-projects that pattern as red. Given that hemoglobin absorbs infrared light to a greater degree than the surrounding tissue, the projected pattern can be used by a medical practitioner to identify the position of a vein to be used for venipuncture.


In other products, the light is captured and the processed image is displayed on a remote screen such as an LCD panel or through an eyepiece that is in line with the object.


These contrast enhancement products act as color shifters. Just as the human eye would detect variations in absorption and reflection in the three colors it can see (red, green and blue), these contrast enhancers detect the variations at wavelengths outside the visible spectrum and display the corresponding pattern inside the visible spectrum.


OBJECTS OF THE INVENTION





    • 1. It is an object of the invention to use a laser camera to detect characteristics of an observed object based on the reflection and absorption of the laser light or based on the re-emission of absorbed light at a different wavelength than the incident light.

    • 2. It is an object of the invention to use a single or multiple wavelengths of laser light to detect characteristics of an observed object based on the reflection and absorption of the laser light based or the re-emission of absorbed light at a different wavelength than the incident light.

    • 3. It is an object of the invention to use multi-spectral imaging by capturing light from wavelengths beyond just the visible light range, such as infrared and UV. This allows extraction of additional information that the human eye fails to capture with its receptors for red, green and blue.

    • 4. It is an object of the invention to use hyper-spectral imaging by capturing information from a plurality of wavelengths including and beyond the visible light range, such as infrared and UV. This allows extraction of additional information that the human eye fails to capture with its broadband receptors.

    • 5. It is an object of the invention to detect characteristics of the observed object both of the surface of the observed object when it its opaque and of the surface and below the surface when the object is translucent.

    • 6. It is an object of the invention to improve the quality of detection of characteristics of the observed object by iteratively varying in real time the intensity of the light emitted by the laser camera based on the previously detected characteristics of the observed object,

    • 7. It is an object of the invention to present detected characteristics of the observed object back on to the object itself or on to a display visible to the user of the device or both using contrast, color, false color, icons or text or a combination of these modalities.

    • 8. It is an object of the invention to capture detected characteristics of the observed object for the purpose of record keeping or for the purpose of post processing or for the purposes of detecting changes and trends in the observed object or a combination of these purposes.

    • 9. It is an object of the invention to combine the detected characteristics of the observed object with external sources of data for the purposes of refining and/or extending the meaning of the detected characteristics.

    • 10. It is an object of the invention to improve the detection characteristics of the system by using transillumination.

    • 11. It is an object of the invention to detect characteristics of many types of objects and materials including veins, arteries, teeth, metals and plastics.





BRIEF SUMMARY OF THE INVENTION

The invention disclosed herein extends this concepts described in the parent applications in several novel ways, which can be used individually, or in combination.

    • 1. By using more than one wavelength of light for analysis, additional characteristics about the object being scanned can be determined and then this additional information can be re-projected back on to the surface within the visible spectrum making these characteristics visible to a human either as a color-shifted image or as a false-color image. In this embodiment the device acts like a photo spectrometer that re-projects a visible image back on the object.
    • 2. Contrast enhancement products rely on differential absorption and reflection of light (i.e. they detect contrast changes) and then re-project that contrast pattern. An alternative embodiment can also use florescence of the object being scanned by shining light of one wavelength on to the object and detecting light at another wavelength returned from the object and then this information being re-projected back on to the surface within the visible spectrum making these characteristics visible. In this embodiment the device acts like a spectroscope that re-projects a visible image back on the object.
    • 3. This invention can further use florescence or color change of a material applied to the object being scanned that based on the characteristics of the object exhibits either a color change (and can therefore use contrast enhancement) or a florescence at one or more wavelengths of incident light.


The invention can be further enhanced by combining some or all of these techniques to detect and project different characteristics of the object being scanned and projecting them back on the object.


While many of the descriptions are for embodiments that use re-projection back on to the object under study as the user interface, embodiments with user interfaces remote from the object such as an LCD screen will be useful in many applications. Furthermore, the combination of re-projection and remote displays will also be useful in many applications.


While the devices and systems described herein focus on multispectral systems and describe specific embodiments of said devices and systems, the methods, features, functions, abilities, and accessories described in the parent applications apply fully.





BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE FIGURES


FIG. 1 shows the different absorption spectrum for human deoxygenated hemoglobin, and for oxygenated hemoglobin.



FIG. 2 shows a laser camera system having multiple frequency lasers for multispectral imaging applications.



FIG. 3A shows the laser camera system of FIG. 2, but with a pair of photo detectors positioned to avoid laser light reflected from the skin surface, and positioned for trans-illumination of an internal structure.



FIG. 3B shows the laser camera system of FIG. 3, but where the photo detectors are configured to have their field-of-view restricted by optical elements, to areas of the skin of the body part that are not directly illuminated by the laser light.



FIG. 4 shows a 5×5 array of photo detectors.



FIG. 5 shows a photo detector ring with a circular array of photo detectors arranged around the body part to be penetrated to detect laser light scattered from an inner structure.



FIG. 6 shows a block diagram of a closed loop projection system capable of capturing images with high dynamic range.



FIG. 7 shows a flow chart illustrating the functioning of the system of FIG. 6.



FIG. 8 shows a laser camera system for capturing images of a tooth.



FIG. 9A shows a color filter which limits the response of the photo detector.



FIG. 9B shows a first filter and a second filter being used to limit the response of the photo detector.



FIG. 9C shows the first and second filters of FIG. 9B with a pair of electronic shutters also being used to electro-optically limit the response of the photo detector.



FIG. 9D shows a grating being used to transmit light to three different color-resolved detectors.





DETAILED DESCRIPTION OF THE INVENTION

As is well known in the art, a laser camera works by emitting one or more laser beams and moving those beams in a pattern such that the beams cross over the area of an object of which an image is to be captured. A photo detector element in the camera captures the changes in light reflected from the object and uses that light change to create an image of the observed object. Note that as with a traditional camera, “object” should be read in this explanation as an object or as a group of objects (e.g., an apple or a still life that includes an apple). The pattern in which the beam is moved is unimportant as long as the position at which it strikes the object can be determined either directly or inferentially. Examples of patterns that can be used include raster and lissajous.


As previously mentioned, the AccuVein AV300 is a laser camera system that uses a single infrared laser scanned over the object (in this case the human body) to determine the position of hemoglobin as a proxy for the position of a vein. The device uses the general characteristic of hemoglobin in that it absorbs infrared light to a greater degree than surrounding tissue. As seen in FIG. 1, there is a slightly different absorption spectrum for deoxygenated hemoglobin (as would be found in veins) and oxygenated hemoglobin (as would be found in arteries).


Through the addition of a second infrared laser of a different wavelength, or through the addition of a tunable laser allowing the wavelength to be changed, the invention could detect the difference between a vein and an artery. The user interface could then use one or more of several techniques to indicate the type of hemoglobin detected.


While a range of techniques could be used for the detection algorithm, in one embodiment the following truth table could be used:

    • 1. Less reflection seen in either wavelength when compared to surrounding tissue? Position contains hemoglobin
    • 2. Wavelength one reflection<wavelength two reflection? Position contains oxygenated hemoglobin
    • 3. Wavelength one reflection>wavelength two reflection? Position contains deoxygenated hemoglobin


Furthermore, a range of techniques could be used for the user interface. These include:

    • 1. Project an image back on to the object that is scanned using a visible wavelength laser showing contrast changes between “hit” areas and surrounding areas
    • 2. Project said areas using continuously variable brightness to track the contrast changes.
    • 3. Project said areas using enhanced contrast to highlight the position of the detected hemoglobin
    • 4. Project said areas using a color map (sometimes known as false color) where different colors represent different characteristics.


While the use of infrared wavelengths to detect different types of hemoglobin are used for this illustrative example, there are many characteristics well know in the art that can be determined by the absorption spectrum of an object that the invention would be equally suited to.


One embodiment of the invention uses a one or more data capture techniques as discussed previously and provides user feedback by re-projecting a re-colored image back on to the area being scanned. Since it is possible for one or more of the wavelengths of light being captured to overlap with the wavelengths of light being projected it is necessary to implement one or more techniques to prevent the projected light from being confused with the detected light.


These techniques include the following and can be combined:

    • 1. Detect for a short period (e.g., a pixel time) and project for a short period.
    • 2. Detect for a scan line and project for a scan line.
    • 3. Detect for multiple scan lines and project for a scan line.
    • 4. Detect for a scan line and repeat project for multiple scan lines.
    • 5. Detect for a frame and project for a frame


The illustrative uses of asymmetrical detection and projection allows a balance between the amount of time that might be needed for capture and the processing the captured information and the need to have a sufficiently high projection rate to provide a good user experience. Other asymmetrical combinations are possible.


An alternative embodiment of the invention is one in which diffuse light of one or more wavelengths is emitted and then reflected by the object under study. A digital camera using technology such as CMOS or CCD sensors captures an image of the object being studied to determine the reflection/absorption spectrum of the object. By controlling the emitted light wavelengths or by modifying the sensitivity spectrum of the image sensor, the spectral characteristics of the object can be determined.


Examples of modifying the response characteristics of the image sensor have been seen in the literature. For example, in “Laser Focus World” there is a discussion in an article titled “CMOS imager with mosaic filter detects skin disorders.” Another related discussion is found in another article with the title “MEDICAL IMAGING: Real-time multispectral imager promises portable-diagnosis.”


Transillumination



FIG. 2 shows a laser camera having multiple frequency lasers for multispectral imaging applications. Multiple lasers (Laser Freq 1-Laser Freq N) are combined so that they are exiting coaxially from the Beam Combiner. They then bounce off a biaxial moving mirror (or a separate X and Y mirror) to produce a two-dimensional projection pattern. The pattern travels along Path O to a Body Part. Some of the Laser Freq wavelengths penetrate the Body Part and travel to the Internal Structure. The various Laser Freq wavelengths each interact with the internal structure in differing ways (varying levels of absorption and reflections). The reflections of the Laser Frequencies 1-N return to Photo Detector A and Photo Detector B along Path A and Path B respectively. The Photo Detectors may be, for example, a photo diode.


In one mode of operation, each Laser Freq 1-N is sequentially turned on for one frame of projection. The reflected light received at Photo Detectors A+B for that frame is then stored in a first frame memory location (not shown). In this manner, by sequentially stepping through Freq 1-N a multispectral image is stored in sequential frames of memory locations 1-N.


A characteristic of the system shown in FIG. 2 is that some portion of the projected Laser Freqs 1-N are reflected off the surface of the Body Part back to the Photo Detectors A+B. In the case where you are only interested in the characteristics of the Internal Structure, the reflections off the surface of the Body Part are essentially “noise” to the system. External structures, such a hair, scars, curvature of the body part, differences in reflectivity of exterior regions of the Body Part, all have the effect of generating noise that detract from imaging the Internal Structure. Algorithms can be written to help distinguish between the Internal Structure and the “noise”, however, such algorithms are rarely perfect.


One method of penetrating deeper into the Body Part to see deeper Internal Structures is to increase the power output of the Lasers 1-N. However, as laser power is increased, the reflections off the external surface of the Body Part also increases. Eventually the Photo Detectors A+B, and the associated circuitry after them (not shown), gets saturated and the details of the Internal Structure get washed away.



FIG. 3A shows a system similar to that of FIG. 2 except that the Photo Detectors A and B are moved and are placed in a way that no light from lasers 1-N reflected from the surface of the skin can reach them. For example, they may be physically touching the skin of the Body Part (FIG. 3A). This type of system will be referred hereinafter as a transillumination laser system, wherein the Laser 1-N, upon hitting the Internal Structure, is then carried internal to the Body Part, with some portion of the light (shown as Path A and Path B) eventually hitting the Photo Detector A and/or B which are placed against the skin of the Body Part. Accordingly, the Laser Light that reaches Photo Detector A and/or B vary as a function of the Internal Structure's absorption and reflection of the Laser Light. For example, the presence of a highly absorptive tissue in the light path would decrease the signal generated by the Photo Detectors, while the presence of a highly scattering tissue would increase it. It should be noted that the position of the Photo Detectors does not need to be on the side opposite the output laser Path O. The Photo Detectors can be placed anywhere on the Body Part, as long as sufficient internally carried light manages to reach the Photo Detectors.


Nor do the Photo Detectors have to be physically touching the skin of the Body Part. Instead, they may configured to have their Field-of-View (FOV) restricted to areas of the skin the Body Part which are not directly illuminated by lasers 1-N (FIG. 3b). The FOV may be shaped by lenses, Fresnel lenses, curved mirrors or other optical elements. Additionally, the FOV of the Photo Detectors does not have to be stationary. Instead, it can be moving synchronously with the scanning system in such manner that no light from lasers 1-N reflected from the surface of the skin can reach them.


In the transillumination laser system of FIG. 3A/FIG. 3B, as the intensity of the Laser 1-N is increased, none of reflections off the surface of the Body Part are projected onto Photo Detector A or B. Accordingly, the power of Laser 1-N can be significantly increased to allow for imaging of deeper Internal Structures without concern for saturation due to reflections off the surface of the Body Part. Further, surface artifacts such as hair and surface blemishes are largely ignored. Essentially, the transillumination system allows for a greater signal to noise separation between the internal structure (the signal) and the reflections occurring off the surface of the Body Part (the noise). This allows for a much higher contrast ratio image of the underlying Internal Structure. In both FIGS. 2 and 3A/3B individual Laser Freq 1-N were shown as individual blocks. It is also possible to purchase a tunable laser that can output a wide range of laser frequencies. This eliminates the need for the Beam Combiner of FIGS. 2 and 3A/3B. OpoTek Inc. sells a tunable laser system called the VIBRANT (HE) 355 II which can output frequencies in the range of 410-2400 nm. In such a case, the tunable laser will be set to a desired frequency for a single frame and the appropriate image captured. The tunable laser is set to the next frequency and its corresponding image is captured. This cycle repeats until all laser frequencies are cycled. This allows for a very large number of frequencies to be utilized.


Alternatively, a wide-band laser, which emits light of different wavelength simultaneously, may be used. Such lasers are known to be constructed with active media been confined to an optical fiber with various doping elements with overlapping emission spectra. Alternatively, the pulsed lasers with ultra-short pulses may be used where the spectrum is broadened by the sidebands of the frequencies associated with the pulse duration. One example of such lasers is a mode-locked laser.


In this case, different wavelengths will be detected by Photo Detectors with different spectral responses. In one embodiment, identical Photo Detectors with broadband response may have color filters which limit the response of each Detector to a narrow band of wavelengths (FIG. 9). Alternatively, the filters may be applied selectively, by moving or masking parts of the filter either mechanically (FIG. 9b). or electro-optically (FIG. 9c), using electronically-controlled optical elements such as LCD shutters. Yet alternatively, color-resolved Detectors may be used, where the light of different wavelength is directed toward different detector elements by a grating or other suitable optical element (FIG. 9d).


In a transillumination laser system, single, multiple, or arrays of Photo Detectors may be used instead of the two Photo Detectors shown in FIG. 3A/FIG. 3B. FIG. 4 shows a 5 by 5 array of Photo Detectors. This array of Photo Detectors is then placed in contact with the Body Part to receive the internally reflected Laser 1-N. In this embodiment the array can be placed anywhere on the Body Part except along optical Path O of FIG. 3A/FIG. 3B. A large array of Photo Detectors increases the photo detection area, thereby capturing more of the internally reflected light. Further, the Photo Detector array can uniformly distribute the receiving Photo Detectors over area so that it more uniformly receives the internally reflected light. In this manner, “hot spots” associated with fewer Photo Detectors can be minimized.



FIG. 5 shows an embodiment wherein a Photo Detector Ring is placed around the Projected Area of the Laser 1-N. More specifically, it is a view from the perspective of the mirrors of FIG. 3. In this embodiment, the ring is placed against the surface of the Body Part in a position such that the Laser 1-N projected along Path O in FIG. 3A/FIG. 3B falls inside the inner edge of the Ring. The Lasers 1-N penetrates into the Body Part and interacts with the Inner Structure. The Lasers 1-N scatter inside the Body Part with a portion of the light being returned to the Photo Detector ring wherein it is detected. The detected light corresponds to the Inner Structure. In this embodiment, Lasers 1-N scattering off the surface of the Body Part do not reach the Photo Detectors on the Photo Detector ring, and therefore, do not interfere with the signal created when the Lasers 1-N interact with the Inner Structure. Accordingly, the power of the Lasers 1-N can be increased substantially to reach deeper Inner Structures without having the surface reflections creating “noise”.


In FIGS. 2 to 6 the Photo Detectors are not shown with the electronics attaching them to a system. Such connectivity between the Photo Detectors and the system can be via a wired connection, a wireless connection, an optical connection, or any other transmission technique. Accordingly, a wide array of devices can be envisioned. For example, the Photo Detector array of FIG. 4 can be built into an armrest of a phlebotomy chair. In this case, when a person's arm is placed down on the armrest, the Photo Detectors are in contact with the skin. Alternatively, the photo array of FIG. 4 can be a wireless patch which gets affixed with some type of temporary adhesive to the body part and which wireless communicates the output of the Photo Detectors to the system.


The transillumination laser systems described herein can be utilized as a multispectral system for detecting bruising and erythema (which might indicate developing pressure ulcers). For example, an article in Laser Focus World having the title “MEDICAL IMAGING: Real-time multispectral imager promises portable-diagnosis.” describes a conventional CCD camera system for detection having a masked filter array for receiving images with the following frequencies of light 460, 525, 577 and 650 nm for detection of bruising or 540, 577, 650 and 970 for detection of erythema. However, such a system differs significantly from the transillumination laser system in that the CCD camera receives the light reflected off the skin, and therefore, does not have the same contrast ratio (or signal to noise performance) as transillumination laser system utilizing the same light frequencies for viewing events under the skin. Accordingly, a transillumination laser system utilizing the frequencies, for example 460, 525, 540, 577, 650 and 970 nm can be configured as described in FIGS. 3 to 6 for the detection of both bruising and erythema. The CCD camera system described is further limited in that the number of pixels of the CCD array gets reduced due to the masked filter. Accordingly, the density of the CCD imaging gets divided down by the number of frequencies in the mask. The laser system does not have this limitation in that a complete frame can be taken with each frequency of laser light.


While the laser system of FIG. 2 and the transillumination laser systems of FIGS. 3 to 5 have been herein describe with regard to multispectral laser systems, wherein more than one frequency of light is utilized, the transillumination described herein is applicable to the single frequency detection systems described in the parent applications hereto for the detection of blood vessels within a body.


Closed Loop Projection


Traditional CCD cameras have a large number of pixels that provide a high-resolution image. However, with conventional CCD cameras, each of the pixels has a common exposure time, and the camera lens typically has a single optical aperture setting per picture. Accordingly, it is very difficult to take a very good picture of a very bright item positioned very close to a very dim item. For example, if you were to attempt to take a picture of a seagull next to the sun, if you set the exposure time down (short) and/or the lens aperture opening so small (higher F number) that the sun does not saturate the CCD pixels, you could image the sun but the image of the seagull would be washed out. Conversely, if you set the exposure time long, and the lens aperture opened wide (smaller F number), you could image the bird but the sun would saturate the CCD pixels corresponding to it.


Described in FIG. 6 is a closed loop laser imaging system that is capable of capturing images with very high dynamic range. In the parent application hereto, laser image capture systems are described in which the projected laser light is provided by a raster scanned laser beam. In FIG. 6, the laser, scanning mirrors, photo diodes, mirror drives can all be the same as previously described.


In FIG. 6 the laser beam brightness is controlled by a high speed DAC (digital to analog converter). This DAC is capable of varying the intensity of the laser at a very high rate (hundreds to thousands of times in each horizontal scan). Each segment of a duration corresponding to a desired resolution of the image will be referred hereafter as a pixel. Each pixel of the image has a memory location in the Frame Memory Buffer. Each pixel has a defined location on the object defined by a time slot in the frame.


A Photo Detector (or multiple Photo Detectors or Photo Detector array) receives the reflected light and provides a corresponding voltage to the Amplifier (DC coupled). The output is then provided to the Comparator (One Bit Logic Output) that in turn provides one bit of data. That one bit indicates whether the laser was “too bright” or “too dark” for that pixel. The result is then stored as Pixel brightness information and is updated with every frame. Stored pixel brightness is changed up or down depending on the Photo Detector bit. For maximum light contrast sensitivity, pixel data is always changed by at least one bit every frame. In this manner the closed loop projection image is constantly capturing.



FIG. 7 is a flow chart illustrating the functioning of the system of FIG. 6.


Depending on the bits of brightness resolution, the system requires multiple frames to fully capture an image. For example, for 8 bits (255 shades), new image capture requires 8 frames. At 60 frames per second, that's 0.13 seconds to capture. After capture, image is maintained and updated with every frame. Since laser brightness (the DAC setting) is adjusted for each pixel, the reflected light for each pixel approaches one value. That value is the midpoint of the analog Photo Detector signal range. This scheme allows the highest contrast sensitivity and highest DC gain in the front end, because the analog signal approaches a flat line. Therefore the dynamic range of the system is not limited by the dynamic range or speed of the Photo Detector amplifier chain.


It is also possible to further increase the dynamic range and speed up the data acquisition of a closed loop laser imaging system by employing a fast, moderate resolution ADC in place of a single Comparator as described above, but still varying the laser power on a pixel-by-pixel basis to ensure nearly-uniform brightness of the resulting image. In this case, the dynamic range of the system would be generally equal to the product of the bit resolutions of the laser driver and the ADC, while the number of frames needed to capture a full-resolution image will be equal to a dividend of the bit resolutions of the laser driver and the ADC.


The time period during the top scan line of the image is reserved for Laser calibration. During calibration, the laser is driven to a defined maximum and then minimum brightness. During minimum brightness, the DC bias on the Photo Detector amplifier is adjusted to compensate for any change in ambient room lighting.


While FIG. 6 describes a system with a single laser, it is possible to implement a multiple laser system utilizing the closed loop projection method. Each frequency of laser can be sequentially cycled for a frame. Alternatively, multiple photo detectors can be filtered; each arranged to receive only one of the specific frequencies of laser light. In this manner, in each frame each frequency of light can concurrently be processed as shown in FIG. 6. For example, red, green and blue lasers can be utilized, wherein each color has a corresponding Frame Buffer Memory. This would function as a color image capture device. As a further example, a multispectral systems can be build, utilizing the frequencies described above for detection of bruising and erythema. Further, any frequencies of laser can be utilized provided that the photo detectors are capable of receiving such frequencies.


Additionally, the information captured at one wavelength may be used to adjust the laser power of different wave-length. Such wavelength cross-coupling may increase accuracy and/or shorten acquisition time of a multispectral closed loop laser imaging system.


The multispectral laser system FIG. 2, the Transillumination Laser System FIGS. 3 to 5 and the Closed Loop Projection system FIG. 6 can be combined together in a single system so that the advantages of each are provided.


Further, the concept described in the parent applications hereto of adding a visible laser as one of the Laser Freq 1-N can be applied to the multispectral laser system FIG. 2, the Transillumination Laser System FIGS. 3 to 5 and the Closed Loop Projection system FIG. 6.


Wherein it is described herein that the object is a Body Part, the multispectral laser system FIG. 2, the Transillumination Laser System FIGS. 3 to 5 and the Closed Loop Projection system FIG. 6 can be utilized on objects other than Body Parts. For example, they can be used on metals for detecting stress fractures, or can be used on plastic parts for detecting imperfections.


System for Evaluating Teeth



FIG. 8 shows a Laser Camera for capturing images of teeth. The Laser Camera can be designed as previously described in this application and the parent applications. A 1310 nm laser can be utilized as the laser source for imaging. It is known that the frequency of 1310 nm partially passes through teeth. The presence of cavity or other abnormalities will interfere with the reflection of the light. The laser light is transmitted in a raster pattern (or repetitive pattern) towards the teeth. Given that the tooth is relatively small, the laser beam is focused down to a very small spot size by the focusing lens within the Laser Camera. The maximum angle of the of transmitted pattern is made relatively small so that the light falls on a single tooth (or a small number of teeth).


The Laser Camera can be configured as a Transillumination Laser Camera, as previously described. A Photo Detector Insert, containing multiple Photo Diodes, can be placed inside the mouth of the patient and pressed against the backside of the teeth. The Photo Detector Insert will receive the laser light that is transmitted through the tooth. The Photo Detector Insert can be molded out of a transmissive gummy material so that it can slightly adhere to the backside of the teeth and provides an optical path for the 1310 nm light that scatters within the tooth and passes the light to the Photo Diodes. The light which is received by the Photo Detector Insert is converted to a signal (circuit not shown) which is then communicated (either wired or wirelessly) to the Laser Camera where the results are clocked into an image memory. Once a frame of data is clocked into an image memory it can then be output on a Monitor where the user can view the image of the teeth.


The Laser Camera can be designed as a closed loop imaging system as describe previously in FIGS. 6 and 7. Without the closed loop imaging system, if there are gaps between the teeth, the projected laser will pass through such gaps and saturate the Photo Detector Insert. The very high dynamic range provided by the closed loop imaging system will be beneficial in being able to pick out subtle details, such as cavities and cracks, that are directly next to the very bright spots caused by the gaps in the teeth. The laser power will be able to be substantially increased at specific pixels requiring more illumination, while being reduced requiring a lesser light source (such as the gaps in the teeth).


The Laser Camera can also be a multispectral camera as previously described, wherein the 1310 nm frequency is utilized with other frequency lasers for detecting other characteristics of the teeth.

Claims
  • 1. A trans-illumination imaging system, for use in capturing a multispectral image of a deep internal structure of an object without noise created by reflections from a surface of the object, and for projecting a high contrast-ratio image of the deep internal structure onto the surface of the object, said trans-illumination imaging system comprising: means for transmitting a beam of light comprising two or more different infrared wavelengths and one or more visible wavelengths;means for scanning said beam of light in a first direction and in a second direction, with a first maximum angle between said beam of light at a beginning and an end of said first scan direction, and with a second maximum angle between said beam of light at a beginning and an end of said second scan direction, for scanning said beam of light along a plurality of optical paths in a two dimensional scan pattern directed upon at least a portion of the surface of the object;an array of uniformly distributed photo detectors;means for restricting a field of view for each said photo detector;wherein each of said photo detectors of said array of photo detectors are positioned on an opposite side of the object from said means for scanning, and positioned to receive infrared light from the deep internal structure of the object not aligned with said plurality of optical paths of said scan pattern and not aligned with a reflection of said light of said plurality of optical paths after being reflected from the surface of the object;wherein each of said photo detectors is thereby configured to receive a contrasted image from said two or more infrared wavelengths of light after being differentially absorbed by, and reflected from, portions of the deep internal structure, without noise created by reflections from the surface of the object; andwherein each said photo detector is further configured to output a signal of the contrasted image, and to transmit said signal to said means for transmitting, for scanning of the contrasted image onto the surface of the object by said means for scanning, using said one or more visible wavelengths of light.
  • 2. The trans-illumination imaging system according to claim 1, wherein said photo detectors of said array and each said means for restricting a field of view are configured to move synchronously about the object to remain unaligned with said plurality of optical paths of said scan pattern.
  • 3. The trans-illumination imaging system according to claim 1, wherein said system is configured to image using a first power level and a second power level, with said second power level being higher than said first power level, for said second power level to image deeper into the object than said first power level.
  • 4. The trans-illumination imaging system according to claim 1, wherein each said means for restricting a field of view of said photo detectors of said array are positioned to be in contact with the surface of the object.
  • 5. The trans-illumination imaging system according to claim 1, said means for transmitting a beam of light being further configured to transmit six wavelengths of light comprising light at 460 nm, 525 nm, 540 nm, 577 nm, 650 nm, and 970 nm.
  • 6. A trans-illumination imaging system comprising: means for transmitting a beam of light comprising one or more different infrared wavelengths and one or more visible wavelengths;means for scanning said beam of light in a first direction and in a second direction, with a first maximum angle between said beam of light at a beginning and an end of said first scan direction, and with a second maximum angle between said beam of light at a beginning and an end of said second scan direction, for scanning said beam of light along a plurality of optical paths in a two dimensional scan pattern directed upon at least a portion of a surface of an object;an array of uniformly distributed photo detectors;means for restricting a field of view for each said photo detector;wherein each of said photo detectors of said array of photo detectors are positioned to be in contact with the surface of the object, and positioned to receive infrared light from the deep internal structure of the object not aligned with said plurality of optical paths of said scan pattern and not aligned with a reflection of said light of said plurality of optical paths after being reflected from the surface of the object;wherein each of said photo detectors is thereby configured to receive a contracted image from said two or more infrared wavelengths of light after being differentially absorbed by, and reflected from, portions of the deep internal structure, without noise created by reflections from the surface of the object; andwherein each said photo detector is further configured to output a signal of the contrasted image, and to transmit said signal to said means for transmitting, for scanning of said contrasted image onto the surface of the object by said means for scanning, using said one or more visible wavelengths of light.
  • 7. The trans-illumination imaging system according to claim 6, wherein said photo detectors of said array and each said means for restricting a field of view are configured to move synchronously about the object to remain unaligned with said plurality of optical paths of said scan pattern.
  • 8. The trans-illumination imaging system according to claim 6, wherein said system is configured to image using a first power level and a second power level, with said second power level being higher than said first power level, for said second power level to image deeper into the object than said first power level.
  • 9. The trans-illumination imaging system according to claim 6, said means for transmitting a beam of light being further configured to transmit six wavelengths of light comprising light at 460 nm, 525 nm, 540 nm, 577 nm, 650 nm, and 970 nm.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 12/925,166, on Oct. 14, 2010, which claims priority on Provisional Patent Application Ser. No. 61/278,948, filed Oct. 14, 2009, and which is a continuation-in-part of: application Ser. No. 11/478,322, filed on Jun. 29, 2006, now issued as U.S. Pat. No. 8,478,386; U.S. patent application Ser. No. 11/700,729, filed Jan. 31, 2007; U.S. patent application Ser. No. 11/807,359, filed May 25, 2007, now issued as U.S. Pat. No. 8,489,178; U.S. patent application Ser. No. 12/215,713, filed Jun. 27, 2008; and U.S. patent application Ser. No. 11/823,862, filed Jun. 28, 2007, now issued as U.S. Pat. No. 7,983,738. All of the foregoing disclosures are hereby incorporated by reference.

US Referenced Citations (293)
Number Name Date Kind
3136310 Meltzer Jun 1964 A
3349762 Kapany Oct 1967 A
3511227 Johnson May 1970 A
3527932 Thomas Sep 1970 A
3818129 Yamamoto Jun 1974 A
3984629 Gorog Oct 1976 A
4030209 Dreiding Jun 1977 A
4057784 Tafoya Nov 1977 A
4109647 Stern et al. Aug 1978 A
4162405 Chance et al. Jul 1979 A
4182322 Miller Jan 1980 A
4185808 Donohoe et al. Jan 1980 A
4213678 Pomerantzeff et al. Jul 1980 A
4265227 Ruge May 1981 A
4312357 Andersson et al. Jan 1982 A
4315318 Kato et al. Feb 1982 A
4321930 Jobsis et al. Mar 1982 A
4393366 Hill Jul 1983 A
4495949 Stoller Jan 1985 A
4502075 DeForest et al. Feb 1985 A
4510938 Jobsis et al. Apr 1985 A
4536790 Kruger et al. Aug 1985 A
4565968 Macovski Jan 1986 A
4567896 Barnea et al. Feb 1986 A
4576175 Epstein Mar 1986 A
4586190 Tsuji Apr 1986 A
4590948 Nilsson May 1986 A
4596254 Adrian et al. Jun 1986 A
4619249 Landry Oct 1986 A
4669467 Willett et al. Jun 1987 A
4697147 Moran et al. Sep 1987 A
4699149 Rice Oct 1987 A
4703758 Omura Nov 1987 A
4766299 Tierney et al. Aug 1988 A
4771308 Tejima et al. Sep 1988 A
4780919 Harrison Nov 1988 A
4799103 Muckerheide Jan 1989 A
4817622 Pennypacker et al. Apr 1989 A
4846183 Martin Jul 1989 A
4861973 Hellekson et al. Aug 1989 A
4862894 Fujii Sep 1989 A
4883953 Koashi Nov 1989 A
4899756 Sonek Feb 1990 A
4901019 Wedeen Feb 1990 A
4926867 Kanda et al. May 1990 A
RE33234 Landry Jun 1990 E
4938205 Nudelman Jul 1990 A
5074642 Hicks Dec 1991 A
5088493 Giannini et al. Feb 1992 A
5090415 Yamashita Feb 1992 A
5103497 Hicks Apr 1992 A
5146923 Dhawan Sep 1992 A
5174298 Dolfi et al. Dec 1992 A
5184188 Bull et al. Feb 1993 A
5214458 Kanai May 1993 A
5222495 Clarke et al. Jun 1993 A
5261581 Harden, Sr. Nov 1993 A
5293873 Fang Mar 1994 A
5339817 Nilsson Aug 1994 A
5371347 Plesko Dec 1994 A
5406070 Edgar et al. Apr 1995 A
5418546 Nakagakiuchi et al. May 1995 A
5423091 Lange Jun 1995 A
5436655 Hiyama et al. Jul 1995 A
5445157 Adachi et al. Aug 1995 A
D362910 Creaghan Oct 1995 S
5455157 Hinzpeter et al. Oct 1995 A
5485530 Lakowicz et al. Jan 1996 A
5487740 Sulek et al. Jan 1996 A
5494032 Robinson et al. Feb 1996 A
5497769 Gratton et al. Mar 1996 A
5504316 Bridgelall et al. Apr 1996 A
5519208 Esparza et al. May 1996 A
5541820 McLaughlin Jul 1996 A
5542421 Erdman Aug 1996 A
5598842 Ishihara et al. Feb 1997 A
5603328 Zucker et al. Feb 1997 A
5608210 Esparza et al. Mar 1997 A
5610387 Bard et al. Mar 1997 A
5625458 Alfano et al. Apr 1997 A
5631976 Bolle et al. May 1997 A
5655530 Messerschmidt Aug 1997 A
5678555 O'Connell Oct 1997 A
5716796 Bull et al. Feb 1998 A
5719399 Alfano et al. Feb 1998 A
5740801 Branson Apr 1998 A
5747789 Godik May 1998 A
5756981 Roustaei et al. May 1998 A
5758650 Miller et al. Jun 1998 A
5772593 Hakamata Jun 1998 A
5787185 Clayden Jul 1998 A
5814040 Nelson et al. Sep 1998 A
5836877 Zavislan Nov 1998 A
5847394 Alfano et al. Dec 1998 A
5860967 Zavislan et al. Jan 1999 A
5929443 Alfano et al. Jul 1999 A
5946220 Lemelson Aug 1999 A
5947906 Dawson, Jr. et al. Sep 1999 A
5966204 Abe Oct 1999 A
5966230 Swartz et al. Oct 1999 A
5969754 Zeman Oct 1999 A
5982553 Bloom et al. Nov 1999 A
5988817 Mizushima et al. Nov 1999 A
5995856 Manheimer et al. Nov 1999 A
5995866 Lemelson Nov 1999 A
6006126 Cosman Dec 1999 A
6032070 Flock et al. Feb 2000 A
6056692 Schwartz May 2000 A
6061583 Ishihara et al. May 2000 A
6083486 Weissleder et al. Jul 2000 A
6101036 Bloom Aug 2000 A
6113536 Aboul-Hosn et al. Sep 2000 A
6122042 Wunderman et al. Sep 2000 A
6132379 Patacsil et al. Oct 2000 A
6135599 Fang Oct 2000 A
6141985 Cluzeau et al. Nov 2000 A
6142650 Brown et al. Nov 2000 A
6149061 Massieu et al. Nov 2000 A
6149644 Xie Nov 2000 A
6171301 Nelson et al. Jan 2001 B1
6178340 Svetliza Jan 2001 B1
6179260 Ohanian Jan 2001 B1
6230046 Crane et al. May 2001 B1
6240309 Yamashita et al. May 2001 B1
6251073 Imran et al. Jun 2001 B1
6263227 Boggett et al. Jul 2001 B1
6272376 Marcu et al. Aug 2001 B1
6301375 Choi Oct 2001 B1
6305804 Rice et al. Oct 2001 B1
6314311 Williams et al. Nov 2001 B1
6334850 Amano et al. Jan 2002 B1
6353753 Flock et al. Mar 2002 B1
6424858 Williams Jul 2002 B1
6436655 Bull et al. Aug 2002 B1
6438396 Cook et al. Aug 2002 B1
6463309 Ilia Oct 2002 B1
6464646 Shalom et al. Oct 2002 B1
6523955 Eberl et al. Feb 2003 B1
6542246 Toida Apr 2003 B1
6556854 Sato et al. Apr 2003 B1
6556858 Zeman Apr 2003 B1
6599247 Stetten Jul 2003 B1
6631286 Pfeiffer et al. Oct 2003 B2
6648227 Swartz et al. Nov 2003 B2
6650916 Cook et al. Nov 2003 B2
6671540 Hochman Dec 2003 B1
6689075 West Feb 2004 B2
6690964 Bieger et al. Feb 2004 B2
6702749 Paladini et al. Mar 2004 B2
6719257 Greene et al. Apr 2004 B1
6755789 Stringer et al. Jun 2004 B2
6777199 Bull et al. Aug 2004 B2
6782161 Barolet et al. Aug 2004 B2
6845190 Smithwick et al. Jan 2005 B1
6882875 Crowley Apr 2005 B1
6889075 Marchitto et al. May 2005 B2
6913202 Tsikos et al. Jul 2005 B2
6923762 Creaghan, Jr. Aug 2005 B1
6980852 Jersey-Willuhn et al. Dec 2005 B2
7092087 Kumar et al. Aug 2006 B2
7113817 Winchester, Jr. et al. Sep 2006 B1
7158660 Gee, Jr. et al. Jan 2007 B2
7158859 Wang et al. Jan 2007 B2
7204424 Yavid et al. Apr 2007 B2
7225005 Kaufman et al. May 2007 B2
7227611 Hull et al. Jun 2007 B2
7239909 Zeman Jul 2007 B2
7247832 Webb Jul 2007 B2
7280860 Ikeda et al. Oct 2007 B2
7283181 Allen et al. Oct 2007 B2
7302174 Tan et al. Nov 2007 B2
7333213 Kempe Feb 2008 B2
D566283 Brafford et al. Apr 2008 S
7359531 Endoh et al. Apr 2008 B2
7376456 Marshik-Geurts et al. May 2008 B2
7428997 Wiklof et al. Sep 2008 B2
7431695 Creaghan Oct 2008 B1
7448995 Wiklof et al. Nov 2008 B2
7532746 Marcotte et al. May 2009 B2
7545837 Oka Jun 2009 B2
7559895 Stetten et al. Jul 2009 B2
7579592 Kaushal Aug 2009 B2
7608057 Woehr et al. Oct 2009 B2
7699776 Walker et al. Apr 2010 B2
7708695 Akkermans et al. May 2010 B2
7792334 Cohen et al. Sep 2010 B2
7846103 Cannon, Jr. et al. Dec 2010 B2
7848103 Zhan Dec 2010 B2
7904138 Goldman et al. Mar 2011 B2
7904139 Chance Mar 2011 B2
7925332 Crane et al. Apr 2011 B2
7966051 Xie et al. Jun 2011 B2
8032205 Mullani Oct 2011 B2
8078263 Zeman et al. Dec 2011 B2
8187189 Jung et al. May 2012 B2
8199189 Kagenow et al. Jun 2012 B2
8320998 Sato Nov 2012 B2
8336839 Boccoleri et al. Dec 2012 B2
8364246 Thierman Jan 2013 B2
8467855 Yasui Jun 2013 B2
8480662 Stolen et al. Jul 2013 B2
8494616 Zeman Jul 2013 B2
8498694 McGuire, Jr. et al. Jul 2013 B2
8509495 Xu et al. Aug 2013 B2
8537203 Seibel et al. Sep 2013 B2
8548572 Crane Oct 2013 B2
8630465 Wieringa et al. Jan 2014 B2
8649848 Crane et al. Feb 2014 B2
20010006426 Son et al. Jul 2001 A1
20010056237 Cane et al. Dec 2001 A1
20020016533 Marchitto et al. Feb 2002 A1
20020111546 Cook Aug 2002 A1
20020118338 Kohayakawa Aug 2002 A1
20020188203 Smith et al. Dec 2002 A1
20030018271 Kimble Jan 2003 A1
20030037375 Riley et al. Feb 2003 A1
20030052105 Nagano et al. Mar 2003 A1
20030120154 Sauer et al. Jun 2003 A1
20030125629 Ustuner Jul 2003 A1
20030156260 Putilin et al. Aug 2003 A1
20040015062 Ntziachristos et al. Jan 2004 A1
20040015158 Chen et al. Jan 2004 A1
20040022421 Endoh et al. Feb 2004 A1
20040046031 Knowles et al. Mar 2004 A1
20040087862 Geng May 2004 A1
20040171923 Kalafut et al. Sep 2004 A1
20040222301 Willins et al. Nov 2004 A1
20040237051 Clauson Nov 2004 A1
20050017924 Utt et al. Jan 2005 A1
20050033145 Graham et al. Feb 2005 A1
20050043596 Chance Feb 2005 A1
20050047134 Mueller et al. Mar 2005 A1
20050085732 Sevick-Muraca et al. Apr 2005 A1
20050085802 Gruzdev et al. Apr 2005 A1
20050113650 Pacione et al. May 2005 A1
20050131291 Floyd et al. Jun 2005 A1
20050135102 Gardiner et al. Jun 2005 A1
20050141069 Wood et al. Jun 2005 A1
20050143662 Marchitto et al. Jun 2005 A1
20050146765 Turner et al. Jul 2005 A1
20050154303 Walker et al. Jul 2005 A1
20050157939 Arsenault et al. Jul 2005 A1
20050161051 Pankratov et al. Jul 2005 A1
20050168980 Dryden et al. Aug 2005 A1
20050174777 Cooper et al. Aug 2005 A1
20050175048 Stern et al. Aug 2005 A1
20050187477 Serov et al. Aug 2005 A1
20050215875 Khou Sep 2005 A1
20050265586 Rowe et al. Dec 2005 A1
20050281445 Marcotte et al. Dec 2005 A1
20060007134 Ting Jan 2006 A1
20060020212 Xu et al. Jan 2006 A1
20060025679 Viswanathen et al. Feb 2006 A1
20060052690 Sirohey et al. Mar 2006 A1
20060058683 Chance Mar 2006 A1
20060081252 Wood Apr 2006 A1
20060100523 Ogle et al. May 2006 A1
20060103811 May May 2006 A1
20060122515 Zeman et al. Jun 2006 A1
20060129037 Kaufman et al. Jun 2006 A1
20060129038 Zelenchuk et al. Jun 2006 A1
20060151449 Warner, Jr. et al. Jul 2006 A1
20060173351 Marcotte et al. Aug 2006 A1
20060184040 Keller et al. Aug 2006 A1
20060206027 Malone Sep 2006 A1
20060232660 Nakajima et al. Oct 2006 A1
20060253010 Brady et al. Nov 2006 A1
20060271028 Altshuler et al. Nov 2006 A1
20060276712 Stothers Dec 2006 A1
20070015980 Numada et al. Jan 2007 A1
20070016079 Freeman et al. Jan 2007 A1
20070070302 Govorkov et al. Mar 2007 A1
20070115435 Rosendaal May 2007 A1
20070129634 Hickey et al. Jun 2007 A1
20070176851 Willey et al. Aug 2007 A1
20070238957 Yared Oct 2007 A1
20080045841 Wood et al. Feb 2008 A1
20080147147 Griffiths et al. Jun 2008 A1
20080194930 Harris et al. Aug 2008 A1
20080214940 Benaron Sep 2008 A1
20090018414 Toofan Jan 2009 A1
20090082629 Dotan Mar 2009 A1
20090171205 Kharin et al. Jul 2009 A1
20100051808 Zeman et al. Mar 2010 A1
20100061598 Seo Mar 2010 A1
20100087787 Woehr et al. Apr 2010 A1
20100177184 Berryhill et al. Jul 2010 A1
20100312120 Meier Dec 2010 A1
20110275932 Leblond et al. Nov 2011 A1
20130147916 Bennett et al. Jun 2013 A1
20140039309 Harris et al. Feb 2014 A1
20140046291 Harris et al. Feb 2014 A1
20140194747 Kruglick Jul 2014 A1
Foreign Referenced Citations (40)
Number Date Country
2289149 May 1976 FR
1298707 May 1970 GB
1298707 Feb 1972 GB
1507329 Apr 1978 GB
S60-108043 Jun 1985 JP
04-042944 Feb 1992 JP
07-255847 Oct 1995 JP
08-023501 Jan 1996 JP
08023501 Jan 1996 JP
08-164123 Jun 1996 JP
2000-316866 Nov 2000 JP
2000316866 Nov 2000 JP
2002-328428 Nov 2002 JP
2002328428 Nov 2002 JP
2002345953 Dec 2002 JP
2002-345953 Dec 2002 JP
2004-237051 Aug 2004 JP
2004237051 Aug 2004 JP
2004237051 Aug 2004 JP
2004329786 Nov 2004 JP
2004-329786 Nov 2004 JP
2006-102360 Apr 2006 JP
2003-0020152 Mar 2003 KR
20030020152 Mar 2003 KR
WO 9422370 Oct 1994 WO
WO 1994 22370 Oct 1994 WO
WO 0639925 Dec 1996 WO
WO 9639925 Dec 1996 WO
WO 1996 39925 Dec 1996 WO
WO 1998 26583 Jun 1998 WO
WO 9826583 Jun 1998 WO
WO 9948420 Sep 1999 WO
WO 1999 48420 Sep 1999 WO
WO 0182786 Nov 2001 WO
WO 2001-82786 Nov 2001 WO
WO 03009750 Feb 2003 WO
WO 2003-009750 Feb 2003 WO
WO 2005-053773 Jun 2005 WO
WO 2005053773 Jun 2005 WO
WO 2007-078447 Jul 2007 WO
Non-Patent Literature Citations (7)
Entry
Nikbin, Darius, “IPMS Targets Colour Laser Projectors,” Optics & Laser Europe, Mar. 2006, Issue 137, p. 11
http://www.wikihow.com/See-Blood-Weins-in-Your-Hand-With-a- Flashlight “How to See Blood Veins in Your Hand With a Flashlight”.
Wiklof, Chris, “Display Technology Spawns Laser Camera,” LaserFocusWorld, Dec. 1, 2004, vol. 40, Issue 12, PennWell Corp., USA.
Nikbin, Darius, “IPMS Targets Colour Laser Projectors,” Optics & Laser Europe, Mar. 1006, Isue 137, p. 11.
http://sciencegeekgirl.wordpress.com/category/science-myths/page/2/ Myth 7: Blood is Blue.
http://www.exploratorium.edu/sports/hnds_up/hands6.html “Hands Up! To Do & Notice: Getting the Feel of Your Hand”.
http://www.wikihow.com/See-Blook-Veins-in-Your-Hand-With-a- Flashlight “How to See Blood Veins in Your Hand With a Flashlight”.
Related Publications (1)
Number Date Country
20150105648 A1 Apr 2015 US
Provisional Applications (1)
Number Date Country
61278948 Oct 2009 US
Continuations (1)
Number Date Country
Parent 12925166 Oct 2010 US
Child 14053775 US
Continuation in Parts (5)
Number Date Country
Parent 11478322 Jun 2006 US
Child 12925166 US
Parent 11700729 Jan 2007 US
Child 11478322 US
Parent 11807359 May 2007 US
Child 11700729 US
Parent 12215713 Jun 2008 US
Child 11807359 US
Parent 11823862 Jun 2007 US
Child 12215713 US