Device for detecting and illuminating the vasculature using an FPGA

Information

  • Patent Grant
  • 10568518
  • Patent Number
    10,568,518
  • Date Filed
    Thursday, August 17, 2017
    7 years ago
  • Date Issued
    Tuesday, February 25, 2020
    5 years ago
Abstract
A laser based vascular illumination system utilizing a FPGA for detecting vascular positions, processing an image of such vasculature positions, and projecting the image thereof onto the body of a patient.
Description
BACKGROUND
Summary

A laser based vascular illumination system utilizing a FPGA for detecting vascular positions, processing an image of such vasculature positions, and projecting the image thereof onto the body of a patient.





BRIEF DESCRIPTION


FIG. 1 Block diagram of a system for detecting and illuminating the vasculature in a patient.



FIG. 2 Shows the signal processing flow of the FPGA.



FIG. 3 shows the internal bus architecture of the FPGA.



FIG. 4 shows details of the vein processing.



FIG. 5 shows the vein processing at the boundary of the image frames.



FIG. 6 shows further detail of the vein processing at the boundary of the image frames.



FIG. 7 2-D Moving Window Sum Generator.



FIG. 8 shows a X-sum generator.





DETAILED DESCRIPTION


FIG. 1 shows a block diagram of a system for detecting and illuminating the vasculature in a patient.


The system shown in the block diagram of FIG. 1 is used for detecting the location of veins on a patient and illuminating the veins.


The disclosures of U.S. patent application Ser. No. 12/804,506, now issued as U.S. Pat. No. 8,463,364 are incorporated herein by reference.


In a preferred embodiment, FIGS. 30-47 of application Ser. No. 12/804,506 illustrates an assembly of a housing that may be used in the present invention. In the present invention, circuit boards 43, 44 and 15 of application Ser. No. 12/804,506 may be modified to contain the circuitry described by the block diagram in FIG. 1. The remainder of the device in FIGS. 30-47 can remain substantially the same.


In FIG. 1 an FPGA 1 (field programmable gate array) is configured to control a red laser drive 2 which in turn drives a red laser 3. The output of the red laser 3 is controlled in a manner so as to illuminate the detected veins. A red laser feedback 4 detects the output of the red laser 3 and sends the information to the FPGA 1. Accordingly, a closed loop is formed whereby the FPGA 1 can both drive the Red laser 3 and receive feedback as to the red laser 3 state.


FPGA 1 outputs data to an IR laser drive 5 which in turn drives an IR laser 6. The output of the IR laser 6 is controlled to output an intensity of IR light, aimed at the area of the body where veins are located, sufficient to detect the veins. An IR laser feedback 7 detects the output of the IR laser 6 and sends the information to the FPGA 1. Accordingly, a closed loop is formed whereby the FPGA 1 can both drive the IR Laser 6 and receive feedback as to the IR laser 6 state.


FPGA 1 communicates to both a x-mirror drive 8 and a y-mirror drive 9 to drive x-mirror 10 and y-mirror 11 in such a manner that a raster pattern is formed on the patient when the Red laser 3 and the IR laser 6 are coaxially projected thereon. X-mirror feedback 12 and y-mirror feedback 13 detect the positions of the x-mirror 10 and y-mirror 11, respectively, and communicates such information to the FPGA1.


Top photodiode 23 and bottom photodiode 22 receive the IR Laser 6 reflected off the patient, converts the light into an analog signal which is provided to Top FE 25 and Bottom FE 24, and then to Top ADC 27 and bottom ADC 25, respectively. The top FE 25 and the bottom FE 24 are front end circuits that provide analog filtering, gain control and threshold of the analog signals. The Top ADC 27 and bottom ADC 26 are analog to digital converters that convert the analog signals to digital representations thereof to be communicated to the FPGA 1. Control lines are provided from the FPGA 1 to the top FE 25 and the bottom FE 24 to set parameters such as, for example, gain control and analog filtering.


From a mechanical standpoint, the red laser 3 and the IR laser 6 are co axially aligned and projected off of mirrors X-mirror 10 and Y-mirror 11 to form a pattern, such as for example, a raster pattern on the patient. The IR laser 6 reflects off the patient and is received by top photodiode 23 and photodiode 22. The reflected IR light contains information as to the location of the veins (IR light is absorbed by the blood in the veins and therefore the amount or reflected IR light is lower when the IR laser 6 is aimed at a vein. The FPGA 1 time sequentially receives in the signal form the top ADC 27 and the bottom ADC and can form two partial and/or full frame images of the reflected IR light (hereinafter a top channel data and a bottom channel data wherein the top channel data is received from the top ADC 27 and the bottom channel data is received from the bottom ADC). The FPGA 1 processes one or both of the partial and/or full image to detect and enhance the image of the veins. The enhanced image is time sequentially projected by the Red laser 3 onto the patient.


A CPLD is provided for controlling an LCD 19 with displays user information related to the operating status of the device. It also controls an audio 20 output to provide audible tones to the user. Finally the CPLD 18 controls the switches 21 on the unit for turning on and off the units as well as selecting user modes and entering data.


A microprocessor PIC MCU 17 is provided for receiving and monitoring the IR laser feedback 7 signal, the red laser feedback 4 signal, the x-mirror feedback 12 signal and the y-mirror feedback 13 signal. Since these signals are also provided to the FPGA 1, redundancy monitoring of the signals is provided by the PIC MCU 17. This is particularly important when regulatory requirements require redundant monitoring of the laser power and movement to comply with safety requirements. The PIC MCU 17 also monitors the device power management 14, the Li-ion Battery management 15 circuitry and the Li-ion Fuel gauge 16.



FIG. 2 shows an example of the signal processing flow of the FPGA.



FIG. 2 shows an embodiment of the signal processing algorithm of the FPGA of FIG. 1. As described with reference to FIG. 1, the image of the reflected IR laser 6 is time sequentially stored in the FPGA 1 as top channel data 30T and bottom channel data 30B.


The X-mirror 10 oscillates about a single axis to move the laser beam from the IR laser 6 to form a line. The beam moves first in one direction and then back in the other direction. It is critical that the left to right image data be in convergence with the right to left data. The top line correlator 31T measures the shift in the convergence of the top channel data 30T and supplies the information to the mirror convergence control 34. Similarly, the bottom line correlator 31B measures the shift in the convergence of the bottom channel data 30B and supplies the information to the mirror convergence control 34. The mirror convergence control 34 can adjust the control signals provided from the FPGA 1 to the x-mirror drive 8 so as to converge the data.


A top histogram 32T receives the top channel data 30T and generates a histogram based upon an entire frame of the top channel data 30T. Similarly, a bottom histogram 32B receives the top channel data 30B and generates a histogram based upon an entire frame of the bottom channel data 30B. The histograms contain information describing the characteristics of the images, including but not limited to contrast and intensity levels. The top histogram 32T and the bottom histogram 32B are provided to exposure control 35. Exposure control 35 communicates appropriate signals the IR laser drive 5 to adjust the power of the IR laser 6 on a frame by frame basis until the histograms indicate appropriate images. The exposure control 35 also communicates with the top FE 25 and bottom FE 24 to adjust parameters such as setting thresholds and setting electrical gain.


A top vein processing 33T block receives the top channel data 30T and performs image processing to detect vein patterns and provides the enhanced vein image to fused vein projection 36. Similarly, bottom vein processing 33B block receives the bottom channel data 30B and performs image processing to detect vein patterns and provides the enhanced vein image to fused vein projection 36. The fused vein projection 36 forms a single image and communicates the image to the alpha blended projection 38. The fused vein projection 36 can form the single image by merging the images from the top vein processing 33T and bottom vein processing 33B. Alternative, the fused vein projection 36 can simply select the best image received from the top vein processing 33T and the bottom vein processing 33B.


Alpha channel 37 forms an image that contains graphical data, such as text or characters. Alpha channel 37 and fused vein projection 36 are provided to alpha blended projection 38 with drives the IR laser drive 5 to display an image which is the combination of the fused vein projection 36 and the alpha channel 37.



FIG. 3 shows an example of the internal bus architecture of the FPGA



FIG. 4 shows details of the top vein processing 33T and bottom vein processing 33B.



FIG. 5 shows the vein processing at the boundary of the image frames.



FIG. 6 shows further detail of the vein processing at the boundary of the image frames.



FIG. 7 shows the 2-D Moving Window Sum Generator.



FIG. 8 shows a X-sum generator.

Claims
  • 1. An image capture device configured to detect, process, and project an image using a field programmable gate array (FPGA), said image capture device comprising: a first laser configured to output a beam of light at an infrared wavelength;a second laser configured to output a beam of light at a visible red wavelength;a combiner configured to combine said beams of light from said first and second lasers into a co-axial beam of light;an x-direction mirror configured to reflect said coaxial beam of light, and to be pivotable about a first axis, in a first direction and in a second direction;an x-direction mirror driver configured to drive said x-direction mirror to oscillate about said first axis, to cyclically reflect said coaxial beam of light in a line, in both said first and second directions;a y-direction mirror configured to reflect said line of light received from said x-direction mirror, and to be pivotable about a second axis, in a third direction and a fourth direction;a y-direction mirror driver configured to drive said y-direction mirror to oscillate about said second axis;a FPGA configured to control said x-direction mirror driver and said y-direction mirror driver to control said oscillations about said first and second axes to form a pattern of said red and infrared wavelengths of light;a feedback means configured for detecting a position of said x-direction mirror and said y-direction mirror, and for signaling said positions to said FPGA;a photodiode configured to receive the image formed from said infrared light, said photodiode further configured to convert the received image into an analog signal;wherein said second laser driver is further configured to receive said analog signal, and to drive said red laser to project the image using said analog signal;a line correlator configured to measure a shift in convergence between said line of light in said first direction and said second direction, for each said oscillation of said X-direction mirror;a mirror convergence control configured to receive said measured shift in convergence from said line correlator, and to adjust said control of said first mirror driver by said FPGA, for said line in said first direction to converge with said line in said second direction.
  • 2. The image capture device according to claim 1, further comprising: a second feedback means configured for detecting said output of said first and second lasers, and for signaling said detected output to said FPGA, for said FPGA to control said output of said first and second lasers.
  • 3. The image capture device according to claim 2, further comprising: an analog-to-digital converter configured to receive said analog signal from said photodiode, and to convert said analog signal into a digital image signal, and to communicate said digital image signal to said FPGA;wherein said FPGA is configured to receive and process said digital image signal and to output a processed image signal; andwherein said second laser driver is further configured to receive said processed image signal from said FPGA, and to drive said red laser to project said processed image using said x-direction mirror and said y-direction mirror.
  • 4. The image capture device according to claim 3, wherein said photodiode comprises a top photodiode and a bottom photodiode each configured to receive a full frame of the image and to output a respective said analog signal.
  • 5. The image capture device according to claim 4, further comprising: a top front end circuit and a bottom front end circuit, each configured to respectively receive said analog signals of said top and bottom photodiodes; said top and bottom front end circuits configured to control analog filtering, gain, and threshold of said respective analog signals;wherein said analog-to-digital converter comprises a top analog-to-digital converter (ADC) and a bottom ADC configured to respectively receive said analog signals from said top and bottom front end circuits, and to convert said analog signals into respective digital image signals, and to communicate said respective digital image signals to said FPGA;said FPGA further configured to receive each of said respective digital image signals from said top and bottom ADC, and to perform imaging processing within each of said respective images, to form respective enhanced images;wherein said FPGA is further configured to form a single enhanced image signal from said respective enhanced images; andwherein said second laser driver is further configured to receive said single enhanced image from said FPGA, and to drive said red laser to project said enhanced single image using said x-direction mirror and said y-direction mirror.
  • 6. The image capture device according to claim 5, wherein said single enhanced image is formed from: a merged signal formed by merging said respective digital image signals; ora selected best image signal selected from either of said respective digital image signals.
  • 7. The image capture device according to claim 5, further comprising: a top histogram generator and a bottom histogram generator, each of said top and bottom histogram generators configured to generate a respective histogram of characteristics of said analog signals from said top and bottom photodiodes; andwherein said FPGA is further configured to receive and use said respective histograms to signal said infrared laser driver to adjust power to said infrared laser on a frame by frame basis until said histograms indicate a proper image.
  • 8. The image capture device according to claim 7, wherein said histogram characteristics comprise contrast and intensity levels.
  • 9. The image capture device according to claim 7, further comprising a CPLD configured to control an LCD to display an operating status of said system thereon.
  • 10. The image capture device according to claim 9, wherein said CPLD is further configured to control an audio output to provide audible tones to a user.
  • 11. The image capture device according to claim 10, wherein said CPLD is further configured to control one or more switches on said system for turning on and off said system, for selecting one or more user modes, and for entering data therein.
  • 12. The image capture device according to claim 2, further comprising a microprocessor configured to redundantly receive and monitor said feedback signal for said first and second lasers, and said feedback signal for said X-direction mirror and Y-direction mirror, in conjunction with said FPGA.
  • 13. An image capture device comprising: means for outputting and scanning a beam of light comprising an infrared wavelength and a visible red wavelength in a first and a second direction, for forming respective lines in said first and second directions, and for scanning said lines in a third direction and a fourth direction;a field programmable gate array (FPGA) configured to control said means for outputting and scanning to control said scanning in said first and second directions, and to control said scanning in said third and fourth directions, to form a pattern;a photodiode configured to receive an image formed from said scanned pattern of said infrared wavelength of light, said photodiode further configured to convert the image into an analog signal;wherein said means for outputting and scanning is further configured to receive said analog signal, and to scan said image using said visible red wavelength;a line correlator configured to measure a shift in convergence between said respective lines in said first and second directions;a mirror convergence control configured to receive said measured shift in convergence from said line correlator, and to adjust said control of said means for outputting and scanning by said FPGA, for said line in said first direction to converge with said line in said second direction.
  • 14. The image capture device according to claim 13, further comprising: a first feedback means configured for detecting a position of said x-direction mirror and said y-direction mirror, and for signaling said positions to said FPGA;a second feedback means configured for detecting said output of said means for outputting and scanning, and for signaling said detected output to said FPGA, for said FPGA to control said output of said means for outputting and scanning;an analog-to-digital converter configured to receive said analog signal from said photodiode, and to convert said analog signal into a digital image signal, and to communicate said digital image signal to said FPGA;wherein said FPGA is configured to receive and process said digital image signal and to output a processed image signal; andwherein said means for outputting and scanning is further configured to receive said processed image from said FPGA, and to scan said processed image using said red wavelength.
  • 15. The image capture device according to claim 13, wherein said photodiode comprises a top photodiode and a bottom photodiode each configured to receive a full frame of said image and to output a respective said analog signal.
  • 16. The image capture device according to claim 15, further comprising: a top front end circuit and a bottom front end circuit, each configured to respectively receive said analog signals of said top and bottom photodiodes; said top and bottom front end circuits configured to control analog filtering, gain, and threshold of said respective analog signals;wherein said analog-to-digital converter comprises a top analog-to-digital converter (ADC) and a bottom ADC configured to respectively receive said analog signals from said top and bottom front end circuits, and to convert said analog signals into respective digital image signals, and to communicate said respective digital image signals to said FPGA;said FPGA further configured to receive each of said respective digital image signals from said top and bottom ADC, and to perform imaging processing within each of said respective images, to form respective enhanced images;wherein said FPGA is further configured to form a single enhanced digital image from said respective enhanced images; andwherein said means for outputting and scanning is further configured to receive said single enhanced image from said FPGA, and to scan said single enhanced image using said visible red wavelength.
  • 17. The image capture device according to claim 16, wherein said single enhanced image comprises: a merged signal formed by merging said respective digital image signals; ora selected best image signal selected from either of said respective digital image signals.
  • 18. The image capture device according to claim 16, further comprising: a top histogram generator and a bottom histogram generator, each of said top and bottom histogram generators configured to generate a respective histogram of characteristics of said analog signals from said top and bottom photodiodes; andwherein said FPGA is further configured to receive and use said respective histograms to adjust power to said means for outputting and scanning on a frame by frame basis until said histograms indicate a proper image.
  • 19. The image capture device according to claim 18, wherein said histogram characteristics comprise contrast and intensity levels.
  • 20. The image capture device according to claim 13, further comprising a microprocessor configured to redundantly monitor and control said means for outputting and scanning to control said scanning in said first and second directions, and said scanning in said third and fourth directions, to form a raster pattern.
  • 21. An image capture device comprising: means for outputting and scanning a beam of light comprising an infrared wavelength and a visible red wavelength in a first and a second direction, for forming respective lines in said first and second directions, and for scanning said lines in a third direction and a fourth direction;an electronic component configured to control said means for outputting and scanning to control said scanning in said first and second directions, and to control said scanning in said third and fourth directions, to form a pattern;a photodiode configured to receive an image formed from said scanned pattern of said infrared wavelength of light, said photodiode further configured to convert the image into an analog signal;wherein said means for outputting and scanning is further configured to receive said analog signal, and to scan said image using said visible red wavelength;a line correlator configured to measure a shift in convergence between said respective lines in said first and second directions;a mirror convergence control configured to receive said measured shift in convergence from said line correlator, and to adjust said control of said means for outputting and scanning by said electronic component, for said line in said first direction to converge with said line in said second direction.
  • 22. The image capture device according to claim 21, further comprising: a feedback means configured for detecting said output of said means for outputting and scanning, and for signaling said detected output to said electronic component, for said electronic component to control said output of said means for outputting and scanning;an analog-to-digital converter configured to receive said analog signal from said photodiode, and to convert said analog signal into a digital image signal, and to communicate said digital image signal to said electronic component;wherein said electronic component is configured to receive and process said digital image signal and to output a processed image signal; andwherein said means for outputting and scanning is further configured to receive said processed image from said electronic component, and to scan said processed image using said red wavelength.
  • 23. The image capture device according to claim 22, wherein said photodiode comprises a top photodiode and a bottom photodiode each configured to receive a full frame of said image and to output a respective said analog signal.
CROSS REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. application Ser. No. 14/723,674, filed on May 28, 2015, which is a continuation of U.S. application Ser. No. 13/957,767, filed on Aug. 2, 2013, now issued as U.S. Pat. No. 9,072,426, which claims priority on U.S. Provisional Application Ser. No. 61/678,726, filed on Aug. 2, 2012, with the disclosures of each incorporated herein by reference.

US Referenced Citations (257)
Number Name Date Kind
3136310 Meltzer Jun 1964 A
3349762 Kapany Oct 1967 A
3511227 Stern May 1970 A
3527932 Thomas Sep 1970 A
3818129 Yamamoto Jun 1974 A
3984629 Gorog Oct 1976 A
4030209 Dreidling Jun 1977 A
4057784 Tafoya Nov 1977 A
4109647 Stern Aug 1978 A
4162405 Chance Jul 1979 A
4182322 Miller Jan 1980 A
4185808 Donohoe et al. Jan 1980 A
4213678 Pomerantzeff Jul 1980 A
4265227 Ruge May 1981 A
4312357 Andersson et al. Jan 1982 A
4315318 Kato Feb 1982 A
4321930 Jobsis et al. Mar 1982 A
4393366 Hill Jul 1983 A
4495949 Stoller Jan 1985 A
4502075 DeForest et al. Feb 1985 A
4510938 Jobsis Apr 1985 A
4536790 Kruger Aug 1985 A
4565968 Macovski Jan 1986 A
4567896 Barnea Feb 1986 A
4576175 Epstein Mar 1986 A
4586190 Tsuji Apr 1986 A
4590948 Nilsson May 1986 A
4596254 Adrian Jun 1986 A
4619249 Landry Oct 1986 A
4669467 Willet Jun 1987 A
4697147 Moran Sep 1987 A
4699149 Rice Oct 1987 A
4703758 Omura Nov 1987 A
4766299 Tierney et al. Aug 1988 A
4771308 Tejima et al. Sep 1988 A
4780919 Harrison Nov 1988 A
4799103 Mucherheide Jan 1989 A
4817622 Pennypacker et al. Apr 1989 A
4846183 Martin Jul 1989 A
4862894 Fujii Sep 1989 A
4899756 Sonek Feb 1990 A
4901019 Wedeen Feb 1990 A
4926867 Kanda May 1990 A
RE33234 Landry Jun 1990 E
4938205 Nudelman Jul 1990 A
5074642 Hicks Dec 1991 A
5088493 Giannini Feb 1992 A
5103497 Hicks Apr 1992 A
5146923 Dhawan Sep 1992 A
5174298 Dolfi Dec 1992 A
5184188 Bull Feb 1993 A
5214458 Kenai May 1993 A
5222495 Clarke Jun 1993 A
5261581 Harden Nov 1993 A
5293873 Fang Mar 1994 A
5339817 Nilsson Aug 1994 A
5371347 Plesko Dec 1994 A
5406070 Edgar et al. Apr 1995 A
5418546 Nakagakiuchi et al. May 1995 A
5423091 Lange Jun 1995 A
5436655 Hiyama Jul 1995 A
D362910 Creaghan Oct 1995 S
5455157 Hinzpeter et al. Oct 1995 A
5494032 Robinson Feb 1996 A
5497769 Gratton Mar 1996 A
5504316 Bridgelall et al. Apr 1996 A
5519208 Esparza et al. May 1996 A
5541820 McLaughlin Jul 1996 A
5542421 Erdman Aug 1996 A
5598842 Ishihara et al. Feb 1997 A
5603328 Zucker et al. Feb 1997 A
5608210 Esparza et al. Mar 1997 A
5610387 Bard et al. Mar 1997 A
5625458 Alfano Apr 1997 A
5631976 Bolle et al. May 1997 A
5655530 Messerschmidt Aug 1997 A
5678555 O'Connell Oct 1997 A
5716796 Bull Feb 1998 A
5719399 Alfano et al. Feb 1998 A
5747789 Godik May 1998 A
5756981 Roustaei et al. May 1998 A
5758650 Miller Jun 1998 A
5772593 Hakamata Jun 1998 A
5787185 Clayden Jul 1998 A
5814040 Nelson Sep 1998 A
5836877 Zavislan Nov 1998 A
5847394 Alfano et al. Dec 1998 A
5860967 Zavislan et al. Jan 1999 A
5929443 Alfano et al. Jul 1999 A
5946220 Lemelson Aug 1999 A
5947906 Dawson, Jr. et al. Sep 1999 A
5966204 Abe Oct 1999 A
5969754 Zeman Oct 1999 A
5982553 Bloom et al. Nov 1999 A
5988817 Mizushima et al. Nov 1999 A
5995856 Manheimer et al. Nov 1999 A
5995866 Lemelson Nov 1999 A
6006126 Cosman Dec 1999 A
6032070 Flock et al. Feb 2000 A
6056692 Schwartz May 2000 A
6061583 Ishihara et al. May 2000 A
6101036 Bloom Aug 2000 A
6122042 Wunderman Sep 2000 A
6132379 Patacsil Oct 2000 A
6135599 Fang Oct 2000 A
6141985 Cluzeau et al. Nov 2000 A
6142650 Brown et al. Nov 2000 A
6149644 Xie Nov 2000 A
6171301 Nelson Jan 2001 B1
6178340 Svetliza Jan 2001 B1
6230046 Crane et al. May 2001 B1
6240309 Yamashita May 2001 B1
6251073 Imran et al. Jun 2001 B1
6263227 Boggett et al. Jul 2001 B1
6301375 Choi Oct 2001 B1
6305804 Rice Oct 2001 B1
6314311 Williams et al. Nov 2001 B1
6334850 Amano et al. Jan 2002 B1
6353753 Flock Mar 2002 B1
6424858 Williams Jul 2002 B1
6436655 Bull Aug 2002 B1
6438396 Cook et al. Aug 2002 B1
6463309 Ilia Oct 2002 B1
6464646 Shalom et al. Oct 2002 B1
6523955 Eberl Feb 2003 B1
6542246 Toida Apr 2003 B1
6556854 Sato et al. Apr 2003 B1
6556858 Zeman Apr 2003 B1
6599247 Stetten Jul 2003 B1
6631286 Pfeiffer Oct 2003 B2
6648227 Swartz et al. Nov 2003 B2
6650916 Cook et al. Nov 2003 B2
6689075 West Feb 2004 B2
6690964 Bieger et al. Feb 2004 B2
6702749 Paladini et al. Mar 2004 B2
6719257 Greene et al. Apr 2004 B1
6755789 Stringer Jun 2004 B2
6777199 Bull Aug 2004 B2
6782161 Barolet et al. Aug 2004 B2
6845190 Smithwick Jan 2005 B1
6882875 Crowley Apr 2005 B1
6889075 Marchitto et al. May 2005 B2
6913202 Tsikos et al. Jul 2005 B2
6923762 Creaghan Aug 2005 B1
6980852 Jersey-Willuhn et al. Dec 2005 B2
7092087 Kumar Aug 2006 B2
7113817 Winchester Sep 2006 B1
7158660 Gee et al. Jan 2007 B2
7158859 Wang Jan 2007 B2
7225005 Kaufman et al. May 2007 B2
7227611 Hull et al. Jun 2007 B2
7239909 Zeman Jul 2007 B2
7247832 Webb Jul 2007 B2
7280860 Ikeda et al. Oct 2007 B2
7283181 Allen Oct 2007 B2
7302174 Tan et al. Nov 2007 B2
7333213 Kempe Feb 2008 B2
D566283 Brafford et al. Apr 2008 S
7359531 Endoh et al. Apr 2008 B2
7376456 Marshik-Geurts May 2008 B2
7431695 Creaghan Oct 2008 B1
7532746 Marcotte et al. May 2009 B2
7545837 Oka Jun 2009 B2
7559895 Stetten Jul 2009 B2
7579592 Kaushal Aug 2009 B2
7608057 Woehr et al. Oct 2009 B2
7699776 Walker et al. Apr 2010 B2
7708695 Akkermans May 2010 B2
7792334 Cohen Sep 2010 B2
7846103 Cannon, Jr. Dec 2010 B2
7904138 Goldman Mar 2011 B2
7904139 Chance Mar 2011 B2
7925332 Crane et al. Apr 2011 B2
7966051 Xie Jun 2011 B2
8032205 Mullani Oct 2011 B2
8078263 Zeman et al. Dec 2011 B2
8187189 Jung et al. May 2012 B2
8199189 Kagenow et al. Jun 2012 B2
8320998 Sato Nov 2012 B2
8336839 Boccoleri et al. Dec 2012 B2
8364246 Thierman Jan 2013 B2
8467855 Yasui Jun 2013 B2
8494616 Zeman Jul 2013 B2
8498694 McGuire, Jr. et al. Jul 2013 B2
8509495 Xu et al. Aug 2013 B2
8548572 Crane et al. Oct 2013 B2
8630465 Wieringa Jan 2014 B2
8649848 Crane et al. Feb 2014 B2
20010006426 Son Jul 2001 A1
20010056237 Cane Dec 2001 A1
20020016533 Marchitto Feb 2002 A1
20020118338 Kohayakawa Aug 2002 A1
20020188203 Smith Dec 2002 A1
20030018271 Kimble Jan 2003 A1
20030052105 Nagano Mar 2003 A1
20030120154 Sauer Jun 2003 A1
20030125629 Ustuner Jul 2003 A1
20030156260 Putilin Aug 2003 A1
20040015158 Chen et al. Jan 2004 A1
20040022421 Endoh et al. Feb 2004 A1
20040046031 Knowles et al. Mar 2004 A1
20040171923 Kalafut et al. Sep 2004 A1
20040222301 Willins et al. Nov 2004 A1
20050017924 Utt et al. Jan 2005 A1
20050033145 Graham et al. Feb 2005 A1
20050043596 Chance Feb 2005 A1
20050047134 Mueller et al. Mar 2005 A1
20050085802 Gruzdev Apr 2005 A1
20050113650 Pacione et al. May 2005 A1
20050131291 Floyd et al. Jun 2005 A1
20050135102 Gardiner et al. Jun 2005 A1
20050141069 Wood et al. Jun 2005 A1
20050143662 Marchitto et al. Jun 2005 A1
20050146765 Turner Jul 2005 A1
20050154303 Walker Jul 2005 A1
20050157939 Arsenault et al. Jul 2005 A1
20050161051 Pankratov et al. Jul 2005 A1
20050168980 Dryden et al. Aug 2005 A1
20050174777 Cooper et al. Aug 2005 A1
20050175048 Stern et al. Aug 2005 A1
20050187477 Serov Aug 2005 A1
20050215875 Khou Sep 2005 A1
20050265586 Rowe et al. Dec 2005 A1
20050281445 Marcotte et al. Dec 2005 A1
20060007134 Ting Jan 2006 A1
20060020212 Xu Jan 2006 A1
20060025679 Viswanathan et al. Feb 2006 A1
20060052690 Sirohey et al. Mar 2006 A1
20060081252 Wood Apr 2006 A1
20060100523 Ogle May 2006 A1
20060103811 May et al. May 2006 A1
20060122515 Zeman Jun 2006 A1
20060129037 Kaufman et al. Jun 2006 A1
20060129038 Zelenchuk et al. Jun 2006 A1
20060151449 Warner Jul 2006 A1
20060173351 Marcotte et al. Aug 2006 A1
20060184040 Keller et al. Aug 2006 A1
20060206027 Malone Sep 2006 A1
20060232660 Nakajima et al. Oct 2006 A1
20060253010 Brady et al. Nov 2006 A1
20060271028 Altshuler et al. Nov 2006 A1
20070016079 Freeman et al. Jan 2007 A1
20070070302 Govorkov Mar 2007 A1
20070115435 Rosendaal May 2007 A1
20070176851 Wiley Aug 2007 A1
20080045841 Wood et al. Feb 2008 A1
20080147147 Griffiths et al. Jun 2008 A1
20080194930 Harris et al. Aug 2008 A1
20090018414 Toofan Jan 2009 A1
20090171205 Kharin Jul 2009 A1
20100051808 Zeman et al. Mar 2010 A1
20100061598 Seo Mar 2010 A1
20100087787 Woehr et al. Apr 2010 A1
20100177184 Berryhill et al. Jul 2010 A1
20100312120 Meier Dec 2010 A1
20140039309 Harris et al. Feb 2014 A1
20140046291 Harris et al. Feb 2014 A1
Foreign Referenced Citations (22)
Number Date Country
2289149 May 1976 FR
1298707 Dec 1972 GB
1507329 Apr 1978 GB
S60-108043 Jun 1985 JP
04-042944 Feb 1992 JP
07-255847 Oct 1995 JP
08023501 Jan 1996 JP
08-164123 Jun 1996 JP
2000316866 Nov 2000 JP
2002 328428 Nov 2002 JP
2002345953 Dec 2002 JP
2004 237051 Aug 2004 JP
2004329786 Nov 2004 JP
20030020152 Nov 2004 KR
WO 1994 22370 Oct 1994 WO
WO 1996 39925 Dec 1996 WO
WO 9826583 Jun 1998 WO
WO 9948420 Sep 1999 WO
WO 2001 82786 Nov 2001 WO
WO 2003 009750 Feb 2003 WO
WO 2005053773 Jun 2005 WO
WO 2007078447 Jul 2007 WO
Non-Patent Literature Citations (5)
Entry
Wiklof, Chris, “Display Technology Spawns Laser Camera,” LaserFocusWorld, Dec. 1, 2004, vol. 40, Issue 12, PennWell Corp., USA.
Nikbin, Darius, “IPMS Targets Colour Laser Projectors,” Optics & Laser Europe, Mar. 2006, Issue 137, p. 11.
http://sciencegeekgirl.wordpress, com/category/science-myths/page/2/ Myth 7: Blood is Blue.
http://www exploratorium.edu/sports/hnds_up/hands6.html “Hands Up! To Do & Notice: Getting the Feel of Your Hand”.
http://www.wikihow.com/See-Blood-Veins-in-Your-Hand-With-a-Flashlight “How to See Blood Veins in Your Hand With a Flashlight”.
Related Publications (1)
Number Date Country
20180070832 A1 Mar 2018 US
Provisional Applications (1)
Number Date Country
61678726 Aug 2012 US
Continuations (2)
Number Date Country
Parent 14723674 May 2015 US
Child 15679277 US
Parent 13957767 Aug 2013 US
Child 14723674 US