System, method and arrangement which can use spectral encoding heterodyne interferometry techniques for imaging

Information

  • Patent Grant
  • 7859679
  • Patent Number
    7,859,679
  • Date Filed
    Wednesday, May 31, 2006
    18 years ago
  • Date Issued
    Tuesday, December 28, 2010
    14 years ago
Abstract
Systems, arrangements and methods for obtaining three-dimensional imaging data are provided. For example, a broadband light source can provide a particular radiation. A first electro-magnetic radiation can be focused and diffracted, and then provided to at least one sample to generate a spectrally-encoded line. A second electro-magnetic radiation may be provided to a reference, which may include a double-pass rapidly-scanning optical delay, where the first and second electro-magnetic radiations can be based on the particular radiation. An interference between a third electro-magnetic radiation (associated with the first electro-magnetic radiation) and a fourth electro-magnetic radiation (associated with the second electro-magnetic radiation) can be detected. The spectrally-encoded line may be scanned over the sample in a direction approximately perpendicular to the line. Image data containing three-dimensional information can then be obtained based on the interference. The exemplary imaging methods and systems can be used in a small fiber optic or endoscopic probe.
Description
FIELD OF THE INVENTION

The present invention relates to optical imaging and, more particularly, to systems, methods and arrangements that can use spectral encoding heterodyne interferometry techniques for imaging at least one portion of a sample.


BACKGROUND OF THE INVENTION

Three-dimensional (“3D”) endoscopy can assist with a variety of minimally invasive procedures by providing clinicians with depth information. Achieving depth-resolved imaging having a large, three-dimensional field of view can be difficult when small diameter flexible imaging probes such as, e.g., borescopes, laparoscopes, and endoscopes are utilized. The use of confocal imaging through a fiber-bundle using a high numerical aperture lens may be one technique that can be used to address this problem. Such technique is described in, e.g., Y. S. Sabharwal et al., “Slit-scanning confocal microendoscope for high-resolution in vivo imaging,” Appl. Opt. 38, 7133 (1999). A 3D field of view for such devices, however, may be limited to less than a few millimeters due to a small clear aperture of the objective lens and a low f-number that may be required for high-resolution optical sectioning.


Other techniques such as, for example, stereo imaging and structured illumination have also been proposed for obtaining 3D endoscopic images. Such techniques are described in, e.g., M. Chan et al., “Miniaturized three-dimensional endoscopic imaging system based on active stereovision,” Appl. Opt. 42, 1888 (2003); and D. Karadaglic et al., “Confocal endoscope using structured illumination,” Photonics West 2003, Biomedical Optics, 4964-34, respectively. These techniques may, however, require more components to construct a probe than would be required for confocal imaging that is performed using a fiber bundle. This additional hardware can increase the size, cost, and complexity of such devices.


Spectrally-encoded endoscopy (“SEE”) techniques can utilize a broadband light source and a diffraction grating to spectrally encode reflectance across a transverse line within a sample. For example, a two-dimensional image can be formed by slowly scanning this spectrally-encoded line. This technique can be performed using a single optical fiber, thereby enabling imaging through a flexible probe having a small diameter. In particular, SEE images can have a larger number of resolvable points, and may be relatively free from pixilation artifacts as compared with images obtained using fiber-bundle endoscopes.


When combined with interferometry techniques and systems, SEE can provide three-dimensional images. A depth-resolved imaging can be achieved, e.g., by incorporating a SEE probe into a sample arm of a Michelson interferometer. Using such an arrangement, two-dimensional (“2D”) speckle patterns can be recorded using a charge-coupled device (“CCD”) camera at multiple longitudinal locations of a reference mirror. Subsequently, depth information can be extracted by comparing interference signals obtained at consecutive reference mirror positions. When using this technique, the reference mirror can be held stationary to within an optical wavelength while a single image (or line) is being acquired to avoid the loss of fringe visibility. Scanning a reference mirror that is positioned with such accuracy over multiple discrete depths can be very difficult at the high rates required for real-time volumetric imaging.


OBJECTS AND SUMMARY OF THE INVENTION

One of the objects of the present invention is to overcome certain deficiencies and shortcomings of the prior art systems (including those described herein above), and to provide exemplary SEE techniques, systems and arrangements that are capable of generating three-dimensional image data associated with a sample. Exemplary embodiments of the present invention can provide methods, systems and arrangements that are capable of generating high-speed volumetric imaging of a sample. Exemplary embodiments of these systems and arrangements can be provided within the confines of a fiber optic probe or an endoscopic probe.


In certain exemplary embodiments of the present invention, a system can be provided that includes a light source or another electro-magnetic radiation generating arrangement. The light source can be a broadband source capable of providing the electro-magnetic radiation. The exemplary embodiment of the system can include a beam splitter configured to separate radiation from the light source into a first radiation and a second radiation. The system can be configured to direct the first radiation toward a sample. The first radiation can pass through a lens-grating arrangement (that can include a diffraction grating and a lens) to focus, modify and/or direct the first radiation. The lens-grating system can be configured to direct a spectrally-encoded line associated with the first radiation towards the sample. A scanning mechanism can also be provided that is configured to effectuate the scanning of the line over at least a portion of the sample in a direction that is approximately perpendicular to the line. A third radiation can be generated based on interactions between at least a portion of the spectrally-encoded line and the sample. The lens-grating arrangement and/or the scanning mechanism may be provided, e.g., in a probe. The probe may include an endoscope and/or a catheter.


The exemplary embodiment of the system can further include a rapidly-scanning optical delay (“RSOD”) arrangement, where the second radiation can be configured to pass through the RSOD arrangement and possibly be affected thereby to generate a fourth radiation. A detection arrangement can also be provided that is configured to detect an interference between the third and fourth radiations. This detection arrangement can include, e.g., a charge-coupled device that is capable of generating raw data based on the interference.


A processing arrangement such as, e.g., a computer and/or a software arrangement executable by the processing arrangement, can be provided that is/are configured to generate the image data based on the detected interference between the third and fourth radiations. The processing arrangement and/or the software arrangement can be configured to apply, for example, a Fourier transform to the raw data to generate the image data. A display arrangement can also be provided to display the images of at least one portion of the sample based on the image data. These images can optionally be displayed in real time, e.g., while the first radiation is being directed towards the sample.


In further exemplary embodiments of the present invention, a method can be provided for generating three-dimensional image data of at least a portion of the sample. A particular radiation can be provided which may include a first radiation directed to the sample and a second radiation directed to a reference. For example, the first radiation can be directed through a lens and a diffraction grating to provide a spectrally-encoded line directed towards the sample. This line can be scanned over at least a portion of the sample in a direction approximately perpendicular to the line. A third radiation can be produced based on an interaction between the first radiation and the sample. A fourth radiation can be generated by directing the second radiation through a rapidly-scanning optical delay.


An interference can then be detected between the third radiation and the fourth radiation. This interference can be used to generate three-dimensional image data that characterizes at least one portion of the sample. The image data can be used to display images of the sample on a display.


These and other objects, features and advantages of the present invention will become apparent upon reading the following detailed description of embodiments of the invention, when taken in conjunction with the appended claims.





BRIEF DESCRIPTION OF THE DRAWINGS

Further objects, features and advantages of the invention will become apparent from the following detailed description taken in conjunction with the accompanying figures showing illustrative embodiments of the invention, in which:



FIG. 1 is a block diagram of an exemplary embodiment of a time-domain spectrally-encoded imaging system according to the present invention;



FIG. 2 is a diagram of an exemplary technique that may be used to extract both transverse and depth information from an interference trace using a short-time Fourier transform in accordance with certain exemplary embodiments of the present invention;



FIG. 3A shows an exemplary image of a fingertip obtained using a method, system and arrangement in accordance with exemplary embodiments of the present invention;



FIG. 3B shows an exemplary image of the fingertip shown in FIG. 3A in which depth information is superimposed using contour lines;



FIG. 4A is an image of a surface of a quarter dollar coin obtained using a method, system and arrangement in accordance with exemplary embodiments of the present invention;



FIG. 4B is an image of a surface of a dime placed 2.4 mm in front of the quarter dollar coin shown in FIG. 4A;



FIG. 4C is an exemplary two-dimensional integrated image of the two coins shown in FIGS. 4A and 4B, which was obtained using a method, system and arrangement in accordance with exemplary embodiments of the present invention; and



FIG. 4D is a depth-resolved image of the two coins shown in FIG. 4C, in which surface features closer to the lens are brighter than those further away; and



FIG. 5 is a flow diagram of an exemplary method in accordance with exemplary embodiments of the present invention.





Throughout the figures, the same reference numerals and characters, unless otherwise stated, are used to denote like features, elements, components or portions of the illustrated embodiments. Moreover, while the subject invention will now be described in detail with reference to the figures, it is done so in connection with the illustrative embodiments. It is intended that changes and modifications can be made to the described embodiments without departing from the true scope and spirit of the subject invention as defined by the appended claims.


DETAILED DESCRIPTION OF EXEMPLARY EMBODIMENTS OF INVENTION

A block diagram of a system configured to acquire image data for 3D images in accordance with exemplary embodiments of the present invention is shown in FIG. 1. For example, a light source 100 or other source of electro-magnetic radiation can be provided (which may generate a light or other electro-magnetic radiation that has a broad bandwidth such as, e.g., a titanium-sapphire laser) which may be coupled to an input port of, e.g., a single-mode fiber optic 50/50 splitter 110 or another type of optical splitter. A compact lens-grating arrangement may be provided that can include a lens 120 (e.g., a lens having f=40 mm and a beam diameter of 0.5 mm) adapted to focus a beam of light, and a transmission grating 130 having, e.g., 1000 lines/mm (Holographix LLC) to diffract the beam and form a spectrally-encoded line (along an x-axis) on a surface of a sample 140. A galvanometric optical scanner 150 can be provided for, e.g., slow scanning of the line along a y-axis.


This exemplary system can provide a spatial transverse resolution of, e.g., approximately 80 microns. The image may include 80 transverse resolvable points, with each transverse spot capable of being illuminated using a bandwidth of, e.g., 1.9 nm. The overall power provided to the sample may be about 4 mW. A double-pass rapidly-scanning optical delay (“RSOD”) 160 can be used to control a group delay of the reference arm light. The RSOD 160 may be scanned over a distance of about 1.5 mm at a rate of about 1000 scans per second. An interference signal can be recorded as a function of time by a detector 170, and then demodulated and displayed in real time using a computer 180.


Spatial resolutions and ranging depth measurements and visualization can be improved according to the exemplary embodiments of the present invention by using, for example, a broader bandwidth source and an extended-range optical delay line. Such arrangements are described, e.g., in K. K. M. B. D. Silva, A. V. Zvyagin, and D. D. Sampson, “Extended range, rapid scanning optical delay line for biomedical interferometric imaging,” Elec. Lett. 35, 1404 (1999).


An illustration of an exemplary technique according to the present invention for encoding both transverse and depth dimensions using broad-spectrum illumination is shown in FIG. 2. For example, a sample 200 can include three discrete scattering points in an x-z plane. An interference signal 210 that may be recorded as a function of time by scanning the sample 200 can contain three interference traces 220. Each interference trace can represent depth information which may be characterized by a corresponding delay, Δti=Δzi/vg, where Δzi is a depth location of a corresponding scatterer and vg is a group-delay velocity. A transverse location can correspond to a carrier frequency, 2vpi, where vp can represent a phase velocity and λi may be a wavelength corresponding to the location of scatterer i. (For example, in the sample containing three scatterers shown in FIG. 2, i=1 to 3.)


The width of each trace, Ti, can determine a depth resolution, and may be expressed as Ti=0.44Nxλi2/(vgΔλ), where Δλ represents a total bandwidth and Nx is a number of resolvable points along the spectrally-encoded line. A two-dimensional data set 230 (corresponding to locations in x- and z-axes) can be obtained by applying a short-time Fourier transform (“STFT”) to the interference data 220 using a Gaussian window centered at Δti and having a width of Ti. The frequency distribution corresponding to a given delay Δti can provide spatial information at a corresponding depth, Δzi.


Alternatively or additionally, a depth-integrated transverse image can be obtained by applying a frequency transform to part or all of a set of interference data simultaneously, or by summing individual depth-resolved images. The frequency transform may be, e.g., a Fourier transform, a short-time Fourier transform, or a Wigner transform. Volumetric data can be obtained by scanning the spectrally encoded line transversely across the sample 200.


The exemplary detection technique according to the present invention described herein can be analogous to a technique which may be used in conventional optical coherence tomography (“OCT”). Conventional OCT techniques are described, e.g., in D. Huang et al., Science 254, 1178 (1991). Exemplary OCT techniques can utilize a broadband light source to obtain a high resolution in an axial direction which may be, e.g., less than about 10 μm. To perform three-dimensional imaging using the conventional OCT techniques, a probe beam should be scanned in two dimensions, which can require a fast beam-scanning mechanism. In contrast, spectrally-encoded endoscopy techniques can utilize a spectral bandwidth to obtain both transverse and axial resolution simultaneously, which may thereby utilize only one slow-axis scan to acquire three-dimensional data sets. Using a given source bandwidth, the two-dimensional resolution can be achieved with a decreased axial resolution.


If an exemplary shot-noise limited detection technique is utilized and a source having a uniformly flat spectrum is used, a signal-to-noise ratio (“SNR”) associated with a spatial point having a reflectivity R can be expressed as:







SNR
=



2



P
r


N
x



R



P
s


N
x




2


hvP
r


B


=


2


RP
s


τ



hvN
x
2



N
z





,





where Pr denotes a total reference arm power, Ps denotes a total sample power, τ represents a line scan period, B denotes a sampling bandwidth, which may be written as B=Nz/2τ, and Nz indicates a number of axial resolvable points. The expression for the SNR above can be inversely proportional to the square of the number of transverse resolvable points, since only a fraction of the reference arm power (i.e., Pr/Nx) interferes with light returning from a single transverse location.


Exemplary images of a fingertip acquired using an exemplary 3D spectrally-encoded technique in accordance with certain exemplary embodiments of the present invention are shown in FIGS. 3A and 3B. The frame size of these exemplary images is approximately 15×9 mm). Three-dimensional image data was obtained at a rate of 2.5 frames per second. Each frame in the images of FIGS. 3A and 3B includes a resolution of 200 points (along a spatially scanned axis)×80 points (along a wavelength-encoded axis)×10 points (indicating depth within the sample). The depth resolution was approximately 145 μm.


For example, a two-dimensional (depth-integrated) image 300 of FIG. 3A can be obtained by acquiring about 4000 points per scan, and applying a Fourier transform to these data. Each scan can be divided into about ten time windows that may be transformed separately to extract three-dimensional information. The three-dimensional data can also be presented as a contour map 310 as shown in FIG. 3B. Further, a false-color image can be generated and superimposed onto a two-dimensional image to provide an additional three-dimensional visualization.


In biological tissues, a single-scattered signal emerging from a particular depth within a tissue sample can have a significantly lower intensity than a signal scattered from near the tissue surface. Based on this characteristic of scattered signals, it is likely that the largest frequency component of each STFT may correspond to a surface height or depth within the tissue.


Three-dimensional image data can be obtained from the samples having a depth range larger than, e.g., the 1.5 mm depth provided directly by the RSOD 160 shown in FIG. 1. A greater range of depths can be resolved by obtaining two or more volumetric data sets, where each set can be acquired using a different reference arm path length.


In exemplary embodiments of the present invention, certain components of the system may be provided in a small size in the form of a probe that can be introduced into a body. For example, the lens-grating arrangement and/or the scanning mechanism may be provided in a capsule or other enclosure or housing that can be included with or introduced into a body using a catheter and/or an endoscope. A waveguide can be used to direct at least part of the radiation generated by the light source to the lens-grating arrangement, the reference, and/or the sample. The waveguide can include, for example, a single-mode optical fiber, a multi-mode optical fiber, and/or a multiple-clad optical fiber.


As an example of this extended range acquisition, the surface of a dime placed about 2.4 mm in front of a quarter dollar coin was imaged as shown in FIGS. 4A-4D using a method, arrangement and system in accordance with certain exemplary embodiments of the present invention. For example, a lens having an f value of 65 mm was used to provide a larger field of view and depth of focus. Two volumetric data sets were obtained by calculating the STFT for each of two locations of the RSOD double-pass mirror. Each set of image data included 200 horizontal lines, captured at a rate of 5 volume sets per second, and was processed and displayed on a computer screen at a rate of 2.5 frames per second.


Images of the two coins shown in FIGS. 4A-4D are provided under various conditions. A first image 400 in FIG. 4A includes a scale bar having a length of 1 mm, which also corresponds to the images shown in FIGS. 4B-4D. Although the surfaces of both imaged coins are within the focal depth of the lens, the dime is not seen in the first image 400 of FIG. 4A because of the limited scanning range of the RSOD. After adjusting the optical path length of the reference arm by stepping the RSOD 160 double pass mirror of FIG. 1 by 2.4 mm, the surface of the dime can be visualized, as shown in a second image 410 in FIG. 4B.


The two volumetric data sets used to form the first and second images 400, 410 can be combined to obtain a depth-integrated two-dimensional third image 420 of FIG. 4C and an extended-range depth resolved fourth image 430 of FIG. 4D. The surface height in the resolved fourth image 430 can be represented by a gray scale lookup table, where depth locations closer to the lens have higher pixel intensity. Thus, the image of the dime appears brighter, whereas recesses in the lower quarter dollar coin appear the darkest in this image. Other exemplary image processing techniques may be used to provide additional displays of the three-dimensional image data obtained using the exemplary methods and systems described herein.


An exemplary flow diagram of a method 500 according to exemplary embodiments of the present invention is shown in FIG. 5. A particular radiation can be provided that can include a first and a second electro-magnetic radiation (step 510). The particular radiation can be provided by, e.g., a broadband light source or a laser. The radiation can include a plurality of wavelengths that are provided simultaneously, or it can optionally can include one or more wavelengths that vary in time. The first and second radiations can be provided, e.g., by directing the particular radiation through an optical arrangement such as a beam splitter.


A spectrally disperse line of radiation can be generated that is associated with the first radiation (step 520). This line can be generated, e.g., by directing the first radiation through a lens-grating arrangement which can include, for example, a diffraction grating and a lens that can be configured to focus and/or direct the first radiation. The spectrally disperse line can be generated all at once or, alternatively, different portions of the line can be generated sequentially when using a light source having at least one wavelength that varies with time.


The spectrally disperse line can be directed toward a portion of a sample to be imaged (step 530). The line may also be scanned in a direction that can be approximately perpendicular to the line (step 540) using an arrangement such as, e.g., a galvanometric optical scanner or the like, which can provide coverage of a region of the sample to be imaged.


The second radiation can be directed to an optical delay arrangement (step 550) or other arrangement such as, e.g., a RSOD, which is capable of affecting the second radiation in a controlled time-dependent manner. A signal associated with the first and second radiations may then be detected (step 560). This signal can be, e.g., an interference which can be obtained by combining the second radiation (after it has been directed to the optical delay arrangement) and electro-magnetic radiation generated by an interaction between the first radiation and a portion of the sample being imaged (step 570). Three-dimensional image data can then be generated that is associated with the signal using a processing arrangement or computer. The data can be generated, e.g., by applying a Fourier transform to the signal and/or demodulating the signal. One or more images can then be displayed using the image data (step 580). Optionally, the image can be displayed in real time.


The foregoing merely illustrates the principles of the invention. Various modifications and alterations to the described embodiments will be apparent to those skilled in the art in view of the teachings herein. Indeed, the arrangements, systems and methods according to the exemplary embodiments of the present invention can be used with any OCT system, OFDI system, SD-OCT system or other imaging systems, and for example with those described in International Patent Application PCT/US2004/029148, filed Sep. 8, 2004, U.S. patent application Ser. No. 11/266,779, filed Nov. 2, 2005, and U.S. patent application Ser. No. 10/501,276, filed Jul. 9, 2004, the disclosures of which are incorporated by reference herein in their entireties. It will thus be appreciated that those skilled in the art will be able to devise numerous systems, arrangements and methods which, although not explicitly shown or described herein, embody the principles of the invention and are thus within the spirit and scope of the present invention. In addition, to the extent that the prior art knowledge has not been explicitly incorporated by reference herein above, it is explicitly being incorporated herein in its entirety. All publications referenced herein above are incorporated herein by reference in their entireties.

Claims
  • 1. A system comprising: at least one first arrangement configured to provide a particular radiation which includes at least one first electro-magnetic radiation directed to at least one sample and at least one second electro-magnetic radiation directed to a reference arrangement, wherein at least one of the at least one first radiation or the at least one second radiation comprises a plurality of wavelengths, and wherein the at least one first arrangement is configured to spectrally disperse the at least one first electro-magnetic radiation along at least one portion of the at least one sample; andat least one second arrangement configured to generate data is based on the at least one second electromagnetic radiation, wherein the data is axial data that is associated with at least one portion of the at least one sample which is located in a direction that is axial with respect to a direction of the at least one first electro-magnetic radiation.
  • 2. The system according to claim 1, wherein the reference arrangement comprises an optical delay arrangement.
  • 3. The system according to claim 2, wherein the optical delay arrangement is a rapidly-scanning optical delay.
  • 4. The system according to claim 1, wherein the data is further associated with at least one portion of the at least one sample which is located in a direction that is transverse with respect to a direction of the at least one first electro-magnetic radiation.
  • 5. The system according to claim 4, wherein the data is further associated with at least one of a two-dimensional image or a three-dimensional image of at least a portion of the at least one sample.
  • 6. The system according to claim 1, wherein the at least one sample is an anatomical structure.
  • 7. The system according to claim 6, wherein at least a portion of the anatomical structure is provided below a surface of skin.
  • 8. The system according to claim 1, wherein the at least one first arrangement comprises a diffraction grating.
  • 9. The system according to claim 8, wherein the at least one first arrangement further comprises a lens.
  • 10. The system according to claim 9, wherein the at least one first arrangement is further configured to generate a line of radiation on at least a portion of the at least one sample.
  • 11. The system according to claim 10, wherein the at least one first arrangement further comprises at least one scanning arrangement configured to scan the line of radiation in a direction approximately perpendicular to the line.
  • 12. The system according to claim 1, wherein the at least one second arrangement comprises an optical detector.
  • 13. The system according to claim 12, wherein the optical detector includes a charge-couple device.
  • 14. The system according to claim 12, wherein the optical detector is configured to generate a signal based on the at least one first electromagnetic radiation and the at least one second electromagnetic radiation, and wherein the at least one second arrangement is configured to generate a time-frequency transform of the signal.
  • 15. The system according to claim 14, wherein the time-frequency transform is at least one of a short-time Fourier transform, or a Wigner transform.
  • 16. The system according to claim 1, further comprising a processing arrangement configured to provide at least one image based on the data.
  • 17. The system according to claim 16, wherein the at least one processing arrangement is configured to provide the at least one image in real time.
  • 18. The system according to claim 1, wherein the at least one first electromagnetic radiation is provided through a waveguide arrangement.
  • 19. The system according to claim 18, wherein the waveguide arrangement is at least one of a single-mode optical fiber, a multi-mode optical fiber, or a multiple-clad optical fiber.
  • 20. The system according to claim 18, wherein the at least one first arrangement is provided in a probe.
  • 21. The system according to claim 20, wherein the probe comprises at least one of an endoscope or a catheter.
  • 22. A method for generating three-dimensional image data comprising: providing a particular radiation which includes at least one first electro-magnetic radiation and at least one second electro-magnetic radiation;directing the at least one first electro-magnetic radiation to at least one sample, wherein the at least one first radiation comprises at least one of a plurality of wavelengths and the at least one first electromagnetic radiation is spectrally dispersed on the at least one sample;directing the at least one second electro-magnetic radiation to a reference arrangement;detecting a signal associated with the at least one second electro-magnetic radiation; andgenerating image data associated based on the signal, wherein the image data is axial data which is further associated with at least one portion of the at least one sample which is located in a direction that is axial with respect to a direction of the at least one first electro-magnetic radiation.
  • 23. The method of claim 22, wherein the reference arrangement includes an optical delay arrangement.
  • 24. The method of claim 22, wherein the at least one first electromagnetic radiation is provided in the form of a line, and further comprising scanning the line in a direction approximately perpendicular to the line.
  • 25. The method of claim 22, wherein generating the image data comprises generating a time-frequency transform of the signal.
  • 26. The method of claim 25, wherein the time-frequency transform is at least one of a Fourier transform, a short-time Fourier transform, or a Wigner transform.
  • 27. The method of claim 22, further comprising displaying at least one image based on the image data.
  • 28. The method of claim 27, wherein the at least one image is displayed in real time.
  • 29. The method according to claim 23, wherein the optical delay arrangement is a rapidly-scanning optical delay.
CROSS-REFERENCE TO RELATED APPLICATION(S)

This application is based upon and claims the benefit of priority from U.S. patent application Ser. No. 60/686,518, filed May 31, 2005, the entire disclosure of which is incorporated herein by reference.

STATEMENT REGARDING FEDERALLY SPONSORED RESEARCH

Development of the present invention was supported in part by the U.S. Government under National Science Foundation grant BES-0086709. Thus, the U.S. Government may have certain rights in the invention.

US Referenced Citations (254)
Number Name Date Kind
2339754 Brace Jan 1944 A
3601480 Randall Aug 1971 A
3856000 Chikama Dec 1974 A
3941121 Olinger Mar 1976 A
3973219 Tang et al. Aug 1976 A
3983507 Tang et al. Sep 1976 A
4030827 Delhaye et al. Jun 1977 A
4141362 Wurster Feb 1979 A
4295738 Meltz et al. Oct 1981 A
4300816 Snitzer et al. Nov 1981 A
4303300 Pressiat et al. Dec 1981 A
4428643 Kay Jan 1984 A
4479499 Alfano Oct 1984 A
4533247 Epworth Aug 1985 A
4585349 Gross et al. Apr 1986 A
4601036 Faxvog et al. Jul 1986 A
4607622 Fritch et al. Aug 1986 A
4631498 Cutler Dec 1986 A
4639999 Daniele Feb 1987 A
4770492 Levin et al. Sep 1988 A
4868834 Fox et al. Sep 1989 A
4892406 Waters Jan 1990 A
4925302 Cutler May 1990 A
4928005 Lefèvre et al. May 1990 A
4940328 Hartman Jul 1990 A
4965441 Picard Oct 1990 A
4965599 Roddy et al. Oct 1990 A
4993834 Carlhoff et al. Feb 1991 A
5039193 Snow et al. Aug 1991 A
5040889 Keane Aug 1991 A
5045936 Lobb et al. Sep 1991 A
5046501 Crilly Sep 1991 A
5065331 Vachon et al. Nov 1991 A
5120953 Harris Jun 1992 A
5127730 Brelje et al. Jul 1992 A
5197470 Helfer et al. Mar 1993 A
5202745 Sorin et al. Apr 1993 A
5228001 Birge et al. Jul 1993 A
5248876 Kerstens et al. Sep 1993 A
5251009 Bruno Oct 1993 A
5262644 Maguire Nov 1993 A
5281811 Lewis Jan 1994 A
5291885 Taniji et al. Mar 1994 A
5293872 Alfano et al. Mar 1994 A
5293873 Fang Mar 1994 A
5304810 Amos Apr 1994 A
5305759 Kaneko et al. Apr 1994 A
5317389 Hochberg et al. May 1994 A
5321501 Swanson et al. Jun 1994 A
5353790 Jacques et al. Oct 1994 A
5383467 Auer et al. Jan 1995 A
5411016 Kume et al. May 1995 A
5419323 Kittrell et al. May 1995 A
5439000 Gunderson et al. Aug 1995 A
5441053 Lodder et al. Aug 1995 A
5450203 Penkethman Sep 1995 A
5454807 Lennox et al. Oct 1995 A
5459325 Hueton et al. Oct 1995 A
5459570 Swanson et al. Oct 1995 A
5465147 Swanson Nov 1995 A
5486701 Norton et al. Jan 1996 A
5491524 Hellmuth et al. Feb 1996 A
5491552 Knuttel Feb 1996 A
5522004 Djupsjobacka et al. May 1996 A
5526338 Hasman et al. Jun 1996 A
5562100 Kittrell et al. Oct 1996 A
5565986 Knüttel Oct 1996 A
5583342 Ichie Dec 1996 A
5590660 MacAulay et al. Jan 1997 A
5600486 Gal et al. Feb 1997 A
5601087 Gunderson et al. Feb 1997 A
5621830 Lucey et al. Apr 1997 A
5623336 Raab et al. Apr 1997 A
5697373 Richards-Kortum et al. Dec 1997 A
5698397 Zarling et al. Dec 1997 A
5710630 Essenpreis et al. Jan 1998 A
5716324 Toida Feb 1998 A
5719399 Alfano et al. Feb 1998 A
5735276 Lemelson Apr 1998 A
5740808 Panescu et al. Apr 1998 A
5748598 Swanson et al. May 1998 A
5784352 Swanson et al. Jul 1998 A
5785651 Kuhn et al. Jul 1998 A
5795295 Hellmuth et al. Aug 1998 A
5801826 Williams Sep 1998 A
5803082 Stapleton et al. Sep 1998 A
5807261 Benaron et al. Sep 1998 A
5817144 Gregory Oct 1998 A
5840023 Oraevsky et al. Nov 1998 A
5840075 Mueller et al. Nov 1998 A
5842995 Mahadevan-Jansen et al. Dec 1998 A
5843000 Nishioka et al. Dec 1998 A
5843052 Benja-Athon Dec 1998 A
5847827 Fercher Dec 1998 A
5862273 Pelletier Jan 1999 A
5865754 Sevick-Muraca et al. Feb 1999 A
5867268 Gelikonov et al. Feb 1999 A
5871449 Brown Feb 1999 A
5872879 Hamm Feb 1999 A
5877856 Fercher Mar 1999 A
5887009 Mandella et al. Mar 1999 A
5892583 Li Apr 1999 A
5920373 Bille Jul 1999 A
5920390 Farahi et al. Jul 1999 A
5921926 Rolland et al. Jul 1999 A
5949929 Hamm Sep 1999 A
5951482 Winston et al. Sep 1999 A
5956355 Swanson et al. Sep 1999 A
5968064 Selmon et al. Oct 1999 A
5983125 Alfano et al. Nov 1999 A
5987346 Benaron et al. Nov 1999 A
5991697 Nelson et al. Nov 1999 A
5994690 Kulkarni et al. Nov 1999 A
6002480 Izatt et al. Dec 1999 A
6004314 Wei et al. Dec 1999 A
6006128 Izatt et al. Dec 1999 A
6010449 Selmon et al. Jan 2000 A
6014214 Li Jan 2000 A
6033721 Nassuphis Mar 2000 A
6044288 Wake et al. Mar 2000 A
6048742 Weyburne et al. Apr 2000 A
6053613 Wei et al. Apr 2000 A
6069698 Ozawa et al. May 2000 A
6091496 Hill Jul 2000 A
6091984 Perelman et al. Jul 2000 A
6111645 Tearney et al. Aug 2000 A
6117128 Gregory Sep 2000 A
6120516 Selmon et al. Sep 2000 A
6134003 Tearney et al. Oct 2000 A
6134010 Zavislan Oct 2000 A
6134033 Bergano et al. Oct 2000 A
6141577 Rolland et al. Oct 2000 A
6151522 Alfano et al. Nov 2000 A
6159445 Klaveness et al. Dec 2000 A
6160826 Swanson et al. Dec 2000 A
6161031 Hochmann et al. Dec 2000 A
6166373 Mao Dec 2000 A
6174291 McMahon et al. Jan 2001 B1
6175669 Colston et al. Jan 2001 B1
6185271 Kinsinger Feb 2001 B1
6191862 Swanson et al. Feb 2001 B1
6193676 Winston et al. Feb 2001 B1
6198956 Dunne Mar 2001 B1
6201989 Whitehead et al. Mar 2001 B1
6208415 De Boer et al. Mar 2001 B1
6208887 Clarke Mar 2001 B1
6249349 Lauer Jun 2001 B1
6263234 Engelhardt et al. Jul 2001 B1
6264610 Zhu Jul 2001 B1
6272376 Marcu et al. Aug 2001 B1
6274871 Dukor et al. Aug 2001 B1
6282011 Tearney et al. Aug 2001 B1
6308092 Hoyns Oct 2001 B1
6324419 Guzelsu et al. Nov 2001 B1
6341036 Tearney et al. Jan 2002 B1
6353693 Kano et al. Mar 2002 B1
6359692 Groot Mar 2002 B1
6377349 Fercher Apr 2002 B1
6384915 Everett et al. May 2002 B1
6393312 Hoyns May 2002 B1
6394964 Sievert, Jr. et al. May 2002 B1
6421164 Tearney et al. Jul 2002 B2
6437867 Zeylikovich et al. Aug 2002 B2
6445944 Ostrovsky Sep 2002 B1
6459487 Chen et al. Oct 2002 B1
6463313 Winston et al. Oct 2002 B1
6469846 Ebizuka et al. Oct 2002 B2
6485413 Boppart et al. Nov 2002 B1
6485482 Belef Nov 2002 B1
6501551 Tearney et al. Dec 2002 B1
6501878 Hughes et al. Dec 2002 B2
6549801 Chen et al. Apr 2003 B1
6552796 Magnin et al. Apr 2003 B2
6556305 Aziz et al. Apr 2003 B1
6556853 Cabib et al. Apr 2003 B1
6558324 Von Behren et al. May 2003 B1
6564087 Pitris et al. May 2003 B1
6564089 Izatt et al. May 2003 B2
6615071 Casscells, III et al. Sep 2003 B1
6622732 Constantz Sep 2003 B2
6654127 Everett et al. Nov 2003 B2
6657730 Pfau et al. Dec 2003 B2
6680780 Fee Jan 2004 B1
6685885 Nolte et al. Feb 2004 B2
6687007 Meigs Feb 2004 B1
6687010 Horii et al. Feb 2004 B1
6687036 Riza Feb 2004 B2
6692430 Adler Feb 2004 B2
6741355 Drabarek May 2004 B2
6790175 Furusawa et al. Sep 2004 B1
6806963 Wälti et al. Oct 2004 B1
6816743 Moreno et al. Nov 2004 B2
6839496 Mills et al. Jan 2005 B1
6900899 Nevis May 2005 B2
6903820 Wang Jun 2005 B2
6961123 Wang et al. Nov 2005 B1
6980299 de Boer Dec 2005 B1
7006231 Ostrovsky et al. Feb 2006 B2
7072047 Westphal et al. Jul 2006 B2
7075658 Izatt et al. Jul 2006 B2
7113288 Fercher Sep 2006 B2
7177027 Hirasawa et al. Feb 2007 B2
7231243 Tearney et al. Jun 2007 B2
7782464 Mujat et al. Aug 2010 B2
20010047137 Moreno et al. Nov 2001 A1
20020016533 Marchitto et al. Feb 2002 A1
20020064341 Fauver et al. May 2002 A1
20020076152 Hughes et al. Jun 2002 A1
20020085209 Mittleman et al. Jul 2002 A1
20020093662 Chen et al. Jul 2002 A1
20020122182 Everett et al. Sep 2002 A1
20020122246 Tearney et al. Sep 2002 A1
20020161357 Anderson et al. Oct 2002 A1
20020163622 Magnin et al. Nov 2002 A1
20020172485 Keaton et al. Nov 2002 A1
20020188204 McNamara et al. Dec 2002 A1
20020196446 Roth et al. Dec 2002 A1
20020198457 Tearney et al. Dec 2002 A1
20030023153 Izatt et al. Jan 2003 A1
20030026735 Nolte et al. Feb 2003 A1
20030067607 Wolleschensky et al. Apr 2003 A1
20030135101 Webler Jul 2003 A1
20030164952 Deichmann et al. Sep 2003 A1
20030171691 Casscells, III et al. Sep 2003 A1
20030174339 Feldchtein et al. Sep 2003 A1
20030199769 Podoleanu et al. Oct 2003 A1
20030216719 Debenedictics et al. Nov 2003 A1
20030236443 Cespedes et al. Dec 2003 A1
20040076940 Alexander et al. Apr 2004 A1
20040086245 Farroni et al. May 2004 A1
20040100631 Bashkansky et al. May 2004 A1
20040100681 Bjarklev et al. May 2004 A1
20040126048 Dave et al. Jul 2004 A1
20040126120 Cohen et al. Jul 2004 A1
20040133191 Momiuchi et al. Jul 2004 A1
20040150829 Koch et al. Aug 2004 A1
20040166593 Nolte et al. Aug 2004 A1
20040189999 De Groot et al. Sep 2004 A1
20040212808 Okawa et al. Oct 2004 A1
20040239938 Izatt Dec 2004 A1
20050018201 De Boer Jan 2005 A1
20050036150 Izatt et al. Feb 2005 A1
20050075547 Wang Apr 2005 A1
20050083534 Riza et al. Apr 2005 A1
20060033923 Hirasawa et al. Feb 2006 A1
20060103850 Alphonse et al. May 2006 A1
20060155193 Leonardi et al. Jul 2006 A1
20060171503 O'Hara et al. Aug 2006 A1
20060244973 Yun et al. Nov 2006 A1
20070086017 Buckland et al. Apr 2007 A1
20070258094 Izatt et al. Nov 2007 A1
20080204762 Izatt et al. Aug 2008 A1
20100086251 Xu et al. Apr 2010 A1
20100150467 Zhao et al. Jun 2010 A1
Foreign Referenced Citations (49)
Number Date Country
1550203 Dec 2004 CN
4105221 Sep 1991 DE
4309056 Sep 1994 DE
19542955 May 1997 DE
10351319 Jun 2005 DE
0110201 Jun 1984 EP
0251062 Jan 1988 EP
0590268 Apr 1994 EP
0933096 Aug 1999 EP
1426799 Jun 2004 EP
1257778 Dec 1971 GB
2030313 Apr 1980 GB
2209221 May 1989 GB
4135550 May 1992 JP
4135551 May 1992 JP
9219930 Nov 1992 WO
9303672 Mar 1993 WO
9533971 Dec 1995 WO
9628212 Sep 1996 WO
9732182 Sep 1997 WO
9801074 Jan 1998 WO
9814132 Apr 1998 WO
9835203 Aug 1998 WO
9838907 Sep 1998 WO
9846123 Oct 1998 WO
9848838 Nov 1998 WO
9905487 Feb 1999 WO
9944089 Sep 1999 WO
9957507 Nov 1999 WO
0058766 Oct 2000 WO
0108579 Feb 2001 WO
0138820 May 2001 WO
0142735 Jun 2001 WO
0236015 May 2002 WO
0238040 May 2002 WO
02054027 Jul 2002 WO
03020119 Mar 2003 WO
03052478 Jun 2003 WO
03062802 Jul 2003 WO
2004034869 Apr 2004 WO
2004066824 Aug 2004 WO
2004088361 Oct 2004 WO
2004105598 Dec 2004 WO
2005000115 Jan 2005 WO
2005054780 Jun 2005 WO
2005082225 Sep 2005 WO
2006014392 Feb 2006 WO
2006130797 Dec 2006 WO
2007038787 Apr 2007 WO
Related Publications (1)
Number Date Country
20060270929 A1 Nov 2006 US
Provisional Applications (1)
Number Date Country
60686518 May 2005 US