Imaging system for combine full-color reflectance and near-infrared imaging

Information

  • Patent Grant
  • 10779734
  • Patent Number
    10,779,734
  • Date Filed
    Tuesday, May 2, 2017
    7 years ago
  • Date Issued
    Tuesday, September 22, 2020
    3 years ago
Abstract
An imaging system for acquisition of NIR and full-color images includes a light source providing visible light and NIR light to an area under observation, such as living tissue, a camera having one or more image sensors configured to separately detect blue reflectance light, green reflectance light, and combined red reflectance light/detected NIR light returned from the area under observation. A controller in signal communication with the light source and the camera is configured to control the light source to continuously illuminate area under observation with temporally continuous blue/green illumination light and with red illumination light and NIR excitation light. At least one of the red illumination light and NIR excitation light are switched on and off periodically in synchronism with the acquisition of red and NIR light images in the camera.
Description
FIELD OF THE INVENTION

The invention is directed to medical imaging, in particular to a system and method for obtaining visible light images and near infrared light images from an area under observation, such as living tissue, and in particular for use in endoscopy.


BACKGROUND OF THE INVENTION

Near-infrared (NIR) imaging has been described in the literature for various clinical applications. Typically such an imaging modality utilizes a contrast agent (e.g. indocyanine green) that absorbs and/or fluoresces in the NIR. Such contrast agents may be conjugated to targeting molecules (e.g. antibodies) for disease detection. The contrast agents may be introduced into tissue intravenously or subcutaneously to image tissue structure and function (e.g. flow of blood/lymph/bile in vessels) that is not easily seen with standard visible light imaging technology.


Independently of the clinical application, endoscopic NIR imaging devices typically include multiple imaging modes as a practical feature. For example, endoscopists utilize visible spectrum color for both visualization and navigation, and an endoscopic imaging device that offers NIR imaging typically provides a concurrent color image. Such concurrent imaging devices can be realized, for example, as follows:

    • One conventional configuration utilizes spectral separation of the visible and the NIR light, with full color and NIR image signals acquired using separate sensors for the different color (e.g. red, green, and blue) and NIR spectral bands or a single color sensor with an integrated filter with filter elements transparent to the different spectral bands (e.g. red, green, blue and NIR). Thus, such multi-modality color and NIR imaging devices provide dedicated sensors or sensor pixels for each of the two imaging modes. Disadvantageously, this increases the number of image sensors in multi-sensor implementations or compromises image resolution when on the same sensor, specific sensor pixels are dedicated for NIR imaging while others are utilized for color imaging.
    • Another conventional configuration utilizes a single monochrome image sensor for sequential imaging of the visible and NIR light. The object is hereby sequentially illuminated with light in the red, green, blue and NIR spectral bands, with separate image frames being acquired for each spectral band and composite color and NIR images being generated from the acquired image frames. However, this approach, where image frames are acquired sequentially at different times, can generate objectionable motion artifacts (i.e. color fringing and “rainbow effects”) in the composite color and NIR images. These artifacts can be mitigated by increasing the acquisition or frame rate to more than, for example, 15 frames/second (fps), for example to 90 fps, or even 180 fps. Because of the high data transfer rate, high frame rates are difficult to implement for high definition images (e.g. 2 million pixels), or images having a large dynamic range (>10 bits), thus limiting image size and/or resolution.


It would therefore be desirable to provide a system and a method for simultaneous acquisition of full-color visible light and NIR light images, which obviates the aforementioned disadvantages and does not compromise image resolution and/or introduce objectionable motion artifacts.


SUMMARY OF THE INVENTION

According to one aspect of the invention, a method for acquisition of NIR images and full-color images includes the steps of illuminating an area under observation with continuous blue/green light, and illuminating the area under observation with red light and NIR light, wherein at least one of the red light and NIR light are switched on and off periodically. The blue, green, red and NIR light returning from the area under observation is directed to one or more sensors which are configured to separately detect the blue light, the green light, and the combined red light /NIR light. The red light spectral component and the NIR light spectral component are determined separately from image signals of the combined red light /NIR light, in synchronism with the switched red and NIR light. A full-color reflectance image of the area under observation is rendered and displayed from the blue, green, and red light and an NIR image is likewise rendered and displayed from the NIR light.


According to another aspect of the invention, an imaging system for acquisition of NIR and full-color images includes a light source providing visible light and NIR light to an area under observation, a camera having one or more image sensors configured to separately detect blue and green light, and combined red and NIR light returned from the area under observation, and a controller in signal communication with the light source and the camera. The controller is configured to control the light source to continuously illuminate tissue with blue/green light and to illuminate the area under observation with red light and NIR light, wherein at least one of the red light and NIR light are switched on and off periodically in synchronism with the acquisition of the red and NIR images in the camera.


The controller is further configured to determine from sensor signals representing the combined red light and NIR light separately the red light spectral component and the NIR light spectral component. The imaging system further includes a display receiving image signals corresponding to the blue light, the green light, and the separately determined red light spectral component and rendering therefrom a full-color visible light image of the area under observation. The display also receives the separately determined NIR light spectral component and renders therefrom an NIR image of the area under observation.


The video imaging system may use a three-sensor color camera configured to continuously image the blue and green wavebands and intermittently image the red waveband, thus providing continuous, high quality luma information and a sufficiently continuous complete chroma to produce high quality video images of the area under observation, such as living tissue. In such a configuration, the red image sensor can be time-multiplexed to acquire both red and NIR images (i.e. the red image sensor alternately, and in rapid succession, images both red light for the color information required for the color image and NIR light for image information required for the NIR image). Such time-multiplexing may be coupled to (and synchronized with) the illumination source used to provide the NIR illumination (excitation for fluorescence) and the red light for color imaging. Image processing is then utilized to separate and process the resulting image signals appropriately.


Embodiments of the invention may include one or more of the following features. The area under observation may be alternatingly illuminated with red light and NIR light, wherein the duration of red light may be different from, preferably longer than, the duration of illumination with NIR light. The illumination may be switched at video field or frame rates.


Fields captured by the image sensor and lacking the red light spectral component or the NIR light spectral component may be interpolated from temporally adjacent image fields that include a corresponding red light spectral component or NIR light spectral component. In one embodiment, the NIR light spectral component obtained in the absence of red light may be subtracted from the combined red light /NIR light to obtain the separate red light spectral component. This is advantageous in particular when the detected NIR signal has an intensity comparable to that of the red signal.


In one embodiment, the light source may include an illuminator emitting a substantially constant intensity of visible light and NIR light over a continuous spectral range, and a plurality of movable filters disposed between the illuminator and the area under observation for transmitting temporally continuous blue/green light and temporally discontinuous red light and NIR light.


In another embodiment, the light source may include an illuminator emitting a substantially constant intensity of visible light and NIR light over a continuous spectral range, first dichroic means for separating the visible light and NIR light into blue/green and red light and NIR light, shutter means for transforming the separated red light and NIR light into temporally discontinuous red light and discontinuous NIR light, and second dichroic means for combining the blue/green light, the temporally discontinuous red light and the temporally discontinuous NIR light for transmission to the area under observation.


In yet another embodiment, the light source may include a first illuminator emitting a substantially constant intensity of green and blue light, a second illuminator producing switched red light, a third illuminator producing switched NIR excitation light, and dichroic means for combining the switched red light and the switched NIR light with the green and blue light for transmission to the area under observation. The switched red light and the NIR light may be produced by interrupting a continuous intensity light beam of the red light and the NIR light by a shutter or chopper. Alternatively, the switched red light and the NIR light may be produced by electrically switching the second illuminator and the third illuminator on and off.


The image sensors may employ an interlaced scan or a progressive scan.


The imaging system may include an endoscope.





BRIEF DESCRIPTION OF THE DRAWINGS

The following figures depict certain illustrative embodiments of the invention which are to be understood as illustrative of the invention and not as limiting in any way.



FIG. 1 shows an endoscopic system according to one embodiment of the invention;



FIGS. 2a-2d show various exemplary embodiments of a multimode light source to be used with the endoscopic system of FIG. 1;



FIG. 3a shows an exemplary dichroic prism employed by a 3-sensor color camera;



FIG. 3b shows the optical transmission ranges for the spectral components separated by the dichroic prism of FIG. 3a;



FIG. 3c shows the optical transmission range of a notch filter that blocks excitation light from entering the camera;



FIG. 4 shows a timing diagram of a first embodiment for continuous illumination with green/blue light and alternating illumination with red/NIR light;



FIG. 5 shows a timing diagram of a second embodiment for continuous illumination with green/blue light and alternating illumination with red/NIR light;



FIG. 6 shows a timing diagram of a third embodiment for continuous illumination with green/blue/NIR light and alternating illumination with red light;



FIG. 7 shows an exemplary CMOS sensor having stacked imaging layers and the corresponding spectral sensitivity of these layers; and



FIG. 8 shows four stacked imaging layers of an exemplary sensor.





DESCRIPTION OF CERTAIN ILLUSTRATED EMBODIMENTS

Color video images are generally obtained with three-sensor color cameras where separate red, green and blue image sensors provide simultaneous contiguous arrays of red, green and blue pixel information. Full color video images are generated by combining the image information from all three sensors. Color fidelity (i.e. a true color rendition) is extremely important in medical imaging applications and all three sensors are used to provide complete color information.


To understand the relative importance of color and spatial information in video images of human tissue, however, it is useful to consider information in such video images in terms of luma and chroma. Luma refers to the brightness information in the image and it is this information that provides the spatial detail that enables the viewer to recognize shapes. The spatial and temporal resolution of luma is consequently crucial to the perception of video image quality. Chroma refers to the color information in the video image. It is a property of human vision that fine detail variations in the chroma of image features are not easily perceived and that such variations are consequently less critical than fine detail variations in luma, in an overall assessment of image quality. It is for this reason that video encoding of chroma information is often sub-sampled.


In video images of human tissue obtained with visible light, the structural details of the tissue are largely contained in the blue and green wavelength regions of the imaged light. Blue and green light tends to be reflected from the tissue surface, whereas red light tends to be highly scattered within the tissue. As a consequence, there is very little fine structural detail in the red light that reaches the red image sensor. It is also known from color science that human vision receives most of the spatial information from the green portion of the visible spectrum—i.e. green light information contributes disproportionately to the luma. The standard formula for calculating luma from gamma-corrected color components is Y′=0.2126 R′+0.7152 G′+0.0722 8′. For this reason, spatial and/or temporal interpolation of the red component of video images of human tissue does not significantly affect perception of fine detail in those images.


Similarly to red light, NIR light tends to be scattered in tissue causing NIR image features to be diffusely, rather than sharply defined. Furthermore, because the NIR image highlights areas of interest (i.e. the areas in which the contrast agent is localized), but does not provide the overall visualization or navigational information, it is desirable for a NIR endoscopic imaging device to provide a continuous color image and either a superimposed or side-by-side display of the NIR image information. In such a display the NIR light would also contribute less to the spatial information presented to observer.



FIG. 1 shows schematically an exemplary embodiment of a NIR endoscopic imaging system 10 which includes a multimode light source 11 that provides both visible and NIR illumination, connected to an endoscope 12 by way of an illumination guide, for example a fiber optic cable 17, suitable for transmission of both color and NIR illumination, a color camera 13, illustrated here as having three different sensors 34, 36, 38 (see FIG. 3a) for blue, green and red/NIR imaging, respectively, mounted to the endoscope image guide, and a camera controller 14 connected to the camera 13 and the light source 11 for controlling and synchronizing illumination and image acquisition. Controller 14 can also process the acquired visible and NIR images for display on a monitor 15 connected to the controller 14, for example, by a cable 19. Images can be acquired in real time at selectable frame rates, such as video rates.



FIGS. 2a-2d show schematic diagrams of exemplary embodiments of various light sources 11. The illustrated light sources are constructed to supply in normal color imaging mode visible illumination light yielding a substantially continuous spectral distribution. The light source maybe an arc lamp, a halogen lamp, one or more solid state sources (e.g. LEDs, semiconductor lasers) or any combination thereof and may be spectrally filtered or shaped (e.g. with bandpass filters, IR filters, etc.). The continuous spectrum may be produced as primary colors (RGB) either concurrently or sequentially, for example, using a rotating filter wheel.


In systems according to the present invention, light sources to be used with the system of the invention and described in detail below are configured to provide continuous, uninterrupted illumination in the blue and green parts of the visible spectrum and discontinuous red and/or NIR light. The blue and green parts of the visible spectrum may be optically filtered from the emission produced by a continuous source or produced directly by a narrow-band source (e.g. blue and green LEDs). The red and NJR light may also be produced by an arc lamp, a halogen lamp, a solid state source (e.g., red and NIR LEDs or lasers), or any combination thereof.


Turning now to FIG. 2a, in one embodiment a light source 11a includes an illuminator 202 producing visible and NIR light emission, a collimating lens 204, a filter wheel or reciprocating filter holder 208 that alternatingly transmits red and NIR light and continuously transmits green and blue light. Alternatively, a tunable electro-optic or acousto-optic filter may be used. The filtered light is focused by lens 206 onto light guide 17.


Another embodiment of a light source 11b is schematically illustrated in FIG. 2b. The light source 11b includes an illuminator 202 producing visible and NIR light emission and a collimating lens 204. A dichroic mirror 212 transmits green/blue light and reflects red/NIR light to another dichroic mirror 214 which transmits NIR light to NIR mirror 215 and reflects red light, or vice versa. The green/blue light can be further bandpass-filtered by filter 213. The reflected red and NIR light is chopped, for example, by chopper wheels 219a, 219b (which can be combined into a single chopper wheel) to produce temporally discontinuous illumination, which is then reflected by mirrors 216, 217 and combined with the green/blue light by dichroic mirror 218. The combined light is then focused by lens 206 onto light guide 17, as before.


In another embodiment of a light source 11c schematically illustrated in FIG. 2c, an illuminator 202a produces green and blue light emission which is collimated by a collimating lens 204a. Likewise, separate illuminators 202b, 202c produce respective red and NIR light emissions which are collimated by corresponding collimating lenses 204b and 204c. As in the embodiment of FIG. 2b, the red and NIR light is chopped, for example, by chopper wheels 219a, 219b (which may also be combined into a single chopper wheel) to produce temporally discontinuous illumination, which is then combined with the green/blue illumination by dichroic mirrors 222, 228. The combined light is then focused by lens 206 onto light guide 17, as before.


In yet another embodiment of a light source 11d schematically illustrated in FIG. 2d, an illuminator 202a produces green and blue light emission which is collimated by a collimating lens 204a, as before. However, unlike in the embodiment of FIG. 2c, the separate illuminators 202d, 202e are here switched electrically to produce red and NIR light emissions with controlled timing. For example, the red and NIR light sources 202d, 202e may be solid state light sources, such as LEDs or semiconductor lasers, which can be rapidly turned on and off with suitable, preferably electronic, switches. As described above with reference to FIG. 2c, the red and NIR illumination is collimated by corresponding collimating lenses 204b and 204c and combined with the green/blue illumination by dichroic mirrors 222, 228. The combined light is then focused by lens 206 onto light guide 17, as before.


The alternating red and NIR illumination is synchronized with the image acquisition of the three-sensor camera such that red and NIR images are acquired by the camera synchronously with the red and NIR illumination of the endoscope.



FIG. 3a shows in more detail the three-sensor camera 13 of FIG. 1, in particular the optical beam splitter used to direct red/NIR, green, and blue light to the three different image sensors 34, 36 and 38, respectively. For NIR fluorescence applications, the camera preferably also includes an excitation band blocking filter 32. The beam splitter may be made, for example, of a plurality of dichroic prisms, cube splitters, plate splitters or pellicle splitters. FIG. 3b shows the spectral composition of the light received from the endoscope according to FIG. 3a. FIG. 3c illustrates the spectral composition of the light transmitted through the excitation band blocking filter 32 implemented as a notch filter 31 which blocks transmission of excitation light, while transmitting the other wavelengths in the visible and NIR spectral range. The transmission characteristic of this filter 32 may be designed to also block undesired NIR wavelengths interfering with the visible spectrum that may degrade the color image.



FIG. 4 shows a timing diagram for a first exemplary embodiment of a simultaneous color and NIR imaging mode using, for example, a three-sensor camera. In this embodiment, the camera sensors utilize an interlaced read-out format which represents an advantageous combination of spatial and temporal resolution for smooth display of motion. Any of the light sources illustrated in FIGS. 2a-2d can be used with this embodiment. The light source provides continuous blue/green illumination and alternating red and NIR illumination. Half-frames are alternatingly exposed on the image sensors, i.e., a first field (half-frame) with even lines alternating with a second field (half-frame) with odd lines. In the timing diagram of FIG. 4 depicting a full frame rate of 30 fps, one field period (16.7 ms) provides NIR illumination, followed by two field periods (33.3 ms) of red illumination. Stated differently, the sample or tissue is illuminated with full-spectrum color (RGB) during two field periods (33.3 ms) and with GB and NIR during a third field period. For reconstructing the full-color visible image, the missing red information is interpolated between the fields adjacent to the field with the NIR illumination. The blue and green image information is always available, thereby providing optimum and continuous luma information. The NIR image is generated from every sixth field in each half frame, wherein the missing lines are spatially interpolated. When the fluorescence field is displayed, the image is updated every three fields, with the displayed image interpolated between even and odd lines.


In all the figures, the term “IR” is used instead of or interchangeably with “NIR.”


Once the color and NIR image data have been processed, the signal is outputted to a video monitor and may be displayed as two separate, simultaneous views (one color and one fluorescence) or as combined color and fluorescence image signals (e.g. by assigning the fluorescence signal a color that contrasts with the naturally occurring colors in the tissue).



FIG. 5 shows a timing diagram for a second exemplary embodiment of a simultaneous color and NIR imaging mode. In this embodiment, the camera sensors utilize a progressive scan sensor read-out format wherein a complete frame (G/B/R alternating with G/B/NIR) is read out during each field period. Any of the light sources illustrated in FIGS. 2a-2d can be used with this embodiment. The light source provides continuous blue/green illumination and alternating red and NIR illumination. In the timing diagram of FIG. 5, one field period (16.7 ms) provides NIR illumination, followed by one field period (16.7 ms) of red illumination. Stated differently, the sample or tissue is illuminated with full-spectrum color (RGB) during one field period (16.7 ms) and with GB and NIR during a third field period. In this case, a full visible spectrum color image is available at every pixel, in every other frame. In the alternate frames, the blue and green information is acquired directly, whereas the red information is interpolated between adjacent frames. Unlike with the embodiment of FIG. 4, no spatial interpolation is required. Further image processing and display can be implemented in a manner similar to that described in previous embodiments.



FIG. 6 shows a timing diagram for a third exemplary embodiment, wherein both the green/blue illumination and the NIR illumination are continuous, while only the red illumination is modulated. Like in the embodiment of FIG. 4, half-frames are alternatingly exposed on the image sensors, i.e., a first field (half-frame) with even lines alternating with a second field (half-frame) with odd lines. In the timing diagram of FIG. 6 depicting a full frame rate of 30 fps, one field period (16.7 ms) provides (NIR+GB) illumination (red illumination switched off), followed by two field periods (33.3 ms) of (NIR+RGB). If the NIR image signal is small compared to the red reflected signal, it will not significantly affect the overall visible (RGB) image, so that the color image may be generated by conventional color image processing without correction. Otherwise the NIR contribution obtained in the red image channel when the red illumination is switched off may be subtracted from the (NIR+R) image data by spatial and temporal interpolation to obtain the red image signal, as shown in the second to last lien in the timing diagram of FIG. 6. Alternatively, sensors with a progressive scan image sensor readout similar to those illustrated in FIG. 5 could be used with RGB and (RGB+IR) image acquisition in alternate frames.


In yet another exemplary embodiment (not illustrated in the drawings), the green/blue illumination as well as the red illumination are continuous, whereas the NIR illumination is modulated. This timing scheme can be best applied if the red and NIR image signals have approximately the same magnitude. In this embodiment, the light source provides uninterrupted illumination with full visible spectrum and intermittent illumination with NIR light. The timing diagram is essentially the same as that depicted in FIG. 6, with the NIR and the red illumination interchanged. The intermittent NIR illumination is synchronized to coincide with every 3rd field with interlaced cameras and with every other field in progressive scan cameras. For every field in which NIR illumination is provided, the red image sensor will acquire a (R+NIR) image signal. The NIR image signal can be extracted from the (R+NIR) image signal by interpolation of the red signal value from the appropriate preceding and subsequent “red only” image fields and subtracting the red image signal from the (R+NIR) signal. Since the red and NIR image signals are of similar magnitude, such interpolation and subtraction will provide a reasonably accurate NIR image signal value. The color image is processed by using the acquired and interpolated values for the red image signal in combination with the blue and green image signals. The resulting color and NIR image information can then be displayed or recorded as described before.


In any of the aforementioned embodiments, the NIR endoscopic imaging system can also be operated such that the light sources provides continuous illumination with either the full visible spectrum or the NIR spectrum and the camera acquires the corresponding color image or NIR (absorbance or fluorescence) image in a continuous fashion to provide high spatial resolution. The resulting video image of either individual illumination/imaging mode—color or NIR—can be subsequently displayed and/or recorded.


By implementing color and NIR imaging as described in the aforementioned embodiments, it is possible to acquire and display full-color visible light and NIR light images at video rates without compromising image resolution and/or introducing objectionable motion artifacts. Furthermore, should any residual color fringing occur as a consequence of sharp edges moving rapidly across the visual field (e.g. with the discontinuous acquisition of red or NIR images), these relatively minor effects can be mitigated by temporal interpolation of the missing (red/NIR) video fields with minimum additional processing time.


While the invention has been disclosed in connection with the preferred embodiments shown and described in detail, various modifications and improvements thereon will become readily apparent to those skilled in the art. For example, instead of using separate image sensors for G/B and R/NIR, or a single color sensor for RGB images and NIR fluorescence images, a single direct three-color RGB sensor image sensor with a stacked pixel design implemented in CMOS technology and commercially available from Foveon, Inc., San Jose, Calif., may be used. Such sensor is schematically illustrated in FIG. 7. It will be understood that this sensor design can be extended to four colors by adding an NIR-sensitive layer. The red, green, blue and NIR images are hereby acquired at different depths in the image sensor. With a 4-layer sensor, such as a sensor having layers 802, 804, 806, 808 shown in FIG. 8, multiplexing of the red and NIR illumination would be unnecessary. However, with a 3-layer sensor, the red and NIR illumination would still need to be multiplexed, as described above for a 3-sensor conventional camera. An appropriate barrier filter to block the NIR excitation light would also be required for fluorescence imaging applications.


While the invention has been illustrated and described in connection with currently preferred embodiments shown and described in detail, it is not intended to be limited to the details shown since various modifications and structural changes may be made without departing in any way from the spirit and scope of the present invention. The embodiments were chosen and described in order to explain the principles of the invention and practical application to thereby enable a person skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated.


What is claimed as new and desired to be protected by Letters Patent is set forth in the appended claims and includes equivalents of the elements recited therein:

Claims
  • 1. A medical imaging system for acquisition of NIR images and full-color images comprising: a light source configured to provide visible light and NIR excitation light to a sample area; anda camera having an image sensor, the image sensor comprising: a barrier filter to block NIR excitation light, andsensor pixels arranged in a stacked array, the sensor pixels including: first sensor pixels located at a first depth in the image sensor, the first sensor pixels configured to detect blue reflectance light,second sensor pixels located at a second depth in the image sensor that is different from the first depth, the second sensor pixels configured to detect green reflectance light,third sensor pixels located at a third depth in the image sensor that is different from the first and second depths, the third sensor pixels configured to detect red reflectance light, andfourth sensor pixels located at a fourth depth in the image sensor that is different from the first, second, and third depths, the fourth sensor pixels configured to detect NIR fluorescence light received from the sample area.
  • 2. The imaging system of claim 1, wherein the image sensor is a CMOS sensor.
  • 3. The imaging system of claim 1, wherein the system is configured to generate NIR images and full-color images of the sample area.
  • 4. The imaging system of claim 1, wherein the visible light provided by the light source comprises blue illumination light, green illumination light, and red illumination light, the blue illumination light being reflected from the tissue as blue reflectance light, the green illumination light being reflected from the tissue as green reflectance light, and the red illumination light being reflected from the tissue as red reflectance light.
  • 5. The imaging system of claim 4, comprising a controller in signal communication with the light source and the camera, the controller being configured to: control the light source to illuminate the area under observation with the blue illumination light continuously and illuminate the area under observation with the red illumination light and the NIR illumination light, wherein at least one of the red illumination light and NIR illumination light is switched on and off periodically according to a predetermined timing scheme;simultaneously acquire a first image signal corresponding to the blue illumination light, and a second image signal corresponding to the red illumination light and the NIR illumination light; anddetermine the red reflectance light and detected NIR light from the second image signal, based on the predetermined timing scheme.
  • 6. The imaging system of claim 5, wherein the predetermined timing scheme includes alternating the red illumination light and NIR illumination light.
  • 7. The imaging system of claim 1, wherein the light source comprises an illuminator configured to emit a substantially constant intensity of visible light and NIR light over a continuous spectral range, and a plurality of filters disposed between the illuminator and the area under observation for transmitting temporally continuous blue light and temporally discontinuous red light and discontinuous NIR light.
  • 8. The imaging system of claim 1, wherein the light source comprises one or more solid state sources.
  • 9. The imaging system of claim 1, wherein the blue, green, and red illumination light are produced by blue, green, and red LEDs, respectively.
  • 10. The imaging system of claim 1, wherein the imaging system is configured as an endoscope.
  • 11. The imaging system of claim 1, wherein the NIR light detected by the camera is fluorescent light.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. application Ser. No. 14/873,842, filed Oct. 2, 2015, which is a continuation of U.S. application Ser. No. 12/933,512, filed Nov. 24, 2010, now U.S. Pat. No. 9,173,554, which is the U.S. national phase application of PCT/US2009/037506, having an international filing date of Mar. 18, 2009, which claims the benefit of U.S. Provisional Application No. 61/037,514, filed Mar. 18, 2008, each of which are incorporated herein by reference in their entirety.

US Referenced Citations (399)
Number Name Date Kind
1290744 Hollander Jan 1919 A
2453336 Orser Nov 1948 A
2857523 Corso Oct 1958 A
3215029 Woodcock Nov 1965 A
3582178 Boughton et al. Jun 1971 A
3671098 Rotter Jun 1972 A
3749494 Hodges Jul 1973 A
3790248 Kellow Feb 1974 A
3931593 Marshall Jan 1976 A
3970373 Pledger Jul 1976 A
3971068 Gerhardt et al. Jul 1976 A
4037866 Price Jul 1977 A
4066330 Jones Jan 1978 A
4115812 Akatsu Sep 1978 A
4149190 Wessler et al. Apr 1979 A
4158504 de Ponteves et al. Jun 1979 A
4200801 Schuresko Apr 1980 A
4260217 Traeger et al. Apr 1981 A
4318395 Tawara Mar 1982 A
4355325 Nakamura et al. Oct 1982 A
4378571 Handy Mar 1983 A
4449535 Renault May 1984 A
4471766 Terayama Sep 1984 A
4532918 Wheeler Aug 1985 A
4556057 Hiruma et al. Dec 1985 A
4575632 Lange Mar 1986 A
4597630 Brandstetter et al. Jul 1986 A
4611888 Prenovitz et al. Sep 1986 A
4638365 Kato Jan 1987 A
4656508 Yokota Apr 1987 A
4660982 Okada Apr 1987 A
4688905 Okamura Aug 1987 A
4717952 Kohayakawa et al. Jan 1988 A
4742388 Cooper et al. May 1988 A
4768513 Suzuki Sep 1988 A
4786813 Svanberg et al. Nov 1988 A
4799104 Hosoya et al. Jan 1989 A
4806005 Schneider et al. Feb 1989 A
4821117 Sekiguchi Apr 1989 A
4837625 Douziech et al. Jun 1989 A
4852985 Fujihara et al. Aug 1989 A
4856495 Tohjoh et al. Aug 1989 A
4885634 Yabe Dec 1989 A
4895145 Joffe et al. Jan 1990 A
4930516 Alfano et al. Jun 1990 A
4930883 Salzman Jun 1990 A
4951135 Sasagawa et al. Aug 1990 A
4953539 Nakamura et al. Sep 1990 A
4954897 Ejima et al. Sep 1990 A
4974936 Ams et al. Dec 1990 A
5001556 Nakamura et al. Mar 1991 A
5007408 Ieoka Apr 1991 A
5028128 Onuki Jul 1991 A
5034888 Uehara et al. Jul 1991 A
5041852 Misawa et al. Aug 1991 A
5115308 Onuki May 1992 A
5121220 Nakamoto Jun 1992 A
5128803 Sprafke Jul 1992 A
5132837 Kitajima Jul 1992 A
5134662 Bacus et al. Jul 1992 A
5159398 Maekewa et al. Oct 1992 A
5165079 Schulz-Hennig Nov 1992 A
5205280 Dennison, Jr. et al. Apr 1993 A
5208651 Buican May 1993 A
5214503 Chiu et al. May 1993 A
5225883 Carter et al. Jul 1993 A
5255087 Nakamura et al. Oct 1993 A
5278642 Danna et al. Jan 1994 A
5282082 Espie et al. Jan 1994 A
5295017 Brown Mar 1994 A
RE34622 Ledley May 1994 E
5365057 Morley et al. Nov 1994 A
5371355 Wodecki Dec 1994 A
5377686 O'Rourke et al. Jan 1995 A
5379756 Pileski et al. Jan 1995 A
5408263 Kikuchi et al. Apr 1995 A
5410363 Capen et al. Apr 1995 A
5419323 Kittrell et al. May 1995 A
5420628 Poulsen et al. May 1995 A
5421337 Richards-Kortum et al. Jun 1995 A
5424841 Van Gelder et al. Jun 1995 A
5426530 Copenhaver et al. Jun 1995 A
5430476 Häfele et al. Jul 1995 A
5481401 Kita et al. Jan 1996 A
5485203 Nakamura et al. Jan 1996 A
5490015 Umeyama et al. Feb 1996 A
5507287 Palcic et al. Apr 1996 A
5515449 Tsuruoka et al. May 1996 A
5535052 Jörgens Jul 1996 A
5536236 Yabe et al. Jul 1996 A
5557451 Copenhaver et al. Sep 1996 A
5585846 Kim Dec 1996 A
5590660 MacAulay et al. Jan 1997 A
5596654 Tanaka Jan 1997 A
5646680 Yajima Jul 1997 A
5647368 Zeng et al. Jul 1997 A
5647840 D'Amelio et al. Jul 1997 A
5667472 Finn et al. Sep 1997 A
5677724 Takizawa et al. Oct 1997 A
5682567 Spruck et al. Oct 1997 A
5689354 Orino Nov 1997 A
5695049 Bauman Dec 1997 A
5697373 Richards-Kortum et al. Dec 1997 A
5713364 DeBaryshe et al. Feb 1998 A
5729382 Morita et al. Mar 1998 A
5749830 Kaneko et al. May 1998 A
5769792 Palcic et al. Jun 1998 A
5772355 Ross et al. Jun 1998 A
5772580 Utsui et al. Jun 1998 A
5827190 Palcic et al. Oct 1998 A
5833617 Hayashi Nov 1998 A
5838001 Minakuchi et al. Nov 1998 A
5840017 Furuswaba et al. Nov 1998 A
5852498 Youvan et al. Dec 1998 A
5891016 Utsui et al. Apr 1999 A
5897269 Ross et al. Apr 1999 A
5971918 Zanger Oct 1999 A
5973315 Saldana et al. Oct 1999 A
5984861 Crowley Nov 1999 A
5986271 Lazarev et al. Nov 1999 A
5986642 Ueda et al. Nov 1999 A
5990996 Sharp Nov 1999 A
5999240 Sharp et al. Dec 1999 A
6002137 Hayashi Dec 1999 A
6004263 Nakaichi et al. Dec 1999 A
6008889 Zeng et al. Dec 1999 A
6021344 Lui et al. Feb 2000 A
6028622 Suzuki Feb 2000 A
6030339 Tatsuno et al. Feb 2000 A
6059719 Yamamoto et al. May 2000 A
6059720 Furusawa et al. May 2000 A
6061591 Freitag et al. May 2000 A
6069689 Zeng et al. May 2000 A
6070096 Hayashi May 2000 A
6095982 Richards-Kortum et al. Aug 2000 A
6099466 Sano et al. Aug 2000 A
6110106 MacKinnon et al. Aug 2000 A
6120435 Eino Sep 2000 A
6147705 Krauter et al. Nov 2000 A
6148227 Wagnières et al. Nov 2000 A
6161035 Furusawa Dec 2000 A
6181414 Raz et al. Jan 2001 B1
6192267 Scherninski et al. Feb 2001 B1
6212425 Irion et al. Apr 2001 B1
6226126 Conemac May 2001 B1
6258576 Richards-Kortum et al. Jul 2001 B1
D446524 Bontly et al. Aug 2001 S
6280378 Kazuhiro et al. Aug 2001 B1
6293911 Imaizumi et al. Sep 2001 B1
6315712 Rovegno Nov 2001 B1
6332092 Deckert et al. Dec 2001 B1
6364829 Fulghum Apr 2002 B1
6364831 Crowley Apr 2002 B1
D456809 Schieffers May 2002 S
6419628 Rudischhauser et al. Jul 2002 B1
6422994 Kaneko et al. Jul 2002 B1
6462770 Cline et al. Oct 2002 B1
6510338 Irion et al. Jan 2003 B1
6526213 Ilenda et al. Feb 2003 B1
6529239 Dyck et al. Mar 2003 B1
6529768 Hakamata Mar 2003 B1
6537211 Wang et al. Mar 2003 B1
6544102 Schäfer et al. Apr 2003 B2
6571119 Hayashi May 2003 B2
6596996 Stone et al. Jul 2003 B1
6603552 Cline et al. Aug 2003 B1
6639664 Haan et al. Oct 2003 B2
6652452 Seifert et al. Nov 2003 B1
6750971 Overbeck et al. Jun 2004 B2
6772003 Kaneko et al. Aug 2004 B2
6773392 Kikuchi et al. Aug 2004 B2
6786865 Dhindsa Sep 2004 B2
6821245 Cline et al. Nov 2004 B2
6826424 Zeng et al. Nov 2004 B1
6898458 Zeng et al. May 2005 B2
6899675 Cline et al. May 2005 B2
6922583 Perelman et al. Jul 2005 B1
6958862 Joseph Oct 2005 B1
6960165 Ueno et al. Nov 2005 B2
7043291 Sendai May 2006 B2
D524985 Lukan et al. Jul 2006 S
D524987 Lukan et al. Jul 2006 S
7150552 Weidel Dec 2006 B2
7179222 Imaizumi et al. Feb 2007 B2
7235045 Wang et al. Jun 2007 B2
7236815 Richards-Kortum et al. Jun 2007 B2
7253894 Zeng et al. Aug 2007 B2
7324674 Ozawa et al. Jan 2008 B2
7333270 Pochapsky et al. Feb 2008 B1
7341557 Cline et al. Mar 2008 B2
7385772 Forkey et al. Jun 2008 B2
7420151 Fengler et al. Sep 2008 B2
7479990 Imaizumi et al. Jan 2009 B2
D599799 Di Bari et al. Sep 2009 S
D603408 Fitch Nov 2009 S
D606544 Di Bari et al. Dec 2009 S
7697975 Zeng Apr 2010 B2
7704206 Suzuki et al. Apr 2010 B2
7722534 Cline et al. May 2010 B2
7777191 Olcott et al. Aug 2010 B2
7798955 Ishihara et al. Sep 2010 B2
7811229 Sugimoto Oct 2010 B2
7928352 Toda Apr 2011 B2
8035067 Toda Oct 2011 B2
D653811 BenZion Feb 2012 S
8140147 Maynard et al. Mar 2012 B2
8285015 Demos Oct 2012 B2
8337400 Mizuyoshi Dec 2012 B2
8361775 Flower Jan 2013 B2
D677258 Mistkawi Mar 2013 S
8408269 Fengler et al. Apr 2013 B2
8408772 Li Apr 2013 B2
D682277 Tasselli et al. May 2013 S
8448867 Liu et al. May 2013 B2
8473035 Frangioni Jun 2013 B2
8498695 Westwick et al. Jul 2013 B2
D692004 Man Oct 2013 S
8630698 Fengler et al. Jan 2014 B2
8721532 Takei et al. May 2014 B2
8736748 Takita May 2014 B2
8759243 Coffy et al. Jun 2014 B2
8773756 Tesar et al. Jul 2014 B2
8790253 Sunagawa et al. Jul 2014 B2
8830339 Velarde et al. Sep 2014 B2
D719574 Alegiani et al. Dec 2014 S
8961403 Cline et al. Feb 2015 B2
D723563 Alegiani Mar 2015 S
8979301 Moore Mar 2015 B2
D726186 Jenkins et al. Apr 2015 S
D734339 Zhou et al. Jul 2015 S
9125552 Dunki-Jacobs et al. Sep 2015 B2
9143746 Westwick et al. Sep 2015 B2
9173554 Fengler et al. Nov 2015 B2
9282305 Kikuchi Mar 2016 B2
9294691 Ooki Mar 2016 B2
9295392 Douplik et al. Mar 2016 B2
9386909 Fengler et al. Jul 2016 B2
9435496 Moore Sep 2016 B2
9577012 Ooki Feb 2017 B2
9642532 Fengler et al. May 2017 B2
D791137 Wang et al. Jul 2017 S
9814378 Moore Nov 2017 B2
D815928 Rummel et al. Apr 2018 S
D826234 Zhou et al. Aug 2018 S
D835284 Barker et al. Dec 2018 S
D835285 Barker et al. Dec 2018 S
20010016679 Futatsugi et al. Aug 2001 A1
20010028458 Xiao Oct 2001 A1
20010049473 Hayashi Dec 2001 A1
20020013937 Ostanevich et al. Jan 2002 A1
20020016533 Marchitto et al. Feb 2002 A1
20020021355 Utsui et al. Feb 2002 A1
20020035330 Cline et al. Mar 2002 A1
20020076480 Hsieh et al. Jun 2002 A1
20020138008 Tsujita et al. Sep 2002 A1
20020143243 Geordakoudi et al. Oct 2002 A1
20020148902 Schlieffers Oct 2002 A1
20020155619 Kurihara et al. Oct 2002 A1
20020156380 Feld et al. Oct 2002 A1
20020161282 Fulghum Oct 2002 A1
20020161283 Sendai Oct 2002 A1
20020161284 Tanaka Oct 2002 A1
20020168096 Hakamata et al. Nov 2002 A1
20020175993 Ueno et al. Nov 2002 A1
20020177778 Averback et al. Nov 2002 A1
20020186478 Watanabe et al. Dec 2002 A1
20020196335 Ozawa Dec 2002 A1
20030002036 Haan et al. Jan 2003 A1
20030042493 Kazakevich Mar 2003 A1
20030080193 Ryan et al. May 2003 A1
20030117491 Avni et al. Jun 2003 A1
20030135092 Cline et al. Jul 2003 A1
20030153811 Muckner Aug 2003 A1
20030191368 Wang et al. Oct 2003 A1
20030229270 Suzuki et al. Dec 2003 A1
20040006276 Demos et al. Jan 2004 A1
20040010183 Dhindsa Jan 2004 A1
20040020990 Haven et al. Feb 2004 A1
20040021859 Cunningham Feb 2004 A1
20040037454 Ozawa et al. Feb 2004 A1
20040044275 Hakamata Mar 2004 A1
20040046865 Ueno et al. Mar 2004 A1
20040133073 Berci et al. Jul 2004 A1
20040134990 Fitch et al. Jul 2004 A1
20040143162 Krattiger et al. Jul 2004 A1
20040148141 Tsujita et al. Jul 2004 A1
20040149998 Henson et al. Aug 2004 A1
20040156124 Okada Aug 2004 A1
20040186351 Imaizumi et al. Sep 2004 A1
20040218115 Kawana et al. Nov 2004 A1
20040225222 Zeng Nov 2004 A1
20040245350 Zeng Dec 2004 A1
20040263643 Imaizumi et al. Dec 2004 A1
20050027166 Matsumoto et al. Feb 2005 A1
20050096505 Imaizumi et al. May 2005 A1
20050140270 Henson et al. Jun 2005 A1
20050143627 Cline et al. Jun 2005 A1
20050154319 Cline et al. Jul 2005 A1
20050171440 Maki et al. Aug 2005 A1
20050182291 Hirata Aug 2005 A1
20050182321 Frangioni Aug 2005 A1
20050203421 Zeng et al. Sep 2005 A1
20050225656 Ihama Oct 2005 A1
20050256373 Bar-Or et al. Nov 2005 A1
20050273011 Hattery et al. Dec 2005 A1
20050280783 Yamasaki et al. Dec 2005 A1
20050288593 Geordakoudi et al. Dec 2005 A1
20060002141 Ouderkirk et al. Jan 2006 A1
20060004292 Beylin Jan 2006 A1
20060017913 Kawamata et al. Jan 2006 A1
20060089554 Ishihara et al. Apr 2006 A1
20060094109 Trainer May 2006 A1
20060146322 Komachi et al. Jul 2006 A1
20060149133 Sugimoto et al. Jul 2006 A1
20060155166 Takahashi et al. Jul 2006 A1
20060211915 Takeuchi et al. Sep 2006 A1
20060215406 Thrailkill Sep 2006 A1
20060217594 Ferguson Sep 2006 A1
20060241496 Fengler et al. Oct 2006 A1
20060247537 Matsumoto Nov 2006 A1
20060250696 McGuire Nov 2006 A1
20060258910 Stefanchik et al. Nov 2006 A1
20070041195 Chen Feb 2007 A1
20070091634 Sakurada Apr 2007 A1
20070177152 Tearney et al. Aug 2007 A1
20070203413 Frangioni Aug 2007 A1
20070213593 Nakaoka Sep 2007 A1
20070229309 Tomita et al. Oct 2007 A1
20080021274 Bayer et al. Jan 2008 A1
20080024868 Okamura Jan 2008 A1
20080027280 Fengler et al. Jan 2008 A1
20080039697 Morishita Feb 2008 A1
20080074752 Chaves et al. Mar 2008 A1
20080177140 Cline et al. Jul 2008 A1
20080208006 Farr Aug 2008 A1
20080217411 Ledwith et al. Sep 2008 A1
20080246920 Buczek et al. Oct 2008 A1
20090012361 MacKinnon et al. Jan 2009 A1
20090021739 Tsujita et al. Jan 2009 A1
20090036734 Dunki-Jacobs et al. Feb 2009 A1
20090040754 Brukilacchio et al. Feb 2009 A1
20090052185 Toriyama et al. Feb 2009 A1
20090114799 Maeda May 2009 A1
20090114803 Yamaguchi May 2009 A1
20090122135 Matsui May 2009 A1
20090122152 Yamaguchi et al. May 2009 A1
20090124854 Yamaguchi et al. May 2009 A1
20090153797 Allon et al. Jun 2009 A1
20090181339 Liang et al. Jul 2009 A1
20090201577 LaPlante et al. Aug 2009 A1
20090218405 Joseph et al. Sep 2009 A1
20090290149 Roth Nov 2009 A1
20100065641 Liu et al. Mar 2010 A1
20100087741 Douplik et al. Apr 2010 A1
20100094136 Nakaoka et al. Apr 2010 A1
20100110168 Avni et al. May 2010 A1
20100110393 Chen et al. May 2010 A1
20100121146 Sugimoto May 2010 A1
20100125164 LaBombard May 2010 A1
20100155487 Liu et al. Jun 2010 A1
20100157039 Sugai Jun 2010 A1
20100168588 Matsumoto et al. Jul 2010 A1
20100198010 Cline et al. Aug 2010 A1
20100208487 Li Aug 2010 A1
20100277817 Durell Nov 2010 A1
20100308116 Sani et al. Dec 2010 A1
20110032350 Kikuchi et al. Feb 2011 A1
20110073658 Vassura et al. Mar 2011 A1
20110235017 Iwasaki Sep 2011 A1
20110244506 Sutter et al. Oct 2011 A1
20110270092 Kang et al. Nov 2011 A1
20110290889 Tamburini et al. Dec 2011 A1
20120006897 Barkan et al. Jan 2012 A1
20120044462 Kaji Feb 2012 A1
20120150046 Watson et al. Jun 2012 A1
20120256002 O'Donnell et al. Oct 2012 A1
20120319645 O'Donnell et al. Dec 2012 A1
20130008964 Hawley et al. Jan 2013 A1
20130237762 Fengler et al. Sep 2013 A1
20140071328 Miesak Mar 2014 A1
20140078378 Demers et al. Mar 2014 A1
20140139893 Sugiyama et al. May 2014 A1
20140187967 Wood et al. Jul 2014 A1
20140194687 Fengler et al. Jul 2014 A1
20150184811 Moore Jul 2015 A1
20150230698 Cline et al. Aug 2015 A1
20150320296 Morita Nov 2015 A1
20150381909 Butte et al. Dec 2015 A1
20160041098 Hirawake et al. Feb 2016 A1
20160044253 Dainty et al. Feb 2016 A1
20160100763 Fengler et al. Apr 2016 A1
20160249019 Westwick et al. Aug 2016 A1
20160360956 Moore Dec 2016 A1
20170064257 Westwick et al. Mar 2017 A1
20170064258 Westwick et al. Mar 2017 A1
20170142314 Moore et al. May 2017 A1
20170167980 Dimitriadis et al. Jun 2017 A1
20170209050 Fengler et al. Jul 2017 A1
20170354392 Fengler et al. Dec 2017 A1
Foreign Referenced Citations (126)
Number Date Country
101726980 Jun 2010 CN
101828139 Sep 2010 CN
201974160 Sep 2011 CN
19535114 Mar 1996 DE
19608027 Sep 1996 DE
0512965 Nov 1992 EP
0672379 Sep 1995 EP
0774865 May 1997 EP
0792618 Sep 1997 EP
0671706 Jun 1999 EP
1374755 Jan 2004 EP
1883337 Feb 2008 EP
2051603 Apr 2009 EP
2859837 Apr 2015 EP
2671405 Jul 1992 FR
S60-246733 Dec 1985 JP
S61-159936 Jul 1986 JP
H01-135349 May 1989 JP
03-97439 Apr 1991 JP
03-97441 Apr 1991 JP
03-97442 Apr 1991 JP
05-115435 May 1993 JP
06-125911 May 1994 JP
H07-155285 Jun 1995 JP
H07-155286 Jun 1995 JP
H07-155290 Jun 1995 JP
H07-155291 Jun 1995 JP
H07-155292 Jun 1995 JP
H07-204156 Aug 1995 JP
H07-222712 Aug 1995 JP
H07-250804 Oct 1995 JP
H07-250812 Oct 1995 JP
H07-327913 Dec 1995 JP
H08-126605 May 1996 JP
08-140928 Jun 1996 JP
08-140929 Jun 1996 JP
H08-224208 Sep 1996 JP
H08-224209 Sep 1996 JP
H08-224210 Sep 1996 JP
H08-224240 Sep 1996 JP
H08-252218 Oct 1996 JP
H09-19408 Jan 1997 JP
09-066023 Mar 1997 JP
09-070384 Mar 1997 JP
H10-127563 May 1998 JP
H10-151104 Jun 1998 JP
10-225427 Aug 1998 JP
H10-201700 Aug 1998 JP
H10-201707 Aug 1998 JP
H10-225426 Aug 1998 JP
H10-243915 Sep 1998 JP
H10-243920 Sep 1998 JP
H10-308114 Nov 1998 JP
H10-309281 Nov 1998 JP
H10-309282 Nov 1998 JP
H10-321005 Dec 1998 JP
H10-328129 Dec 1998 JP
H11-47079 Feb 1999 JP
11-089789 Apr 1999 JP
H11-104059 Apr 1999 JP
H11-104060 Apr 1999 JP
H11-104061 Apr 1999 JP
H11-104070 Apr 1999 JP
H11-155812 Jun 1999 JP
H11-113839 Jul 1999 JP
H11-244220 Sep 1999 JP
H11-332819 Dec 1999 JP
2000-504968 Apr 2000 JP
2000-245693 Sep 2000 JP
2000-354583 Dec 2000 JP
2001-78205 Mar 2001 JP
2002-000560 Jan 2002 JP
2002-049302 Feb 2002 JP
2002-244122 Aug 2002 JP
2003-045210 Feb 2003 JP
2004-024611 Jan 2004 JP
2004-094043 Mar 2004 JP
2004-163902 Jun 2004 JP
2004-520105 Jul 2004 JP
2004-247156 Sep 2004 JP
2004-289545 Oct 2004 JP
2004-292722 Oct 2004 JP
2005-010315 Jan 2005 JP
2005-058618 Mar 2005 JP
2005-058619 Mar 2005 JP
2005-058620 Mar 2005 JP
2005-080819 Mar 2005 JP
2005-081079 Mar 2005 JP
2005-149996 Jun 2005 JP
2005-292404 Oct 2005 JP
2006-073767 Mar 2006 JP
2006-087764 Apr 2006 JP
2006-525494 Nov 2006 JP
2007-029453 Feb 2007 JP
2007-072392 Mar 2007 JP
2007-089840 Apr 2007 JP
2010-107751 May 2010 JP
2010-117442 May 2010 JP
2010-524194 Jul 2010 JP
2011-500921 Jan 2011 JP
2011-072424 Apr 2011 JP
2011-169819 Sep 2011 JP
2011-528918 Dec 2011 JP
5231625 Jul 2013 JP
5859578 Feb 2016 JP
99592 Nov 2010 RU
WO-199304648 Mar 1993 WO
WO-199413191 Jun 1994 WO
WO-199526673 Oct 1995 WO
WO-199824360 Jun 1998 WO
WO-199901749 Jan 1999 WO
WO-199953832 Oct 1999 WO
WO-200042910 Jul 2000 WO
WO-200054652 Sep 2000 WO
WO-2002007587 Jan 2002 WO
WO-200250518 Jun 2002 WO
WO-2003059159 Jul 2003 WO
WO-2003059159 Jul 2003 WO
WO-2006116847 Nov 2006 WO
WO-2007081707 Jul 2007 WO
WO-2008011722 Jan 2008 WO
WO-20080071240 Jun 2008 WO
WO-2009033021 Mar 2009 WO
WO-2013160279 Oct 2013 WO
WO-2014176375 Oct 2014 WO
WO-2016055837 Apr 2016 WO
Non-Patent Literature Citations (161)
Entry
US 6,692,429 B1, 02/2004, Imaizumi et al. (withdrawn)
R.F. Lyon & P.M Hubel, “Eyeing the Camera: Into the Next Century”, 10 Color and Imaging Conference Final Program & Proceedings 349-355 (2002).
Australian Examination Report No. 1 dated Jun. 28, 2018 for Australian Application No. 2016351730 filed on Nov. 10, 2016, five pages.
European Decision to Grant dated Jul. 12, 2018 for EP Application No. 12754208.2 filed Oct. 4, 2013, two pages.
European Decision to Grant dated May 25, 2018 for EP Patent Application No. 13180297.7 filed Aug. 13, 2013, two pages.
Indian Office Action dated Jun. 26, 2018 for Indian Patent Application No. 8678/DELNP/2013 filed on Mar. 8, 2012, five pages.
International Preliminary Report on Patentability dated May 24, 2018 for International Application No. PCT/CA2016/051315 filed on Nov. 10, 2016, nine pages.
U.S. Non Final Office Action dated Jun. 5, 2018, for U.S. Appl. No. 14/860,687, filed Sep. 21, 2015, eighteen pages.
U.S. Non Final Office Action dated Jun. 8, 2018, for U.S. Appl. No. 15/343,034, filed Nov. 3, 2016, thirteen pages.
U.S. Non Final Office Action dated May 25, 2018, for U.S. Appl. No. 15/343,038, filed Nov. 3, 2016, eleven pages.
Australian Office Action dated May 10, 2019 for Australian Patent Application No. 2016351730 filed Nov. 10, 2016, ten pages.
Canadian Office Action dated Feb. 19, 2019 for CA Patent Application No. 2,998,920 filed Mar. 16, 2018, four pages.
Chinese Office Action dated Sep. 26, 2018 for Chinese Patent Application No. 2018092001857100, filed on Sep. 4, 2017, nineteen pages.
European Notice of Allowance dated Mar. 18, 2019 for EP Patent Application No. 09819758.5, filed on May 4, 2011, seven pages.
International Preliminary Report on Patentability dated Dec. 27, 2018 for International Patent Application No. PCT/CA2017/050734 filed on Jun. 14, 2017, six pages.
U.S. Final Office Action dated Jan. 11, 2019 for U.S. Appl. No. 15/343,038, filed Nov. 3, 2016, twelve pages.
U.S. Final Office Action dated Jan. 14, 2019 for U.S. Appl. No. 14/860,687, filed Sep. 21, 2015, sixteen pages.
U.S. Final Office Action dated Jan. 22, 2019 for U.S. Appl. No. 15/343,034, filed Nov. 3, 2016, twelve pages.
U.S. Non Final Office Action dated Apr. 3, 2019 for U.S. Appl. No. 15/416,876, filed Jan. 26, 2017, thirteen pages.
U.S. Non Final Office Action dated Aug. 15, 2018 for U.S. Appl. No. 15/348,664, filed Nov. 10, 2016, eleven pages.
U.S. Non Final Office Action dated Feb. 5, 2019 for U.S. Appl. No. 15/623,100, filed Jun. 14, 2017, ten pages.
U.S. Appl. No. 15/810,911, filed Nov. 13, 2017. (Copy not submitted herewith pursuant to the waiver of 37 C.F.R. § 1.98(a)(2)(iii) issued by the Office on Sep. 21, 2004).
U.S. Restriction Requirement dated Feb. 7, 2019 for U.S. Appl. No. 29/562,795, filed Apr. 28, 2016, seven pages.
Australian Notice of Allowance dated Jun. 26, 2019 for Patent Application No. 2016351730 filed on Nov. 10, 2016, three pages.
Brazilian Office Action dated Aug. 5, 2019, for Patent Application No. BR1120130229977, filed Mar. 8, 2012, 4 pages (including English translation).
Canadian Office Action dated Nov. 5, 2019, for Canadian Patent Application No. 3027592, filed on Jun. 14, 2017, four pages.
European Extended Search Report dated Oct. 16, 2019, for Patent Application No. 17743524.5, filed Jan. 26, 2017, 4 pages.
European Extended Search Report dated May 7, 2019, for Patent Application No. 16863277.6, filed Nov. 10, 2016, 3 pages.
Japanese Office Action dated Jul. 12, 2019, for Patent Application No. 2018-51661, filed Nov. 10, 2016, 21 pages (including English translation).
Sensitization (photography), definition from Wikipedia, original language German, 6 pages (Machine Translation).
U.S. Final Office Action dated Jul. 25, 2019 for U.S. Appl. No. 15/416,876, filed Jan. 26, 2017, 13 pages.
U.S. Non-Final Office Action dated Sep. 27, 2019, for U.S. Appl. No. 29/562,795, filed Apr. 28, 2019, 6 pages.
European Notice of Allowance dated Feb. 28, 2018 for EP Patent Application No. 12754208.2 filed Oct. 4, 2013, six pages.
European Notice of Allowance dated Mar. 6, 2018 for EP Patent Application No. 13180297.7 filed Aug. 13, 2013, seven pages.
Indian Office Action dated Jan. 31, 2018 for Indian Patent Application No. 6532/DELNP/2010 filed on Sep. 16, 2010, five pages.
Japanese Notice of Allowance dated Apr. 2, 2018 for Japanese Patent Application No. 2017-018858 filed on Feb. 3, 2017, six pages.
Alfano, R.R. et al. (Oct. 1987). “Fluorescence Spectra From Cancerous and Normal Human Breast and Lung Tissues,” IEEE Journal of Quantum Electronics QE-23(10):1806-1811.
Andersson-Engels, S. et al. (Mar. 1989). “Tissue Diagnostics Using Laser Induced Fluorescence,” Ber. Bunsenges Physical Chemistry 93(3):335-342.
Bhunchet, E. et al. (Apr. 2002). “Fluorescein Electronic Endoscopy: A Novel Method for Detection of Early Stage Gastric Cancer Not Evident to Routine Endoscopy,” Gastrointestinal Endoscopy 55(4):562-571.
Dawson, J.B. et al. (Jul. 1980). “A Theoretical and Experimental Study of Light Absorption and Scattering by In Vivo Skin,” Phys. Med. Biol. 25(4):695-709.
Georgakoudi, I et al. (2003). “Quantitative Characterization of Biological Tissue Using Optical Spectroscopy,” in Chapter 31 of Biomedical Photonics Handbook, Tuan Vo-Dinh (ed.), CRC Press, New York, thirty three pages.
Georgakoudi, I et al. (Apr. 2005). “Characterization of Dysplastic Tissue Morphology and Biochemistry in Barrett's Esophagus using Diffuse Reflectance and Light Scattering Spectroscopy,” Techniques in Gastrointestinal Endoscopy 7(2):100-105.
Hubel, P.M. et al. (2004). “Spatial Frequency Response of Color Image Sensors: Bayer Color Filters and Foveon X3,” Proceedings of SPIE 5301:402-406.
Hung, J. et al. (1991). “Autofluorescence of Normal and Malignant Bronchial Tissue,” Lasers in Surgery and Medicine 11(2):99-105.
Török, B. et al. (May 1996). “Simultane digitale Indocyaningrün—und Fluoreszeinangiographie (Simultaneous Digital ICG and Fluorescein Angiography),” Klin Monatsbl Augenheilkd 208(5):333-336, (with English Translation of the Introduction).
Canadian Examiner's Report for Registration of an Industrial Design dated Feb. 1, 2017 for Canadian Application No. 171282, filed on Oct. 27, 2016, two pages.
Chinese Notice of Allowance dated Jun. 19, 2017 for Chinese Application No. 201280022284.3, filed on Nov. 7, 2013, four pages.
Chinese Office action dated Jul. 29, 2016 for application No. 2012800222843 filed on Mar. 8, 2012, eight pages.
Chinese Office action dated Nov. 24, 2015 for application No. 2012800222843 filed on Mar. 8, 2012, sixteen pages.
Chinese Third Office Action dated Mar. 14, 2017 for Chinese Patent Application No. 201280022284.3, filed on Nov. 7, 2013, seven pages.
European Communication Pursuant to Article 94(3) EPC dated Apr. 13, 2017, filed on Oct. 4, 2013, five pages.
European Communication pursuant to Rules 70(2) and 70a(2) EPC and Reference to Rule 39(1) EPC dated Jan. 23, 2017 for European Application No. 16186321.2 filed on Aug. 30, 2016, two pages.
European Communication under Rule 71(3) EPC dated Nov. 25, 2016 for EP Application No. 08706262.6 filed on Aug. 21, 2009, eight pages.
European Decision to Grant a European Patent Pursuant to Article 97(1) EPC dated Jun. 22, 2017, for EP Application No. 08706262.6 filed on Aug. 21, 2009, two pages.
European Extended Search Report dated Jul. 17, 2014, for EP Application No. 09721252.6 filed Mar. 18, 2009; eleven pages.
European Extended Search Report dated Sep. 20, 2013, for EP Application No. 08706262.6 filed on Jan. 23, 2008, five pages.
European Invitation Pursuant to Article 94(3) and Rule 71(1) EPC dated Apr. 6, 2017, for EP Application No. 09819758.5, filed on May 4, 2011, five pages.
European Office Action dated Dec. 3, 2015, for EP Application No. 08706262.6 filed on Jan. 23, 2008; fifteen pages.
European Office Action dated Nov. 19, 2015, for EP Application No. 07 785 001.4, filed on Jul. 30, 2007, four pages.
European Office Action dated Nov. 3, 2015 for EP Patent Application No. 12754208.2 filed Oct. 4, 2013, four pages.
European Office Action dated Sep. 29, 2015, for EP Application No. 09721252.6 filed Mar. 18, 2009; five pages.
European Search Report and Written Opinion dated Dec. 21, 2016 for European Application No. 16186321.2 filed on Aug. 30, 2016, nine pages.
European Supplemental Search Report dated Jan. 24, 2012, for European Patent Application No. 07785001.4 filed on Jul. 30, 2007, seven pages.
European Supplemental Search Report dated Oct. 1, 2014 for EP Application No. 12754208.2 filed on Mar. 8, 2012, five pages.
European Supplemental Search Report dated Oct. 9, 2013, for European Patent Application No. 06721854.5, filed on May 4, 2005, six pages.
Extended European Search Report dated Jan. 24, 2012 for EP Application No. 07 785 001.4, filed on Jul. 30, 2007, seven pages.
International Preliminary Report on Patentability dated Feb. 3, 2009, for International Application No. PCT/CA2007/001335 filed on Jul. 30, 2007, five pages.
International Preliminary Report on Patentability dated Nov. 6, 2007, for International Application No. PCT/CA2006/000669, filed on Apr. 27, 2006, nine pages.
International Preliminary Report on Patentability dated Sep. 21, 2010, for International Application No. PCT/US2009/037506, filed on Mar. 18, 2009, seven pages.
International Search Report and written Opinion dated Apr. 24, 2017, for International Application No. PCT/CA2017/050083, filed on Jan. 26, 2017, seven pages.
International Search Report and Written Opinion dated Sep. 18, 2017, for International Application No. PCT/CA2017/050734, filed on Jun. 14, 2017, eight pages.
International Search Report and Written Opinion of the International Searching Authority dated Feb. 10, 2017, for International Application No. PCT/CA2016/051315 filed on Nov. 10, 2016, thirteen pages.
International Search Report dated Aug. 3, 2006, for International Application No. PCT/CA2006/000669, filed on Apr. 27, 2006, three pages.
International Search Report dated Aug. 3, 2012, for International Application No. PCT/162012/000601, filed on Mar. 8, 2012, three pages.
International Search Report dated Dec. 7, 2007, for International Application No. PCT/CA2007/001335, filed on Jul. 30, 2007, two pages.
International Search Report dated Jan. 21, 2002, for International Application No. PCT/US2001/022198, filed on Jul. 13, 2001, three pages.
International Search Report dated Jul. 22, 2009, for International Application No. PCT/US09/37506, filed on Mar. 18, 2009, two pages.
International Search Report dated May 13, 2008 for Intentional Application No. PCT/CA2008/00015, filed on Jan. 8, 2008, one page.
Invitation to Pay additional Fees and, where Applicable, Protest Fee, dated Dec. 22, 2016 for International Application No. PCT/CA2016/051315, filed on Nov. 10, 2016, two pages.
Japanese Final Office Action dated Aug. 2, 2013, for Japanese Patent Application No. 2008-509275, filed on Apr. 27, 2006, four pages.
Japanese Office Action dated Dec. 8, 2017 for Japanese Patent Application No. 2017-018858 filed on Feb. 3, 2017, six pages.
Japanese Notice of Allowance dated Jan. 5, 2017 in Japanese Patent Application No. 2015-238784, filed on Dec. 7, 2015, six pages.
Japanese Notice of Allowance dated Nov. 17, 2017, for Japanese Patent Application No. 2016-253736 filed on Dec. 27, 2016, six pages.
Japanese Notice of Allowance dated Nov. 28, 2016 for Japanese Patent Application No. 2015-245598, filed on Mar. 8, 2012, six pages.
Japanese Office Action dated Apr. 20, 2012, issued in counterpart Japanese Application No. 2011-500921, filed Mar. 18, 2009, four pages.
Japanese Office Action dated Apr. 3, 2015 in Japanese Application No. 2013-058356 filed Mar. 18, 2009, four pages.
Japanese Office Action dated Feb. 17, 2012, for Japanese Patent Application No. 2008-509275, filed on Apr. 27, 2006, six pages.
Japanese Office Action dated Jul. 22, 2014 for Japanese Patent Application No. 2013- 557187 filed Mar. 8, 2012, seven pages.
Japanese Office Action dated Mar. 9, 2015 for Japanese Patent Application No. 2013- 557187, filed Mar. 8, 2012, five pages.
Japanese Office Action dated Nov. 11, 2011, for Japanese Patent Application No. 2009-521077, filed on Jul. 30, 2007, four pages.
Japanese Office Action dated Sep. 14, 2012, for Japanese Patent Application No. 2008-509275, filed on Apr. 27, 2006, seven pages.
Japanese Office Action dated Sep. 19, 2014, for Japanese Patent Application No. 2013-246636, filed on Apr. 27, 2006, six pages.
Japanese Office dated Dec. 26, 2012 for Japanese Patent Application No. 2011-500921, filed on Mar. 18, 2009, two pages.
Japanese Patent Office Action dated May 26, 2014 in Japanese Patent Application No. 2013-058356, filed on Mar. 18, 2009, three pages.
Korean Decision of Refusal Action dated Aug. 30, 2016 for patent application No. 10-2015-7033310 filed on Mar. 8, 2012, seven pages.
Korean Decision on the Trial Against Final Rejection from the Intellectual Property Tribunal (IPT) dated Sep. 25, 2017, for Korean Patent Application No. 2013-7026479, filed on Oct. 7, 2013, seventeen pages.
Korean Notice of Allowance dated Jan. 2, 2017 for Korean Application No. 10-2015-7033310, filed on Nov. 20, 2015, three pages.
Korean Office Action dated Aug. 20, 2015 for patent application No. 20137026479 filed on Mar. 8, 2012, three pages.
Korean Office Action dated Dec. 8, 2015 for patent application No. 20157033310 filed on Mar. 8, 2012, seven pages.
Korean Office Action dated Jun. 27, 2017 for Korean Patent Application No. 2017-7008654, filed on Mar. 29, 2017, ten pages.
Korean Notice of Allowance dated Dec. 13, 2017 for Korean Patent Application No. 10-2017-7008654, filed on Mar. 29, 2017, three pages.
Russian Office Action—Decision to Grant dated Aug. 19, 2016 for Russian Patent Application No. 2013144845/07, filed on Mar. 8, 2012, thirteen pages.
U.S. Final Office Action dated Apr. 24, 2015 for U.S. Appl. No. 12/933,512, filed Nov. 24, 2010, nineteen pages.
U.S. Final Office Action dated Aug. 10, 2017, for U.S. Appl. No. 15/343,034, filed Nov. 3, 2016, twelve pages.
U.S. Final Office Action dated Aug. 11, 2017, for U.S. Appl. No. 14/860,687, filed Sep. 21, 2015, seventeen pages.
U.S. Final Office Action dated Aug. 7, 2017, for U.S. Appl. No. 15/343,038, filed Nov. 3, 2016, eleven pages.
U.S. Final Office Action dated Feb. 27, 2017 for U.S. Appl. No. 15/247,419 filed Aug. 25, 2016, ten pages.
U.S. Final Office Action dated Jul. 23, 2008, for U.S. Appl. No. 11/122,267 filed May 4, 2016, six pages.
U.S. Final Office Action dated Jun. 18, 2015, for U.S. Appl. No. 14/154,177, filed Jan. 13, 2014, eight pages.
U.S. Final Office Action dated Jun. 5, 2014, for U.S. Appl. No. 12/761,462, filed Apr. 16, 2010, fourteen pages.
U.S. Final Office Action dated Mar. 22, 2016 for U.S. Appl. No. 14/873,842, filed Oct. 2, 2015, eighteen pages.
U.S. Final Office Action dated May 11, 2011, for U.S. Appl. No. 11/412,715, filed Apr. 26, 2006, eight pages.
U.S. Final Office Action dated May 21, 2012, for U.S. Appl.No. 11/964,330, filed Dec. 26, 2007, twelve pages.
U.S. Final Office Action dated Nov. 24, 2009, for U.S. Appl. No. 11/009,965, filed Dec. 10, 2004, fourteen pages.
U.S. Non Final Office Action dated Apr. 2, 2009, for U.S. Appl. No. 11/009,965, filed Dec. 10, 2004, thirteen pages.
U.S. Non Final Office Action dated Aug. 16, 2013, for U.S. Appl. No. 12/761,462, filed Apr. 16, 2010, ten pages.
U.S. Non Final Office Action dated Aug. 16, 2013, for U.S. Appl. No. 12/761,523, filed Apr. 16, 2010, nine pages.
U.S. Non Final Office Action dated Dec. 10, 2010, for U.S. Appl. No. 11/412,715, filed Apr. 26, 2006, ten pages.
U.S. Non Final Office Action dated Dec. 14, 2011, for U.S. Appl. No. 11/412,715, filed Apr. 26, 2006, eight pages.
U.S. Non Final Office Action dated Feb. 1, 2017, for U.S. Appl. No. 14/860,687, filed Sep. 21, 2015, sixteen pages.
U.S. Non Final Office Action dated Feb. 3, 2010, for U.S. Appl. No. 11/626,308, filed Jan. 23, 2007, eleven pages.
U.S. Non Final Office Action dated Jan. 2, 2008, for U.S. Appl. No. 11/122,267, filed May 4, 2005, five pages.
U.S. Non Final Office Action dated Jan. 20, 2016, for U.S. Appl. No. 14/629,473, filed Feb. 23, 2015, fifteen pages.
U.S. Non Final Office Action dated Jan. 26, 2017, for U.S. Appl. No. 15/343,034, filed Nov. 3, 2016, seventeen pages.
U.S. Non Final Office Action dated Jan. 27, 2017, for U.S. Appl. No. 15/343,038, filed Nov. 3, 2016, fifteen pages.
U.S. Non Final Office Action dated Jul. 17, 2003, for U.S. Appl. No. 09/905,642, filed Jul. 13, 2001, six pages.
U.S. Non Final Office Action dated Jul. 2, 2013 for U.S. Appl. No. 12/933,512, filed Nov. 24, 2010, twelve pages.
U.S. Non Final Office Action dated Jun. 1, 2007, for U.S. Appl. No. 10/899,648, filed Jul. 26, 2004, seven pages.
U.S. Non Final Office Action dated Jun. 20, 2008, for U.S. Appl. No. 11/009,398, filed Dec. 10, 2004, fifteen pages.
U.S. Non Final Office Action dated Jun. 23, 2010, for U.S. Appl. No. 11/009,965, filed Dec. 10, 2004, fifteen pages.
U.S. Non Final Office Action dated Jun. 27, 2014 for U.S. Appl. No. 13/415,561, filed Mar. 3, 2012, fourteen pages.
U.S. Non Final Office Action dated Jun. 9, 2011, for U.S. Appl. No. 11/830,323, filed Jul. 30, 2007, five pages.
U.S. Non Final Office Action dated May 18, 2004, for U.S. Appl. No. 10/050,601, filed Jan. 15, 2002, eight pages.
U.S. Non Final Office Action dated Nov. 23, 2009, for U.S. Appl. No. 11/969,974, filed Jan. 7, 2008, seven pages.
U.S. Non Final Office Action dated Nov. 5, 2014, for U.S. Appl. No. 13/930,225, filed Jun. 28, 2013, six pages.
U.S. Non Final Office Action dated Oct. 23, 2013 for U.S. Appl. No. 13/415,561, filed Mar. 8, 2012, ten pages.
U.S. Non Final Office Action dated Oct. 5, 2016 for U.S. Appl. No. 15/247,419, filed Aug. 25, 2016, eight pages.
U.S. Non Final Office Action dated Oct. 7, 2011, for U.S. Appl. No. 11/964,330, filed Dec. 26, 2007; ten pages.
U.S. Non Final Office Action dated Sep. 12, 2014, for U.S. Appl. No. 14/154,177, filed On Jan. 13, 2014, four pages.
U.S. Non Final Office Action dated Sep. 6, 2016 for U.S. Appl. No. 14/873,842, filed Oct. 2, 2015, seven pages.
U.S. Non Final Office Action with Restriction Requirement dated Mar. 4, 2011, for U.S. Appl. No. 11/830,323, filed Jul. 30, 2007, nine pages.
U.S. Notice of Allowance dated Dec. 30, 2016, for U.S. Appl. No. 14/873,842, filed Oct. 2, 2015, eleven pages.
U.S. Notice of Allowance dated Apr. 7, 2004, for U.S. Appl. No. 09/905,642, filed Jul. 13, 2001, six pages.
U.S. Notice of Allowance dated Aug. 26, 2004, for U.S. Appl. No. 10/050,601, filed Jan. 15, 2002, eight pages.
U.S. Notice of Allowance dated Aug. 6, 2015, for U.S. Appl. No. 13/853,656, filed Mar. 29, 2013, seven pages.
U.S. Notice of Allowance dated Dec. 10, 2012, for U.S. Appl. No. 11/964,330, filed Dec. 26, 2007, seven pages.
U.S. Notice of Allowance dated Feb. 25, 2010, for U.S. Appl. No. 11/969,974, filed Jan. 7, 2008, four pages.
U.S. Notice of Allowance dated Jan. 2, 2008, for U.S. Appl. No. 10/899,648, filed Jul. 26, 2004, three pages.
U.S. Notice of Allowance dated Jul. 10, 2017 for U.S. Appl. No. 15/247,419 filed Aug. 25, 2016, eight pages.
U.S. Notice of Allowance dated Jun. 25, 2015, for U.S. Appl. No. 12/933,512, filed Nov. 24, 2010 fourteen pages.
U.S. Notice of Allowance dated Mar. 22, 2013, for U.S. Appl. No. 11/964,330, filed Dec. 26, 2007, eight pages.
U.S. Notice of Allowance dated Mar. 28, 2016, for U.S. Appl. No. 13/853,656, filed Mar. 29, 2013, eight pages.
U.S. Notice of Allowance dated May 18, 2015, for U.S. Appl. No. 13/930,225, filed Jun. 28, 2013, nine pages.
U.S. Notice of Allowance dated Nov. 23, 2015, for U.S. Appl. No. 13/853,656, filed Mar. 29, 2013, seven pages.
U.S. Notice of Allowance dated Oct. 10, 2014, for U.S. Appl. No. 12/761,462, filed Apr. 16, 2010, ten pages.
U.S. Notice of Allowance dated Oct. 5, 2007, for U.S. Appl. No. 10/899,648, filed Jul. 26, 2004, six pages.
U.S. Notice of Allowance dated Sep. 10, 2013, for U.S. Appl. No. 11/412,715, filed Apr. 26, 2006, eight pages.
U.S. Notice of Allowance dated Sep. 14, 2012, for U.S. Appl. No. 11/830,323, filed Jul. 30, 2007, eight pages.
U.S. Supplemental Notice of Allowability dated Mar. 10, 2005, for U.S. Appl. No. 10/050,601, filed Jan. 15, 2002, five pages.
Written Opinion of the International Searching Authority dated Aug. 3, 2006, for International Application No. PCT/CA2006/000669, filed on Apr. 27, 2006, eight pages.
Written Opinion of the International Searching Authority dated Dec. 7, 2007, for International Application No. PCT/CA2007/001335, filed on Jul. 30, 2007, four pages.
Related Publications (1)
Number Date Country
20170273567 A1 Sep 2017 US
Provisional Applications (1)
Number Date Country
61037514 Mar 2008 US
Continuations (2)
Number Date Country
Parent 14873842 Oct 2015 US
Child 15584405 US
Parent 12933512 US
Child 14873842 US