Imaging system with a single color image sensor for simultaneous fluorescence and color video endoscopy

Abstract
An endoscopic video system and method using a camera with a single color image sensor, for example a CCD color image sensor, for fluorescence and color imaging and for simultaneously displaying the images acquired in these imaging modes at video rates in real time is disclosed. The tissue under investigation is illuminated continuously with fluorescence excitation light and is further illuminated periodically using visible light outside of the fluorescence excitation wavelength range. The illumination sources may be conventional lamps using filters and shutters, or may include light-emitting diodes mounted at the distal tip of the endoscope.
Description
BACKGROUND OF THE INVENTION

The invention is directed to methods and systems for simultaneous real-time fluorescence and color video endoscopy at close to video frame rates. The invention is also directed to high-efficiency illumination sources and to methods and systems for controlling temporal and spectral output of these light sources.


Medical endoscopy is increasingly employing specialized optical imaging techniques, such as fluorescence (i.e. autofluorescence and photodynamic) endoscopy, narrow band imaging and other techniques, for improved visualization and for the detection and diagnosis of diseases, Endoscopic imaging systems that provide specialized imaging modes typically also operate in a conventional color, or white-light, endoscopy mode. Embodiments of endoscopic imaging systems incorporating both a color and fluorescence imaging modes have been disclosed, for example, in U.S. Pat. No. 6,462,770 B1, U.S. Pat. No. 6,821,245 B1, and U.S. Pat. No. 6,899,675 B2.


In conventional white-light endoscopy, hereinafter also referred to as color imaging mode, light in the visible spectral range is used to illuminate the tissue surface under observation. Light reflected by the tissue passes through a suitable lens system and is incident on an image sensor built into or attached to the endoscope. The electrical signals from the image sensor are processed into a full color video image which can be displayed on a video monitor or stored in a memory. In fluorescence endoscopy, fluorescence excitation light excites fluorophors in the tissue, which emit fluorescence light at an emission wavelength which is typically greater than the excitation wavelength. Fluorescence light from the tissue passes through a suitable lens system and is incident on the image sensor. The electrical signals from the image sensor are processed into a fluorescence video image which can be displayed on a video monitor, either separately from or together with the color video image, or stored in a memory.


The fluorescence excitation and emission wavelengths depend upon the type of fluorophors being excited. In the case of exogenously applied fluorophors, the band of excitation wavelengths may be located anywhere in the range from the ultraviolet (UV) to the near infra-red (NIR) and the emission wavelength band anywhere from the visible to the NIR. For fluorophors endogenous to tissue, the band of excitation and emission wavelengths are more limited (excitation from the UV to the green part of the visible spectrum, emission from the blue-green to the NIR).


In a conventional fluorescence/white-light endoscopic imaging system, the system can be switched between color and fluorescence modes either automatically or with a hand- or foot-operated external switch. Both the illumination and imaging characteristics of the endoscopic imaging system may require adjustment when switching the operation of an endoscopic imaging system from one mode to the other. For example, gain adjustments and additional image processing (e.g., pixel binning, time averaging, etc.) may be required because the image signal in color imaging mode tends to be substantially greater than the image signal from endogenous (tissue) fluorescence. Although switching between imaging modes with an automated device is not difficult, additional time may be required to complete the endoscopic procedure because areas of interest are examined sequentially in each mode.


It would therefore be desirable to provide an endoscopic imaging system capable of acquiring and displaying images in both conventional color (“white-light”) and fluorescence imaging modes simultaneously. It would further be desirable to employ high-efficiency illumination sources that can be easily controlled over the spectral range of interest for endoscopy.


SUMMARY OF THE INVENTION

The invention disclosed herein describes an endoscopic video system and method using a single color image sensor for fluorescence and color imaging and for simultaneously displaying the images acquired in these imaging modes at video rates. The color imager may include a CCD color image sensor. The endoscopic video system has no moving parts.


According to one aspect of the invention, tissue is illuminated continuously with fluorescence excitation light and is further illuminated periodically using visible light outside of the fluorescence excitation wavelength range. The method furthermore utilizes an excitation light blocking filter which substantially blocks the excitation light while allowing the blue, green and red components of the illumination light to pass to the color image sensor. In one embodiment, the single color image sensor may be disposed in the tip of the endoscope, in which case the excitation light blocking filter is mounted in or on the tip of video endoscope.


With the method of the invention, fluorescence images are acquired during a time period when only the excitation light is supplied as illumination, while color images are acquired during a time period when the combination of both excitation light and visible light outside of the excitation wavelength range are supplied as illumination. The image fields are read out from the single CCD color image sensor in an interlaced fashion and processed to produce corresponding full-frame fluorescence and white-light images. Real-time fluorescence and white-light images of the tissue are then produced by subtracting from each full-frame combined fluorescence and white-light image the corresponding fluorescence image on a pixel-by pixel basis.


In one embodiment, the illumination light may be switched on for one cycle and switched off for two cycles, wherein a different image field of the combined tissue fluorescence and white-light image is read out during each of the two cycles when the illumination light is switched off, and a different image field of the tissue fluorescence image are read out during each of the cycles when the illumination light is switched on. A cycle may have a duration of 1/60 second. Four full frame white-light images and two full frame fluorescence images may be generated every six cycles.


The image data can be interpolated during cycles when no actual image data are available. For example, during a cycle where no full frame white-light image is produced, an interpolated full frame white-light image may be computed from two adjacent full frame white-light images. Likewise, the fluorescence signals may be interpolated between sequential fluorescence frames before being subtracted from the white-light image signals.


In yet another embodiment, pixel values of adjacent rows of the CCD color image sensor are added pixel-by-pixel to form summed row pixel values and the summed values are read out in an interlaced fashion.


In one embodiment, a high-resolution video image may be generated by computing a luma image of the combined full-frame fluorescence and white-light image signals and colorizing the luma image based on a ratio of red reflectance to fluorescence signals to produce a superimposed fluorescence/color image for display. Processing an image based on the luma data enhanced the attainable spatial resolution. A change in tissue pathology, as indicated by a change in the fluorescence signal from that tissue, can be represented as a change in color in the video image.


Further features and advantages of the present invention will be apparent from the following description of preferred embodiments and from the claims.





BRIEF DESCRIPTION OF THE DRAWINGS

The following figures depict certain illustrative embodiments of the invention in which like reference numerals refer to like elements. These depicted embodiments are to be understood as illustrative of the invention and not as limiting in any way.



FIG. 1 shows a schematic block diagram of an exemplary fluorescence endoscopy video system with a single distal color image sensor;



FIG. 2 shows the camera of FIG. 1 with an excitation light blocking filter;



FIG. 3 shows a schematic block diagram of a first exemplary embodiment of an illumination source according to the invention;



FIG. 4 shows a schematic block diagram of a second exemplary embodiment of an illumination source according to the invention;



FIGS. 5A-5C show a filter arrangement on a CMGY image sensor (FIG. 5A) and interlaced readout (FIGS. 5B-5C) with summing on chip;



FIGS. 6A-6C show a filter arrangement on a CMGY image sensor (FIG. 6A) and interlaced readout (FIGS. 6B-6C) without summing on chip;



FIG. 7 shows a timing diagram for excitation light and imaging light exposure;



FIG. 8 shows a timing diagram for reading from the color sensor fluorescence and color image information;



FIG. 9 shows schematically block diagram of a process according to the invention for extracting fluorescence and color images; and



FIGS. 10A and 10B illustrate an LED assembly configured as a multi-wavelength illumination source for endoscopy.





DETAILED DESCRIPTION

In conventional white-light (color imaging) endoscopy, broadband visible light is used to illuminate the tissue under observation. Historically, endoscopes used for white light endoscopy have incorporated fiberoptic light guides to transmit light from lamps to provide this type of illumination. In fluorescence endoscopy, fluorophors in the tissue are excited by illumination with a shorter wavelength light and the resulting fluorescence emission is detected at Stokes-shifted longer wavelengths. The fluorophors may be either endogenous to the tissue (i.e., naturally present) or exogenous (e.g., dyes applied to enhance contrast for diagnostic or other imaging purposes). Since the fluorescence process tends to be rather inefficient, the intensity of the shorter wavelength excitation light is typically several orders of magnitude greater than the intensity of the resulting fluorescence emission. As such, both direct visualization and imaging of emissions from fluorophors requires the use of a barrier filter that blocks transmission of the reflected shorter wavelength excitation light and prevents the excitation light from overwhelming the eye or image sensor used to observe/detect the emitted fluorescence. A certain minimum level of excitation light intensity is also required to provide the desired quality of (optical or electronic) image signal. The desired amount of excitation light will depend on the type and concentration of fluorophors to be excited, distance to the tissue and size of the area being visualized imaged, the sensitivity of the eye/image sensor and similar related factors. As a result, particularly in the case of natural (i.e., endogenous) tissue fluorescence, endoscopy imaging systems operating in fluorescence mode typically employ powerful arc lamps or lasers to excite fluorophors as well as highly sensitive cameras to image fluorescence emissions from these fluorophors.



FIG. 1 is a block diagram of a fluorescence endoscopy video system 1 in accordance with one embodiment of the present invention. The system includes a multi-mode light source 12 that generates light for obtaining color and fluorescence images. The use of the light source for obtaining different kinds of images will be described in further detail below. Light from the light source 12 is supplied to an illumination guide 16 of an endoscope 10, which then illuminates a tissue sample 200 that is to be imaged.


As also shown in FIG. 1, the system also includes a camera 100, for example, a solid-state camera based on a CCD or CMOS sensor chip, which in the exemplary embodiment is located at the distal or insertion end of the endoscope 60. Alternatively, although not illustrated, the camera 100 may also be positioned at another location, such as the proximal end of the endoscope 60. In the depicted embodiment, the light from the tissue is directly captured by the camera 100, and the operation of the system is similar to video endoscopes currently on the market (such as the Olympus CF-240L).


A processor/controller 14 controls the camera 100 and the light source 12, which will be described in more detail below, and produces video signals that are displayed on a video monitor 18. The processor/controller 14 communicates with the camera 100 by wire or other signal communication devices that are routed within the endoscope, such as optical fiber. Alternatively, communication between the processor/controller 14 and the camera 100 can be conducted over a wireless link. Clinically relevant information about the health of the tissue under observation may be contained in the intensity of the fluorescence emission within a specific wavelength range.


For autofluorescence endoscopy (endoscopy using endogenous fluorophors), such information is contained in the green wavelength range of the emitted fluorescence. It has been observed that green florescence is increasingly suppressed as the tissue becomes increasingly diseased. However, the red fluorescence signal does not vary with the disease state of the tissue and can hence be used to distinguish between intensity variation in the green fluorescence emission due to the disease state of the tissue and intensity variations due to imaging artifacts, such as shadows or geometry effects (e.g., imaging distance). A single multicolor image can be formed in which the color is indicative of the health of the examined tissue by combining the image information from a wavelength range that varies with the disease state (green fluorescence) with the image information from a wavelength range that does not vary with the disease state (red fluorescence) of the tissue.



FIG. 2 shows schematically an exemplary embodiment of camera 100 with color image sensor 22 and light collimating optics 26. Positioned between the tissue 200 and color image sensor 22 is an excitation light blocking filter 24 which blocks reflected excitation light from reaching image sensor 22, while allowing imaging light and fluorescence light to pass. The advantage of this configuration is that all imaging is performed and controlled by the same imaging optics 113. In an alternative embodiment, the excitation light blocking filter 24 may be placed distal of the light collimating optics 26, and in some embodiments may be disposed on the outside of the distal tip of the endoscope, for example, when converting a white-light imaging endoscope into an imaging/fluorescence endoscope. An externally mounted excitation light blocking filter is described in, for example, commonly assigned U.S. application Ser. No. 11/412,715.


The white light/fluorescence video endoscopy system of the invention operates by illuminating the sample with either excitation light alone or with a combination of excitation light and illumination light in a wavelength range or in wavelength ranges outside the spectral range of the excitation spectrum. The light source for excitation light and illumination light can be, for example, an arc lamp, a solid state light emitter such as one or more diode lasers or light emitting diodes, or any other light source emitting light in a suitable wavelength range. The light source can be a single light source, wherein a portion of the light is filtered out to provide excitation light, and another portion of the light is filtered out to provide illumination light. Alternatively, different light sources can be provided for excitation light and illumination light, respectively. The illumination light is timed, either by using an external shutter 37 or, if light sources with a rapid response are used, by turning the light sources on and off.



FIG. 3 shows in more detail a first embodiment of a multi-mode light source 30 for simultaneously illuminating a tissue sample 200 with continuous fluorescence excitation light and switched illumination light. Light source 30 includes a first light source 31, for example, an arc lamp, and a collimating lens 33 for producing a high intensity, preferably collimated spectral output Ssource which includes an excitation wavelength range. A bandpass filter 34 filters out spectral components outside the excitation wavelength range Sexcitation and allows only spectral components within the excitation wavelength range Sexcitation to pass. Light source 30 further includes a second light source 32, for example, a halogen lamp, for producing a preferably collimated spectral output Sillumination with a high intensity in an imaging wavelength range covering, for example, the visible spectral range. Light source 32 may be switched with timing signals produced by processor/controller 14, for example, by placing a mechanical or electronic shutter 37 between second light source 32 and dichroic mirror 35 or by controlling the electric current supplied to light source 32. The combined collimated excitation/imaging light is focused by lens 36 onto the input face 37 of an optical fiber illumination guide 16 with a suitable numerical aperture (NA).



FIG. 4 shows a second embodiment of a multi-mode light source 40 for simultaneously illuminating a tissue sample 200 with continuous fluorescence excitation light and switched illumination light. Light source 40 includes an excitation/illumination light source 31, for example, an arc lamp, and a collimating lens 33 for producing a high intensity, preferably collimated spectral output Ssource which includes an excitation wavelength range Sexcitation. A dichroic mirror 41 reflects the spectral illumination component Sillumination and passes the excitation wavelength range Sexcitation which may be additionally narrow-band filtered by bandpass filter 34. The light component reflected off a first dichroic mirror 41 is then reflected by mirror 42, passes through a shutter 45 (mechanical or electronic) and is then further reflected by mirror 43 and reflected at second dichroic mirror 44 to become collinear with the excitation light passing through filter 34. As before, the combined collimated excitation/imaging light is focused by lens 36 onto the input face 37 of an optical fiber illumination guide 16 with a suitable numerical aperture (NA). This embodiment takes advantage of the fact that a suitable arc lamp can emit over a wavelength range which covers both the excitation light spectrum and the illumination light spectrum. The shutter 45 may be switched by timing signals produced by processor/controller 14.


Suitable filters, for example, a low-pass filter to block excitation light and/or a high-pass filter to block unswitched illumination light, may be placed along the optical paths.


In operation, when the switched light source 32 is off (or the shutter 45 is closed), only excitation light illuminates the tissue 200, for example, through the endoscope illumination guide 16. The reflected excitation light is blocked from reaching the color image sensor by the excitation light blocking filter 24, while tissue fluorescence light passes through the excitation light blocking filter 24 and reaches the color image sensor 22 for fluorescence light detection.


When the illumination light source 32 is switched on (or the shutter 45 is open), the combined light from the illumination light source 32 and the excitation light source 31 is coupled into the endoscope light guide 14 and illuminates the tissue 200. The reflected excitation light (and any residual light from the switched light source at that wavelength) is blocked as before by the excitation light blocking filter 24, while the combination of both tissue fluorescence and reflected illumination light (“white light”) is imaged by the color image sensor 22.



FIGS. 5A-5C show an exemplary arrangement of spectral filter elements disposed on the pixels of a CMGY image sensor (FIG. 5A) and an interlaced readout (FIGS. 5B-5C) with on-chip summing of pixels from adjacent rows. The first half-frame in the embodiment depicted in FIGS. 5A, 5B, 5C is here composed of the sum of lines 1 and 2; 3 and 4; 5 and 6; and so on, whereas the second half-frame is composed of the sum of lines 2 and 3; 4 and 5; and so on. FIG. 6A shows the same filter arrangement as in FIG. 5A, but with a different interlaced readout (FIGS. 6B-6C) without on-chip summing. The first half-frame in the embodiment depicted in FIGS. 6A, 6B, 6C is composed of lines 1; 3; 5; and so on, whereas the second half-frame is composed of the lines 2; 4; and so on.


Most video endoscopes and endoscopic video cameras currently use COD image sensors with CMGY color filters since these tend to provide the highest quality color images.



FIG. 7 shows a timing diagram according to the invention for operating the exemplary endoscope system. As can be seen from curve (a) in the diagram, the excitation light source 31 is turned on at time T0, irradiating the tissue continuously with fluorescence excitation light. Conversely, as depicted by curve (b), the illumination light source 32 is periodically switched on and off (or shutter 37 or 45 is opened and closed) with a duty factor of 33%, i.e. the illumination light source is turned on at times T1, T4, T7, . . . (i.e., at times T1+3*n with n=0, 1, 2 . . . ) for one field period and turned off again at times T2, T5, T8, . . . , T2+3*n for two field periods, which include times T3, T6, T9, . . . , T3+3*n. In the depicted example, a field period has a duration of 1/60 s=16.7 ms.


As mentioned above, the exemplary image sensor is read out in an interlaced fashion, so that even lines and odd lines are read alternatingly, with or without summation on the chip. An image with full vertical resolution is then generated in the video processor/controller 14 by combining two sequential interlaced fields to form a full video frame for the fluorescence image and for the combined fluorescence/white-light image.



FIG. 8 describes in more detail the temporal illumination and readout pattern of the interlaced CCD image as a function of time.


Before the image acquisition begins in the depicted example at time T1, the COD is illuminated only with fluorescence excitation light. The even-fields acquired in the time interval preceding T1 contain fluorescence-only data which are read out at T1. At the same time, the illumination light is turned on, so that the COD is now illuminated with fluorescence excitation light and illumination light between the times T1 and T2.


The illumination light is turned off at time T2, in the present example after 16.7 ms, and the image data representing “color-plus-fluorescence” are read out for the odd field at T2 and for the even field at T3. The COD is illuminated from T2 until T4 with fluorescence light only and acquires a new fluorescence signal. It should be noted that the fluorescence signal is acquired during two field periods, whereas the added illumination light is acquired only during one field period, which provides an improved signal over other methods, where the fluorescence signal and the illumination signal are acquired with the same duty cycle.


The image signals from the color image sensor acquired alternatingly during “fluorescence-only” and “color-plus-fluorescence” measurements are supplied to processor/controller 14 which stores and processes the image signals to form the desired images for display. The processor/controller 14 may be, for example, a processor selected from the Texas Instruments C64XX family of image processors. The processing of a specific field depends on whether the field is to be used to generate a fluorescence image or a color (white tight) image. The processor/controller 14 may also synchronize the operation of the switched illumination light source with the image acquisition, as described above.


This exposure and read-out scheme described above generates from the combination of odd and even fields a full frame of fluorescence image information every six field time periods. In the depicted example, each field time period is 16.7 ms. In other words, the full frame fluorescence image is completely updated every tenth of a second. During the same six (6) field periods, four fields (two even fields and two odd fields) of color image information are generated and these even- and odd-line fields are suitably combined and processed to generate four (4) full vertical resolution color video frames during the same six (6) field periods. As seen in column 6 of FIG. 8, the display signal written into buffer memory still includes the fluorescence signal component, which is then subtracted to yield the color image signal. The transformation of image data from the CMGY image space to the RGB image space for display is conventional and will not be described further.


Because during six (6) field periods the image data contain only 2 (two) fields of color information, rather than three (3) video frames, the image data may advantageously be interpolated between sequential data points. In this way, the image quality can be improved by providing a smooth transition between frames, so that the final color video image is perceived by the human eye as being substantially similar to the field update rate in a normal video signal.


Once the image signals in Column 6 of FIG. 8 are transferred to the image processor 14, the color (white-light) images and the fluorescence images are separated on a frame-by-frame basis. The color information is extracted from these frames (i.e. the contribution from the fluorescence signal is removed) by subtracting pixel-by-pixel a fluorescence signal value from each “color-plus-fluorescence” frame. Advantageously, the subtracted fluorescence signal values are interpolated from the preceding stored “fluorescence-only” frame and the “fluorescence-only” frame following the “color-plus-fluorescence” frame being processed. This causes at most a delay of two fields, in the present example of 66.7 ms, between image acquisition and display.


After the fluorescence contribution is subtracted, the color balance of the remaining image signals may still need to be corrected for proper white balance. This correction may be performed using conventional image processing and color-space transformation methods by using a compensation matrix or similar processing techniques, which convert the image signal from one color space to another. The processing of fluorescence image fields is somewhat less complex because the fluorescence image data do not include image data from other sources. Accordingly, fluorescence image data produced in multiple, non-overlapping spectral ranges may be processed and displayed as a real color or false color image (for example, green fluorescence from fluorescein) may be displayed as green and IR fluorescence from ICG may be displayed as red, etc., in the same fashion as white light color images are processed and displayed on a video monitor. Using this type of fluorescence imaging display for autofluorescence or endogenous tissue fluorescence imaging, areas of tissue in which the green fluorescence is suppressed due to abnormal pathology will appear red since the red fluorescence is proportionally less suppressed.


The processor/controller circuit 14 can carry out inter-image computation for superimposing a fluorescence image and a white-light light image on video monitor 18. An operator can therefore view the fluorescence image and the white-light light image simultaneously, without introducing a perceptible time delay between them. Consequently, for example, the location of a lesion can be readily viewed with high precision, which is very useful for diagnosis.



FIG. 9 illustrates schematically a process flow which may be performed, for example, by processor/controller 14, to extract fluorescence images and reflectance images, and to correct image intensity and color information for improving spatial resolution and for simultaneously displaying fluorescence/reflectance images.


The depicted process assumes that the excitation light, labeled (A) in FIG. 9, is emitted in the blue/UV spectral range, for example, for exciting fluorescence in fluorescein, which is detected in the green spectral range. However, other fluorescent dyes such as ICG which has excitation/fluorescence wavelengths in the red/IR spectral range can also be used, and the present invention is not limited to particular fluorescent materials or wavelengths. Illumination light is emitted at wavelengths outside the excitation light wavelengths and is shown together with the excitation light in FIG. 9 as (B).


When the tissue is illuminated with fluorescence light only, e.g., during the time interval between T0 and T1 (FIG. 7), a fluorescence spectrum (C) is detected by color sensor 22. When the tissue is illuminated with fluorescence light+illumination light, e.g., in the time interval between T1 and T2 (FIG. 7), a fluorescence+color image spectrum (D) is detected by color sensor 22. The fluorescence spectrum (C) is then subtracted from the fluorescence+color image spectrum (ID) to produce the spectral response of the color image (E). This color image can then be displayed at 92.


Advantageously, the “luma” component of the fluorescence+color image is extracted, shown as (F). Luma refers to the brightness in an image, i.e., the not-gamma-corrected “black and white” or achromatic portion of the image. Stated differently, luma represents the achromatic image without any color, while the chroma components represent the color information. The luma component can be used for extracting more accurate spatial information from the image data.


In one embodiment, the red reflectance signal (G) is extracted from the color image frames. A ratio of fluorescence to red reflectance for spatially corresponding pixels in the fluorescence and color video frames is calculated, at 94, on a pixel-by-pixel basis, and the value of that ratio is used to determine the color (chroma) of the display pixel at that same location, at 94. The color of a display pixel is assigned such that ratio values that indicate suppressed green fluorescence and abnormal pathology are rendered in a contrasting color to pixels in which the ratio values are characteristic of normal green fluorescence values indicating normal tissue pathology. Although the color (chroma) of the display pixels is based upon a ratio of fluorescence to reflectance signal for that pixel, the brightness (luma) of each display pixel may simply be taken as the brightness (luma) of each color video frame pixel. Because the color, or white-light, video fields are updated at near video rates (i.e. 4 times in a 6 field period, see FIGS. 7 and 8), the resulting fluorescence/reflectance image brightness defining the luma will also be updated at that rate. Conversely, the chroma portion of the fluorescence/reflectance image will be updated somewhat more gradually (due to the less frequent field update rate of the fluorescence image signals). However, the human eye is less sensitive to changes in color than to changes in brightness, so that the slower fluorescence field update rate will be less objectionable in the image display and can still be regarded as a real-time image. The luma image (F) can then be colored according to the chroma information derived from the red reflectance (G).


Normalizing a fluorescence image by a red light image is advantageous, because the color of mucosa inside a human body is dominated by hemoglobin which is a pigment and predominantly absorbs light with wavelengths shorter than 600 nm. The reference image used for normalization should therefore represent reflected wavelengths of 600 nm or longer. The normalized fluorescence image can then be used as an accurate representation of the intensity of actual fluorescence or the degree of accumulation of an antibody labeled, for example, by indocyanine green (ICG). Normalization of a fluorescence image is not limited to normalization relative to a red light image. Alternatively, an image depicted by infrared fluorescence components may be used for the normalization.


It should be mentioned that for removing excitation light, the excitation light blocking filter 24 in FIG. 2 may be replaced by a dichroic mirror which reflects the spectral components of the excitation light.


Recent developments in solid state lighting technology have given rise to the use of solid state devices, such as light-emitting diodes (LEDs) and lasers, as sources of endoscopic illumination which may eventually replace the lamps 31 and 32 in the multimode light source 12. Since LEDs are very compact, inexpensive, reliable, and have a long lifetime (on the order of 10,000 hours or longer, depending on the drive current), incorporation of this illumination technique in endoscopic medical equipment will lead to lower cost endoscopic light sources and hence also to less expensive endoscopes.


Solid state illumination sources, in particular LEDs, with emission wavelengths ranging from the deep UV to the infrared spectral range, have recently become available. These LEDs have several advantages which makes them particularly suitable for endoscopy: they can be manufactured to have a narrow, controllable spectral emission range which may be tuned to the fluorescence excitation spectra of the fluorophors; they are very efficiently in converting electric input power to optical output power; they can be rapidly switched on and off; and their power output can be adjusted by varying the electric current through the device which facilitates control and timing of the spectral output of an LED-based illumination source.


Due to their small die size, LEDs may be disposed at or incorporated in the distal tip of an endoscope. For example, as shown schematically in FIGS. 10A and 10B, several LEDs mounted on a common carrier can provide both narrow-band shorter wavelength excitation light for fluorescence endoscopy and broader visible illumination light for white-light endoscopy. FIG. 10A is a schematic top view of an illumination assembly 110 with an excitation, e.g. UV LED 112 die for providing excitation light, which is surrounded by blue (CWL 470 nm), green (CWL 525 nm), (CWL 590 nm) amber and red (CWL 630 nm) LED dies 114, 115, 116, 117 that provide illumination light. The indicated wavelengths are exemplary only and not intended to limit the scope of the invention. Also indicated are bonding pads 122 to electrically connect the LEDs to external wires (not shown). In general, more than the two indicated bonding pads may be provided. Each of the LEDs may be controlled individually.


In another embodiment not shown in the drawings, a so-called “white” LED which generates illumination light covering the visible spectral range can be employed instead of separate blue, green, red, and amber LEDs. “White” LEDs convert blue or UV radiation emitted by the blue- or UV-emitting LED die to visible light by downconversion of the blue- or UV-emission with a suitable phosphor. Both types of LEDs have recently become commercially available. Advantageously, the LEDs can be lensed for efficient directional illumination of the target tissue. The excitation LED may emit light in any spectral range suitable for exciting fluorescence in a dye, such as in the blue for fluorescein and in the near IR for ICG.


It will be understood that light emitted by the illumination LEDs should not contain spectral components in a wavelength range where dye fluorescence is excited. To eliminate emission at excitation light wavelengths from reaching the tissue under examination, suitable cutoff or passband, for example notch filters, may be placed in the optical path of the separate color LEDs or the “white-light” LEDs of illumination assembly 110.


Although LEDs convert electric energy to optical energy very efficiently, they still generate a substantial amount of heat which may cause discomfort for the patient. These LEDs may therefore have to be cooled. As shown more clearly in FIG. 10B, the LEDs may be mounted on a heat sink 118 with a coolant inlet/outlet which can be connected to an external chiller. In general, devices for cooling the LEDs may include thermoelectric coolers, liquid-cooled heat exchangers, expansion coolers, microchannel coolers, thermo-siphon heat pipes, and the like.


The excitation light blocking filter 24 for the excitation light placed in front of the sensor may be designed to prevent transmission of blue or UV light produced by the white-light LED. Alternatively or in addition, the LED itself may be covered with a filter absorbing the blue or UV light from the LED dies.


A temperature sensor may be incorporated into the heat sink 118, or mounted in close vicinity to the LED array, for the purposes of


1. monitoring and adjusting the heat sink temperature, and


2. providing a safety mechanism by which a signal can be generated to reduce or interrupt the electrical power to the LEDs in the event of a failure in the heat sink cooling system.


While the invention has been disclosed in connection with the preferred embodiments shown and described in detail, various modifications and improvements thereon will become readily apparent to those skilled in the art. For example, although not illustrated in the drawings, the illumination sources, such as the arc lamp or halogen lamp, may be replaced with LEDs or lasers. Accordingly, the spirit and scope of the present invention is to be limited only by the following claims.

Claims
  • 1. A method for visualizing a tissue of a subject, the method comprising: illuminating the tissue with a white light and an excitation light that excites fluorophors in the tissue, wherein the fluorophors emit fluorescence light to create a fluorescence image;continuously acquiring fluorescence and white light reflectance images of the tissue; anddisplaying images of the tissue generated from the continuously acquired fluorescence and white light reflectance images at video frame rates on a display device, wherein generating the displayed images comprises: receiving a fluorescence image of the tissue and a white light reflectance image of the tissue that is formed from reflectance of the illuminated white light, wherein the fluorescence and the white light reflectance images have spatially corresponding pixels;calculating, for each of the spatially corresponding pixels in the fluorescence and reflectance images on a pixel-by-pixel basis, a ratio between a fluorescence signal for each pixel in the fluorescence image and an extracted color reflectance signal from the white light reflectance image for each pixel in the reflectance image; andgenerating an image of the tissue, wherein each pixel in the generated image has a brightness based on brightness of its corresponding pixel in the white light reflectance image, and wherein each pixel in the generated image is assigned a color based on the calculated ratio for its corresponding pixel in the fluorescence and reflectance images,wherein the assigned colors in the generated image comprise a first color indicating a first tissue characteristic and a second color indicating a second tissue characteristic.
  • 2. The method of claim 1, wherein the first and second colors are contrasting.
  • 3. The method of claim 1, wherein the first tissue characteristic is abnormal tissue pathology and the second tissue characteristic is normal tissue pathology.
  • 4. The method of claim 1, wherein the extracted color reflectance signal for each pixel is a red reflectance signal.
  • 5. The method of claim 1, wherein the calculated ratio is the ratio of the fluorescence signal to the extracted color reflectance signal.
  • 6. The method of claim 1, wherein the calculated ratio is the ratio of the extracted color reflectance signal to the fluorescence signal.
  • 7. The method of claim 1, wherein the reflectance image and the fluorescence image have been produced from a combined reflectance and fluorescence image.
  • 8. The method of claim 1, wherein a sensor used to receive the reflectance image is also used to receive the fluorescence image.
  • 9. The method of claim 1, wherein receiving the reflectance image and the fluorescence image comprises receiving a combined reflectance and fluorescence signal.
  • 10. The method of claim 1, wherein the receiving is performed using an endoscope.
  • 11. The method of claim 1, further comprising displaying images of the tissue generated from the continuously acquired fluorescence and white light reflectance images at video frame rates in real-time.
  • 12. A system for visualizing a tissue of a subject, the system comprising: a light source that provides fluorescence excitation light to excite fluorophors in the tissue, wherein the fluorophors emit fluorescence light to create a fluorescence image, and white light reflectance illumination light;a camera that continuously acquires white light reflectance images of the tissue that are formed from reflectance of white light from the white light reflectance illumination source and fluorescence images of the tissue, wherein the reflectance and fluorescence images have spatially corresponding pixels; anda processor in communication with the camera thatcontinuously receives the fluorescence and reflectance images;calculates, for each of the spatially corresponding pixels in the fluorescence and reflectance images on a pixel-by-pixel basis, a ratio between a fluorescence signal for each pixel in the fluorescence image and an extracted color reflectance signal based on the white light reflectance image for each pixel in the reflectance image; andgenerates images of the tissue at video frame rates, wherein each pixel in the generated images has a brightness based on brightness of its corresponding pixel in the white light reflectance image, and wherein each pixel of the generated images is assigned a color based on the calculated ratio for its corresponding pixel in the fluorescence and reflectance images,wherein the assigned colors in the generated images comprise a first color indicating a first tissue characteristic and a second color indicating a second tissue characteristic.
  • 13. The system of claim 12, wherein the first and second colors are contrasting.
  • 14. The system of claim 12, wherein the first tissue characteristic is abnormal tissue pathology and the second tissue characteristic is normal tissue pathology.
  • 15. The system of claim 12, further comprising a display device that simultaneously displays the generated images of the tissue.
  • 16. The system of claim 12, wherein the extracted color reflectance signal is a red reflectance signal.
  • 17. The system of claim 12, wherein the calculated ratio is the ratio of the fluorescence signal to the extracted color reflectance signal.
  • 18. The system of claim 12, wherein the calculated ratio is the ratio of the extracted color reflectance signal to the fluorescence signal.
  • 19. The system of claim 12, wherein the camera comprises a sensor that is used to acquire the reflectance images and is also used to acquire the fluorescence images.
  • 20. The system of claim 12, wherein the camera is located at an insertion end of an endoscope.
  • 21. The system of claim 12, wherein the camera is located at a proximal end of an endoscope.
  • 22. The system of claim 19, wherein the sensor comprises a CMOS sensor chip.
  • 23. The system of claim 12, wherein the light source comprises a light-emitting diode that is switched on and off.
  • 24. A method for visualizing a tissue of a subject, the method comprising: administering a fluorescent dye to the subject;illuminating the tissue with a white light and exciting the fluorescent dye in the tissue to excite fluorophors in the tissue, wherein the fluorophors emit fluorescence light to create a fluorescence image;continuously acquiring fluorescence and white light reflectance images of the tissue; anddisplaying images of the tissue generated from the continuously acquired fluorescence and white light reflectance images at video frame rates on a display device, wherein generating the displayed images comprises: receiving a fluorescence image of the tissue and a reflectance image of the tissue that is formed from reflectance of the illuminated white light, wherein the fluorescence and reflectance images have spatially corresponding pixels;calculating, for each of the spatially corresponding pixels in the fluorescence and reflectance images on a pixel-by-pixel basis, a ratio between a fluorescence signal for each pixel in the fluorescence image and an extracted color reflectance signal for each pixel in the reflectance image; andgenerating an image of the tissue, wherein each pixel in the generated image has a brightness based on brightness of its corresponding pixel in the white light reflectance image, and wherein each pixel in the generated image is assigned a color based on the calculated ratio for its corresponding pixel in the fluorescence and reflectance images,wherein the assigned colors in the generated image comprise a first color indicating a first tissue characteristic and a second color indicating a second tissue characteristic.
  • 25. The method of claim 24, wherein the first and second colors are contrasting.
  • 26. The method of claim 24, wherein the first tissue characteristic is abnormal tissue pathology and the second tissue characteristic is normal tissue pathology.
  • 27. The method of claim 24, wherein the extracted color reflectance signal for each pixel is a red reflectance signal.
  • 28. The method of claim 24, wherein the calculated ratio is the ratio of the fluorescence signal to the extracted color reflectance signal.
  • 29. The method of claim 24, wherein the calculated ratio is the ratio of the extracted color reflectance signal to the fluorescence signal.
  • 30. The method of claim 24, wherein the reflectance image and the fluorescence image have been produced from a combined reflectance and fluorescence image.
  • 31. The method of claim 24, further comprising acquiring the reflectance image and the fluorescence image.
  • 32. The method of claim 31, wherein a sensor used to acquire the reflectance image is also used to acquire the fluorescence image.
  • 33. The method of claim 31, wherein acquiring the reflectance image and the fluorescence image comprises acquiring a combined reflectance and fluorescence signal.
  • 34. The method of claim 31, wherein the acquisition is performed using an endoscope.
  • 35. The method of claim 24, further comprising generating one or more fluorescence images by interpolation.
  • 36. The method of claim 24, further comprising generating one or more reflectance images by interpolation.
  • 37. The method of claim 24, wherein the dye comprises indocyanine green (ICG).
  • 38. The method of claim 24, wherein the dye is indocyanine green (ICG).
  • 39. The method of claim 1, further comprising administering a fluorescent dye to the subject comprising indocyanine green (ICG).
  • 40. The method of claim 39, wherein the dye is indocyanine green (ICG).
RELATED APPLICATIONS

This application is a continuation of U.S. application Ser. No. 13/930,225, filed Jun. 28, 2013, now U.S. Pat. No. 9,143,746, which is a continuation of U.S. application Ser. No. 11/964,330, filed Dec. 26, 2007, now U.S. Pat. No. 8,498,695, which claims the benefit of U.S. Provisional Application No. 60/876,597, filed Dec. 22, 2006, and U.S. Provisional Application No. 60/908,373, filed Mar. 27, 2007, the disclosures of all of which are incorporated herein by reference as if fully set forth herein.

US Referenced Citations (404)
Number Name Date Kind
1290744 Hollander Jan 1919 A
D62892 Dinkelspiel Aug 1923 S
2453336 Orser Nov 1948 A
2857523 Corso Oct 1958 A
3215029 Woodcock Nov 1965 A
3582178 Boughton et al. Jun 1971 A
3671098 Rotter Jun 1972 A
3749494 Hodges Jul 1973 A
3790248 Kellow Feb 1974 A
3931593 Marshall Jan 1976 A
3970373 Pledger Jul 1976 A
3971068 Gerhardt et al. Jul 1976 A
4037866 Price Jul 1977 A
4066330 Jones Jan 1978 A
4115812 Akatsu Sep 1978 A
4149190 Wessler et al. Apr 1979 A
4158504 de Ponteves et al. Jun 1979 A
4200801 Schuresko Apr 1980 A
4260217 Traeger et al. Apr 1981 A
4318395 Tawara Mar 1982 A
4355325 Nakamura et al. Oct 1982 A
4378571 Handy Mar 1983 A
4449535 Renault May 1984 A
4471766 Terayama Sep 1984 A
4532918 Wheeler Aug 1985 A
4556057 Hiruma et al. Dec 1985 A
4575632 Lange Mar 1986 A
4597630 Brandstetter et al. Jul 1986 A
4611888 Prenovitz et al. Sep 1986 A
4638365 Kato Jan 1987 A
4656508 Yokota Apr 1987 A
4660982 Okada Apr 1987 A
4688905 Okamura Aug 1987 A
4717952 Kohayakawa et al. Jan 1988 A
4742388 Cooper et al. May 1988 A
4768513 Suzuki Sep 1988 A
4786813 Svanberg et al. Nov 1988 A
4799104 Hosoya et al. Jan 1989 A
4806005 Schneider et al. Feb 1989 A
4821117 Sekiguchi Apr 1989 A
4837625 Douziech et al. Jun 1989 A
4852985 Fujihara et al. Aug 1989 A
4856495 Tohjoh et al. Aug 1989 A
4885634 Yabe Dec 1989 A
4895145 Joffe Jan 1990 A
4930516 Alfano et al. Jun 1990 A
4930883 Salzman Jun 1990 A
4951135 Sasagawa et al. Aug 1990 A
4953539 Nakamura et al. Sep 1990 A
4954897 Ejima et al. Sep 1990 A
4974936 Ams et al. Dec 1990 A
5001556 Nakamura et al. Mar 1991 A
5007408 Ieoka Apr 1991 A
5028128 Onuki Jul 1991 A
5034888 Uehara et al. Jul 1991 A
5041852 Misawa et al. Aug 1991 A
5115308 Onuki May 1992 A
5121220 Nakamoto Jun 1992 A
5128803 Sprafke Jul 1992 A
5132837 Kitajima Jul 1992 A
5134662 Bacus et al. Jul 1992 A
5159398 Maekewa et al. Oct 1992 A
5165079 Schulz-Hennig Nov 1992 A
5205280 Dennison, Jr. et al. Apr 1993 A
5208651 Bulcan May 1993 A
5214503 Chiu et al. May 1993 A
5225883 Carter et al. Jul 1993 A
5255087 Nakamura et al. Oct 1993 A
5278642 Danna et al. Jan 1994 A
5282082 Espie et al. Jan 1994 A
5295017 Brown Mar 1994 A
RE34622 Ledley May 1994 E
5365057 Morley et al. Nov 1994 A
5371355 Wodecki Dec 1994 A
5377686 O'Rourke et al. Jan 1995 A
5379756 Pileski et al. Jan 1995 A
5408263 Kikuchi et al. Apr 1995 A
5410363 Capen et al. Apr 1995 A
5419323 Kittrell et al. May 1995 A
5420628 Poulsen et al. May 1995 A
5421337 Richards-Kortum et al. Jun 1995 A
5424841 Van Gelder et al. Jun 1995 A
5426530 Copenhaver et al. Jun 1995 A
5430476 Hafele et al. Jul 1995 A
D362465 Gallenmore Sep 1995 S
5481401 Kita et al. Jan 1996 A
5485203 Nakamura et al. Jan 1996 A
5490015 Umeyama et al. Feb 1996 A
5507287 Palcic et al. Apr 1996 A
5515449 Tsuruoka et al. May 1996 A
5535052 Jörgens Jul 1996 A
5536236 Yabe et al. Jul 1996 A
5557451 Copenhaver et al. Sep 1996 A
5585846 Kim Dec 1996 A
5590660 MacAulay et al. Jan 1997 A
5596654 Tanaka Jan 1997 A
5646680 Yajima Jul 1997 A
5647368 Zeng et al. Jul 1997 A
5647840 D'Amelio et al. Jul 1997 A
5667472 Finn et al. Sep 1997 A
5677724 Takizawa et al. Oct 1997 A
5682567 Spruck et al. Oct 1997 A
5689354 Orino Nov 1997 A
5695049 Bauman Dec 1997 A
5697373 Richards-Kortum et al. Dec 1997 A
5713364 DeBaryshe et al. Feb 1998 A
5729382 Morita et al. Mar 1998 A
5749830 Kaneko et al. May 1998 A
5769792 Palcic et al. Jun 1998 A
5772355 Ross et al. Jun 1998 A
5772580 Utsui et al. Jun 1998 A
5827190 Palcic et al. Oct 1998 A
5833617 Hayashi Nov 1998 A
5838001 Minakuchi et al. Nov 1998 A
5840017 Furuswaba et al. Nov 1998 A
5852498 Youvan et al. Dec 1998 A
5891016 Utsui et al. Apr 1999 A
5897269 Ross et al. Apr 1999 A
5971918 Zanger Oct 1999 A
5973315 Saldana et al. Oct 1999 A
5984861 Crowley Nov 1999 A
5986271 Lazarev et al. Nov 1999 A
5986642 Lazarev et al. Nov 1999 A
5990996 Sharp Nov 1999 A
5999240 Sharp et al. Dec 1999 A
6002137 Hayashi Dec 1999 A
6004263 Nakaichi et al. Dec 1999 A
6008889 Zeng et al. Dec 1999 A
6021344 Lui et al. Feb 2000 A
6028622 Suzuki Feb 2000 A
6030339 Tatsuno et al. Feb 2000 A
6059719 Yamamoto et al. May 2000 A
6059720 Furusawa et al. May 2000 A
6061591 Freitag et al. May 2000 A
6069689 Zeng et al. May 2000 A
6070096 Hayashi May 2000 A
6095982 Richards-Kortum et al. Aug 2000 A
6099466 Sano et al. Aug 2000 A
6110106 MacKinnon et al. Aug 2000 A
6120435 Eino Sep 2000 A
6147705 Krauter et al. Nov 2000 A
6148227 Wagnieres et al. Nov 2000 A
6161035 Furusawa Dec 2000 A
6181414 Raz et al. Jan 2001 B1
6192267 Scherninski et al. Feb 2001 B1
6212425 Irion et al. Apr 2001 B1
6226126 Conemac May 2001 B1
6258576 Richards-Kortum et al. Jul 2001 B1
D446524 Bontly et al. Aug 2001 S
6280378 Kazuhiro et al. Aug 2001 B1
6293911 Imaizumi et al. Sep 2001 B1
6315712 Rovegno Nov 2001 B1
6332092 Deckert et al. Dec 2001 B1
6364829 Fulghum Apr 2002 B1
6364831 Crowley Apr 2002 B1
D456809 Schieffers May 2002 S
6419628 Rudischhauser et al. Jul 2002 B1
6422994 Kaneko et al. Jul 2002 B1
6462770 Cline et al. Oct 2002 B1
6510338 Irion et al. Jan 2003 B1
6526213 Ilenda et al. Feb 2003 B1
6529239 Dyck et al. Mar 2003 B1
6529768 Hakamata Mar 2003 B1
6537211 Wang et al. Mar 2003 B1
6544102 Schafer et al. Apr 2003 B2
6571119 Hayashi May 2003 B2
6596996 Stone et al. Jul 2003 B1
6603552 Cline et al. Aug 2003 B1
6639664 Haan et al. Oct 2003 B2
6652452 Seifert et al. Nov 2003 B1
6750971 Overbeck et al. Jun 2004 B2
6772003 Kaneko et al. Aug 2004 B2
6773392 Kikuchi et al. Aug 2004 B2
6786865 Dhindsa Sep 2004 B2
6821245 Cline et al. Nov 2004 B2
6826424 Zeng et al. Nov 2004 B1
6898458 Zeng et al. May 2005 B2
6899675 Cline et al. May 2005 B2
6922583 Perelman et al. Jul 2005 B1
6958862 Joseph Oct 2005 B1
6960165 Ueno et al. Nov 2005 B2
7043291 Sendai May 2006 B2
D524985 Lukan et al. Jul 2006 S
D524987 Lukan et al. Jul 2006 S
7150552 Weidel Dec 2006 B2
7179222 Imaizumi et al. Feb 2007 B2
7235045 Wang et al. Jun 2007 B2
7236815 Richards-Kortum et al. Jun 2007 B2
7253894 Zeng et al. Aug 2007 B2
7324674 Ozawa et al. Jan 2008 B2
7333270 Pochapsky et al. Feb 2008 B1
7341557 Cline et al. Mar 2008 B2
7385772 Forkey et al. Jun 2008 B2
7420151 Fengler et al. Sep 2008 B2
7479990 Imaizumi et al. Jan 2009 B2
D599799 Di Bari et al. Sep 2009 S
D603408 Fitch Nov 2009 S
D606544 Di Bari et al. Dec 2009 S
7697975 Zeng Apr 2010 B2
7704206 Suzuki et al. Apr 2010 B2
7722534 Cline et al. May 2010 B2
7777191 Olcott et al. Aug 2010 B2
7798955 Ishihara et al. Sep 2010 B2
7811229 Sugimoto Oct 2010 B2
7928352 Toda Apr 2011 B2
8035067 Toda Oct 2011 B2
D653811 BenZion Feb 2012 S
8140147 Maynard et al. Mar 2012 B2
8285015 Demos Oct 2012 B2
8337400 Mizuyoshi Dec 2012 B2
8361775 Flower Jan 2013 B2
D677258 Mistkawi Mar 2013 S
8408269 Fengler et al. Apr 2013 B2
8408772 Li Apr 2013 B2
D682277 Tasselli et al. May 2013 S
8448867 Liu et al. May 2013 B2
8473035 Frangioni Jun 2013 B2
8498695 Westwick et al. Jul 2013 B2
D692004 Man Oct 2013 S
8630698 Fengler et al. Jan 2014 B2
8721532 Takei et al. May 2014 B2
8736748 Takita May 2014 B2
8759243 Coffy et al. Jun 2014 B2
8773756 Tesar et al. Jul 2014 B2
8790253 Sunagawa et al. Jul 2014 B2
8830339 Velarde et al. Sep 2014 B2
D719574 Alegiani et al. Dec 2014 S
8961403 Cline et al. Feb 2015 B2
D723563 Alegiani Mar 2015 S
8979301 Moore Mar 2015 B2
D726186 Jenkins et al. Apr 2015 S
D734339 Zhou et al. Jul 2015 S
9125552 Dunki-Jacobs et al. Sep 2015 B2
9143746 Westwick et al. Sep 2015 B2
9173554 Fengler et al. Nov 2015 B2
9282305 Kikuchi Mar 2016 B2
9294691 Ooki Mar 2016 B2
9295392 Douplik et al. Mar 2016 B2
9386909 Fengler et al. Jul 2016 B2
9407838 Butte et al. Aug 2016 B2
9435496 Moore Sep 2016 B2
9577012 Ooki Feb 2017 B2
9642532 Fengler et al. May 2017 B2
D791137 Wang et al. Jul 2017 S
9814378 Moore Nov 2017 B2
D815928 Rummel et al. Apr 2018 S
D826234 Zhou et al. Aug 2018 S
D835284 Barker et al. Dec 2018 S
D835285 Barker et al. Dec 2018 S
10356334 Moore et al. Jul 2019 B2
20010016679 Futatsugi et al. Aug 2001 A1
20010028458 Xiao Oct 2001 A1
20010049473 Hayashi Dec 2001 A1
20020013937 Ostanevich et al. Jan 2002 A1
20020016533 Marchitto et al. Feb 2002 A1
20020021355 Utsui et al. Feb 2002 A1
20020035330 Cline et al. Mar 2002 A1
20020076480 Hsieh et al. Jun 2002 A1
20020138008 Tsujita et al. Sep 2002 A1
20020143243 Geordakoudi et al. Oct 2002 A1
20020148902 Schlieffers Oct 2002 A1
20020155619 Kurihara et al. Oct 2002 A1
20020156380 Feld et al. Oct 2002 A1
20020161282 Fulghum Oct 2002 A1
20020161283 Sendai Oct 2002 A1
20020161284 Tanaka Oct 2002 A1
20020168096 Hakamata et al. Nov 2002 A1
20020175993 Ueno et al. Nov 2002 A1
20020177778 Averback et al. Nov 2002 A1
20020186478 Watanabe et al. Dec 2002 A1
20020196335 Ozawa Dec 2002 A1
20030002036 Haan et al. Jan 2003 A1
20030042493 Kazakevich Mar 2003 A1
20030063398 Abe et al. Apr 2003 A1
20030080193 Ryan et al. May 2003 A1
20030117491 Avni et al. Jun 2003 A1
20030135092 Cline Jul 2003 A1
20030153811 Muckner Aug 2003 A1
20030158470 Wolters et al. Aug 2003 A1
20030191368 Wang et al. Oct 2003 A1
20030229270 Suzuki et al. Dec 2003 A1
20040006276 Demos et al. Jan 2004 A1
20040010183 Dhindsa Jan 2004 A1
20040020990 Haven et al. Feb 2004 A1
20040021859 Cunningham Feb 2004 A1
20040037454 Ozawa et al. Feb 2004 A1
20040044275 Hakamata Mar 2004 A1
20040046865 Ueno et al. Mar 2004 A1
20040133073 Berci et al. Jul 2004 A1
20040134990 Fitch et al. Jul 2004 A1
20040143162 Krattiger et al. Jul 2004 A1
20040148141 Tsujita et al. Jul 2004 A1
20040149998 Henson et al. Aug 2004 A1
20040156124 Okada Aug 2004 A1
20040186351 Imaizumi Sep 2004 A1
20040218115 Kawana et al. Nov 2004 A1
20040225222 Zeng et al. Nov 2004 A1
20040245350 Zeng Dec 2004 A1
20040263643 Imaizumi et al. Dec 2004 A1
20050027166 Matsumoto et al. Feb 2005 A1
20050096505 Imaizumi et al. May 2005 A1
20050140270 Henson et al. Jun 2005 A1
20050143627 Cline et al. Jun 2005 A1
20050154319 Cline et al. Jul 2005 A1
20050171440 Maki et al. Aug 2005 A1
20050182291 Hirata Aug 2005 A1
20050182321 Frangioni Aug 2005 A1
20050203421 Zeng et al. Sep 2005 A1
20050256373 Bar-Or Nov 2005 A1
20050273011 Hattery et al. Dec 2005 A1
20050280783 Yamasaki et al. Dec 2005 A1
20050288593 Geordakoudi et al. Dec 2005 A1
20060002141 Ouderkirk et al. Jan 2006 A1
20060004292 Beylin Jan 2006 A1
20060017913 Kawamata et al. Jan 2006 A1
20060089554 Ishihara et al. Apr 2006 A1
20060094109 Trainer May 2006 A1
20060146322 Komachi et al. Jul 2006 A1
20060149133 Sugimoto et al. Jul 2006 A1
20060155166 Takahashi et al. Jul 2006 A1
20060211915 Takeuchi et al. Sep 2006 A1
20060215406 Thrailkill Sep 2006 A1
20060217594 Ferguson Sep 2006 A1
20060241496 Fengler et al. Oct 2006 A1
20060247537 Matsumoto Nov 2006 A1
20060250696 McGuire Nov 2006 A1
20060258910 Stefanchik et al. Nov 2006 A1
20070041195 Chen Feb 2007 A1
20070091634 Sakurada Apr 2007 A1
20070177152 Tearney et al. Aug 2007 A1
20070203413 Frangioni Aug 2007 A1
20070213593 Nakaoka Sep 2007 A1
20070229309 Tomita et al. Oct 2007 A1
20080021274 Bayer et al. Jan 2008 A1
20080024868 Okamura Jan 2008 A1
20080027280 Fengler et al. Jan 2008 A1
20080039697 Morishita Feb 2008 A1
20080074752 Chaves et al. Mar 2008 A1
20080177140 Cline et al. Jul 2008 A1
20080208006 Farr Aug 2008 A1
20080217411 Ledwith et al. Sep 2008 A1
20080246920 Buczek et al. Oct 2008 A1
20090012361 MacKinnon et al. Jan 2009 A1
20090021739 Tsujita et al. Jan 2009 A1
20090036734 Dunki-Jacobs et al. Feb 2009 A1
20090040754 Brukilacchio et al. Feb 2009 A1
20090052185 Toriyama et al. Feb 2009 A1
20090114799 Maeda May 2009 A1
20090114803 Yamaguchi May 2009 A1
20090122135 Matsui May 2009 A1
20090122152 Yamaguchi et al. May 2009 A1
20090124854 Yamaguchi et al. May 2009 A1
20090153797 Allon et al. Jun 2009 A1
20090181339 Liang et al. Jul 2009 A1
20090201577 LaPlante et al. Aug 2009 A1
20090218405 Joseph et al. Sep 2009 A1
20090290149 Roth Nov 2009 A1
20100065641 Liu et al. Mar 2010 A1
20100087741 Douplik et al. Apr 2010 A1
20100094136 Nakaoka et al. Apr 2010 A1
20100110168 Avni et al. May 2010 A1
20100110393 Chen et al. May 2010 A1
20100121146 Sugimoto May 2010 A1
20100125164 LaBombard May 2010 A1
20100155487 Liu et al. Jun 2010 A1
20100157039 Sugai Jun 2010 A1
20100168588 Matsumoto et al. Jul 2010 A1
20100198010 Cline et al. Aug 2010 A1
20100208487 Li Aug 2010 A1
20100277817 Durell Nov 2010 A1
20100308116 Sani et al. Dec 2010 A1
20110032350 Kikuchi et al. Feb 2011 A1
20110073658 Vassura et al. Mar 2011 A1
20110235017 Iwasaki Sep 2011 A1
20110244506 Sutter et al. Oct 2011 A1
20110270092 Kang et al. Nov 2011 A1
20110290889 Tamburini et al. Dec 2011 A1
20120006897 Barkan et al. Jan 2012 A1
20120044462 Kaji Feb 2012 A1
20120150046 Watson et al. Jun 2012 A1
20120256002 O'Donnell et al. Oct 2012 A1
20120319645 O'Donnell et al. Dec 2012 A1
20130008964 Hawley et al. Jan 2013 A1
20130237762 Fengler et al. Sep 2013 A1
20140071328 Miesak Mar 2014 A1
20140078378 Demers et al. Mar 2014 A1
20140139893 Sugiyama et al. May 2014 A1
20140187967 Wood et al. Jul 2014 A1
20140194687 Fengler et al. Jul 2014 A1
20150184811 Moore Jul 2015 A1
20150230698 Cline et al. Aug 2015 A1
20150320296 Morita Nov 2015 A1
20150381909 Butte et al. Dec 2015 A1
20160041098 Hirawake et al. Feb 2016 A1
20160044253 Dainty et al. Feb 2016 A1
20160100763 Fengler et al. Apr 2016 A1
20160360956 Moore Dec 2016 A1
20170064257 Westwick et al. Mar 2017 A1
20170064258 Westwick et al. Mar 2017 A1
20170142314 Moore et al. May 2017 A1
20170167980 Dimitriadis et al. Jun 2017 A1
20170209050 Fengler et al. Jul 2017 A1
20170273567 Fengler et al. Sep 2017 A1
20170354392 Fengler et al. Dec 2017 A1
Foreign Referenced Citations (128)
Number Date Country
101726980 Jun 2010 CN
101828139 Sep 2010 CN
201974160 Sep 2011 CN
195 35 114 Mar 1996 DE
196 08 027 Sep 1996 DE
0 512 965 Nov 1992 EP
0 672 379 Sep 1995 EP
0 774 865 May 1997 EP
0 792 618 Sep 1997 EP
0671706 Jun 1999 EP
1 374 755 Jan 2004 EP
1 883 337 Feb 2008 EP
2 051 603 Apr 2009 EP
2859837 Apr 2015 EP
2 671 405 Jul 1992 FR
S-60-246733 Dec 1985 JP
S-61-159936 Jul 1986 JP
H-01-135349 May 1989 JP
03-97439 Apr 1991 JP
03-97441 Apr 1991 JP
03-97442 Apr 1991 JP
05-115435 May 1993 JP
06-125911 May 1994 JP
H-07-155285 Jun 1995 JP
H-07-155286 Jun 1995 JP
H-07-155290 Jun 1995 JP
H-07-155291 Jun 1995 JP
H-07155292 Jun 1995 JP
H-07-204156 Aug 1995 JP
H-07-222712 Aug 1995 JP
H-07-250804 Oct 1995 JP
H-07-250812 Oct 1995 JP
H-07-327913 Dec 1995 JP
H-08-126605 May 1996 JP
08-140928 Jun 1996 JP
08-140929 Jun 1996 JP
H-08-224208 Sep 1996 JP
H-08-224209 Sep 1996 JP
H-08-224210 Sep 1996 JP
H-08-224240 Sep 1996 JP
H-08-252218 Oct 1996 JP
H09-19408 Jan 1997 JP
09-066023 Mar 1997 JP
09-070387 Mar 1997 JP
H-10-127563 May 1998 JP
H-10-151104 Jun 1998 JP
10-201707 Aug 1998 JP
10-225427 Aug 1998 JP
H-10-201700 Aug 1998 JP
H-10-225426 Aug 1998 JP
H-10-243915 Sep 1998 JP
H-10-243920 Sep 1998 JP
H-10-308114 Nov 1998 JP
H-10-309281 Nov 1998 JP
H-10-309282 Nov 1998 JP
H10-321005 Dec 1998 JP
H-10-328129 Dec 1998 JP
11-047079 Feb 1999 JP
11-089789 Apr 1999 JP
H-11-104059 Apr 1999 JP
H-11-104060 Apr 1999 JP
H-11-104061 Apr 1999 JP
H-11-104070 Apr 1999 JP
H-11-113839 Apr 1999 JP
H-11-155812 Jun 1999 JP
H-11-244220 Sep 1999 JP
H-11-332819 Dec 1999 JP
2000-504968 Apr 2000 JP
2000-245693 Sep 2000 JP
2000-354583 Dec 2000 JP
2001-78205 Mar 2001 JP
2002-000560 Jan 2002 JP
02-049302 Feb 2002 JP
2002-244122 Aug 2002 JP
2003-045210 Feb 2003 JP
2004-024611 Jan 2004 JP
2004-094043 Mar 2004 JP
2004-163902 Jun 2004 JP
2004-520105 Jul 2004 JP
2004-247156 Sep 2004 JP
2004-289545 Oct 2004 JP
2004-292722 Oct 2004 JP
2005-010315 Jan 2005 JP
2005-058618 Mar 2005 JP
2005-058619 Mar 2005 JP
2005-058620 Mar 2005 JP
2005-080819 Mar 2005 JP
2005-081079 Mar 2005 JP
2005-149996 Jun 2005 JP
2005-292404 Oct 2005 JP
2006-003103 Jan 2006 JP
2006-073767 Mar 2006 JP
2006-087764 Apr 2006 JP
2006-525494 Nov 2006 JP
2007-029453 Feb 2007 JP
2007-072392 Mar 2007 JP
2007-089840 Apr 2007 JP
2009-259703 Nov 2009 JP
2010-107751 May 2010 JP
2010-117442 May 2010 JP
2010-524194 Jul 2010 JP
2011-500921 Jan 2011 JP
2011-072424 Apr 2011 JP
2011-169819 Sep 2011 JP
2011-528918 Dec 2011 JP
5231625 Jul 2013 JP
2014-123941 Jul 2014 JP
5859578 Feb 2016 JP
99592 Nov 2010 RU
WO-9304648 Mar 1993 WO
WO-199413191 Jun 1994 WO
WO-9526673 Oct 1995 WO
WO-9824360 Jun 1998 WO
WO-9901749 Jan 1999 WO
WO-9953832 Oct 1999 WO
WO-0042910 Jul 2000 WO
WO-0054652 Sep 2000 WO
WO-2002007587 Jan 2002 WO
WO-200250518 Jun 2002 WO
WO-03059159 Jul 2003 WO
WO-2006116847 Nov 2006 WO
WO-2007081707 Jul 2007 WO
WO-2008011722 Jan 2008 WO
WO-2008071240 Jun 2008 WO
WO-2009033021 Mar 2009 WO
WO-2013160279 Oct 2013 WO
WO-2014176375 Oct 2014 WO
WO-2016055837 Apr 2016 WO
Non-Patent Literature Citations (173)
Entry
US 6,692,429 B1, 02/2004, Imaizumi et al. (withdrawn)
Dawson, J.B. et al. (Jul. 1980). “A Theoretical and Experimental Study of Light Absorption and Scattering by in Vivo Skin,” Phys. Med. Biol. 25(4):695-709.
Georgakoudi, I et al. (2003). “Quantitative Characterization of Biological Tissue Using Optical Spectroscopy,” in Chapter 31 of Biomedical Photonics Handbook, Tuan Vo-Dinh (ed.), CRC Press, New York, thirty three pages.
Georgakoudi, I et al. (Apr. 2005). “Characterization of Dysplastic Tissue Morphology and Biochemistry in Barrett's Esophagus using Diffuse Reflectance and Light Scattering Spectroscopy,” Techniques in Gastrointestinal Endoscopy 7(2):100-105.
Török, B. et al. (May 1996). “Simultane digitale Indocyaningrün- und Fluoreszeinangiographie (Simultaneous Digital ICG and Fluorescein Angiography),” Klin Monatsbl Augenheilkd 208(5):333-336, (with English Translation of the Introduction).
Canadian Examiner's Report for Registration of an Industrial Design dated Feb. 1, 2017 for Canadian Application No. 171282, filed on Oct. 27, 2016, two pages.
Chinese Office action dated Jul. 29, 2016 for application No. 2012800222843 filed on Mar. 8, 2012, eight pages.
Chinese Office action dated Nov. 24, 2015 for application No. 2012800222843 filed on Mar. 8, 2012, sixteen pages.
Chinese Third Office Action dated Mar. 14, 2017 for Chinese Patent Application No. 201280022284.3, filed on Nov. 7, 2013, seven pages.
European Communication pursuant to Rules 70(2) and 70a(2) EPC and Reference to Rule 39(1) EPC dated Jan. 23, 2017 for European Application No. 16186321.2 filed on Aug. 30, 2016, two pagegs.
European Communication under Rule 71(3) EPC dated Nov. 25, 2016 for EP Application No. 08706262.6 filed on Aug. 21, 2009, eight pages.
European Extended Search Report dated Jul. 17, 2014, for EP Application No. 09721252.6 filed Mar. 18, 2009; eleven pages.
European Extended Search Report dated Sep. 20, 2013, for EP Application No. 08706262.6 filed on Jan. 23, 2008, five pages.
European Office Action dated Dec. 3, 2015, for EP Application No. 08706262.6 filed on Jan. 23, 2008; fifteen pages.
European Office Action dated Nov. 3, 2015 for EP Patent Application No. 12754208.2 filed Oct. 4, 2013, four pages.
European Office Action dated Sep. 29, 2015, for EP Application No. 09721252.6 filed Mar. 18, 2009; five pages.
European Search Report and Written Opinion dated Dec. 21, 2016 for European Application No. 16186321.2 filed on Aug. 30, 2016, nine pages.
European Supplemental Search Report dated Oct. 1, 2014 for EP Application No. 12754208.2 filed on Mar. 8, 2012, five pages.
International Preliminary Report on Patentability dated Sep. 21, 2010, for International Application No. PCT/US2009/037506, filed on Mar. 18, 2009, seven pages.
International Search Report and written Opinion dated Apr. 24, 2017, for International Application No. PCT/CA2017/050083, filed on Jan. 26, 2017, seven pages.
International Search Report and written Opinion of the International Searching Authority dated Feb. 10, 2017, for International Application No. PCT/CA2016/051315 filed on Nov. 10, 2016, thirteen pages.
International Search Report dated Aug. 3, 2012, for International Application No. PCT/IB2012/000601, filed on Mar. 8, 2012, three pages.
International Search Report dated Jul. 22, 2009, for International Application No. PCT/US09/37506, filed on Mar. 18, 2009, two pages.
International Search Report dated May 13, 2008 for Intentional Application No. PCT/CA2008/00015, filed on Jan. 8, 2008, one page.
Invitation to Pay additional Fees and, where Applicable, Protest Fee, dated Dec. 22, 2016 for International Application No. PCT/CA2016/051315, filed on Nov. 10, 2016, two pages.
Japanese Notice of Allowance dated Jan. 5, 2017 in Japanese Patent Application No. 2015-238784, filed on Dec. 7, 2015, six pages.
Japanese Notice of Allowance dated Nov. 28, 2016 for Japanese Patent Application No. 2015-245598, filed on Mar. 8, 2012, six pages.
Japanese Office Action dated Apr. 20, 2012, issued in counterpart Japanese Application No. 2011-500921, filed Mar. 18, 2009, four pages.
Japanese Office Action dated Apr. 3, 2015 in Japanese Application No. 2013-058356, filed Mar. 18, 2009, four pages.
Japanese Office Action dated Jul. 22, 2014 for Japanese Patent Application No. 2013-557187 filed Mar. 8, 2012, seven pages.
Japanese Office Action dated Mar. 9, 2015 for Japanese Patent Application No. 2013-557187, filed Mar. 8, 2012, five pages.
Japanese Office dated Dec. 26, 2012 for Japanese Patent Application No. 2011-500921, filed on Mar. 18, 2009, two pages.
Japanese Patent Office Action dated May 26, 2014 in Japanese Patent Application No. 2013-058356, filed on Mar. 18, 2009, w/Concise Explanation of the Relevance, three pages.
Korean Decision of Refusal Action dated Aug. 30, 2016 for patent application No. 10-2015-7033310 filed on Mar. 8, 2012, seven pages.
Korean Notice of Allowance dated Jan. 2, 2017 for Korean Application No. 10-2015-7033310, filed on Nov. 20, 2015, three pages.
Korean Office Action dated Aug. 20, 2015 for patent application No. 20137026479 filed on Mar. 8, 2012, three pages.
Korean Office Action dated Dec. 8, 2015 for patent application No. 20157033310 filed on Mar. 8, 2012, seven pages.
Russian Office Action—Decision to Grant dated Aug. 19, 2016 for Russian Patent Application No. 2013144845/07, filed on Mar. 8, 2012, thirteen pages.
U.S. Final Office Action dated Apr. 24, 2015 for U.S. Appl. No. 12/933,512, filed Nov. 24, 2010, nineteen pages.
U.S. Final Office Action dated Feb. 27, 2017 for U.S. Appl. No. 15/247,419, filed Aug. 25, 2016, ten pages.
U.S. Final Office Action dated Mar. 22, 2016 for U.S. Appl. No. 14/873,842, filed Oct. 2, 2015, eighteen pages.
U.S. Non Final Office Action dated Feb. 3, 2010, for U.S. Appl. No. 11/626,308, filed Jan. 23, 2007, eleven pages.
U.S. Non Final Office Action dated Jan. 26, 2017, for U.S. Appl. No. 15/343,034, filed Nov. 3, 2016, seventeen pages.
U.S. Non Final Office Action dated Jan. 27, 2017, for U.S. Appl. No. 15/343,038, filed Nov. 3, 2016, fifteen pages.
U.S. Non Final Office Action dated Jul. 2, 2013 for U.S. Appl. No. 12/933,512, filed Nov. 24, 2010, twelve pages.
U.S. Non Final Office Action dated Jun. 27, 2014 for U.S. Appl. No. 13/415,561, filed Mar. 3, 2012, fourteen pages.
U.S. Non Final Office Action dated Oct. 23, 2013 for U.S. Appl. No. 13/415,561, filed Mar. 8, 2012, ten pages.
U.S. Non Final Office Action dated Oct. 5, 2016 for U.S. Appl. No. 15/247,419, filed Aug. 25, 2016, eight pages.
U.S. Non Final Office Action dated Sep. 6, 2016 for U.S. Appl. No. 14/873,842, filed Oct. 2, 2015, seven pages.
U.S. Notice of Allowance dated Dec. 30, 2016, for U.S. Appl. No. 14/873,842, filed Oct. 2, 2015, eleven pages.
U.S. Notice of Allowance dated Jun. 25, 2015, for U.S. Appl. No. 12/933,512, filed Nov. 24, 2010 fourteen pages.
U.S. Appl. No. 15/416,876, filed Jan. 26, 2017 titled “Configurable Platform,”.
U.S. Appl. No. 15/584,405 titled “Imaging System for Combine Full-Color Reflectance and Near-Infrared Imaging,” filed May 2, 2017.
Design U.S. Appl. No. 29/562,795, filed Apr. 28, 2016, titled “Device for Illumination and Imaging of a Target,”.
Alfano, R.R. et al. (1987). “Fluorescence Spectra From Cancerous and Normal Human Breast and Lung Tissues,” IEEE Journal of Quantum Electronics QE-23(10):1806-1811.
Andersson-Engels, S. et al. (1989). “Tissue Diagnostics Using Laser Induced Fluorescence,” Ber. Bunsenges Physical Chemistry 93:335-342.
Bhunchet, E. et al. (2002). “Fluorescein Electronic Endoscopy: A Novel Method for Detection of Early Stage Gastric Cancer Not Evident to Routine Endoscopy,” Gastrointestinal Endoscopy 55(4):562-571.
European Office Action dated Nov. 19, 2015, for EP Application No. 07 785 001.4, filed on Jul. 30, 2007, four pages.
Extended European Search Report dated Jan. 24, 2012 for EP Application No. 07 785 001.4, filed on Jul. 30, 2007, seven pages.
Hung, J. et al. (1991). “Autofluorescence of Normal and Malignant Bronchial Tissue,” Lasers in Surgery and Medicine 11:99-105.
International Preliminary Report on Patentability dated Nov. 6, 2007, for International Application No. PCT/CA2006/000669, 9 pages.
International Preliminary Report on Patentability dated Feb. 3, 2009, for International Application No. PCT/CA2007/001335, 5 pages.
International Search Report dated Aug. 3, 2006, for International Application No. PCT/CA2006/000669, 3 pages.
International Search Report dated Dec. 7, 2007, for International Application No. PCT/CA2007/001335, 2 pages.
International Search Report dated Jan. 21, 2002, for International Application No. PCT/US2001/022198, filed on Jul. 13, 2001, three pages.
Japanese Office Action dated Nov. 11, 2011, for Japanese Patent Application No. 2009-521077, filed on Jul. 30, 2007, four pages.
Japanese Office Action dated Feb. 17, 2012, for Japanese Patent Application No. 2008-509275, filed on Apr. 27, 2006, six pages.
Japanese Office Action dated Sep. 14, 2012, for Japanese Patent Application No. 2008-509275, filed on Apr. 27, 2006, seven pages.
Japanese Final Office Action dated Aug. 2, 2013, for Japanese Patent Application No. 2008-509275, filed on Apr. 27, 2006, four pages.
Japanese Office Action dated Sep. 19, 2014, for Japanese Patent Application No. 2013-246636, filed on Apr. 27, 2006, six pages.
Supplemental European Search Report dated Jan. 24, 2012, for European Patent Application No. 07785001.4, six pages.
Supplemental European Search Report dated Oct. 9, 2013, for European Patent Application No. 06721854.5, six pages.
Written Opinion of the International Searching Authority dated Aug. 3, 2006, for International Application No. PCT/CA2006/000669, eight pages.
Written Opinion of the International Searching Authority dated Dec. 7, 2007, for International Application No. PCT/CA2007/001335, four pages.
Non-Final Office Action with Restriction Requirement dated Mar. 4, 2011, for U.S. Appl. No. 11/830,323, nine pages.
Non-Final Office Action dated Jun. 9, 2011, for U.S. Appl. No. 11/830,323, five pages.
Notice of Allowance dated Sep. 14, 2012, for U.S. Appl. No. 11/830,323, eight pages.
Notice of Allowance dated Aug. 6, 2015, for U.S. Appl. No. 13/853,656, seven pages.
Notice of Allowance dated Nov. 23, 2015, for U.S. Appl. No. 13/853,656, seven pages.
Notice of Allowance dated Mar. 28, 2016, for U.S. Appl. No. 13/853,656, eight pages.
Non-Final Office Action dated May 18, 2004, for U.S. Appl. No. 10/050,601, eight pages.
Notice of Allowance dated Mar. 10, 2005, for U.S. Appl. No. 10/050,601, five pages.
Notice of Allowance dated Aug. 26, 2004, for U.S. Appl. No. 10/050,601, eight pages.
Non-Final Office Action dated Apr. 2, 2009, for U.S. Appl. No. 11/009,965, thirteen pages.
Final Office Action dated Nov. 24, 2009, for U.S. Appl. No. 11/009,965, fourteen pages.
Non-Final Office Action dated Jun. 23, 2010, for U.S. Appl. No. 11/009,965, fifteen pages.
Non-Final Office Action dated Sep. 12, 2014, for U.S. Appl. No. 14/154,177, four pages.
Final Office Action dated Jun. 18, 2015, for U.S. Appl. No. 14/154,177, eight pages.
Non-Final Office Action dated Jun. 20, 2008, for U.S. Appl. No. 11/009,398, fifteen pages.
Non-Final Office Action dated Jan. 2, 2008, for U.S. Appl. No. 11/122,267, five pages.
Final Office Action dated Jul. 23, 2008, for U.S. Appl. No. 11/122,267, six pages.
Non-Final Office Action dated Dec. 10, 2010, for U.S. Appl. No. 11/412,715, ten pages.
Final Office Action dated May 11, 2011, for U.S. Appl. No. 11/412,715, eight pages.
Non-Final Office Action dated Dec. 14, 2011, for U.S. Appl. No. 11/412,715, eight pages.
Notice of Allowance dated Sep. 10, 2013, for U.S. Appl. No. 11/412,715, eight pages.
Non-Final Office Action dated Jan. 20, 2016, for U.S. Appl. No. 14/629,473, fifteen pages.
Non-Final Office Action dated Jun. 1, 2007, for U.S. Appl. No. 10/899,648, seven pages.
Notice of Allowance dated Oct. 5, 2007, for U.S. Appl. No. 10/899,648, six pages.
Notice of Allowance dated Jan. 2, 2008, for U.S. Appl. No. 10/899,648, three pages.
Non-Final Office Action dated Nov. 23, 2009, for U.S. Appl. No. 11/969,974, seven pages.
Notice of Allowance dated Feb. 25, 2010, for U.S. Appl. No. 11/969,974, four pages.
Non-Final Office Action dated Aug. 16, 2013, for U.S. Appl. No. 12/761,462, ten pages.
Final Office Action dated Jun. 5, 2014, for U.S. Appl. No. 12/761,462, fourteen pages.
Notice of Allowance dated Oct. 10, 2014, for U.S. Appl. No. 12/761,462, ten pages.
Non-Final Office Action dated Aug. 16, 2013, for U.S. Appl. No. 12/761,523, nine pages.
Non-Final Office Action dated Jul. 17, 2003, for U.S. Appl. No. 09/905,642, six pages.
Notice of Allowance dated Apr. 7, 2004, for U.S. Appl. No. 09/905,642, six pages.
Non-Final Office Action dated Oct. 7, 2011, for U.S. Appl. No. 11/964,330; ten pages.
Final Office Action dated May 21, 2012, for U.S. Appl. No. 11/964,330; twelve pages.
Notice of Allowance dated Dec. 10, 2012, for U.S. Appl. No. 11/964,330; seven pages.
Notice of Allowance dated Mar. 22, 2013, for U.S. Appl. No. 11/964,330; eight pages.
Non-Final Office Action dated Nov. 5, 2014, for U.S. Appl. No. 13/930,225; six pages.
Notice of Allowance dated May 18, 2015, for U.S. Appl. No. 13/930,225; nine pages.
Lyon, R.E. et al. (2002). “Eyeing the Camera: Into the Next Century,” 10 Color and Imaging Conference Final Program & Proceedings 349-355.
Chinese Notice of Allowance dated Jun. 19, 2017 for Chinese Application No. 201280022284.3, filed on Nov. 7, 2013, four pages.
European Communication Pursuant to Article 94(3) EPC dated Apr. 13, 2017, filed on Oct. 4, 2013, five pages.
European Decision to Grant a European Patent Pursuant to Article 97(1) EPC dated Jun. 22, 2017, for EP Application No. 08706262.6 filed on Aug. 21, 2009, two pages.
European Invitation Pursuant to Article 94(3) and Rule 71(1) EPC dated Apr. 6, 2017, for EP Application No. 09819758.5, filed on May 4, 2011, five pages.
International Search Report and Written Opinion dated Sep. 18, 2017, for International Application No. PCT/CA2017/050734, filed on Jun. 14, 2017, eight pages.
Japanese Notice of Allowance dated Nov. 17, 2017, for Japanese Patent Application No. 2016-253736 filed on Dec. 27, 2016, six pages.
Korean Decision on the Trial Against Final Rejection from the Intellectual Property Tribunal (IPT) dated Sep. 25, 2017, for Korean Patent Application No. 2013-7026479, filed on Oct. 7, 2013, seventeen pages.
Korean Notice of Allowance dated Dec. 13, 2017 for Korean Patent Application No. 10-2017-7008654, filed on Mar. 29, 2017, three pages.
Korean Office Action dated Jun. 27, 2017 for Korean Patent Application No. 2017-7008654, filed on Mar. 29, 2017, ten pages.
U.S. Final Office Action dated Aug. 10, 2017, for U.S. Appl. No. 15/343,034, filed Nov. 3, 2016, twelve pages.
U.S. Final Office Action dated Aug. 7, 2017, for U.S. Appl. No. 15/343,038, filed Nov. 3, 2016, eleven pages.
U.S. Non Final Office Action dated Sep. 25, 2017, for U.S. Appl. No. 15/584,405, filed May 2, 2017, eight pages.
U.S. Notice of Allowance dated Jul. 10, 2017 for U.S. Appl. No. 15/247,419, filed Aug. 25, 2016, eight pages.
Hubel, P.M.et al.(2004). “Spatial Frequency Response of Color Image Sensors: Bayer Color Filters and Foveon X3,” Proceedings of SPIE 5301:402-406.
Australian Examination Report No. 1 dated Jun. 28, 2018 for Australian Application No. 2016351730 filed on Nov. 10, 2016, five pages.
Chinese First Office Action dated Sep. 26, 2018 for Chinese Patent Application No. 2018092001857100, filed on Sep. 4, 2017, nineteen pages.
European Decision to Grant dated Jul. 12, 2018 for EP Application No. 12754208.2 filed Oct. 4, 2013, two pages.
European Decision to Grant dated May 25, 2018 for EP Patent Application No. 13180297.7 filed Aug. 13, 2013, two pages.
European Notice of Allowance dated Feb. 28, 2018 for EP Patent Application No. 12754208.2 filed Oct. 4, 2013, six pages.
European Notice of Allowance dated Mar. 6, 2018 for EP Patent Application No. 13180297.7 filed Aug. 13, 2013, seven pages.
Indian Office Action dated Jan. 31, 2018 for Indian Patent Application No. 6532/DELNP/2010 filed on Sep. 16, 2010, five pages.
Indian Office Action dated Jun. 26, 2018 for Indian Patent Application No. 8678/DELNP/2013 filed on Mar. 8, 2012, five pages.
International Preliminary Report on Patentability dated May 24, 2018 for International Application No. PCT/CA2016/051315 filed on Nov. 10, 2016, nine pages.
Japanese Notice of Allowance dated Apr. 2, 2018 for Japanese Patent Application No. 2017-018858 filed on Feb. 3, 2017, six pages.
Japanese Office Action dated Dec. 8, 2017 for Japanese Patent Application No. 2017-018858 filed on Feb. 3, 2017, six pages.
U.S. Final Office Action dated Feb. 1, 2018, for U.S. Appl. No. 15/584,405, filed May 2, 2017, ten pages.
U.S. Non Final Office Action dated Aug. 15, 2018 for U.S. Appl. No. 15/348,664, filed Nov. 10, 2016, eleven pages.
U.S. Non Final Office Action dated Jun. 8, 2018, for U.S. Appl. No. 15/343,034, filed Nov. 3, 2016, thirteen pages.
U.S. Non Final Office Action dated Jun. 8, 2018, for U.S. Appl. No. 15/584,405, filed May 2, 2017, eight pages.
U.S. Non Final Office Action dated May 25, 2018, for U.S. Appl. No. 15/343,038, filed Nov. 3, 2016, eleven pages.
U.S. Appl. No. 15/810,911, filed Nov. 13, 2017. (Copy not submitted herewith pursuant to the waiver of 37 C.F.R. § 1.98(a)(2)(iii) issued by the Office on Sep. 21, 2004.).
Australian Office Action dated May 10, 2019 for Australian Patent Application No. 2016351730 filed on Nov. 10, 2016, ten pages.
Canadian Office Action dated Feb. 19, 2019 for CA Patent Application No. 2,998,920 filed on Mar. 16, 2018, four pages.
European Notice of Allowance dated Mar. 18, 2019 for EP Patent Application No. 09819758.5, filed on May 4, 2011, seven pages.
European Search Report dated Feb. 18, 2019 for EP Patent Application No. 18178620.3 filed on Jun. 19, 2018, eight pages.
International Preliminary Report on Patentability dated Dec. 27, 2018 for International Patent Application No. PCT/CA2017/050734 filed on Jun. 14, 2017, six pages.
U.S. Final Office Action dated Dec. 14, 2018 for U.S. Appl. No. 15/584,405, filed May 2, 2017, seven pages.
U.S. Final Office Action dated Jan. 11, 2019 for U.S. Appl. No. 15/343,038, filed Nov. 3, 2016, twelve pages.
U.S. Final Office Action dated Jan. 22, 2019 for U.S. Appl. No. 15/343,034, filed Nov. 3, 2016, twelve pages.
U.S. Non Final Office Action dated Apr. 3, 2019 for U.S. Appl. No. 15/416,876, filed Jan. 26, 2017, thirteen pages.
U.S. Non Final Office Action dated Feb. 5, 2019 for U.S. Appl. No. 15/623,100, filed Jun. 14, 2017, ten pages.
U.S. Appl. No. 16/441,493, filed Jun. 14, 2019. (Copy not submitted herewith pursuant to the waiver of 37 C.F.R. § 1.98(a)(2)(iii) issued by the Office on Sep. 21, 2004).
U.S. Restriction Requirement dated Feb. 7, 2019 for U.S. Appl. No. 29/562,795, filed Apr. 28, 2016, seven pages.
U.S. Non-Final Office Action dated Aug. 2, 2019 for U.S. Appl. No. 15/623,100, filed Jun. 14, 2017, twelve pages.
Australian Notice of Acceptance for Patent Application dated Jun. 26, 2019 for Patent Application No. 2016351730 filed on Nov. 10, 2016, three pages.
Brazilian Office Action dated Aug. 5, 2019, for Patent Application No. BR1120130229977, filed Mar. 8, 2012, 4 pages. (including English translation).
Canadian Office Action dated Nov. 5, 2019, for Canadian Patent Application No. 3027592, filed on Jun. 14, 2017, four pages.
Chinese Notice of Allowance dated Jan. 13, 2020, for Patent Application No. 201710785223.7, filed Mar. 8, 2012, six pages. (including English translation).
European Extended Search Report dated May 7, 2019, for Patent Application No. 16863277.6, filed Nov. 10, 2016, 3 pages.
European Extended Search Report dated Oct. 16, 2019, for Patent Application No. 17743524.5, filed Jan. 26, 2017, 4 pages.
Japanese Office Action dated Jan. 10, 2020, for Japanese Patent Application No. 2018-516161, filed Nov. 10, 2016, five pages. (including English translation).
Japanese Office Action dated Jul. 12, 2019, for Patent Application No. 2018-516161, filed Nov. 10, 2016, 21 pages. (including English translation).
Sensitization (photography), definition from Wikipedia, original language German, 6 pages. (Machine Translation).
U.S. Final Office Action dated Jul. 25, 2019 for U.S. Appl. No. 15/416,876, filed Jan. 26, 2017, 13 pages.
U.S. Non-Final Office Action dated Jan. 16, 2020, for U.S. Appl. No. 15/416,876, filed Jan. 26, 2017, 13 pages.
U.S. Non-Final Office Action dated Aug. 21, 2019 for U.S. Appl. No. 15/584,405, filed May 2, 2017, 6 pages.
U.S. Non-Final Office Action dated Aug. 23, 2019 for U.S. Appl. No. 15/343,038, filed Nov. 3, 2016, 14 pages.
U.S. Non-Final Office Action dated Aug. 23, 2019 for U.S. Appl. No. 15/343,034, filed Nov. 3, 2016, eighteen pages.
U.S. Non-Final Office Action dated Sep. 27, 2019, for U.S. Appl. No. 29/562,795, filed Apr. 28, 2016, 6 pages.
Related Publications (1)
Number Date Country
20160249019 A1 Aug 2016 US
Provisional Applications (2)
Number Date Country
60908373 Mar 2007 US
60876597 Dec 2006 US
Continuations (2)
Number Date Country
Parent 13930225 Jun 2013 US
Child 14860687 US
Parent 11964330 Dec 2007 US
Child 13930225 US