The present invention relates to medical imaging systems in general, and in particular to fluorescence endoscopy video systems.
Fluorescence endoscopy utilizes differences in the fluorescence response of normal tissue and tissue suspicious for early cancer as a tool in the detection and localization of such cancer. The fluorescing compounds or fluorophores that are excited during fluorescence endoscopy may be exogenously applied photo-active drugs that accumulate preferentially in suspicious tissues, or they may be the endogenous fluorophores that are present in all tissue. In the latter case, the fluorescence from the tissue is typically referred to as autofluorescence or native fluorescence. Tissue autofluorescence is typically due to fluorophores with absorption bands in the UV and blue portion of the visible spectrum and emission bands in the green to red portions of the visible spectrum. In tissue suspicious for early cancer, the green portion of the autofluorescence spectrum is significantly suppressed. Fluorescence endoscopy that is based on tissue autofluorescence utilizes this spectral difference to distinguish normal from suspicious tissue.
Since the concentration and/or quantum efficiency of the endogenous fluorophores in tissue is relatively low, the fluorescence emitted by these fluorophores is not typically visible to the naked eye. Fluorescence endoscopy is consequently performed by employing low light image sensors to acquire images of the fluorescing tissue through the endoscope. The images acquired by these sensors are most often encoded as video signals and displayed on a color video monitor. Representative fluorescence endoscopy video systems that image tissue autofluorescence are disclosed in U.S. Pat. No. 5,507,287, issued to Palcic et al.; U.S. Pat. No. 5,590,660, issued to MacAulay et al.; U.S. Pat. No. 5,827,190, issued to Palcic et al., U.S. patent application Ser. No. 09/615,965, and U.S. patent application Ser. No. 09/905,642, all of which are herein incorporated by reference. Each of these is assigned to Xillix Technologies Corp. of Richmond, British Columbia, Canada, the assignee of the present application.
While the systems disclosed in the above-referenced patents are significant advances in the field of early cancer detection, improvements can be made. In particular, it is desirable to reduce the size, cost, weight, and complexity of the camera.
A fluorescence endoscopy video system in accordance with one aspect of the present invention includes an endoscopic light source that is capable of operating in multiple modes to produce either white light, reflectance light, fluorescence excitation light, or fluorescence excitation light with reference reflectance light. An endoscope incorporates a light guide for transmitting light to the tissue under observation and includes either an imaging guide or a compact camera disposed in the insertion portion of the endoscope for receiving light from the tissue under observation. A compact camera includes at least one low light imaging sensor that receives light from the tissue and is capable of operating in multiple imaging modes to acquire color or multi-channel fluorescence and reflectance images. The system further includes an image processor and system controller that digitizes, processes and encodes the image signals produced by the image sensor(s) as a color video signal and a color video monitor that displays the processed video images.
In accordance with another embodiment of the invention, a filter is placed at the distal end of a conventional endoscope in order to produce both autofluorescence and white light images from an image sensor. The filter blocks excitation light from reaching an image sensor, but passes some blue light so that both fluorescence images and color/white light images of tissue can be produced. In one embodiment of the invention, a light source that produces excitation light, also produces a color corrected illumination light such that color images of the tissue can be white balanced.
The foregoing aspects and many of the attendant advantages of this invention will become more readily appreciated as the same become better understood by reference to the following detailed description, when taken in conjunction with the accompanying drawings, wherein:
As shown in
A processor/controller 64 controls the multi-mode camera 100 and the light source 52, and produces video signals that are displayed on a video monitor 66. The processor/controller 64 communicates with the multi-mode camera 100 with wires or other signal carrying devices that are routed within the endoscope. Alternatively, communication between the processor/controller 64 and the camera 100 can be conducted over a wireless link.
The light from the arc lamp 70 is coupled to a light guide 54 of the endoscope 60 through appropriate optics 74, 76, and 78 for light collection, spectral filtering and focusing respectively. The light from the arc lamp is spectrally filtered by one of a number of optical filters 76A, 76B, 76C . . . that operate to pass or reject desired wavelengths of light in accordance with the operating mode of the system. As used herein, “wavelength” is to be interpreted broadly to include not only a single wavelength, but a range of wavelengths as well.
An intensity control 80 that adjusts the amount of light transmitted along the light path is positioned at an appropriate location between the arc lamp 70 and the endoscope light guide 54. In addition, a shutter mechanism 82 may be positioned in the same optical path in order to block any of the light from the lamp from reaching the light guide. A controller 86 operates an actuator 77 that moves the filters 76A, 76B or 76C into and out of the light path. The controller 86 also controls the position of the intensity control 80 and may control the operation of the shutter mechanism 82.
The transmission characteristics of filters 76A, 7613, 76C, . . . , the characteristics of the actuator 77 mechanism, and the time available for motion of the filters 76A, 76B, 76C, . . . , into and out of the light path, depend on the mode of operation required for use with the various camera embodiments. The requirements fall into two classes. If the light source shown in
A light source 52A of a second class is illustrated in
The transmission characteristics of light source filters, the characteristics of the filter actuator mechanism, and the time available for motion of the filters into and out of the light path, for the two different classes of light sources are described in more detail below in the context of the various camera embodiments.
Because fluorescence endoscopy is generally used in conjunction with white light endoscopy, each of the various embodiments of the multi-mode camera 100 described below may be used both for color (white light) and fluorescence/reflectance and/or fluorescence/fluorescence imaging. (For the purposes of the present invention the terms white light imaging and color imaging of tissue are considered to be synonymous.) These camera embodiments particularly lend themselves to incorporation within a fluorescence video endoscope due to their compactness and their ability to be implemented with no moving parts.
In a first embodiment, shown in
In
The low light image sensor 104 preferably comprises a charge coupled device with charge carrier multiplication (of the same type as the Texas Instruments TC253 or the Marconi Technologies CCD65), electron beam charge coupled device (EBCCD), intensified charge coupled device (ICCD), charge injection device (CID), charge modulation device (CMD), complementary metal oxide semiconductor image sensor (CMOS) or charge coupled device (CCD) type sensor. The monochrome image sensor 102 is preferably a CCD or a CMOS image sensor.
An alternative configuration of the camera 100B is shown in
The processor/controller 64 as shown in
Based on operator input, the processor/controller 64 also provides control functions for the fluorescence endoscopy video system. These control functions include providing control signals that control the camera gain in all imaging modes, coordinating the imaging modes of the camera and light source, and providing a light level control signal for the light source.
The reason that two separate images in different wavelength bands are acquired in fluorescence imaging modes of the fluorescence endoscopy video systems described herein, and the nature of the fluorescence/reflectance and fluorescence/fluorescence imaging, will now be explained. It is known that the intensity of the autofluorescence at certain wavelengths changes as tissues become increasingly abnormal (i.e., as they progress from normal to frank cancer). When visualizing images foamed from such a band of wavelengths of autofluorescence, however, it is not easy to distinguish between those changes in the signal strength that are due to pathology and those that are due to imaging geometry and shadows. A second fluorescence image acquired in a band of wavelengths in which the image signal is not significantly affected by tissue pathology, utilized for fluorescence/fluorescence imaging, or a reflected light image acquired in a band of wavelengths in which the image signal is not significantly affected by tissue pathology consisting of light that has undergone scattering within the tissue (known as diffuse reflectance), utilized for fluorescence/reflectance imaging, may be used as a reference signal with which the signal strength of the first fluorescence image can be “normalized.” Such normalization is described in two patents previously incorporated herein by reference: U.S. Pat. No. 5,507,287, issued to Palcic et al., describes fluorescence/fluorescence imaging and U.S. Pat. No. 5,590,660, issued to MacAulay et al., describes fluorescence/reflectance imaging.
One technique for performing the normalization, is to assign each of the two image signals a different display color, e.g., by supplying the image signals to different color inputs of a color video monitor. When displayed on a color video monitor, the two images are effectively combined to form a single image, the combined color of which represents the relative strengths of the signals from the two images. Since light originating from fluorescence within tissue and diffuse reflectance light which has undergone scattering within the tissue are both emitted from the tissue with a similar spatial distribution of intensities, the color of a combined image is independent of the absolute strength of the separate image signals, and will not change as a result of changes in the distance or angle of the endoscope 60 to the tissue sample 58, or changes in other imaging geometry factors. If however, there is a change in the shape of the autofluorescence spectrum of the observed tissue that gives rise to a change in the relative strength of the two image signals, such a change will be represented as a change in the color of the displayed image. Another technique for performing the normalization is to calculate the ratio of the pixel intensities at each location in the two images. A new image can then be created wherein each pixel has an intensity and color related to the ratio computed. The new image can then be displayed by supplying it to a color video monitor.
The mixture of colors with which normal tissue and tissue suspicious for early cancer are displayed depends on the gain applied to each of the two separate image signals. There is an optimal gain ratio for which tissue suspicious for early cancer in a fluorescence image will appear as a distinctly different color than normal tissue. This gain ratio is said to provide the operator with the best combination of sensitivity (ability to detect suspect tissue) and specificity (ability to discriminate correctly). If the gain applied to the reference image signal is too high compared to the gain applied to the fluorescence image signal, the number of tissue areas that appear suspicious, but whose pathology turns out to be normal, increases. Conversely, if the relative gain applied to the reference image signal is too low, sensitivity decreases and suspect tissue will appear like normal tissue. For optimal system performance, therefore, the ratio of the gains applied to the image signals must be maintained at all times. The control of the gain ratio is described in two patent applications previously incorporated herein by reference: U.S. patent application Ser. No. 09/615,965, and U.S. patent application Ser. No. 09/905,642.
In vivo spectroscopy has been used to determine which differences in tissue autofluorescence and reflectance spectra have a pathological basis. The properties of these spectra determine the particular wavelength bands of autofluorescence and reflected light required for the fluorescence/reflectance imaging mode, or the particular two wavelength bands of autofluorescence required for fluorescence/fluorescence imaging mode. Since the properties of the spectra depend on the tissue type; the wavelengths of the important autofluorescence band(s) may depend on the type of tissue being imaged. The specifications of the optical filters described below are a consequence of these spectral characteristics, and are chosen to be optimal for the tissues to be imaged.
As indicated above, the filters in the light source and camera should be optimized for the imaging mode of the camera, the type of tissue to be examined and/or the type of pre-cancerous tissue to be detected. Although all of the filters described below can be made to order using standard, commercially available components, the appropriate wavelength range of transmission and degree of blocking outside of the desired transmission range for the described fluorescence endoscopy images are important to the proper operation of the system. The importance of other issues in the specification of such filters, such as the fluorescence properties of the filter materials and the proper use of anti-reflection coatings, are taken to be understood.
In this configuration, the filter blocks excitation light and green fluorescence light while transmitting red fluorescence light in the wavelength range of 590-750 nm or any subset of wavelengths in this range. When used in a fluorescence endoscopy video system with the light source filter 79A described above, the filter characteristics are such that any light outside of the wavelength range of 590-750 nm, or any desired subset of wavelengths in this range, contributes no more than 0.1% to the light transmitted by the filter.
The operation of an embodiment of the fluorescence endoscopy video system will now be described. The cameras 100A as shown in
The processor/controller 64 also protects the sensitive low light image sensor 104 during color imaging by decreasing the gain of the amplification stage of the sensor. The light reflected by the tissue 58 is collected and transmitted by the endoscope image guide 56 to the camera where it is projected through beamsplitter 106 onto the monochrome image sensor 102, or the light is directly projected through the camera beamsplitter 106 onto the monochrome image sensor 102 if the sensor is located within the insertion portion of the endoscope. The image projected during each of red, green, and blue illuminations is transduced by the monochrome image sensor 102 and the resulting image signals are transmitted to the processor/controller 64.
Based on the brightness of the images captured, the processor/controller 64 provides a control signal to the multi-mode light source 52 to adjust the intensity control 80 and thereby adjust the level of light output by the endoscope light guide 54. The processor/controller 64 may also send a control signal to the camera 100A, 100B or 100C to adjust the gain of the monochrome image sensor 102.
The processor/controller 64 interpolates the images acquired during sequential periods of red, green, and blue illumination to create a complete color image during all time periods, and encodes that color image as video signals. The video signals are connected to color video monitor 66 for display of the color image. All of the imaging operations occur at analog video display rates (30 frames per second for NTSC format and 25 frames per second for PAL format).
When switching to the fluorescence/reflectance imaging mode, the processor/controller 64 provides a control signal to the multi-mode light source 52 to indicate that it should be operating in fluorescence/reflectance mode. In response to this signal, the light source filter wheel 79 stops rotating and the light source 52 selects and positions the appropriate blue optical filter 79A continuously into the optical path between the arc lamp 70 and the endoscope light guide 54. This change from sequentially changing filters to a static filter occurs in a period of approximately one second. Filter 79A transmits only those wavelengths of light that will induce the tissue 58 under examination to fluoresce. All other wavelengths of light are substantially blocked as described above. The filtered light is then projected into the endoscope light guide 54 and transmitted to the tip of the endoscope 60 to illuminate the tissue 58.
As part of setting the system in the fluorescence/reflectance mode, the processor/controller 64 also increases the gain of the amplification stage of the low light image sensor 104. The fluorescence emitted and excitation light reflected by the tissue 58 are either collected by the endoscope image guide 56 and projected through the camera beamsplitter 106 onto the low light image sensor 104 and the image sensor 102, or are collected and directly projected through the camera beamsplitter 106 onto the low light image sensor 104 and the image sensor 102 at the insertion tip of the endoscope 60. Spectral filter 118 limits the light transmitted to the low light image sensor 104 to either green or red autofluorescence light only and substantially blocks the light in the excitation wavelength band. The autofluorescence image is transduced by the low light image sensor 104. The reference reflected excitation light image is transduced by the monochrome image sensor 102 and the resulting image signals are transmitted to the processor/controller 64.
Based on the brightness of the transduced images, the processor/controller 64 may provide a control signal to the multi-mode light source 52 to adjust the intensity control 80 and thereby adjust the level of light delivered to the endoscope 60. The processor/controller 64 may also send control signals to the cameras 100A, 100B or 100C to adjust the gains of the low light image sensor 104 and the monochrome image sensor 102, in order to maintain constant image brightness while keeping the relative gain constant.
After being processed, the images from the two sensors are encoded as video signals by processor/controller 64. The fluorescence/reflectance image is displayed by applying the video signals to different color inputs on the color video monitor 66.
In order for the combined image to have optimal clinical meaning, for a given proportion of fluorescence to reference light signals emitted by the tissue and received by the system, a consistent proportion must also exist between the processed image signals that are displayed on the video monitor. This implies that the (light) signal response of the fluorescence endoscopy video system is calibrated. One suitable calibration technique is described in two patent applications previously incorporated herein by reference: U.S. patent application Ser. No. 09/615,965, and U.S. patent application Ser. No. 09/905,642.
The cameras 100A, 100B, 100C can be operated in a variation of the fluorescence/reflectance mode to simultaneously obtain fluorescence images and reflectance images with red, green, and blue illumination. The operation of the system is similar to that described previously for color imaging, so only the points of difference from the color imaging mode will be described.
In this variation of the fluorescence/reflectance mode, instead of changing from sequential red, green, and blue illumination to static blue illumination when switching from color imaging to fluorescence/reflectance imaging, the multi-mode light source 52 provides the same sequential illumination utilized in the color imaging mode, for all imaging modes. Capture and display of the light reflected by the tissue is similar to that described previously for the color imaging mode. However, in addition to the reflectance images captured in that mode, the gain of the amplification stage of the low light image sensor 104 is adjusted to a value that makes it possible to capture autofluorescence images during blue illumination. During red and green illumination, the gain of amplification stage of the low light sensor is decreased to protect the sensor while the image sensor 102 captures reflectance images.
In this modified fluorescence/reflectance mode, the camera captures both reflectance and fluorescence images during the blue illumination period, in addition to reflected light images during the red and green illumination periods. As for the color imaging mode, the reflectance images are interpolated and displayed on the corresponding red, green and blue channels of a color video monitor to produce a color image. Like the previously described fluorescence/reflectance mode, a fluorescence/reflectance image is produced by overlaying the fluorescence image and one or more of the reflectance images displayed in different colors on a color video monitor.
Since individual reflectance and fluorescence images are concurrently captured, both a color image and a fluorescence/reflectance image can be displayed simultaneously on the color video monitor. In this case, there is no need to utilize a separate color imaging mode. Alternatively, as described for the previous version of fluorescence/reflectance operation, only the fluorescence/reflectance image may be displayed during fluorescence/reflectance imaging and a color image displayed solely in the color imaging mode.
Yet another embodiment of this invention will now be described. All points of similarity with the first embodiment will be assumed understood and only points that differ will be described.
In this second embodiment, all aspects of the fluorescence endoscopy video system are similar to those of the first embodiment except for the camera and the light source. A camera 100D for this embodiment of a system is as shown in
Each of the pixel elements on the low light color sensor 103 is covered by an integrated filter, typically red, green or blue (RGB). These filters define the wavelength bands of fluorescence and reflectance light that reach the individual pixel elements. Alternatively, the filter mosaic may be of the cyan, magenta, yellow, green (CMYG) variety. All mosaic filters typically have considerable overlap between their respective pass bands, which can lead to considerable crosstalk when imaging dim autofluorescence light in the presence of intense reflected excitation light. Therefore, a separate filter 118 is provided to reduce the intensity of reflected excitation light to the same level as that of the autofluorescence light and, at the same time, pass autofluorescence light. In addition, some conversion and image processing may be applied to convert CMYG filter responses to responses in RGB space. The signals from color image sensors with CMYG filter mosaics are converted to RGB signals by matrix conversions that are based on the specific layout of the CMYG mosaic pattern and the image sensor read-out architecture. Such conversions are routinely performed in color video systems and are taken to be understood by those of ordinary skill in the art.
In this embodiment, the primary fluorescence and reference images are projected onto the same image sensor 103, but, because of the individual filters placed over each pixel, these different images are detected by separate sensor pixels. As a result, individual primary fluorescence and reference image signals can be produced by processor/controller 64 from the single image sensor.
In
The operation of a system based on camera 100D of
In the color imaging mode, the processor/controller 64 provides a control signal to the multimode light source 52 that it should be in white light mode. The light source selects and positions the appropriate optical filter 76A into the optical path between the arc lamp 70 and endoscope light guide 54. Given the presence of filter 118 in cameras 100D, 100E which have reduced transmission for excitation light at blue wavelengths, the light source filter 76A should, in one embodiment, incorporate reduced transmission at red and green wavelengths or a slight peak in the blue (i.e., from 460-480 nm) to obtain a balanced color image at image sensor 103 with the proper proportions of red, green, and blue components.
Image signals from the color low light sensor 103 are processed by processor/controller 64. Standard techniques are utilized to produce a color image from a single color sensor. The image signals from pixels having the same filter characteristics are interpolated by processor/controller 64 to produce an image signal, related to the pass band of each element of the mosaic filter (e.g., red, green, and blue), at every pixel location. The resulting multiple images, which when combined produce a color image, are encoded by processor/controller 64 as video signals. The color image is displayed by connecting the video signals to the appropriate inputs of color video monitor 66.
Processor/controller 64 also maintains the overall image brightness at a set level by monitoring the brightness of the image signal at each pixel and adjusting the intensity of the light source output and camera amplifier gains according to a programmed algorithm.
When switching to the fluorescence/fluorescence imaging mode, processor/controller 64 provides a control signal to the multi-mode light source 52 to indicate that it should be in fluorescence/fluorescence mode. The light source 52 moves light source filter 76B into position in the light beam. Filter 76B transmits excitation light and blocks the transmission of light at the green and red fluorescence detection wavelengths, as described below. The characteristics of light source fluorescence excitation filter 76B and excitation filter 118, along with the mosaic filter elements on the color sensor 103, are such that the intensity of blue light at the color sensor is less than the intensities of red and green autofluorescence at the sensor, and are such that the ratio of the intensity of red autofluorescence to the intensity of green autofluorescence at the color sensor 103 has the appropriate value for optimal differentiation between normal and abnormal tissue. The fluorescence images are processed, as previously described for color imaging, by processor/controller 64 to produce separate images corresponding to each of the pass bands of the mosaic filter (e.g., RGB or CMYG). These separate images are encoded as video signals by processor/controller 64. A composite fluorescence/fluorescence image is displayed on the color video monitor 66 by applying the video signals from red and green image signals to different color inputs of the monitor. Alternatively, a composite image can be created in the image processor/controller 64 based on the relative intensities of the red and green image signals.
When switching to the fluorescence/reflectance imaging mode, processor/controller 64 provides a control signal to the multi-mode light source 52 to indicate that it should be in fluorescence/reflectance mode. The light source 52 moves light source filter 76C into position in the light beam. Filter 76C transmits both excitation light and reference light and blocks the transmission of light at fluorescence detection wavelengths, as described below. The characteristics of the light source filter 76C for fluorescence excitation and the reflectance illumination and the camera filter 118, along with the mosaic filter on the color sensor 103, as detailed below, are such that the intensity of reflected excitation light at the color sensor is comparable to the intensity of autofluorescence at the sensor, and should be such that the ratio of the intensity of autofluorescence to the intensity of reflected reference light at the color sensor 103 has the appropriate value. The fluorescence and reflectance images are processed, as previously described for color imaging, by processor/controller 64 to produce separate images corresponding to each of the pass bands of the mosaic filter (e.g., RGB or CMYG). These separate images are encoded as video signals by processor/controller 64. A composite fluorescence/reflectance image is displayed on color video monitor 66 by applying the video signals from the appropriate spectral bands (as discussed below) to different color inputs of the monitor or creating a composite image in the image processor/controller based on the relative intensities of the image signals received.
As indicated above, the filters in the light source and camera should be optimized for the imaging mode of the camera, the type of tissue to be examined and/or the type of pre-cancerous tissue to be detected based on in viva spectroscopy measurements. Although all of the filters described below can be made to order using standard, commercially available components, the appropriate wavelength range of transmission and degree of blocking outside of the desired transmission range for the described fluorescence endoscopy images modes are important to the proper operation of the system. The importance of other issues in the specification of such filters such as the fluorescence properties of the filter materials and the proper use of anti-reflection coatings are taken to be understood.
Filter characteristics for use in the fluorescence endoscopy video systems with a camera of the type shown in
The light transmitted in the 450-470 nm wavelength range (or subset of that range) is adjusted, as part of the system design, to meet the need to match the intensity of the reflected reference light projected on the color image sensor to the requirements of the sensor and to provide the appropriate ratio of reference reflected light to fluorescence light, at the same time as maintaining sufficient fluorescence excitation. Of the light transmitted by this filter, less than 0.001% is in the fluorescence imaging wavelength range of 490-750 nm (or whatever desired subset of this range is specified as the transmission range of the primary fluorescence wavelength band).
A limitation of many fluorescence video endoscopy systems is that they typically employ the use of dedicated endoscopes. Fluorescence imaging with a dedicated fluorescence video endoscope is enabled by the use of an excitation barrier filter to block the strong excitation light used to excite the tissue fluorescence that these systems are intended to image. As described in the previous embodiments and shown in the corresponding figures, these barrier filters are built in to the endoscope distal end (typically between the objective lens and the low light image sensor) and this built-in filter distinguishes these endoscopes from those used for conventional videoendoscopy.
In an alternative embodiment of the present invention, fluorescence and color/white light images can be obtained by placing a barrier or blocking filter on the distal tip of a conventional video endoscope. The alternative embodiment of the present invention describes the use of an externally mounted filter in conjunction with a video endoscope that contains a sufficiently sensitive image sensor to image tissue fluorescence. This combination of a conventional video endoscope and an externally mounted barrier filter can be used in conjunction with an appropriate endoscopic light source and video processor/controller to image both in both color, and fluorescence modes.
In order to allow the endoscope to obtain both fluorescence and white light images of the tissue 350, the endoscope 250 may be fitted with the distal end filter 202. In the embodiment shown, the filter 202 is included in a filter assembly 300 that does not obscure the illumination ports 252, 254 in order to allow the illumination light to reach the tissue under examination. In addition the filter assembly 300 does not interfere with other distal tip features including but not limited to features such as water/air nozzles, electrodes, confocal imaging ports, or the working channel 256 so that tools can still be routed through the endoscope. The filter assembly is made of appropriately inert and non-conductive materials such as plastic or glass or stainless steel, so as also not to interfere with the particular material or electrical characteristics of the endoscope tip that may be required for use in conjunction with endotherapy techniques, such as argon ion plasma coagulation (APC), electrocautery, cryotherapy, or photodynamic therapy (PDT)
The filter 202 is positioned in front of the imaging lens 258 and prevents excitation light from reaching the image sensor. In one embodiment, the filter removes the excitation light having wavelengths in the range of 370-460 nm or some subset of this range but passes some blue light (e.g., >460 nm), and green and red light for use in white light imaging as described in previous embodiments and as will be in further described below. Because most endoscopes have objective lenses with a wide field of view, the filter should block excitation light over a corresponding wide field of view and over the range of angles of incidence of light collected by the endoscope objective. The filter should also be thin enough not to introduce optical aberrations or interfere with the mechanical properties and maneuverability of the endoscope tip. Dye-based absorption filters that block the desired range of excitation light and operate over a wide field of view may therefore be preferred for this application. Specific examples of such filters include Kodak Wratten gel filters or dyed polycarbonate or other optically clear plastic or glass materials. For durability, the filter is preferably constructed in a manner similar to optically protective eyewear such as laser goggles. As shown in
In one fluorescence imaging mode, only fluorescence light is used to produce video images of the tissue. In another mode, the tissue is illuminated with the excitation light and some amount of reflectance light. As shown in
As shown in
One embodiment of a filter assembly 200 that is placed on the distal tip of an endoscope is shown in
The filter 202 is positioned in front of the imaging lens of the endoscope. As indicated above, the filter 202 operates to remove a substantial portion of reflected blue excitation light used during the fluorescence imaging mode. In the embodiment shown in
In one embodiment, the filter 202 is a 65 μm thick optical grade polycarbonate film dyed with solvent yellow 33 at a concentration of 0.8% by weight. The film is secured to the lens of the endoscope with a 50 μm thick layer of an optical adhesive. Other dyes, thicknesses and/or concentrations could he used, depending on the spectral characteristics of the light source used to provide the illumination and excitation light, the response of the imager and the particular fluorescence bands to be viewed.
The extruded film can be die cut and packaged as a kit with an amount of adhesive to allow owners of white light imaging endoscopes to convert their endoscopes into devices that perform both white light imaging and fluorescence imaging when used with a light source that generates excitation light for fluorescence imaging and a color corrected illumination light that compensates for the presence of the filter 202.
The spectra of the light when filtered by the UV, IR and distal blocking filter 202 are shown in
However, the light also contains spikes of green and yellow wavelengths at approximately 550 and 580 nanometers. In order to produce color balanced white light images, the amounts of blue, green and red light reaching the image sensor should be approximately the same. Therefore a color correction filter may be placed to filter the light from the light source during white light imaging.
The spectra 312 of the illumination light when filtered by the UV, IR blocking filter and color correction filter are shown in
Another embodiment of the present invention also utilizes a filter on the distal tip of a video endoscope. Some video endoscope systems do not employ the use of color image sensors and instead use sequential illumination in various spectral (e.g., blue, green, red) bands and then combine monochrome images to produce full color images. An external barrier filter 202 can be used in an endoscope of the type shown in
Briefly, in accordance with one embodiment of the invention, autofluorescence images are obtained by illuminating with an excitation light. Color images are obtained by sequential illumination with red, green and blue light; wherein the blue light either includes wavelengths or is limited to wavelengths not blocked by the filter 202.
The fluorescence endoscopy video systems described in the above embodiments have been optimized for imaging endogenous tissue fluorescence. They are not limited to this application, however, and may also be used for photo-dynamic diagnosis (PDD) applications. As mentioned above, PDD applications utilize photo-active drugs that preferentially accumulate in tissues suspicious for early cancer. Since effective versions of such drugs are currently in development stages, this invention does not specify the filter characteristics that are optimized for such drugs. With the appropriate light source and camera filter combinations, however, a fluorescence endoscopy video system operating in either fluorescence/fluorescence or fluorescence/reflectance imaging mode as described herein may be used to image the fluorescence from such drugs.
As will be appreciated, each of the embodiments of a camera for a fluorescence endoscopy video system described above, due to their simplicity, naturally lend themselves to miniaturization and implementation in a fluorescence video endoscope, with the camera being incorporated into the insertion portion of the endoscope. The cameras can be utilized for both color imaging and fluorescence imaging, and in their most compact form contain no moving parts.
This application is a continuation of U.S. patent application Ser. No. 14/154,177, filed Jan. 13, 2014, which is a continuation of U.S. patent application Ser. No. 11/412,715, filed Apr. 26, 2006, now issued as U.S. Pat. No. 8,630,698, which is a continuation-in-part of U.S. patent application Ser. No. 11/122,267, filed May 4, 2005, which is a continuation-in-part of U.S. patent application Ser. No. 11/009,965, filed Dec. 10, 2004, which is a continuation of U.S. patent application Ser. No. 10/050,601, filed Jan. 15, 2002, now issued as U.S. Pat. No. 6,899,675, each of which is hereby incorporated by reference in its entirety, and the benefit of the filing dates of which are being claimed under 35 U.S.C. § 120.
Number | Name | Date | Kind |
---|---|---|---|
3215029 | Woodcock | Nov 1965 | A |
3257902 | Hopkins | Jun 1966 | A |
3971068 | Gerhardt et al. | Jul 1976 | A |
4037866 | Price | Jul 1977 | A |
4066330 | Jones | Jan 1978 | A |
4115812 | Akatsu | Sep 1978 | A |
4149190 | Wessler et al. | Apr 1979 | A |
4200801 | Schuresko | Apr 1980 | A |
4318395 | Tawara | Mar 1982 | A |
4355325 | Nakamura et al. | Oct 1982 | A |
4378571 | Handy | Mar 1983 | A |
4449535 | Renault | May 1984 | A |
4471766 | Terayama | Sep 1984 | A |
4532918 | Wheeler | Aug 1985 | A |
4556057 | Hiruma et al. | Dec 1985 | A |
4611888 | Prenovitz et al. | Sep 1986 | A |
4638365 | Kato | Jan 1987 | A |
4655557 | Takahashi | Apr 1987 | A |
4660982 | Okada | Apr 1987 | A |
4768513 | Suzuki | Sep 1988 | A |
4786813 | Svanberg et al. | Nov 1988 | A |
4821117 | Sekiguchi | Apr 1989 | A |
4837625 | Douziech et al. | Jun 1989 | A |
4856495 | Tohjoh et al. | Aug 1989 | A |
4895145 | Joffe | Jan 1990 | A |
4917457 | Iizuka | Apr 1990 | A |
4930516 | Alfano et al. | Jun 1990 | A |
4951135 | Sasagawa et al. | Aug 1990 | A |
4954897 | Ejima et al. | Sep 1990 | A |
4974936 | Ams et al. | Dec 1990 | A |
5001556 | Nakamura et al. | Mar 1991 | A |
5005960 | Heimbeck | Apr 1991 | A |
5007408 | Ieoka | Apr 1991 | A |
5034888 | Uehara et al. | Jul 1991 | A |
5134662 | Bacus et al. | Jul 1992 | A |
5142410 | Ono et al. | Aug 1992 | A |
5165079 | Schulz-Hennig | Nov 1992 | A |
5205280 | Dennison, Jr. et al. | Apr 1993 | A |
5206759 | Ono et al. | Apr 1993 | A |
5214503 | Chiu et al. | May 1993 | A |
5225883 | Carter et al. | Jul 1993 | A |
5255087 | Nakamura et al. | Oct 1993 | A |
5278642 | Danna et al. | Jan 1994 | A |
5305759 | Kaneko et al. | Apr 1994 | A |
5334191 | Poppas et al. | Aug 1994 | A |
5365057 | Morley et al. | Nov 1994 | A |
5371355 | Wodecki | Dec 1994 | A |
5377686 | O'Rourke et al. | Jan 1995 | A |
5379756 | Pileski et al. | Jan 1995 | A |
5403264 | Wohlers et al. | Apr 1995 | A |
5408263 | Kikuchi et al. | Apr 1995 | A |
5410363 | Capen et al. | Apr 1995 | A |
5419323 | Kittrell et al. | May 1995 | A |
5420628 | Poulsen et al. | May 1995 | A |
5421337 | Richards-Kortum et al. | Jun 1995 | A |
5424841 | Van Gelder et al. | Jun 1995 | A |
5430476 | Hafele et al. | Jul 1995 | A |
5460166 | Yabe et al. | Oct 1995 | A |
5461509 | Canzek | Oct 1995 | A |
5485203 | Nakamura et al. | Jan 1996 | A |
5490015 | Umeyama et al. | Feb 1996 | A |
5507287 | Palcic et al. | Apr 1996 | A |
5536236 | Yabe et al. | Jul 1996 | A |
5576882 | Kanamon | Nov 1996 | A |
5585846 | Kim | Dec 1996 | A |
5590660 | MacAulay et al. | Jan 1997 | A |
5596654 | Tanaka | Jan 1997 | A |
5646680 | Yajima | Jul 1997 | A |
5647368 | Zeng et al. | Jul 1997 | A |
5647840 | D'Amelio et al. | Jul 1997 | A |
5667472 | Finn et al. | Sep 1997 | A |
5684629 | Leiner | Nov 1997 | A |
5695049 | Bauman | Dec 1997 | A |
5697373 | Richards-Kortum et al. | Dec 1997 | A |
5697888 | Kobayashi et al. | Dec 1997 | A |
5713364 | DeBaryshe et al. | Feb 1998 | A |
5722962 | Garcia | Mar 1998 | A |
5749830 | Kaneko et al. | May 1998 | A |
5772355 | Ross et al. | Jun 1998 | A |
5772580 | Utsui et al. | Jun 1998 | A |
5827190 | Palcic et al. | Oct 1998 | A |
5833617 | Hayashi | Nov 1998 | A |
5852498 | Youvan et al. | Dec 1998 | A |
5891016 | Utsui et al. | Apr 1999 | A |
5892625 | Heimer | Apr 1999 | A |
5897269 | Ross et al. | Apr 1999 | A |
5910816 | Fontenot et al. | Jun 1999 | A |
5941815 | Chang | Aug 1999 | A |
5952768 | Strok et al. | Sep 1999 | A |
5971918 | Zanger | Oct 1999 | A |
5976146 | Ogawa et al. | Nov 1999 | A |
5984861 | Crowley | Nov 1999 | A |
5986271 | Lazarev et al. | Nov 1999 | A |
5986642 | Lazarev et al. | Nov 1999 | A |
5990996 | Sharp | Nov 1999 | A |
5999240 | Sharp et al. | Dec 1999 | A |
6002137 | Hayashi | Dec 1999 | A |
6004263 | Nakaichi et al. | Dec 1999 | A |
6008889 | Zeng et al. | Dec 1999 | A |
6021344 | Lui et al. | Feb 2000 | A |
6028622 | Suzuki | Feb 2000 | A |
6030339 | Tatsuno et al. | Feb 2000 | A |
6059719 | Yamamoto et al. | May 2000 | A |
6059720 | Furusawa et al. | May 2000 | A |
6061591 | Freitag et al. | May 2000 | A |
6069689 | Zeng et al. | May 2000 | A |
6070096 | Hayashi | May 2000 | A |
6095982 | Richards-Kortum et al. | Aug 2000 | A |
6099466 | Sano et al. | Aug 2000 | A |
6110103 | Donofrio | Aug 2000 | A |
6110106 | MacKinnon et al. | Aug 2000 | A |
6120435 | Eino | Sep 2000 | A |
6148227 | Wagnieres et al. | Nov 2000 | A |
6161035 | Furusawa | Dec 2000 | A |
6166496 | Lys et al. | Dec 2000 | A |
6192267 | Scherninski et al. | Feb 2001 | B1 |
6212425 | Irion et al. | Apr 2001 | B1 |
6258576 | Richards-Kortum et al. | Jul 2001 | B1 |
6280378 | Kazuhiro et al. | Aug 2001 | B1 |
6293911 | Imaizumi et al. | Sep 2001 | B1 |
6332092 | Deckert et al. | Dec 2001 | B1 |
6347010 | Chen et al. | Feb 2002 | B1 |
6351663 | Flower et al. | Feb 2002 | B1 |
6364829 | Fulghum | Apr 2002 | B1 |
6364831 | Crowley | Apr 2002 | B1 |
6388702 | Konomura et al. | May 2002 | B1 |
6419628 | Rudischhauser et al. | Jul 2002 | B1 |
6422994 | Kaneko et al. | Jul 2002 | B1 |
6433102 | Suzuki et al. | Aug 2002 | B1 |
6443976 | Flower et al. | Sep 2002 | B1 |
6461330 | Miyagi | Oct 2002 | B1 |
6490085 | Zobel | Dec 2002 | B1 |
6526213 | Ilenda et al. | Feb 2003 | B1 |
6527709 | Matsumoto | Mar 2003 | B2 |
6529768 | Hakamata | Mar 2003 | B1 |
6537211 | Wang et al. | Mar 2003 | B1 |
6544102 | Schafer et al. | Apr 2003 | B2 |
6571119 | Hayashi | May 2003 | B2 |
6603552 | Cline et al. | Aug 2003 | B1 |
6639664 | Haan et al. | Oct 2003 | B2 |
6772003 | Kaneko et al. | Aug 2004 | B2 |
6773392 | Kikuchi et al. | Aug 2004 | B2 |
6786865 | Dhindsa | Sep 2004 | B2 |
6821245 | Cline et al. | Nov 2004 | B2 |
6853485 | Hoogland | Feb 2005 | B2 |
6898458 | Zeng et al. | May 2005 | B2 |
6899675 | Cline et al. | May 2005 | B2 |
6907527 | Wu | Jun 2005 | B1 |
6911005 | Ouchi et al. | Jun 2005 | B2 |
6915154 | Docherty et al. | Jul 2005 | B1 |
6944493 | Alam et al. | Sep 2005 | B2 |
6958035 | Friedman et al. | Oct 2005 | B2 |
6960165 | Ueno et al. | Nov 2005 | B2 |
7033314 | Kamrava et al. | Apr 2006 | B2 |
7043291 | Sendai | May 2006 | B2 |
7235045 | Wang et al. | Jun 2007 | B2 |
7236815 | Richards-Kortum et al. | Jun 2007 | B2 |
7324674 | Ozawa et al. | Jan 2008 | B2 |
7341557 | Cline et al. | Mar 2008 | B2 |
7364574 | Flower | Apr 2008 | B2 |
7385772 | Forkey et al. | Jun 2008 | B2 |
7704206 | Suzuki et al. | Apr 2010 | B2 |
7722534 | Cline et al. | May 2010 | B2 |
7724430 | Kasai | May 2010 | B2 |
7733583 | Fujiwara | Jun 2010 | B2 |
7733584 | Kazakevich | Jun 2010 | B2 |
7798955 | Ishihara et al. | Sep 2010 | B2 |
7862504 | Kura et al. | Jan 2011 | B2 |
7918559 | Tesar | Apr 2011 | B2 |
8408269 | Fengler et al. | Apr 2013 | B2 |
8506555 | Ruiz Morales | Aug 2013 | B2 |
8630698 | Fengler et al. | Jan 2014 | B2 |
8780445 | Inoue | Jul 2014 | B2 |
8961403 | Cline et al. | Feb 2015 | B2 |
9241615 | Yoshida et al. | Jan 2016 | B2 |
9386909 | Fengler et al. | Jul 2016 | B2 |
9877654 | Tesar | Jan 2018 | B2 |
9918619 | Tesar | Mar 2018 | B2 |
9968244 | Cline et al. | May 2018 | B2 |
20010016679 | Futatsugi et al. | Aug 2001 | A1 |
20020001080 | Miller et al. | Jan 2002 | A1 |
20020057501 | Lei | May 2002 | A1 |
20020072736 | Tierney et al. | Jun 2002 | A1 |
20020087047 | Remijan et al. | Jul 2002 | A1 |
20020103439 | Zeng | Aug 2002 | A1 |
20020138008 | Tsujita | Sep 2002 | A1 |
20020161283 | Sendai | Oct 2002 | A1 |
20020161284 | Tanaka | Oct 2002 | A1 |
20020175993 | Ueno | Nov 2002 | A1 |
20020177778 | Averback et al. | Nov 2002 | A1 |
20020186478 | Watanabe et al. | Dec 2002 | A1 |
20030002036 | Haan et al. | Jan 2003 | A1 |
20030042493 | Kazakevich | Mar 2003 | A1 |
20030153811 | Muckner | Aug 2003 | A1 |
20030219383 | Weissleder et al. | Nov 2003 | A1 |
20030229270 | Suzuki | Dec 2003 | A1 |
20040010183 | Dhindsa | Jan 2004 | A1 |
20040021859 | Cunningham | Feb 2004 | A1 |
20040037454 | Ozawa | Feb 2004 | A1 |
20040046865 | Ueno | Mar 2004 | A1 |
20040054255 | Pilgrim et al. | Mar 2004 | A1 |
20040125445 | Hoogland | Jul 2004 | A1 |
20040133073 | Berci et al. | Jul 2004 | A1 |
20040142485 | Flower et al. | Jul 2004 | A1 |
20040143162 | Krattiger et al. | Jul 2004 | A1 |
20040148141 | Tsujita et al. | Jul 2004 | A1 |
20040156124 | Okada | Aug 2004 | A1 |
20040186383 | Rava et al. | Sep 2004 | A1 |
20040204671 | Stubbs et al. | Oct 2004 | A1 |
20040218115 | Kawana et al. | Nov 2004 | A1 |
20050027166 | Matsumoto | Feb 2005 | A1 |
20050075575 | Vo-Dinh | Apr 2005 | A1 |
20050096505 | Imaizumi | May 2005 | A1 |
20050143627 | Cline et al. | Jun 2005 | A1 |
20050152027 | Armstrong et al. | Jul 2005 | A1 |
20050154319 | Cline et al. | Jul 2005 | A1 |
20050182291 | Hirata | Aug 2005 | A1 |
20050256373 | Bar-Or et al. | Nov 2005 | A1 |
20050267331 | Secrest et al. | Dec 2005 | A1 |
20050275057 | Breen et al. | Dec 2005 | A1 |
20050288622 | Albrecht et al. | Dec 2005 | A1 |
20060017913 | Kawamata | Jan 2006 | A1 |
20060089554 | Ishihara et al. | Apr 2006 | A1 |
20060146322 | Komachi | Jul 2006 | A1 |
20060211915 | Takeuchi et al. | Sep 2006 | A1 |
20060217594 | Ferguson | Sep 2006 | A1 |
20060239921 | Mangat et al. | Oct 2006 | A1 |
20060241496 | Fengler et al. | Oct 2006 | A1 |
20060258910 | Stefanchik et al. | Nov 2006 | A1 |
20080021274 | Bayer et al. | Jan 2008 | A1 |
20080081948 | Weisenburgh et al. | Apr 2008 | A1 |
20080239070 | Westwick | Oct 2008 | A1 |
20080273247 | Kazakevich | Nov 2008 | A1 |
20090012361 | MacKinnon | Jan 2009 | A1 |
20090209813 | Lubowski et al. | Aug 2009 | A1 |
20090290236 | Wang et al. | Nov 2009 | A1 |
20090303317 | Tesar | Dec 2009 | A1 |
20100081988 | Kahle et al. | Apr 2010 | A1 |
20100125164 | LaBombard | May 2010 | A1 |
20100168588 | Matsumoto et al. | Jul 2010 | A1 |
20100198010 | Cline et al. | Aug 2010 | A1 |
20100277817 | Durell | Nov 2010 | A1 |
20110230719 | Katakura et al. | Sep 2011 | A1 |
20110249323 | Tesar et al. | Oct 2011 | A1 |
20120316394 | Yoshida et al. | Dec 2012 | A1 |
20130184591 | Tesar | Jul 2013 | A1 |
20130194667 | Inoue | Aug 2013 | A1 |
20140187859 | Leeuw et al. | Jul 2014 | A1 |
20140194687 | Fengler et al. | Jul 2014 | A1 |
20140343362 | Tesar | Nov 2014 | A1 |
20150230698 | Cline et al. | Aug 2015 | A1 |
20170266398 | Murray et al. | Sep 2017 | A1 |
Number | Date | Country |
---|---|---|
2 404 600 | Oct 2001 | CA |
200987662 | Dec 2007 | CN |
101115995 | Jan 2008 | CN |
201048936 | Apr 2008 | CN |
201085616 | Jul 2008 | CN |
102004309 | Apr 2011 | CN |
103091829 | May 2013 | CN |
195 35 114 | Mar 1996 | DE |
196 08 027 | Sep 1996 | DE |
0 512 965 | Nov 1992 | EP |
0 672 379 | Sep 1995 | EP |
0 774 685 | May 1997 | EP |
0 774 865 | May 1997 | EP |
0 792 618 | Sep 1997 | EP |
1 232 764 | Aug 2002 | EP |
1 374 755 | Jan 2004 | EP |
1 883 337 | Feb 2008 | EP |
2 028 519 | Feb 2009 | EP |
2 051 603 | Apr 2009 | EP |
2106739 | Oct 2009 | EP |
2 671 405 | Jul 1992 | FR |
S-60-246733 | Dec 1985 | JP |
S-61-159936 | Jul 1986 | JP |
H-01-135349 | May 1989 | JP |
H02-272513 | Nov 1990 | JP |
03-97439 | Apr 1991 | JP |
03-97441 | Apr 1991 | JP |
03-97442 | Apr 1991 | JP |
H03-136630 | Jun 1991 | JP |
H05-005101 | Jan 1993 | JP |
H-05-115435 | May 1993 | JP |
6-63164 | Mar 1994 | JP |
H06-094989 | Apr 1994 | JP |
06-125911 | May 1994 | JP |
H06-068702 | Sep 1994 | JP |
H-07-155285 | Jun 1995 | JP |
H-07-155286 | Jun 1995 | JP |
H-07-155290 | Jun 1995 | JP |
H-07-155291 | Jun 1995 | JP |
H-07-155292 | Jun 1995 | JP |
H07-184832 | Jul 1995 | JP |
H-07-204156 | Aug 1995 | JP |
H-07-222712 | Aug 1995 | JP |
H-07-250804 | Oct 1995 | JP |
H-07-250812 | Oct 1995 | JP |
H-07-327913 | Dec 1995 | JP |
H0S-056894 | Mar 1996 | JP |
H08-094928 | Apr 1996 | JP |
H-08-126605 | May 1996 | JP |
08-140928 | Jun 1996 | JP |
08-140929 | Jun 1996 | JP |
H08-168465 | Jul 1996 | JP |
H-08-224208 | Sep 1996 | JP |
H-08-224209 | Sep 1996 | JP |
H-08-224210 | Sep 1996 | JP |
H-08-224240 | Sep 1996 | JP |
H08-228998 | Sep 1996 | JP |
H-08-252218 | Oct 1996 | JP |
09-066023 | Mar 1997 | JP |
09-070384 | Mar 1997 | JP |
H-10-127563 | May 1998 | JP |
H-10-151104 | Jun 1998 | JP |
H10-192297 | Jul 1998 | JP |
10-201707 | Aug 1998 | JP |
10-225427 | Aug 1998 | JP |
H-10-201700 | Aug 1998 | JP |
H-10-225426 | Aug 1998 | JP |
H-10-243915 | Sep 1998 | JP |
H-10-243920 | Sep 1998 | JP |
H10-262907 | Oct 1998 | JP |
H-10-308114 | Nov 1998 | JP |
H-10-309281 | Nov 1998 | JP |
H-10-309282 | Nov 1998 | JP |
H-10-328129 | Dec 1998 | JP |
11-047079 | Feb 1999 | JP |
11-089789 | Apr 1999 | JP |
H-11-104059 | Apr 1999 | JP |
H-11-104060 | Apr 1999 | JP |
H-11-104061 | Apr 1999 | JP |
H-11-104070 | Apr 1999 | JP |
H-11-113839 | Apr 1999 | JP |
H-11-155812 | Jun 1999 | JP |
H-11-244220 | Sep 1999 | JP |
H-11-332819 | Dec 1999 | JP |
2000-504968 | Apr 2000 | JP |
2000-245693 | Sep 2000 | JP |
2000-287915 | Oct 2000 | JP |
2000-354583 | Dec 2000 | JP |
2001-212245 | Aug 2001 | JP |
2002-244122 | Aug 2002 | JP |
2004-024611 | Jan 2004 | JP |
2004-057520 | Feb 2004 | JP |
2004-094043 | Mar 2004 | JP |
2004-163902 | Jun 2004 | JP |
2004-247156 | Sep 2004 | JP |
2004-292722 | Oct 2004 | JP |
2005-010315 | Jan 2005 | JP |
2005-058618 | Mar 2005 | JP |
2005-058619 | Mar 2005 | JP |
2005-058620 | Mar 2005 | JP |
2005-080819 | Mar 2005 | JP |
2005-081079 | Mar 2005 | JP |
2005-292404 | Oct 2005 | JP |
2007-143624 | Jun 2007 | JP |
2008-511341 | Apr 2008 | JP |
2009-048085 | Mar 2009 | JP |
2009-247566 | Oct 2009 | JP |
2010-526342 | Jul 2010 | JP |
2012-050618 | Mar 2012 | JP |
5089168 | Dec 2012 | JP |
2412800 | Feb 2011 | RU |
WO-199304648 | Mar 1993 | WO |
WO-199526673 | Oct 1995 | WO |
WO-199824360 | Jun 1998 | WO |
WO-199901749 | Jan 1999 | WO |
WO-199953832 | Oct 1999 | WO |
WO-200006013 | Feb 2000 | WO |
WO-200042910 | Jul 2000 | WO |
WO-200054652 | Sep 2000 | WO |
WO-200207587 | Jan 2002 | WO |
WO-2003059159 | Jul 2003 | WO |
WO-2003059159 | Jul 2003 | WO |
WO-2005110196 | Nov 2005 | WO |
WO-2006116634 | Nov 2006 | WO |
WO-2006116847 | Nov 2006 | WO |
WO-2006119349 | Nov 2006 | WO |
WO-2008011722 | Jan 2008 | WO |
WO-2013021704 | Feb 2013 | WO |
WO-2014199236 | Dec 2014 | WO |
Entry |
---|
Alfano, R.R. et al. (Oct. 1987). “Fluorescence Spectra From Cancerous and Normal Human Breast and Lung Tissues,” IEEE Journal of Quantum Electronics QE-23(10):1806-1811. |
Andersson-Engels, S. et al. (1989). “Tissue Diagnostics Using Laser-Induced Fluorescence,” Ber. Bunsenges Physical Chemistry 93:335-342. |
Arregui, M.E. et al. (1996). “Visualization,” Chapter 35 in Principles of Lacroscopic Surgery, Springer-Verlag New York, NY, pp. 767-794. |
Bennett, J.M. et al. (Feb. 1965). “Infrared Reflectance and Emittance of Silver and Gold Evaporated in Ultrahigh Vacuum,” Applied Optics 4(2):221-224. |
Bhunchet, E. et al. (2002). “Fluorescein Electronic Endoscopy: A Novel Method for Detection of Early Stage Gastric Cancer Not Evident to Routine Endoscopy,” Gastrointestinal Endoscopy 55(4):562-571. |
Hung, J. et al. (1991). “Autofluorescence of Normal and Malignant Bronchial Tissue,” Lasers in Surgery and Medicine 11:99-105. |
Orfanidis, S. J. (Jun. 21, 2004) “Multilayer Structures,” Chapter 5 in Electromagnetic Waves and Antennas, thirty four pages. |
O'Shea, D. et al. (1995). “Aberration Curves in Lens Design,” Chapter 33 in Handbook of Optics Fundamentals, Techniques and Design, McGraw-Hill, Inc., pp. 33.1-33.6. |
Sherwinter, D.A. (Aug. 2013, e-published on Oct. 11, 2012). “A Novel Adaptor Converts a Laparoscope Into a high-Definition Rigid Sigmoidoscope,” Surgical Innovation 20(4):411-413. |
Tomkinson, T.H. et al. (Dec. 1, 1996). “Rigid Endoscopic Relay Systems: A Comparative Study,” Applied Optics 35:6674-6683. |
Canadian Office Action dated Dec. 30, 2015, for Canadian Patent Application No. 2,896,267, filed Sep. 22, 2014, four pages. |
Chinese First Office Action dated Aug. 3, 2016 for Chinese Application No. 201380073686.0 filed Dec. 24, 2013, sixteen pages. |
Chinese First Office Action dated Nov. 2, 2016 for Chinese Application No. 201480027537.5 filed May 15, 2014, 17 pages. (with English Translation). |
European Office Action dated Nov. 19, 2015, for EP Application No. 07 785 001.4, filed on Jul. 30, 2007, four pages. |
Final Office Action dated Aug. 8, 2016 for U.S. Appl. No. 14/278,833, filed May 15, 2014, seven pages. |
Final Office Action dated Dec. 7, 2015 for U.S. Appl. No. 13/585,824, filed Aug. 14, 2012, ten pages. |
Final Office Action dated Feb. 14, 2012, for U.S. Appl. No. 12/278,740, filed Dec. 10, 2008, 6 pages. |
Final Office Action dated Jul. 13, 2014 for U.S. Appl. No. 13/585,824, filed Aug. 14, 2012, nine pages. |
Final Office Action dated Mar. 3, 2016 for U.S. Appl. No. 14/140,370, filed Dec. 24, 2013, ten pages. |
Final Office Action dated Jul. 23, 2008, for U.S. Appl. No. 11/122,267, filed May 4, 2005, six pages. |
Final Office Action dated Jun. 18, 2015, for U.S. Appl. No. 14/154,177, filed Jan. 13, 2014, eight pages. |
Final Office Action dated Jun. 5, 2014, for U.S. Appl. No. 12/761,462, filed Apr. 16, 2010, fourteen pages. |
Final Office Action dated May 11, 2011, for U.S. Appl. No. 11/412,715, filed Apr. 26, 2006, eight pages. |
Final Office Action dated Nov. 24, 2009, for U.S. Appl. No. 11/009,965, filed Dec. 10, 2004, fourteen pages. |
International Preliminary Report on Patentability dated Feb. 3, 2009, for International Application No. PCT/CA2007/001335 filed Jul. 30, 2007, five pages. |
International Preliminary Report on Patentability dated Nov. 6, 2007, for International Application No. PCT/CA2006/000669 filed Apr. 27, 2006, nine pages. |
International Search Report and Written Opinion dated Sep. 22, 2014, for International Application No. PCT/IB2013/003243 filed on Dec. 24, 2013, six pages. |
International Search Report dated Aug. 3, 2006, for International Application No. PCT/CA2006/000669, three pages. |
International Search Report dated Dec. 7, 2007, for International Application No. PCT/CA2007/001335, two pages. |
International Search Report dated Feb. 26, 2008 for International Application No. PCT/US07/061810 filed Feb. 7, 2007, two pages. |
International Search Report dated Jan. 21, 2002, for International Application No. PCT/US2001/022198, filed on Jul. 13, 2001, three pages. |
Japanese Final Office Action dated Aug. 2, 2013, for Japanese Patent Application No. 2008-509275, filed on Apr. 27, 2006, four pages. |
Japanese Office Action dated Feb. 17, 2012, for Japanese Patent Application No. 2008-509275, filed on Apr. 27, 2006, six pages. |
Japanese Office Action dated Jul. 4, 2016 for Japanese Patent Application No. 2015-550160 filed Dec. 24, 2013, ten pages. |
Japanese Office Action dated Nov. 11, 2011, for Japanese Patent Application No. 2009-521077, filed on Jul. 30, 2007, four pages. |
Japanese Office Action dated Sep. 14, 2012, for Japanese Patent Application No. 2008-509275, filed on Apr. 27, 2006, seven pages. |
Japanese Office Action dated Sep. 19, 2014, for Japanese Patent Application No. 2013-246636, filed on Apr. 27, 2006, six pages. |
Korean Office Action dated Jul. 15, 2016 for Korean Patent Application No. 10-2015-7019659 filed Dec. 24, 2013, twelve pages. |
Non Final Office Action dated Dec. 31, 2015 for U.S. Appl. No. 14/278,833, filed May 15, 2014, seven pages. |
Non Final Office Action dated Jan. 29, 2014 for U.S. Appl. No. 13/585,824, filed Aug. 14, 2012, five pages. |
Non Final Office Action dated Mar. 25, 2015 for U.S. Appl. No. 13/585,824, filed Aug. 14, 2012, ten pages. |
Non Final Office Action dated Oct. 23, 2015 for U.S. Appl. No. 14/140,370, filed Dec. 24, 2013, eight pages. |
Non-Final Office Action dated Apr. 2, 2009, for U.S. Appl. No. 11/009,965, filed Dec. 10, 2004, thirteen pages. |
Non-Final Office Action dated Aug. 16, 2013, for U.S. Appl. No. 12/761,462, filed Apr. 16, 2010, ten pages. |
Non-Final Office Action dated Aug. 16, 2013, for U.S. Appl. No. 12/761,523, filed Apr. 16, 2010, nine pages. |
Non-Final Office Action dated Dec. 10, 2010, for U.S. Appl. No. 11/412,715, filed Apr. 26, 2006, ten pages. |
Non-Final Office Action dated Dec. 14, 2011, for U.S. Appl. No. 11/412,715, filed Apr. 26, 2006, eight pages. |
Non-Final Office Action dated Jan. 2, 2008, for U.S. Appl. No. 11/122,267, filed May 4, 2005, five pages. |
Non-Final Office Action dated Jan. 20, 2016, for U.S. Appl. No. 14/629,473, filed Feb. 23, 2015, fourteen pages. |
Non-Final Office Action dated Jul. 17, 2003, for U.S. Appl. No. 09/905,642, filed Jul. 13, 2001, six pages. |
Non-Final Office Action dated Jun. 1, 2007, for U.S. Appl. No. 10/899,648, filed Jul. 26, 2004, seven pages. |
Non-Final Office Action dated Jun. 14, 2011, for U.S. Appl. No. 12/278,740, filed Dec. 10, 2008; 15 pages. |
Non-Final Office Action dated Jun. 20, 2008, for U.S. Appl. No. 11/009,398, filed Dec. 10, 2004, fifteen pages. |
Non-Final Office Action dated Jun. 23, 2010, for U.S. Appl. No. 11/009,965, filed Dec. 10, 2004, fourteen pages. |
Non-Final Office Action dated Jun. 9, 2011, for U.S. Appl. No. 11/830,323, filed Jul. 30, 2007, five pages. |
Non-Final Office Action dated May 18, 2004, for U.S. Appl. No. 10/050,601, filed Jan. 15, 2002, eight pages. |
Non-Final Office Action dated Nov. 23, 2009, for U.S. Appl. No. 11/969,974, filed Jan. 7, 2008, seven pages. |
Non-Final Office Action dated Sep. 27, 2016, for U.S. Appl. No. 14/140,370, filed Dec. 24, 2013, nine pages. |
Non-Final Office Action dated Sep. 12, 2014, for U.S. Appl. No. 14/154,177, filed Jan. 13, 2014, four pages. |
Non-Final Office Action with Restriction Requirement dated Mar. 4, 2011, for U.S. Appl. No. 11/830,323, filed Jul. 30, 2007, nine pages. |
Notice of Allowability dated Jan. 2, 2008, for U.S. Appl. No. 10/899,648, filed Jul. 26, 2004, three pages. |
Notice of Allowance dated Apr. 7, 2004, for U.S. Appl. No. 09/905,642, filed Jul. 13, 2001, six pages. |
Notice of Allowance dated Aug. 26, 2004, for U.S. Appl. No. 10/050,601, filed Jan. 15, 2002, four pages. |
Notice of Allowance dated Aug. 6, 2015, for U.S. Appl. No. 13/853,656, filed Mar. 29, 2013, seven pages. |
Notice of Allowance dated Feb. 25, 2010, for U.S. Appl. No. 11/969,974, filed Jan. 7, 2008, four pages. |
Notice of Allowance dated Mar. 28, 2016, for U.S. Appl. No. 13/853,656, filed Mar. 29, 2013, eight pages. |
Notice of Allowance dated Nov. 23, 2015, for U.S. Appl. No. 13/853,656, filed Mar. 29, 2013, seven pages. |
Notice of Allowance dated Oct. 10, 2014, for U.S. Appl. No. 12/761,462, filed Apr. 16, 2010, ten pages. |
Notice of Allowance dated Oct. 5, 2007, for U.S. Appl. No. 10/899,648, filed Jul. 26, 2004, six pages. |
Notice of Allowance dated Sep. 10, 2013, for U.S. Appl. No. 11/412,715, filed Apr. 26, 2006, eight pages. |
Notice of Allowance dated Sep. 14, 2012, for U.S. Appl. No. 11/830,323, filed Jul. 30, 2007, eight pages. |
Supplementary European Search Report dated Jan. 24, 2012 for EP Application No. 07 785 001.4, filed on Jul. 30, 2007, seven pages. |
Supplemental European Search Report dated Oct. 9, 2013, for European Patent Application No. 06721854.5 filed May 4, 2005, six pages. |
Supplemental Notice of Allowability dated Mar. 10, 2005, for U.S. Appl. No. 10/050,601, filed Jan. 15, 2002, five pages. |
Written Opinion of the International Searching√ Authority dated Aug. 3, 2006, for International Application No. PCT/CA2006/000669 filed Apr. 27, 2006, eight pages. |
Written Opinion of the International Searching Authority dated Dec. 7, 2007, for International Application No. PCT/CA2007/001335 filed Jul. 30, 2007, four pages. |
Non-Final Office Action dated Jul. 5, 2016, for U.S. Appl. No. 14/975,707, filed Dec. 18, 2015, four pages. |
Canadian Office Action dated Nov. 29, 2016, for Canadian Patent Application No. 2,896,267, filed Sep. 22, 2014, four pages. |
Final Office Action dated Oct. 6, 2016, for U.S. Appl. No. 14/629,473, filed Feb. 23, 2015, 17 pages. |
U.S. Appl. No. 15/073,259, filed Mar. 17, 2016, thirty nine pages. |
Schott AG's Catalog for Optical Glasses. (Feb. 2016). “As the source of the Schott Table.” |
Canadian Office Action dated Jan. 4, 2017 for Canadian Application No. 2,911,861, filed on May 15, 2014, four pages. |
Chinese Office Action dated Aug. 21, 2017, for Chinese Application No. 201480027537.5, filed on May 15, 2014, seventeen pages. |
Chinese Second Office Action dated Mar. 9, 2017, for Chinese Application No. 201380073686.0, filed on Dec. 24, 2013, nineteen pages. |
European Communication pursuant to Rules 70(2) and 70a(2) EPC dated Aug. 4, 2016 for European Application No. 13871081.9 filed on Dec. 24, 2013, one page. |
European Extended Search Report dated Dec. 16, 2016 for EP Application No. 14810752.7 filed on Sep. 8, 2016, eight pages. |
Extended European Search Report dated Jul. 18, 2016 for European Application No. 13871081.9 filed on Dec. 24, 2013, seven pages. |
Japanese Notice of Allowance dated Sep. 4, 2017 for Japanese Patent Application No. 2016-513460, filed on May 15, 2014, six pages. |
Japanese Office Action dated Dec. 5, 2016 for Japanese Patent Application No. 2016-513460, filed on May 15, 2014, nine pages. |
Japanese Office Action dated Mar. 3, 2017 for Japanese Patent Application No. 2015- 550160, filed on Jun. 24, 2015, nine pages. |
Korean Final Office Action dated Feb. 13, 2017, for Korean Patent Application No. 2015-7019659, filed on Jul. 20, 2015, twelve pages. |
Korean Notice of Allowance dated Aug. 14, 2017, for Korean Application No. 10-2015-7035138, filed on Dec. 10, 2015, three pages. |
Korean Office Action dated Jan. 6, 2017 for Korean Patent Application No. 10-2015-7035138 filed on Dec. 10, 2015, ten pages. |
Russian Office Action dated Dec. 29, 2016, for Russian Application No. 2015124802, filed on Dec. 24, 2013, thirteen pages. |
U.S. Final Office Action dated Jan. 23, 2017 for U.S. Appl. No. 14/140,370, filed Dec. 24, 2013, twelve pages. |
U.S. Non Final Office Action dated Dec. 22, 2016, for U.S. Appl. No. 13/585,824, filed Aug. 14, 2012, eight pages. |
U.S. Non-Final Office Action dated Oct. 3, 2017, for U.S. Appl. No. 15/073,259, filed Mar. 17, 2016, nine pages. |
U.S. Notice of Allowance dated Aug. 23, 2017, for U.S. Appl. No. 13/585,824, filed Aug. 14, 2012, five pages. |
U.S. Notice of Allowance dated Jul. 28, 2017, for U.S. Appl. No. 14/629,473, filed Feb. 23, 2015, ten pages. |
U.S. Notice of Allowance dated Jan. 5, 2017, for U.S. Appl. No. 14/278,833, filed May 15, 2014, seven pages. |
U.S. Notice of Allowance datsed Sep. 15, 2017, for U.S. Appl. No. 13/585,824, filed Aug. 14, 2012, five pages. |
Canadian Office Action dated Feb. 5, 2018 for Canadian Patent Application No. 2911861 filed on Nov. 16, 2015, four pages. |
Chinese Third Office Action dated Mar. 8, 2018 for Chinese Application No. 201480027537.5 filed May 15, 2014, thirteen pages. |
U.S. Notice of Allowance dated Jan. 9, 2018, for U.S. Appl. No. 14/629,473, filed Feb. 23, 2015, nine pages. |
U.S. Appl. No. 15/844,206, filed Dec. 15, 2017. |
Number | Date | Country | |
---|---|---|---|
20160270640 A1 | Sep 2016 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14154177 | Jan 2014 | US |
Child | 14975707 | US | |
Parent | 11412715 | Apr 2006 | US |
Child | 14154177 | US | |
Parent | 10050601 | Jan 2002 | US |
Child | 11009965 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 11122267 | May 2005 | US |
Child | 11412715 | US | |
Parent | 11009965 | Dec 2004 | US |
Child | 11122267 | US |