The invention is directed to medical imaging, in particular to a system and method for obtaining visible light images and near infrared light images from an area under observation, such as living tissue, and in particular for use in endoscopy.
Near-infrared (NIR) imaging has been described in the literature for various clinical applications. Typically such an imaging modality utilizes a contrast agent (e.g. indocyanine green) that absorbs and/or fluoresces in the NIR. Such contrast agents may be conjugated to targeting molecules (e.g. antibodies) for disease detection. The contrast agents may be introduced into tissue intravenously or subcutaneously to image tissue structure and function (e.g. flow of blood/lymph/bile in vessels) that is not easily seen with standard visible light imaging technology.
Independently of the clinical application, endoscopic NIR imaging devices typically include multiple imaging modes as a practical feature. For example, endoscopists utilize visible spectrum color for both visualization and navigation, and an endoscopic imaging device that offers NIR imaging typically provides a concurrent color image. Such concurrent imaging devices can be realized, for example, as follows:
It would therefore be desirable to provide a system and a method for simultaneous acquisition of full-color visible light and NIR light images, which obviates the aforementioned disadvantages and does not compromise image resolution and/or introduce objectionable motion artifacts.
According to one aspect of the invention, a method for acquisition of NIR images and full-color images includes the steps of illuminating an area under observation with continuous blue/green light, and illuminating the area under observation with red light and NIR light, wherein at least one of the red light and NIR light are switched on and off periodically. The blue, green, red and NIR light returning from the area under observation is directed to one or more sensors which are configured to separately detect the blue light, the green light, and the combined red light/NIR light. The red light spectral component and the NIR light spectral component are determined separately from image signals of the combined red light/NIR light, in synchronism with the switched red and NIR light. A full-color reflectance image of the area under observation is rendered and displayed from the blue, green, and red light and an NIR image is likewise rendered and displayed from the NIR light.
According to another aspect of the invention, an imaging system for acquisition of NIR and full-color images includes a light source providing visible light and NIR light to an area under observation, a camera having one or more image sensors configured to separately detect blue and green light, and combined red and NIR light returned from the area under observation, and a controller in signal communication with the light source and the camera. The controller is configured to control the light source to continuously illuminate tissue with blue/green light and to illuminate the area under observation with red light and NIR light, wherein at least one of the red light and NIR light are switched on and off periodically in synchronism with the acquisition of the red and NIR images in the camera.
The controller is further configured to determine from sensor signals representing the combined red light and NIR light separately the red light spectral component and the NIR light spectral component. The imaging system further includes a display receiving image signals corresponding to the blue light, the green light, and the separately determined red light spectral component and rendering therefrom a full-color visible light image of the area under observation. The display also receives the separately determined NIR light spectral component and renders therefrom an NIR image of the area under observation.
The video imaging system may use a three-sensor color camera configured to continuously image the blue and green wavebands and intermittently image the red waveband, thus providing continuous, high quality luma information and a sufficiently continuous complete chroma to produce high quality video images of the area under observation, such as living tissue. In such a configuration, the red image sensor can be time-multiplexed to acquire both red and NIR images (i.e. the red image sensor alternately, and in rapid succession, images both red light for the color information required for the color image and NIR light for image information required for the NIR image). Such time-multiplexing may be coupled to (and synchronized with) the illumination source used to provide the NIR illumination (excitation for fluorescence) and the red light for color imaging. Image processing is then utilized to separate and process the resulting image signals appropriately.
Embodiments of the invention may include one or more of the following features. The area under observation may be alternatingly illuminated with red light and NIR light, wherein the duration of red light may be different from, preferably longer than, the duration of illumination with NIR light. The illumination may be switched at video field or frame rates.
Fields captured by the image sensor and lacking the red light spectral component or the NIR light spectral component may be interpolated from temporally adjacent image fields that include a corresponding red light spectral component or NIR light spectral component. In one embodiment, the NIR light spectral component obtained in the absence of red light may be subtracted from the combined red light/NIR light to obtain the separate red light spectral component. This is advantageous in particular when the detected NIR signal has an intensity comparable to that of the red signal.
In one embodiment, the light source may include an illuminator emitting a substantially constant intensity of visible light and NIR light over a continuous spectral range, and a plurality of movable filters disposed between the illuminator and the area under observation for transmitting temporally continuous blue/green light and temporally discontinuous red light and NIR light.
In another embodiment, the light source may include an illuminator emitting a substantially constant intensity of visible light and NIR light over a continuous spectral range, first dichroic means for separating the visible light and NIR light into blue/green and red light and NIR light, shutter means for transforming the separated red light and NIR light into temporally discontinuous red light and discontinuous NIR light, and second dichroic means for combining the blue/green light, the temporally discontinuous red light and the temporally discontinuous NIR light for transmission to the area under observation.
In yet another embodiment, the light source may include a first illuminator emitting a substantially constant intensity of green and blue light, a second illuminator producing switched red light, a third illuminator producing switched NIR excitation light, and dichroic means for combining the switched red light and the switched NIR light with the green and blue light for transmission to the area under observation. The switched red light and the NIR light may be produced by interrupting a continuous intensity light beam of the red light and the NIR light by a shutter or chopper. Alternatively, the switched red light and the NIR light may be produced by electrically switching the second illuminator and the third illuminator on and off.
The image sensors may employ an interlaced scan or a progressive scan.
The imaging system may include an endoscope.
The following figures depict certain illustrative embodiments of the invention which are to be understood as illustrative of the invention and not as limiting in any way.
a-2d show various exemplary embodiments of a multimode light source to be used with the endoscopic system of
a shows an exemplary dichroic prism employed by a 3-sensor color camera;
b shows the optical transmission ranges for the spectral components separated by the dichroic prism of
c shows the optical transmission range of a notch filter that blocks excitation light from entering the camera;
Color video images are generally obtained with three-sensor color cameras where separate red, green and blue image sensors provide simultaneous contiguous arrays of red, green and blue pixel information. Full color video images are generated by combining the image information from all three sensors. Color fidelity (i.e. a true color rendition) is extremely important in medical imaging applications and all three sensors are used to provide complete color information.
To understand the relative importance of color and spatial information in video images of human tissue, however, it is useful to consider information in such video images in terms of luma and chroma. Luma refers to the brightness information in the image and it is this information that provides the spatial detail that enables the viewer to recognize shapes. The spatial and temporal resolution of luma is consequently crucial to the perception of video image quality. Chroma refers to the color information in the video image. It is a property of human vision that fine detail variations in the chroma of image features are not easily perceived and that such variations are consequently less critical than fine detail variations in luma, in an overall assessment of image quality. It is for this reason that video encoding of chroma information is often sub-sampled.
In video images of human tissue obtained with visible light, the structural details of the tissue are largely contained in the blue and green wavelength regions of the imaged light. Blue and green light tends to be reflected from the tissue surface, whereas red light tends to be highly scattered within the tissue. As a consequence, there is very little fine structural detail in the red light that reaches the red image sensor. It is also known from color science that human vision receives most of the spatial information from the green portion of the visible spectrum—i.e. green light information contributes disproportionately to the luma. The standard formula for calculating luma from gamma-corrected color components is Y′=0.2126 R′+0.7152 G′+0.0722 B′. For this reason, spatial and/or temporal interpolation of the red component of video images of human tissue does not significantly affect perception of fine detail in those images.
Similarly to red light, NIR light tends to be scattered in tissue causing NIR image features to be diffusely, rather than sharply defined. Furthermore, because the NIR image highlights areas of interest (i.e. the areas in which the contrast agent is localized), but does not provide the overall visualization or navigational information, it is desirable for a NIR endoscopic imaging device to provide a continuous color image and either a superimposed or side-by-side display of the NIR image information. In such a display the NIR light would also contribute less to the spatial information presented to observer.
a-2d show schematic diagrams of exemplary embodiments of various light sources 11. The illustrated light sources are constructed to supply in normal color imaging mode visible illumination light yielding a substantially continuous spectral distribution. The light source maybe an arc lamp, a halogen lamp, one or more solid state sources (e.g. LEDs, semiconductor lasers) or any combination thereof and may be spectrally filtered or shaped (e.g. with bandpass filters, IR filters, etc.). The continuous spectrum may be produced as primary colors (RGB) either concurrently or sequentially, for example, using a rotating filter wheel.
In systems according to the present invention, light sources to be used with the system of the invention and described in detail below are configured to provide continuous, uninterrupted illumination in the blue and green parts of the visible spectrum and discontinuous red and/or NIR light. The blue and green parts of the visible spectrum may be optically filtered from the emission produced by a continuous source or produced directly by a narrow-band source (e.g. blue and green LEDs). The red and NIR light may also be produced by an arc lamp, a halogen lamp, a solid state source (e.g., red and NIR LEDs or lasers), or any combination thereof.
Turning now to
Another embodiment of a light source 11b is schematically illustrated in
In another embodiment of a light source 11c schematically illustrated in
In yet another embodiment of a light source 11d schematically illustrated in
The alternating red and NIR illumination is synchronized with the image acquisition of the three-sensor camera such that red and NIR images are acquired by the camera synchronously with the red and NIR illumination of the endoscope.
a shows in more detail the three-sensor camera 13 of
In all the figures, the term “IR” is used instead of or interchangeably with “NIR.”
Once the color and NIR image data have been processed, the signal is outputted to a video monitor and may be displayed as two separate, simultaneous views (one color and one fluorescence) or as combined color and fluorescence image signals (e.g. by assigning the fluorescence signal a color that contrasts with the naturally occurring colors in the tissue).
In yet another exemplary embodiment (not illustrated in the drawings), the green/blue illumination as well as the red illumination are continuous, whereas the NIR illumination is modulated. This timing scheme can be best applied if the red and NIR image signals have approximately the same magnitude. In this embodiment, the light source provides uninterrupted illumination with full visible spectrum and intermittent illumination with NIR light. The timing diagram is essentially the same as that depicted in
In any of the aforementioned embodiments, the NIR endoscopic imaging system can also be operated such that the light sources provides continuous illumination with either the full visible spectrum or the NIR spectrum and the camera acquires the corresponding color image or NIR (absorbance or fluorescence) image in a continuous fashion to provide high spatial resolution. The resulting video image of either individual illumination/imaging mode—color or NIR—can be subsequently displayed and/or recorded.
By implementing color and NIR imaging as described in the aforementioned embodiments, it is possible to acquire and display full-color visible light and NIR light images at video rates without compromising image resolution and/or introducing objectionable motion artifacts. Furthermore, should any residual color fringing occur as a consequence of sharp edges moving rapidly across the visual field (e.g. with the discontinuous acquisition of red or NIR images), these relatively minor effects can be mitigated by temporal interpolation of the missing (red/NIR) video fields with minimum additional processing time.
While the invention has been disclosed in connection with the preferred embodiments shown and described in detail, various modifications and improvements thereon will become readily apparent to those skilled in the art. For example, instead of using separate image sensors for G/B and R/NIR, or a single color sensor for RGB images and NIR fluorescence images, a single direct three-color RGB sensor image sensor with a stacked pixel design implemented in CMOS technology and commercially available from Foveon, Inc., San Jose, Calif., may be used. Such sensor is schematically illustrated in
While the invention has been illustrated and described in connection with currently preferred embodiments shown and described in detail, it is not intended to be limited to the details shown since various modifications and structural changes may be made without departing in any way from the spirit and scope of the present invention. The embodiments were chosen and described in order to explain the principles of the invention and practical application to thereby enable a person skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated.
What is claimed as new and desired to be protected by Letters Patent is set forth in the appended claims and includes equivalents of the elements recited therein:
This application is filed under 35 U.S.C. §371 as a U.S. national phase application of PCT/US2009/037506, designating the United States and having an international filing date of Mar. 18, 2009, which claims the benefit of U.S. provisional patent application No. 61/037,514, filed on Mar. 18, 2008, the contents of which are incorporated herein by reference in their entirety.
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/US2009/037506 | 3/18/2009 | WO | 00 | 11/24/2010 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2009/117483 | 9/24/2009 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
4717952 | Kohayakawa et al. | Jan 1988 | A |
5515449 | Tsuruoka et al. | May 1996 | A |
6596996 | Stone et al. | Jul 2003 | B1 |
7179222 | Imaizumi et al. | Feb 2007 | B2 |
7253894 | Zeng et al. | Aug 2007 | B2 |
8285015 | Demos | Oct 2012 | B2 |
8498695 | Westwick et al. | Jul 2013 | B2 |
20020013937 | Ostanevich et al. | Jan 2002 | A1 |
20020016533 | Marchitto et al. | Feb 2002 | A1 |
20020021355 | Utsui | Feb 2002 | A1 |
20020035330 | Cline et al. | Mar 2002 | A1 |
20020138008 | Tsujita et al. | Sep 2002 | A1 |
20030229270 | Suzuki et al. | Dec 2003 | A1 |
20040006276 | Demos et al. | Jan 2004 | A1 |
20040044275 | Hakamata | Mar 2004 | A1 |
20040148141 | Tsujita et al. | Jul 2004 | A1 |
20040186351 | Imaizumi et al. | Sep 2004 | A1 |
20040225222 | Zeng et al. | Nov 2004 | A1 |
20040245350 | Zeng | Dec 2004 | A1 |
20040263643 | Imaizumi et al. | Dec 2004 | A1 |
20050171440 | Maki et al. | Aug 2005 | A1 |
20050182321 | Frangioni | Aug 2005 | A1 |
20050203421 | Zeng | Sep 2005 | A1 |
20060155166 | Takahashi et al. | Jul 2006 | A1 |
20070213593 | Nakaoka | Sep 2007 | A1 |
20070229309 | Tomita et al. | Oct 2007 | A1 |
20080039697 | Morishita | Feb 2008 | A1 |
20080177140 | Cline et al. | Jul 2008 | A1 |
20090021739 | Tsujita et al. | Jan 2009 | A1 |
20090052185 | Toriyama et al. | Feb 2009 | A1 |
20090181339 | Liang et al. | Jul 2009 | A1 |
20100094136 | Nakaoka et al. | Apr 2010 | A1 |
20100157039 | Sugai | Jun 2010 | A1 |
Number | Date | Country |
---|---|---|
A-02-049302 | Feb 1990 | JP |
10-201707 | Aug 1998 | JP |
2002-000560 | Jan 2002 | JP |
2004-520105 | Jul 2004 | JP |
2004-289545 | Oct 2004 | JP |
2006-525494 | Nov 2006 | JP |
A-2007-029453 | Feb 2007 | JP |
A-2007-089840 | Apr 2007 | JP |
A-2011-500921 | Jan 2011 | JP |
A-2011-528918 | Dec 2011 | JP |
B2-5231625 | Mar 2013 | JP |
Entry |
---|
Int'l Preliminary Report on Patentablility, PCT/US2009/037506 (Sep. 21, 2010). |
Japanese Office Action dated Apr. 20, 2012, issued in counterpart Japanese Application No. 2011-500921. |
Japanese Patent Office, Office Action mailed May 26, 2014 in Japanese Patent Application No. 2013-058356 w/Concise Explanation of the Relevance. |
Apr. 3, 2015 Office Action issued in Japanese Application No. 2013-058356. |
Number | Date | Country | |
---|---|---|---|
20110063427 A1 | Mar 2011 | US |
Number | Date | Country | |
---|---|---|---|
61037514 | Mar 2008 | US |