Endoscopic fluorescence imaging

Information

  • Patent Grant
  • 11832797
  • Patent Number
    11,832,797
  • Date Filed
    Monday, March 25, 2019
    5 years ago
  • Date Issued
    Tuesday, December 5, 2023
    4 months ago
Abstract
Systems and methods are configured for combined fluorescence imaging and white light imaging of tissue such as during surgical endoscopic procedures. A chip-on-tip type endoscope can be equipped with both white light and blue light LEDs. A single camera or dual cameras are configured with backside illuminated CMOS image sensor(s) to receive and process the white light and fluorescence images. The light sources, image sensors and image processing circuitry are configured to synchronously emit light and record pixels for visible white light and fluorescence frames alternately. Global or quasi-global shuttering can be used on the image sensor(s). A modified color filter array and other filters can be provided to enhance fluorescence imaging capabilities. Insufflating gas clears debris from the camera field of view and aids in moving the cannula distally through a body passageway.
Description
FIELD

This patent specification generally relates mainly to a medical device for use in tissue examinations. More particularly, some embodiments relate to devices and methods for fluorescence imaging in medical applications such as visually detecting tissues such as tumors, nerves, and vessels during surgical procedures.


BACKGROUND

Endoscopic fluorescence imaging systems can be used to detect tissue such as tumors and vessels during surgical procedures. Infrared dyes can be used as tagging dyes for marking tissue. Some endoscopic fluorescence imaging systems are capable of acquiring high resolution images in the normal white light visible spectrum, while simultaneously acquiring and overlaying the infrared signal on top of normal visible spectrum images in order to provide a contrast to a surgeon while operating. Some systems are designed to detect unbound intravascular indocyanine green (ICG), an FDA approved NIR (near infrared) fluorescent dye. ICG is typically intravenously administered in high doses, and imaging is performed 30-60 minutes after injection. The intravascular fluorescent load achieved with this approach is high, and approved clinical imaging devices have adequate sensitivity for these applications. Examples of such systems include a fluorescent module incorporated into operating microscopes (OPMI Pentero Infrared 800, Carl Zeiss), as well at the SPY® and Pinpoint® systems (Novadaq), and the FluoBeam® 800 (Fluoptics) hand-held unit. While these systems may have adequate sensitivity for intravascular imaging, they may lack practical use in other applications such as targeted tumor-specific NIR fluorescence due to low sensitivity. In simultaneous visible and NIR capture imaging systems, one camera captures the image in the visible spectrum and second camera captures the fluorescent image. This is achieved by splitting the incident light from the field into two channels using a beam-splitter. One beam transmits the NIR fluorescent light to one of the cameras, while the other beam of visible light passes through the beam splitter into the second camera. See, e.g., U.S. Pat. No. 8,961,403 B2.


U.S. Pat. No. 9,407,838 discusses a system suitable for lab analysis or high-end surgical applications that can simultaneously record a visible light image and an infrared light image from fluorescent dye. The discussed system simultaneously uses a laser for emitting excitation light for an infrared or near-infrared fluorophore and a visible light source. The discussed system also includes a notch beam splitter, a notch filter, a synchronization module, an image sensor for simultaneously detecting both visible light and infrared light, an image processing unit for image subtraction after capture, an image displaying unit, and light-conducting channels.


The subject matter described or claimed in this patent specification is not limited to embodiments that solve any specific disadvantages or that operate only in environments such as those described above. Rather, the above background is only provided to illustrate one exemplary technology area where some embodiments described herein may be practiced.


SUMMARY

According to some embodiments, a single-use, disposable cannula for multi-band imaging of a patient's bladder has a distal tip and comprises: a light source configured to illuminate said bladder with white light during first time intervals and with non-white excitation light causing fluorescence from selected tissue during second time intervals time-interleaved with said first time intervals; and an imaging structure configured to receive both said fluorescence and reflections of said white light from the bladder; wherein said imaging structure comprises a single, multi-pixel, backside-illuminated, two-dimensional light sensor array and a readout circuit electrically and physically integrated therewith into a circuit stack; wherein said imaging structure is configured to respond to said reflections to generate a white light image of the bladder and to said fluorescence to produce a fluorescence image of the bladder; and wherein the white light and fluorescence images are both spatially and temporally registered with each other.


According to some embodiments: the imaging structure includes respective color filters for pixels of said sensor array, causing some of said pixels to receive said fluorescent and others to receive said reflections; the cannula includes a global shutter configured to essentially concurrently read the pixels of said sensor array; the cannula includes a quasi-global shutter configured to essentially concurrently read only one or more selected subsets of the pixels of said sensor array; an agent introduced in the bladder preferentially induces cancerous tissue to emit said fluorescence; said second time intervals are longer than said first time intervals; the excitation light has a wavelength in the range 350-450 nanometers; the fluorescence has a wavelength in the range of 580-720 nanometers; and/or the sensor array receives only the reflected white light during said first time intervals and only the fluorescence during said second time intervals.


According to some embodiments, the cannula is combined with a handle to which the cannula is releasably secured through electrical and mechanical couplers tool-free by hand and detached tool-free by hand, and said handle may further comprise a physically integral video screen coupled with said sensor array to receive and display said white light and fluorescent images.


According to some embodiments: the cannula further includes a shield keeping said excitation light from said sensor array; said white light and excitation light illuminate the same field of illumination in the bladder; and/or said imaging structure comprises a spatially repeating pattern of red, green, and blue pixel filters through which light from the bladder passes to reach said sensor array.


According to some embodiments, the cannula is enclosed in a sterile packaging and is a single-use, disposable unit.


According to some embodiments: the cannula further includes a source of a second, different excitation light causing fluorescence from the bladder during third time intervals interleaved in time with said first and second time intervals, and said imaging structure is configured to receive fluorescence from the bladder during the second and third time intervals and produce spatially registered white light and two fluorescence images.


According to some embodiments, the cannula comprises a distal portion at which said imaging structure is situated and a proximal portion, and further comprises an inlet port that is at the proximal portion of the cannula and is configured to receive insufflating gas, an outlet port that is at the distal portion of the cannula, and a gas conduit between said proximal and distal ports, wherein said distal port is configured to direct gas delivered thereto through said conduit in a flow over said imaging structure that clears a field of view of the imaging portion.


According to some embodiments, said distal port is configured to extend said gas flow in a distal direction to aid in moving said distal portion of the cannula through a patient's body passageway by helping dilate the passageway; said distal portion of the cannula comprises an outer shell that surrounds said imaging structure and said gas conduit comprises spacing between the outer shell and the imaging structure; said spacing radially surrounds the entire imaging structure; said spacing is only at one or more arcs around the imaging structure; said outer shell extends distally further than said imaging structure and comprises a lip bent radially inwardly to direct said gas glow over the imaging structure; and/or said lip extends over only a part of the circumference of said outer shell.


According to some embodiments, the cannula further includes a manually adjustable gas flow control having an output for receiving gas from a source and an output connected for gas flow to said input port; and/or further includes a pressure gauge showing gas pressure fed to said input port. The source of the insufflating gas can be a conventional threaded CO2 cartridge that contains less than 10 grams of carbon dioxide, thus helping keep an entire set of a cannula, handle and gas source easily hand-held and portable.


According to some embodiments, an endoscope comprises a single-use, disposable cannula for multi-band imaging of a patient's bladder and a reusable handle that is releasably connected to said cannula through electrical and mechanical connectors, wherein (a) said cannula has a distal tip that comprises: a light source configured to illuminate said bladder with white light during first time intervals and with non-white excitation light causing fluorescence from selected tissue during second time intervals time-interleaved with said first time intervals; and an imaging structure configured to receive both said fluorescence and reflections of said white light from the bladder and in response to produce a white light image of the bladder and a fluorescence image of the bladder that are both spatially and temporally registered with each other; an input port for insufflating gas, a distal outlet port for said gas at a distal portion of the cannula, and a gas conduit connecting the input and distal ports; wherein said distal port is configured to direct a flow of said gas over a field of view of said imaging structure to clear debris from said field of view; and (b) said handle comprises an integral display screen configured to display images taken with said imaging structure.


According to some embodiments, in said endoscope: said distal port comprises an outer shell that at least partly surrounds said imaging structure and directs said gas over a distal face of the imaging structure; said conduit comprises space between the outer shell and the imaging structure; and/or said distal port is further configured to direct gas flow distally of said distal tip of the cannula to aid in moving said distal tip through a patient's body passageway.


According to some embodiments, the endoscope further includes a gas flow controller having an input for gas and an output feeding gas to said distal port of the cannula; and/or said imaging structure comprises an imaging sensor and a processor in a stacked arrangement.


According to some embodiments, a method of imaging a patient's inner bladder wall comprises: Illuminating said inner bladder wall with white light during first time intervals and with non-white excitation light causing fluorescence from selected tissue during second time intervals time-interleaved with said first time intervals; receiving both said fluorescence and reflections of said white light from the bladder wall at an imaging structure that is at a tip of a cannula, and in response producing a white light image of the bladder wall and a fluorescence image of the bladder wall that are both spatially and temporally registered with each other; and supplying insufflating gas at a distal port of said cannula to create a gas flow over a field of view of the imaging structure that aids in clearing debris from said field of view.


According to some embodiments, said supplying step further comprises creating a gas flow that extends distally of said distal tip of the cannula to aid in moving said distal tip through a patient's body passageway.


As used herein, the grammatical conjunctions “and”, “or” and “and/or” are all intended to indicate that one or more of the cases, object or subjects they connect may occur or be present. In this way, as used herein the term “or” in all cases indicates an “inclusive or” meaning rather than an “exclusive or” meaning.


As used herein the terms “surgical” or “surgery” refer to any physical intervention on a patient's tissues and does not necessarily involve cutting a patient's tissues or closure of a previously sustained wound.





BRIEF DESCRIPTION OF THE DRAWINGS

To further clarify the above and other advantages and features of the subject matter of this patent specification, specific examples of embodiments thereof are illustrated in the appended drawings. It should be appreciated that these drawings depict only illustrative embodiments and are therefore not to be considered limiting of the scope of this patent specification or the appended claims. The subject matter hereof will be described and explained with additional specificity and detail through the use of the accompanying drawings in which:



FIGS. 1A and 1B are diagrams illustrating aspects of a conventional CMOS image sensor;



FIG. 2 is a diagram showing aspects of a distal tip assembly of multi-color band endoscope, according to some embodiments;



FIGS. 3A and 3B are diagrams illustrating aspects of a CMOS image sensor system having improved visualization of tissue using combined fluorescence and white light endoscopy, according to some embodiments;



FIGS. 4A and 4B are schematic diagrams illustrating aspects of a stacked CMOS image sensor system having improved visualization of tissue using combined fluorescence and white light endoscopy, according to some embodiments;



FIG. 5. is a diagram illustrating aspects of a filter configured to enhancing capture of fluorescence light during combined fluorescence and white light endoscopy imaging, according to some embodiments;



FIGS. 6A and 6B are charts illustrating some aspects of the timing of illuminating light sources and sensor exposures for combined fluorescence and white light endoscopy imaging, according to some embodiments;



FIG. 7 is a cross sectional diagram illustrating aspects of a camera module configured for fluorescence endoscopy, according to some embodiments;



FIG. 8 is a diagram showing aspects of a distal tip assembly having dual camera modules, according to some other embodiments;



FIG. 9 is a perspective view of a handheld multi-color band endoscope, according to some embodiments;



FIG. 10 is a block diagram illustrating aspects of a multi-color band endoscope, according to some embodiments;



FIGS. 11 and 12 are diagrams of a portable endoscope with disposable cannula, according to some embodiments;



FIG. 13 is a perspective view illustrating aspects of a distal tip of an endoscopic cannula configured for using gas to insufflate a target organ, according to some embodiments;



FIGS. 14A and 14B are perspective and cross section diagrams illustrating further aspects of a distal tip of an endoscopic cannula configured for using gas to insufflate a target organ, according to some embodiments;



FIG. 15 is a perspective view illustrating further aspects of a distal tip of an endoscopic cannula configured for using gas to insufflate a target organ, according to some embodiments; and



FIG. 16 a diagram showing a handheld surgical endoscope configured for using gas to insufflate a target organ being inserted in a tissue passageway, according to some embodiments.





DETAILED DESCRIPTION

A detailed description of examples of preferred embodiments is provided below. While several embodiments are described, the new subject matter described in this patent specification is not limited to any one embodiment or combination of embodiments described herein, but instead encompasses numerous alternatives, modifications, and equivalents. In addition, while numerous specific details are set forth in the following description in order to provide a thorough understanding, some embodiments can be practiced without some or all of these details. Moreover, for the purpose of clarity, certain technical material that is known in the related art has not been described in detail in order to avoid unnecessarily obscuring the new subject matter described herein. It should be clear that individual features of one or several of the specific embodiments described herein can be used in combination with features of other described embodiments or with other features. Further, like reference numbers and designations in the various drawings indicate like elements.



FIGS. 1A and 1B are diagrams illustrating aspects of a conventional CMOS image sensor. A color image is captured using a color filter array (CFA), where the sensor pixels are masked with red (R), green (G) and blue (B) filters. A 2×2 portion 100 of a common conventional CFA known as a Bayer filter is shown in FIG. 1A. As shown in FIG. 1B, each color band image is captured by a separate sub-group of pixels of the sensor pixel array. A full-color image, with intensities of all three primary colors represented by each pixel, is then synthesized by merging the separate sub pixel group using a de-mosaicing algorithm. As shown in FIG. 1A, the standard Bayer filter CFA has 2G 1R 1B or “RGBG”. With this arrangement, more pixels are allocated to G, because the human eye is most sensitive to green (around 550 nm) and most natural images have high green content.


In medical imaging, fluorescence techniques allow visualization of features that are invisible or are not easily visible under conventional white light. According to some embodiments, a video endoscope system is provided for fluorescence endoscopy. A protocol is also described to combine the fluorescence endoscopy with white light endoscopy. According to some embodiments, dual color band or multi-color band imaging systems are described herein that provide improved visualization of tissue using combined fluorescence and white light endoscopy.



FIG. 2 is a diagram showing aspects of a distal tip assembly of a multi-band or multi-color band endoscope, according to some embodiments. According to some embodiments, the endoscope 200 is disposable, or partially disposable after single-use. Endoscopes with disposable cannulas that include a distal tip with a camera are discussed in U.S. Patent Appl. Publ. No. 2016/0367119, and U.S. Pat. No. 9,895,048 which are incorporated herein by reference. Other examples of endoscopes with disposable cannulas are discussed in U.S. Pat. No. 9,622,646, also incorporated herein by reference. The distal tip assembly 204 includes a lens/sensor barrel 210 that includes a camera module having a color sensor with a color filter array, and electronics and circuitry as will be described in further detail, infra. Surrounding the lens/sensor barrel 210 are blue LEDs 232 and white LEDs 234. The blue LEDs 232 are configured to emit excitation light suitable for fluorescence endoscopy. In some examples, the blue LEDs 232 are configured to emit light at about 410 nm (violet-blue). The white LEDs 234 are configured to emit white light suitable for visible white light endoscopy. Below the lens/sensor barrel 210 is a port 220 that is configured to provide fluid (flowing either into or out of the patient) and/or provide an opening through which a tool or other device can pass (e.g. a needle). According to some embodiments, the port 220 is greater than 1.0 mm in its smallest dimension. According to some embodiments, the LEDs 232 and 234 are approximately 0.75 mm×0.25 mm. The outer wall 202, according to some embodiments, is at least 0.25 mm thickness and is appropriately rounded to avoid any sharp edges. According to some embodiments, the lens/sensor barrel 210 and the LEDs 232 and 234 can be recessed so that a dome shaped cover can be placed so as to further ensure smoothness of the distal tip. According to some embodiments, the outer diameter of the tip assembly is 12 Fr to 15 Fr (4 mm to 5 mm), and preferably is no more than 6 mm as the outside diameter of outer wall 202. Note that although FIG. 2 shows a total of six LEDs (three white and three blue), in general, other numbers of LEDs may be provided according to factors such as desired lighting quality, endoscope size, and LED characteristics such as size and brightness. In some embodiments four or fewer LEDs can be provided and in some embodiments 10 or more LEDs can be provided. Furthermore, the number of white and blue LEDs does not have to be equal, but also will depend on various factors. Other light sources can be substituted, such as optic fibers that deliver light generated elsewhere.



FIGS. 3A and 3B are diagrams illustrating aspects of a CMOS image sensor system having improved visualization of tissue using combined fluorescence and white light endoscopy, according to some embodiments. In FIG. 3A a 2×2 portion 300 is shown of an RGBR color filter array. A color filter array having the shown pattern will have red resolution roughly 2 times greater than that of blue or green. As can be seen in FIG. 3B, when the RGBR color filter layer 302 is used, the sensor array 304 has enhanced sensitivity to fluorescence light coming from the tissues of interest during an endoscopic procedure. According to some embodiments, the sensor array 304 and RGBR color filter 302 are used in the camera module of endoscope 200 shown in FIG. 2.


According to some embodiments, endoscope 200 can be used for differential imaging of a patient's bladder to determine the presence and characteristics of cancerous tissue. In this example of use, the color filter array 302 is configured to enhance sensitivity to pink-red light (about 610 nm wavelength) by selecting a pattern that preferentially passes pink-red to sensor array 304. The image provided by pink-red light preferentially shows cancerous tissue emitting fluorescent light in that color, typically due to the appropriate dye or other agent introduced in the patient at an appropriate time before imaging. Such dyes or other agents are known in the art of fluorescence medical imaging.


According to some embodiments, endoscope 200 can be used for differential imaging of a patient's nerves to assist in surgery where it may be desirable to identify nerve tissue so as to avoid damaging such tissue or to perform procedures on such tissue. In this example of use, the color filter array 302 is configured to enhance sensitivity to dark green light (about 510 nm wavelength) by selecting a pattern that preferentially passes dark green to sensor array 304. A filter such as in FIG. 3A but with a pattern RGBG can be used for the purpose. The image provided by dark green light preferentially shows nerve tissue emitting fluorescent light in that color, typically due to the appropriate dye or other agent introduced in the patient at an appropriate time before imaging. Such dyes or other agents are known in the art of fluorescence medical imaging.


According to some embodiments, light sources 234 can provide light other than white light, sources 232 can provide different color(s) excitation light, and color filter 302 can preferentially pass light in different color(s) to sensor 304, to thereby preferentially image selected tissue with light of a first wavelength range incident on sensor 304 and other selected tissue with light of a second wavelength range, different from the first range, incident on sensor 304.



FIGS. 3A and 3B illustrate one example of a filter array 302 but other examples of filter arrays that enhance sensing different colors at sensor 304 are possible. While using a white light image and a color image such as pink-red, or dark green are given as examples, other combinations of images may be useful for imaging patients' tissue, and fluorescent images in two or more colors may be useful. Typically, sensor array 304 is read in alternating or at least interspersed time intervals, for example alternating white image and color image time intervals. The resulting white and color images typically are shown composited, in geometric registration, so that the position of selected tissue such as cancerous tissue or nerve tissue can be visualized relative to surrounding anatomy, but separate white and color images can be displayed instead or in addition.



FIGS. 4A and 4B are schematic diagrams illustrating aspects of a stacked CMOS image sensor system having improved visualization of tissue using combined fluorescence and white light endoscopy, according to some embodiments. FIG. 4A depicts RGBR color filter 302 formed within an upper portion of sensor array 304, which includes backside-illuminated (BSI) CMOS photo diodes. The sensor array 304 is bonded directly to a logic circuit chip 430 that contains an image signal processor (ISP). FIG. 4B is a side view of the BIS CMOS image sensor (CIS) stack 400 containing the RGBR color filter array 302 and logic circuit chip 430. Using a 3D stacked image sensor 400 consisting of a BSI image sensor die face-to-face stacked on a logic die containing an image signal processor (ISP) has a number of advantages for combined fluorescence and white light endoscopy applications, one of which is the ability to use global shuttering and/or quasi-global shuttering. The stacked image sensor and ISP arrangement shown in FIGS. 4A and 4B allows for readout of the entire area of the pixel array simultaneously. The image can be captured simultaneously using all pixels. According to some embodiments, the image sensor/ISP architecture includes a memory structure and additional MOS transistors to provide additional functionality and increased read-out speed. Another benefit of using a stacked image sensor in endoscopy applications is that the fast read-out times allow for higher quality moving image capabilities which is desirable for imaging during surgical procedures. Furthermore, the use of stacked arrangement such as shown in FIGS. 4A and 4B allows for a more compact sensor chip since the circuit section of the chip does not take up additional sensor chip surface area. Having a compact sensor chip is beneficial in chip-on-tip endoscopy applications since it is desirable to reduce the frontal area occupied by the camera module.


According to some embodiments, stack 400 can be a stacked back-illuminated structure similar to a stacked BSI CMOS device from Sony in the family of devices offered under the trade designation Exmor RS sensors. Preferably, stack 400 is an integrated, single-chip device comprising individual photo-diodes, each under a respective color filter through which light from tissue passes before impinging on the photo-diode, metal wiring under the photodiodes, and image processing circuitry under the metal wiring to read the electrical output of the photo-diodes and convert it to image information to be sent for further processing and/or display. One example of such stack is designated Sony Exmor RS sensor stack IMX 398, which comprises a 12 MP sensor (4608×3456) and has a 6.4 mm diagonal dimension. The stack comprises a sensor layer and a logic layer that is behind the sensor layer and contains readout circuits to read out the electrical signals that the sensor pixels provide in response to light. For use in endoscope 200, the diagonal dimension of sensor stack 400 preferably is no more than 3 mm, more preferably no more than 2 mm, and even more preferably no more than 1.5 mm. This can be accomplished by reducing the number of pixels in the sensor, for example to a number sufficient to provide lower spatial resolution such as 400×400, or by shrinking the dimensions of sensor elements such as photo-diodes, or both.


According to some embodiments, the stacked arrangement can include one or more additional filters 440 to further enhance combined fluorescence and white light endoscopy imaging. For example, filters 440 often include an infrared (IR) filter configured to suppress IR sensitivity, or a different color filter configured to suppress sensitivity to a different color as required for a specific tissue or specific use of endoscope 200. Filters 440 can also include a band pass filter that is configured to allow passage of wavelengths around 600 nm, or some other wavelength(s) that coincide with fluorescent light coming from the tissues of interest. Further examples of possible filters are shown and described with respect to FIG. 5, infra. According to some embodiments, the stacked arrangement can include further layers, such as a DRAM layer 450 (shown in FIG. 4B) that can be configured to store signals read at high speed temporarily, enabling even shorter read-out times. A stack 400 with a built-in DRAM layer can be similar in structure to, but preferably smaller than, a device offered by Sony under the designation Exmor RS IMX 400. In this case, the stack comprises a sensor layer, a DRAM layer, and a logic (readout) layer, stacked in the direction of the light that illuminates the sensor layer. A built-in DRAM layer for temporary storage of images need not be used when the read-out circuitry in stack 400 is sufficiently fast to send the image information from sensor array 304 essentially in real time to processing circuitry elsewhere, such as in handle 940 (FIG. 9) or in display 950 (FIG. 9), where the output of stack 400 can be converted into images for display and/or storage. According to some embodiments, the stacked CMOS image sensor system(s) shown in FIGS. 4A and 4B are used in the camera module of endoscope 200 shown in FIG. 2.



FIG. 5. is a diagram illustrating aspects of a filter configured to enhance capture of fluorescence light during combined fluorescence and white light endoscopy imaging, according to some embodiments. Shown in the main plot are curves 510, 512 and 514 which are pixel response curves for respective colors in an example of a CMOS sensor array. One or more filters, such as filter(s) 440 shown in FIG. 4B, can be configured to allow passage of light in region 520. According to some embodiments, wavelengths below about 420 nm (region 524) and above 700 nm (region 526) are heavily filtered out. According to some embodiments, the filter(s) can be further configured selectively to reduce light having wavelengths in region 522. Note that in practice the precise filtering characteristics and wavelengths blocked and/or passed will depend on the materials and structures used for filtering.


The reduction in region 522 has been found to be useful in reducing the amount of blue light “background” or “noise” that is recorded by the “red pixels” (the pixels associated with “R” in the color filter array). As can be seen in FIG. 5, a typical CMOS pixel response curve for red pixels, curve 514, has some non-zero response in the blue region (region 522). In particular, the cross-hatched area 516 represents that the red pixels will pick up light in the region near the fluorescence excitation source light (e.g. from 350 nm to 450 nm). Since the excitation source light in fluorescence endoscopy can be several times brighter than the resultant fluorescence light, the excitation light can show up as background or noise in the red pixel image. It has been found that by selectively suppressing the blue light in the region 522, and especially near the wavelengths of the source light, the background noise can be greatly reduced in the red pixel image. The amount of reduction in 522 can be tailored to the particular characteristics of the sensor and image processing used, as well as the particular surgical application(s) the endoscope is directed to. The design of filter 440 should also account for the properties of the color filter array 302 (shown in FIGS. 3B, 4A and 4B). For example, in many cases the color filter array 302 will heavily filter out wavelengths below 420 nm. In such cases the filter 440 can be configured to heavily reduce infrared light greater than 700 nm, and selectively reduce blue light in the region 522 to reduce blue light background from the red pixels. According to some embodiments, some or all of filtering aspects of filter 440 described herein can be integrated into the design of the color filter array 302. For example, for suppressing the blue light background recorded by the red pixels in the region 522, the described blue light suppression can be partially or fully accomplished by color filter array 302. Specifically, the red filter portions of filter 302 can be configured to suppress more light in region 522 than red filters in typical CFA designs (e.g. curve portion 516). According to some embodiments, filter 440 can be a liquid crystal tunable filter (LCTF) that can be selectively switched between being clear or nearly clear so it is essentially transparent and being in a state in which it blocks or significantly attenuates some wavelengths, as is described in further detail with respect to FIG. 7, infra.



FIGS. 6A and 6B are charts illustrating some aspects of the timing of illuminating light sources and sensor exposures for combined fluorescence and white light endoscopy imaging, according to some embodiments. Four time intervals 610, 612, 614 and 616 are shown wherein white light sources (e.g. White LEDs 234 in FIG. 2) and blue light sources (e.g. Blue LEDs 232 in FIG. 2) are alternately activated. During the T-White intervals 610 and 614 the white light sources are energized while the global shutter allows passage of white light and are read out of the entire sensor array (e.g. of sensor 304 in FIGS. 4A and 4B). Image processing is pre-formed for the white light frames. Using all of the pixels the white light frames are processed, for example, using ISP circuitry 430 and possibly other processors. During the T-Blue intervals 612 and 616 the blue light sources are energized while the global shutter is set to filter out or significantly attenuate selected wavelengths such as in the blue region and are read out of the R pixels from the sensor array (e.g. of sensor 304 in FIGS. 4A and 4B). Image processing is performed for the red color band frames. Using all of the “R” pixels, the red light frames are processed, for example, using ISP circuitry 430 and possibly other processors. Thus, illumination light (white or blue) is synchronized with the appropriate global shuttering and image processing. The alternately captured white light video and sub-band video (R pixels) present different information of the object being imaged. The white light band image presents a normal view of the object, while the sub-band light image presents additional information about the object. According to some embodiments, the two video types are overlaid by the system processor. In medical applications, displaying the composite video allows clinicians to locate a tissue of interest (e.g. a diseased tissue) because the sub-band image precisely “highlights” the tissue of interest within an ordinary white light background image. This type of overlying or corn positing is particularly useful in endoscopic surgical procedures.


According to some embodiments, the time periods can alternate between white and a color but in other embodiments the periods can be interspersed in a different sequence, for example in the sequence T-white, T-blue, T-blue, T-white, T-blue, T-white, etc. In addition, two or more colors can be included in the same sequence of time intervals, for example to form a sequence T-white, T-blue, T-red, T-white, etc.


According to some embodiments, the image processing can be configured differently for the white and red color band frames to further enhance visualization. For example, during the T-Blue intervals (612 and 616) the relative weights of the three-color channels are manipulated to enhance the visibility of the fluorescent tissue. The red channels can be boosted while the blue channels can be suppressed, by adjusting their relative gains. This will enhance the fluorescent tissue while reducing blue reflectance from the excitation light source (the blue LEDs).


According to some embodiments, in addition to alternating the white and blue light illumination, the duration of the blue (and/or other color(s)) and white light intervals can be manipulated to provide enhanced imaging for certain applications. For example, in some cases the fluorescence image may be too weak compared to the white image. In such cases the T-White and T-Blue interval timing can be configured such that the T-Blue intervals 612 and 616 are longer than the T-White intervals 610 and 614. In some examples, the T-Blue intervals can be 2 or more times longer than the T-White intervals so that the fluorescence image in enhanced over the white image.


According to some embodiments, quasi-global shuttering can be used in some cases instead of full global shuttering. Especially in case of a lower cost, smaller sized pixel array, quasi global shuttering can be used wherein the pixels in an entire column (or row) are read out in series, but all of the columns (or rows) are read out simultaneously. In this case each column (or row) has a dedicated analog to digital converter (ADC). For example, if the array size is 400×400 pixels (and the pixel size is about 2.2 um×2.2 um to 3.0 um×3.0 um) with a pixel read-out time of 5 u secs, each column (or row) can be read-out in about 2 ms, which leaves plenty of time for exposure (30+ ms for exposure at 30 frames per second). Note that in such quasi-global shuttering embodiments, the sensor readout in the T-Blue intervals (612 and 616) will include all of the pixel data, and the non-red pixel data can simply be ignored when forming the red color band frames.



FIG. 7 is a cross sectional diagram illustrating aspects of a camera module configured for fluorescence endoscopy, according to some embodiments. The cross section shows a camera module and includes the lens/sensor barrel 210 which is also visible in FIG. 2. The outer body of the module is housed by camera module holder 702. The lens cover 710 overlies iris 712. One or more lens components are housed within holder 702, such as lens components 720 and 722 in this example shown. A BIS CMOS image sensor (CIS) stack 400 is shown bonded to circuit board 740 in this example. According to some embodiments, the CMOS sensor stack 400 includes an RGBR color filter array such as filter layer 302 shown in FIGS. 3B, 4A and 4B. One or more filters 440 are shown overlaid on the sensor stack 400. According to some embodiments, the filter(s) 440 can include a tunable filter, such as liquid crystal tunable filter (LCTF), that uses electronically controlled liquid crystal (LC) elements to transmit a selectable wavelength of light and exclude or suppress others. The LCTF filter can be dynamically controlled by a system processor to be in one of two states that are synchronized with the timing of illumination, global shutter (or quasi-global) capture mode, and image processing, as shown in FIGS. 6A and 6B. During the T-White intervals, (610 and 614 in FIGS. 6A and 6B) the LCTF filter is set to a “clear” state to allow passage of all colors of the spectrum. During the T-Blue intervals (612 and 616 in FIGS. 6A and 6B) the LCTF filter is set to a “blue” state which blocks most of all of the blue light from entering the sensor 304. In this way, the captured fluorescence light signal can be significantly enhanced by greatly reducing the undesirable background blue light or other spurious fluorescence or phosphorescence light while the white light images are unaffected.



FIG. 8 is a diagram showing aspects of a distal tip assembly having dual camera modules, according to some other embodiments. According to some embodiments, the endoscope portion 800 is disposable, or partially disposable after single-use, as in the case of endoscope portion 204 shown in FIG. 2. The distal tip assembly in this case includes two camera modules: a white camera module 810 and a blue camera module 812. The white camera module 810 includes a CMOS sensor having an ordinary color filter array (as shown in FIGS. 1A and 1B), while the blue camera module 812 is specially configured to image the fluorescent light. According to some embodiments, the blue camera module 812 has no color filter array, but rather has a pass band filter with a narrow pass band corresponding to the fluorescent light (e.g. 600 nm or 610 nm). According to some embodiments, a single stacked CIS chip structure 850 can be used for both camera modules 810 and 812. In such cases, the stacked CIS structure is configured to selectively read out the portion of the sensor structure belonging to the white camera module and blue camera module during the appropriate intervals. Surrounding the camera modules 810 and 812 are a number of blue LEDs 832 and white LEDs 834. The blue LEDs 832 are configured to emit excitation light suitable for fluorescence endoscopy. In some examples, the blue LEDs 832 are configured to emit light at about 410 nm. The white LEDs 834 are configured to emit white light suitable for visible white light endoscopy. Also visible is a port 820 that is configured to provide fluid (flowing either into or out of the patient), and/or provide an opening through which a tool or other device can pass (e.g. a needle). According to some embodiments, the port 820 has an inner diameter of 1.0 mm, and the outer diameter of the entire tip is less than 6 mm on its largest dimension.


According to some embodiments, the blue camera module 812 is optimized to maximize sensitivity to the fluorescence light, while minimizing interference from other light sources. Since in this example the CMOS sensor does not have a CFA which would cause loss of incoming light, a filter can be used to block the undesirable blue light from entering the image sensor. According to some embodiments, a combined filter or separate filter can also be used to block undesired spurious fluorescence not originated from the targeted tissue. A combined filter or separate filter can also be used to block undesired spurious phosphorescence light from entering the blue camera. According to some embodiments, a band pass filter can be used that blocks all the wavelengths below about 580 nm and in the infrared band. One or more filters can be placed either in front of the camera lens (e.g. between the outermost element and the iris, or outside the iris. Alternatively, or in addition, the filter(s) can form part of the sensor lens stack (such as shown with filter(s) 440 in FIG. 7).



FIG. 9 is a perspective view of a handheld multi-color band endoscope, according to some embodiments. The endoscope 200 includes an elongated cannula 920 with a distal tip 204 for inserting into a hollow organ or cavity of the body. Note that further details of tip 204 are shown in FIG. 2, supra. According to some embodiments, tip 204 includes a camera module and white and blue LEDs, as is shown in more detail in FIG. 2, and/or other colors. According to some embodiments, tip 204 can include dual camera modules such as shown in FIG. 8, supra. According to some embodiments, the distal end of the cannula 920 can also be slightly bent.


The endoscope 200 includes a handle portion 940 that is sized and shaped for easy grasping by the endoscope operator (e.g. doctor or other medical professional). According to some embodiments, the cannula 920 includes a fluid channel which is fluidly connected to a proximal fluid port (not shown) and port 220 (shown in FIG. 2). According to some embodiments, the channel within the cannula 920 can also be used as working channel via a proximal opening (not shown). According to some embodiments, a re-usable portions of endoscope 200 is removably mounted to enable some portions of endoscope 200 to be re-used while other portions intended to be disposed of after single-use. For example, endoscope 200 can be configured with a connector between the handle portion 940 so it can be detached from cannula 920, as discussed for the endoscope shown in FIGS. 1 and 2 of said pending U.S. application Ser. No. 14/913,867. In this case the cannula 920, including the distal tip 204 can be disposed of after a single-use while the handle portion 940 (including electronics and a re-chargeable battery) and display module 950 can be re-used many times. Other configurations are possible. For example, in some embodiments the display module 950 includes much or all of the electronics and a re-chargeable battery, and the display module is configured to be removable from the handle portion 940. In this case the handle portion 940 and cannula 920 can be configured and intended to be disposed of after a single use while the display module 950 can be re-used many times. Display 950 can be removably mounted on the upper side of handle portion 940 as shown. By making some portions of the device 200 all single-use, stringent decontamination and disinfection procedures, as well as the risk of cross-contamination and hospital acquired diseases, can be significantly lessened or avoided. Endoscope tip 204 alternatively can be used as a part of a non-disposable endoscope cannula, if made such that it can be suitably decontaminated between patients.


A distal portion of cannula 920 can be bent so tip 204 can point to the tissue to be imaged, as in the case of the tip portion of the cannula in FIGS. 1 and 2 of said application Ser. No. 14/913,867. When the tip portion is bent in this manner, the professional using endoscope 200 can rotate it around the long axis of cannula 920 such that tip 204 points to the desired tissue.


If desirable to change the orientation of tip 204 relative to the long axis of cannula 920, endoscope 200 can be provided with a facility to deflect tip 204 relative to cannula 920 before and/or during a patient examination. For example, a distal portion of cannula 920 can be made of a material that can be bent and retains its shape after being bent, so that a desired angle between tip 204 and the long axis of cannula 920 can be set before introducing cannula 920 into a patient. If desirable to change the angle while cannula 920 is in a patient, endoscope 200 can be provided with a tip deflecting mechanism, for example of the type discussed in U.S. Pat. No. 9,549,666 B2, which is hereby incorporated by reference. Deflection of tip 204 relative to the long axis of cannula 920 can be arranged so it can be in a single plane, e.g., left-right deflection using two cables to pull tip 204 to the left or to the right (or up-down), or it can be to any angle, using more cables or an arrangement similar to that in said U.S. Pat. No. 9,549,666 B2.


Endoscope portions such as 200 and 800 can be used in endoscopes having other configurations, for example in endoscopes that have straight cannulas, have different handles, and different image displays, and in endoscopes in which the cannula is re-usable rather than being a single-use, disposable cannula.



FIG. 10 is a block diagram illustrating aspects of a multi-color band endoscope, according to some embodiments. As shown, the LEDs (e.g. 232 and 234 shown in FIG. 2) and BIS CMOS image sensor (CIS) stack 400 (also shown in FIGS. 4B and 7) are located in the distal tip assembly 204. The camera controls, LED controls and image processing 1010 is handled elsewhere in the endoscope, such as in the handle and/or display module. In cases where a portion of the endoscope is configured for single-use and other portions are re-usable, the components 1010 can be located in the re-usable portion(s). Note that a field programmable gate array (FPGA) 1020 can be provided to handle some pre-processing such as de-mosaicing, gain modification, etc. The control bus 1012 can use, for example, IC2 protocol. The external interfaces 1014 can be used, for example, to receive user input such as from buttons and/or touch screen displays. Such input could be used, for example, to make image adjustments, lighting adjustments and/or select either an “all-white” or “all-blue” rather than the default combined white mode and blue mode overlaid image.



FIGS. 11 and 12 are diagrams of a portable endoscope with disposable cannula. Endoscope 1100 is similar or identical to endoscope 200 in many or all respects. Like endoscope 200, endoscope 1100 can be disposable or partially disposable after a single-use. Endoscope 1100 has a detachable cannula 1106 with at least one fluid channel between distal tip 1102 and fluid port 1104. As in the case of distal tip 204 of endoscope 200, distal tip 1102 of endoscope 1100 includes a camera module and LED light source(s) that are similar or identical as shown described for distal tip 204, supra. Endoscope 1100 also includes a display module 1107 and a handle portion 1108 that can be re-usable. The camera controls, LED controls and image processing 1010 shown in FIG. 10 can be included and located in handle portion 1108 and/or display module 1107.


In FIG. 11, the endoscope 1100 is shown configured for using a liquid (such as saline) to distend or insufflate a target organ such as a bladder. In FIG. 12, the endoscope 1100 is shown configured for using a gas (such as CO2) to distend or insufflate the target organ. In the case of liquids being used for distention/insufflation as shown in FIG. 11, either a hanging fluid bag 1130 or a fluid pump such as a large syringe 1120 can be connected via connector 1114 to tubing 1110. Tubing 1110 can be attached via connector 1112 to luer connector 1104 on endoscope 1100. The fluid can then be infused into the target organ via distal tip 1102 of endoscope 1100.


In FIG. 12, endoscope 1100 is shown configured for using gas, such as CO2 to distend or insufflate the target organ. In this case, a kit is used to connect a gas source, such as a single use cartridge 1240 of CO2 gas to the fluid port 1104 of endoscope 1100. Cartridge 1240 can be a conventional threaded “soda charger” cartridge containing less than 10 grams of carbon dioxide. The kit includes a simple adaptor 1230 that connects the cartridge 1240 on one end and a valve or flow regulator 1232. The adaptor 1230 includes a luer connector connect to endoscope 1100 via tubing 1210. According to some embodiments, adapter 1230 also includes a pressure meter 1234. Tubing 1210 includes luer connectors 1214 and 1212 for attachment to adapter 1230 and flow controller valve 1220. According to some embodiments, flow controller valve 1220 includes valve 1222 that can be configured to prevent back flow of gas from the endoscope towards tubing 1210. Also shown in FIG. 12 are additional gas (e.g. CO2) cartridges 1250. According to some embodiments, the entire kit shown in FIG. 12, including valve 1220, tubing 1210 and adapter 1230 can be relatively inexpensive to manufacture. According to some embodiments the total manufacturing cost of the kit, including gas cartridge 1240 is about 1.0 USD or less. The entire set of a cannula, handle, gas source and gas controller can be a self-contained endoscope that is conveniently hand-held, easily operated by a single person, and easily transported.


It has been found that white light and fluorescence endoscopy with gas insufflation can result in substantial advantages over liquid insufflation. For example, in endoscopy insufflation using gas such as CO2 can result in widening the field of view (FOV) of the lens when compared to fluid medium being used for distention. The increase in FOV can be attributed to a greater index of refraction mismatch between the lens material (e.g. glass or polymer) and gas (e.g. CO2) when compared to liquid (e.g. saline). It has been found in some cases that the FOV can be increased by 15%, 20%, 30% or greater by using gas instead of liquid.


Using gas instead of liquid for insufflation can also keep the patient dry, which can reduce the risk of fluid damage to various devices used in the medical procedure. There can be significant cost savings using CO2 instead of liquid such as saline. Portability can be increased.


In the case of fluorescence imaging, a lack of fluid medium reduces the effect of optical scattering and of secondary undesired light sources such as phosphorescence. In a liquid medium such as water, absorption increases significantly from blue to red, which decreases the signal of desired fluorescence when using those wavelengths. In water, absorption increases to near 0.5 (m−1) around 600 nm. In comparison, CO2 gas is almost transparent to visible light. The decrease in photon absorption/increase in transmission of gas over liquid for the wavelengths of interest (e.g. blue light endoscopy) can lead to significant increases in the signal to noise ratio.


One potential drawback of using gas rather than liquid for insufflation is that the endoscope lens can more easily become tainted with human body fluid such as sticky mucosa or blood. This may cause the resulting image to become blurry. According to some embodiments, when using liquid for distention/insufflation, the liquid flow or the liquid itself and sometimes, in combination of hydrophobic coating of the lens surface, is used to keep the lens relatively clean.


According to some embodiments, when using gas for distension/insufflation the distal tip is configured to direct the flow of the gas to have pattern such that the lens surface can be kept clean. Examples of such designs are shown in FIGS. 13, 14A, 14B and 15.



FIG. 13 is a perspective view illustrating aspects of a distal tip of an endoscopic cannula configured for using gas to insufflate a target organ, according to some embodiments. The distal tip 1102 includes an outer shell 1304 which can be formed from part of the same material as main portion of the elongated cannula (see, e.g. cannula 1106 in FIG. 12). According to some embodiments, the distal tip assembly 1102 can be formed separately and bonded to the end of cannula 1106. Blue LEDs 232, white LEDs 234, lens/sensor barrel 210, and port 220 can be similar or identical to those shown and described supra. The configuration shown in FIG. 13 further includes a gap or spacing 1310 that is in fluid communication with a lumen within cannula 1106 (shown in FIG. 12) which is in turn connected to gas source at the proximal end via a port 1104 (also shown in FIG. 12). According to some embodiments, the gap 1310 is shaped so as to allow gas flow tangentially along the distal surface of lens/sensor barrel 210, as illustrated by the dotted arrows. According to some embodiments, the gas flow can be further directed toward the port 220 (which can be configured to allow passage of a device or tool as described supra). By directing the gas tangentially across the LEDs, camera lens surface working channel port, the gas can be used to prevent collection and/or remove human body fluid or material that might otherwise fully or partially occlude any of those components.


According to some embodiments, since the exit gas flow is directed towards the lens, LEDs and/or port, rather than distally toward the object to be visualized, intermittent higher gas pressure can be used to “blow away” or remove any sticky debris that may adhere to the surfaces of the lens, LEDs and/or port.



FIGS. 14A and 14B are perspective and cross section diagrams illustrating further aspects of a distal tip of an endoscopic cannula configured for using gas to insufflate a target organ, according to some embodiments. Distal tip 1402 can be similar to tip assemblies 1102, and 204 shown and described supra. Blue LEDs 232, white LEDs 234 and lens/sensor barrel 210 can be similar or identical to those shown and described supra. The distal tip 1402 includes an outer shell 1404 which can be formed from part of the same material as main portion of the elongated cannula (see, e.g. cannula 1106 in FIG. 12). According to some embodiments, the distal tip assembly 1102 can be formed separately and bonded to the end of cannula 1106. A working channel 1422 can optionally be provided in the cannula and have a port formed on distal face 1412 similar to port 220 shown and described supra. As shown in FIG. 14B, the front face 1412 of the distal tip is recessed slightly from the outer rim 1406 by a distance d. It has been found that a curved shape of the rim or lip 1406 and the slight recession of the front face 1412 can be effective in creating a gas flow pattern that is effective in removing or preventing collection of any debris on the lens and/or LED surfaces. According to some embodiments, as in the case of the embodiment shown in FIG. 13, a higher gas pressure can be used to intermittently “blow away” or help remove sticky debris that may adhere to the surfaces of the lens, LEDs and/or port. Note that when designing the recessed distance “d” and the shape and size of the rim 1406, care should be taken not to overly restrict the field of view of the camera module.



FIG. 15 is a perspective view illustrating further aspects of a distal tip of an endoscopic cannula configured for using gas to insufflate a target organ, according to some embodiments. Distal tip 1502 can be similar to tip assemblies 1402, 1102, and 204 shown and described supra. Blue LEDs 232, white LEDs 234 and lens/sensor barrel 210 can be similar or identical to those shown and described supra. A separate device channel port and/or fluid port is not shown but can be included. The distal tip 1502 includes an outer shell 1504 which can be formed from part of the same material as the main portion of the elongated cannula (see, e.g. cannula 1106 in FIG. 12). In this example, two crescent-shaped gaps 1510 and 1511 are configured to create opposing gas flow as shown by the dotted arrows. The airflow pattern can be effective to prevent collection and remove any sticky debris that may adhere to the surfaces of the lens and/or LEDs. The front face 1512 can be recessed and/or the rim or lip 1506 can be shaped to generate gas flow that is tangential to the surface of face 1512 as shown and described with respect to FIGS. 14A and 14B. In addition, the opposing gas flows from gaps 1510 and 1511 be configured to form a pattern above the lens surface which extends distally from the cannula can aid insertion of the distal tip and cannula through a tissue passageway and/or into the target organ, as is illustrated in FIG. 16. According to some embodiments, other numbers of gaps and/or different air flow patterns can be used to create gas flow patterns that are also useful in reducing obstruction of the lens and LEDs and/or aid in insertion of the endoscope.



FIG. 16 a diagram showing a handheld surgical endoscope configured for using gas to insufflate a target organ being inserted in a tissue passageway, according to some embodiments. The distal tip 1504 and cannula of a surgical endoscope such as shown and described supra is shown being inserted in passageway 1610 within tissue 1600. As shown, the passageway 1610 is being dilated by the distal tip 1510. In this illustration, the insertion of tip 1504 is being enhanced due to gas envelope or “dome” being formed in front of (or distally) from tip 1504, as depicted by the dotted arrows. The gas ports on opposite sides of tip 1504 as shown in FIG. 15, are configured to form a pattern distally of the lens surface such that a gas envelope or “dome” is formed which shield the lens surface as well as dilate the passageway 1610.


Although the foregoing has been described in some detail for purposes of clarity, it will be apparent that certain changes and modifications may be made without departing from the principles thereof. It should be noted that there are many alternative ways of implementing both the processes and apparatuses described herein. Accordingly, the present embodiments are to be considered as illustrative and not restrictive, and the body of work described herein is not to be limited to the details given herein, which may be modified within the scope and equivalents of the appended claims.

Claims
  • 1. A single-use, disposable cannula for multi-band imaging of a patient's internal organ, said cannula comprising: an imaging structure at a distal portion of the cannula;a gas inlet port that is at a proximal portion of the cannula and is configured to receive insufflating gas, a gas outlet port at the distal portion of the cannula, and a gas conduit between said gas inlet port and gas outlet port, wherein said gas outlet port is configured to direct insufflating gas delivered thereto through said gas conduit in a flow over said imaging structure configured to clear the imaging structure;a source of insufflating gas under pressure selectively coupled to said gas inlet port to supply insufflating gas thereto;wherein said imaging structure at the distal portion of the cannula further comprises: a light source configured to illuminate said internal organ with white light from a white LED and with non-white light from a narrow-band, non-white LED, wherein the white LED and the non-white LED illuminate the internal organ from different angles;a multi-pixel, backside-illuminated, two-dimensional white light sensor array configured to image white light from the organ and to generate white light image data;a multi-pixel, backside-illuminated, two-dimensional non-white sensor array configured to image non-white light from the organ and to generate non-white light image data; anda readout circuit electrically and physically integrated with said white light sensor array and said non-white light sensor array into a circuit stack; andwherein the white light image data and the non-white light image data represent respective white and non-white images of the organ that are both spatially and temporally registered with each other.
  • 2. The cannula of claim 1, in which the white light sensor array and the non-white light sensor array comprise a single plane of pixels with RGB filters and a readout shutter configured to selectively read out all pixels to thereby produce said white light image data or only pixels responding primarily to non-white light to thereby produce said non-white light image data.
  • 3. The cannula of claim 1, in which the non-white light is present over the entire range of 350-450 nanometers.
  • 4. The cannula of claim 1, in which the non-white sensor array is configured to image light in the 600 or 610 nm wavelength range.
  • 5. The cannula of claim 1, in combination with a handle to which the proximal portion of the cannula is releasably secured through electrical and mechanical couplers tool-free by hand and detached tool-free by hand, said handle further comprising a physically integral video screen that is laterally offset from the cannula and is coupled with said imaging structure to receive and display said white light and non-white light images overlaid in spatial registration such that the non-white image highlights tissue of interest in a normal white image.
  • 6. The cannula of claim 1, in which said white light sensor array and said non-white light sensor array are side-by-side in the same plane.
  • 7. The cannula of claim 1, in which said white light and non-white light illuminate respective fields of illumination in the organ that at least partly overlap.
  • 8. The cannula of claim 1, further including a sterile packaging enclosing the cannula, wherein the cannula is a single-use, disposable unit.
  • 9. The cannula of claim 1, in which said gas outlet port is configured to direct gas delivered thereto through said conduit in a flow over said imaging structure that clears a field of view of the imaging portion.
  • 10. The cannula of claim 1, further including a handle to which the proximal portion of the cannula is releasably secured through electrical and mechanical couplers tool-free by hand and detached tool-free by hand, said handle further comprising a physically integral video screen coupled with said imaging structure to receive and display said white light and non-white light images, and further including a working channel for surgical tools that extends in a straight line from a proximal end of the handle to a distal portion of the cannula.
REFERENCE TO RELATED APPLICATIONS

This patent application claims the benefit of and incorporates by reference each of the following provisional applications: U.S. Prov. Ser. No. 62/647,817 filed Mar. 25, 2018; U.S. Prov. Ser. No. 62/654,295 filed Apr. 6, 2018; U.S. Prov. Ser. No. 62/671,445 filed May 15, 2018; and U.S. Prov. Ser. No. 62/816,366 filed Mar. 11, 2019. This patent application is a continuation (with added subject matter) of International Patent Application No. PCT/US17/53171 filed Sep. 25, 2017, which claims the benefit of and incorporates by reference each of the following provisional applications: U.S. Prov. Ser. No. 62/399,429 filed Sep. 25, 2016; U.S. Prov. Ser. No. 62/399,436 filed Sep. 25, 2016; U.S. Prov. Ser. No. 62/399,712 filed Sep. 26, 2016; U.S. Prov. Ser. No. 62/405,915 filed Oct. 8, 2016; U.S. Prov. Ser. No. 62/423,213 filed Nov. 17, 2016; U.S. Prov. Ser. No. 62/424,381 filed Nov. 18, 2016; U.S. Prov. Ser. No. 62/428,018 filed Nov. 30, 2016; U.S. Prov. Ser. No. 62/429,368 filed Dec. 2, 2016; U.S. Prov. Ser. No. 62/485,454 filed Apr. 14, 2017; U.S. Prov. Ser. No. 62/485,641 filed Apr. 14, 2017; U.S. Prov. Ser. No. 62/502,670 filed May 6, 2017; U.S. Prov. Ser. No. 62/550,188 filed Aug. 25, 2017; U.S. Prov. Ser. No. 62/550,560 filed Aug. 25, 2017; U.S. Prov. Ser. No. 62/550,581 filed Aug. 26, 2017; and U.S. Prov. Ser. No. 62/558,818 filed Sep. 14, 2017. All of the above-referenced non-provisional, provisional and international patent applications are hereby incorporated in this application by reference and are collectively referenced herein as “the commonly assigned incorporated applications.”

US Referenced Citations (236)
Number Name Date Kind
4854302 Allred, III Aug 1989 A
4979497 Matsura Dec 1990 A
5010876 Henley Apr 1991 A
5188093 Lafferty Feb 1993 A
5281214 Wilkins Jan 1994 A
5323767 Lafferty Jun 1994 A
5329936 Lafferty Jul 1994 A
5390117 Graf et al. Feb 1995 A
5486155 Muller Jan 1996 A
5549547 Cohen Aug 1996 A
5569163 Francis Oct 1996 A
5666561 Stephenson Sep 1997 A
5667472 Finn Sep 1997 A
5667476 Frassica et al. Sep 1997 A
5785644 Grabover Jul 1998 A
5860953 Snoke Jan 1999 A
5873814 Adair Feb 1999 A
5928137 Green Jul 1999 A
5935141 Weldon Aug 1999 A
5957947 Wattiez Sep 1999 A
6007531 Snoke Dec 1999 A
6007546 Snow Dec 1999 A
6017322 Snoke Jan 2000 A
6033378 Lundquist Mar 2000 A
6059719 Yamamato et al. May 2000 A
6095970 Hidaka Aug 2000 A
6174307 Daniel Jan 2001 B1
6210416 Chu Apr 2001 B1
6211904 Adair Apr 2001 B1
6221007 Green Apr 2001 B1
6221070 Tu et al. Apr 2001 B1
6261226 McKenna Jul 2001 B1
6280386 Alfano et al. Aug 2001 B1
6331174 Reinhard Dec 2001 B1
6387043 Yoon May 2002 B1
6398743 Halseth Jun 2002 B1
6507699 Lemoine Jan 2003 B2
6518823 Kawai Feb 2003 B1
6793882 Verschuur Sep 2004 B1
6917380 Tay Jul 2005 B1
7256446 Hu Aug 2007 B2
7428378 Warpakowski Sep 2008 B1
7507205 Borovsky Mar 2009 B2
7591799 Selkee Sep 2009 B2
7606609 Muranushi Oct 2009 B2
7780650 Frassica Aug 2010 B2
7798995 Yue Sep 2010 B2
7931616 Selkee Apr 2011 B2
7946981 Cubb May 2011 B1
8052609 Harhen Nov 2011 B2
8057464 Chen Nov 2011 B2
8187171 Irion May 2012 B2
8197398 Scholly Jun 2012 B2
8235975 Chen Aug 2012 B2
8361775 Flower Jan 2013 B2
8460182 Ouyang Jun 2013 B2
8523808 Selkee Sep 2013 B2
8696552 Whitman Apr 2014 B2
8803960 Sonnenschein Aug 2014 B2
8834357 Oskin Sep 2014 B2
8845522 McIntyre Sep 2014 B2
8952312 Blanqart Feb 2015 B2
8998844 Reed Apr 2015 B2
9649014 Ouyang May 2017 B2
9736342 Mueckl Aug 2017 B2
9895048 Ouyang Feb 2018 B2
10278563 Ouyang May 2019 B2
10292571 Ouyang May 2019 B2
20010007051 Nakashima Jul 2001 A1
20010049509 Sekine Dec 2001 A1
20030016284 Squilla Jan 2003 A1
20030023142 Grabover Jan 2003 A1
20030078476 Hill Apr 2003 A1
20030078502 Miyaki et al. Apr 2003 A1
20030151680 McDermott Aug 2003 A1
20030199735 Dickopp Oct 2003 A1
20040054254 Miyake Mar 2004 A1
20040054259 Hasegawa Mar 2004 A1
20040138558 Dunki-Jacobs Jul 2004 A1
20040162572 Sauer Aug 2004 A1
20050010178 Katz Jan 2005 A1
20050264687 Murayama Jan 2005 A1
20050049459 Hern Mar 2005 A1
20050085695 Sherner Apr 2005 A1
20050154262 Banik Jul 2005 A1
20050159646 Nordstrom Jul 2005 A1
20050177027 Hirata Aug 2005 A1
20050277874 Selkee Dec 2005 A1
20050277875 Selkee Dec 2005 A1
20060052710 Miura Mar 2006 A1
20060063976 Aizenfeld Mar 2006 A1
20060114986 Knapp Jun 2006 A1
20060152601 Parekh Jul 2006 A1
20060167340 Peas Jul 2006 A1
20060171693 Todd Aug 2006 A1
20060173245 Todd Aug 2006 A1
20060259124 Matsuoka Nov 2006 A1
20060287576 Tsuji Dec 2006 A1
20070060789 Uchimura Mar 2007 A1
20070081920 Murphy Apr 2007 A1
20070117437 Boehnlein May 2007 A1
20070129604 Hatcher Jun 2007 A1
20070162095 Kimmel Jul 2007 A1
20070167678 Moskowitz Jul 2007 A1
20070167868 Sauer Jul 2007 A1
20070173693 Refael Jul 2007 A1
20070188604 Miyamoto Aug 2007 A1
20070197875 Osaka Aug 2007 A1
20070210162 Keen Sep 2007 A1
20070225556 Ortiz Sep 2007 A1
20070238927 Ueno Oct 2007 A1
20080004642 Birk Jan 2008 A1
20080071144 Kimmel Mar 2008 A1
20080097550 Dicks Apr 2008 A1
20080108869 Sanders May 2008 A1
20080195125 Orbay Aug 2008 A1
20080195128 Orbay Aug 2008 A1
20080225410 Ning Sep 2008 A1
20080234547 Irion et al. Sep 2008 A1
20080255416 Gilboa Oct 2008 A1
20080262306 Kawai Oct 2008 A1
20080300456 Irion Dec 2008 A1
20090027489 Takemura Jan 2009 A1
20090065565 Lemoine Mar 2009 A1
20090076321 Suyama Mar 2009 A1
20090076328 Root Mar 2009 A1
20090080214 Watanabe Mar 2009 A1
20090105538 Van Dam Apr 2009 A1
20090118580 Sun May 2009 A1
20090118641 Van Dam May 2009 A1
20090149713 Niida Jul 2009 A1
20090225159 Schneider Sep 2009 A1
20090227897 Wendt Sep 2009 A1
20090286412 Ikeda Nov 2009 A1
20090287663 Takeuchi Nov 2009 A1
20100069834 Schultz Mar 2010 A1
20100094216 Yue Apr 2010 A1
20100095969 Schwartz Apr 2010 A1
20100101569 Kim Apr 2010 A1
20100121142 Ouyang May 2010 A1
20100157039 Sugai Jun 2010 A1
20100160914 Bastian Jun 2010 A1
20100168827 Schultz Jul 2010 A1
20100191051 Miyake Jul 2010 A1
20100191053 Garcia Jul 2010 A1
20100234736 Corl Sep 2010 A1
20100026201 Frangioni Oct 2010 A1
20110009694 Schultz Jan 2011 A1
20110034769 Adair Feb 2011 A1
20110037876 Talbert Feb 2011 A1
20110054446 Schultz Mar 2011 A1
20110092775 Deshmukh Apr 2011 A1
20110105839 Hoffman May 2011 A1
20110112622 Phan May 2011 A1
20110130627 McGrail Jun 2011 A1
20110211115 Tsai Sep 2011 A1
20110213206 Boutillette Sep 2011 A1
20110245602 Brannon Oct 2011 A1
20110288482 Farrell Nov 2011 A1
20120016191 Ito Jan 2012 A1
20120040305 Karazivan Feb 2012 A1
20120053515 Crank Mar 2012 A1
20120100729 Edidin Apr 2012 A1
20120165627 Yamamoto Jun 2012 A1
20120165916 Jordan Jun 2012 A1
20120178991 Clark Jul 2012 A1
20120226103 Gunday Sep 2012 A1
20120236138 Liu Sep 2012 A1
20120245242 Peiffer Sep 2012 A1
20120245418 Boulais Sep 2012 A1
20120253116 Sniffin Oct 2012 A1
20120259203 Devereux Oct 2012 A1
20120286020 Smith Nov 2012 A1
20120289858 Ouyang Nov 2012 A1
20130035553 Kongstorum Feb 2013 A1
20130046142 Remijan Feb 2013 A1
20130057667 McGrath May 2013 A1
20130150672 Fujitani Jun 2013 A1
20130172676 Levy Jul 2013 A1
20130225921 Liu Aug 2013 A1
20130253402 Badawi Sep 2013 A1
20130289559 Reid Oct 2013 A1
20130324973 Reed Dec 2013 A1
20130345514 Manion Dec 2013 A1
20140022649 Echhardt Jan 2014 A1
20140107416 Bimkrant Apr 2014 A1
20140111634 Mueckl Apr 2014 A1
20140154399 Weikart Jun 2014 A1
20140180007 Edidin Jun 2014 A1
20140213848 Moskowitz Jul 2014 A1
20140228635 Tuliakov Aug 2014 A1
20140275763 King Sep 2014 A1
20140296866 Salman Oct 2014 A1
20140323991 Tang Oct 2014 A1
20150005575 Kobayashi Jan 2015 A1
20150011830 Hunter Jan 2015 A1
20150018622 Tesar Jan 2015 A1
20150018710 Furlong Jan 2015 A1
20150088001 Lindvold et al. Mar 2015 A1
20150150441 Ouyang Jun 2015 A1
20150164313 Ouyang Jun 2015 A1
20150196197 Kienzle Jul 2015 A1
20150238251 Shikhman Aug 2015 A1
20150297311 Tesar Oct 2015 A1
20160007833 Huang Jan 2016 A1
20160073853 Venkatesan et al. Mar 2016 A1
20160077008 Takasu et al. Mar 2016 A1
20160174819 Ouyang Jun 2016 A1
20160334694 Liu Nov 2016 A1
20160367119 Ouyang Dec 2016 A1
20170086651 Sato Mar 2017 A1
20170188793 Ouyang Jul 2017 A1
20170188795 Ouyang Jul 2017 A1
20170215699 Ouyang Aug 2017 A1
20170295347 Schneider Oct 2017 A1
20170310858 Mueckl Oct 2017 A1
20180132700 Ouyang May 2018 A1
20180184892 Truckai Jul 2018 A1
20180235441 Huang Aug 2018 A1
20180256009 Ouyang Sep 2018 A1
20190000308 Duckett, III Jan 2019 A1
20190029497 Mirza Jan 2019 A1
20190110686 Kato Apr 2019 A1
20190142262 Inglis May 2019 A1
20190216325 Ouyang Jul 2019 A1
20190223691 Takatsuji Jul 2019 A1
20190246873 Lu Aug 2019 A1
20190246884 Lu et al. Aug 2019 A1
20190282073 Truckai Sep 2019 A1
20190320879 Langell Oct 2019 A1
20190374095 Lord Dec 2019 A1
20200204776 Themelis Jun 2020 A1
20200214739 Shi Jul 2020 A1
20200275827 Weise Sep 2020 A1
20210228806 Streeter Jul 2021 A1
20210401277 Ouyang Dec 2021 A1
Foreign Referenced Citations (17)
Number Date Country
102858275 Jan 2013 CN
1690512 Aug 2006 EP
2560589 Apr 2010 EP
3384879 Apr 2011 EP
2749258 Jul 2014 EP
3078354 Oct 2016 EP
2009148420 Jul 2009 JP
2011133792 Oct 2011 WO
2012060932 May 2012 WO
2014031192 Feb 2014 WO
2014065901 May 2015 WO
2016032729 Mar 2016 WO
2016040131 Mar 2016 WO
2016137838 Sep 2016 WO
WO 2016137838 Sep 2016 WO
2018136950 Jul 2018 WO
2019237003 Dec 2019 WO
Non-Patent Literature Citations (11)
Entry
Jul. 1, 2019 International Preliminary Report on Patentability in connection with PCT/US2017/053171 and dated Mar. 8, 2018 Article 34 Amendment and Demand for International Preliminary Examination Under Chapter II of PCT.
International Search Report and Written Opinion of PCT/US2016/18670, dated Jul. 12, 2016.
International Search Report and Written Opinion of PCT/US2018/014880, dated Jun. 6, 2018.
International Search Report and Written Opinion of PCT/US2018/065396, dated Feb. 24, 2017.
International Search Report and Written Opinion of PCT/US2021/050095 dated Dec. 17, 2021.
International Search Report and Written Opinion of PCT/US2019/036060 dated Aug. 27, 2019.
International Search Report and Written Opinion of PCT/US2017/053171 dated Dec. 5, 2017.
International Preliminary Report on Patentability of PCT/US2017/053171 completed on Jul. 1, 2019.
Extended European Search Report of European Patent Application No. EP19816177 completed Feb. 2, 2022.
International Search Report and Written Opinion of the International Searching Authority dated Dec. 5, 2017 in connection with PCT/US17/53171.
Mar. 6, 2018 Letter Accompanying Amendment under Article 34 and Demand For International Preliminary Examination under Chapter II of the PCT in connection with PCT/US17/53171.
Related Publications (1)
Number Date Country
20190216325 A1 Jul 2019 US
Provisional Applications (19)
Number Date Country
62647817 Mar 2018 US
62654295 Apr 2018 US
62671445 May 2018 US
62816366 Mar 2019 US
62399429 Sep 2016 US
62399436 Sep 2016 US
62399712 Sep 2016 US
62405915 Oct 2016 US
62423213 Nov 2016 US
62424381 Nov 2016 US
62428018 Nov 2016 US
62429368 Dec 2016 US
62485454 Apr 2017 US
62485641 Apr 2017 US
62502670 May 2017 US
62550188 Aug 2017 US
62550560 Aug 2017 US
62550581 Aug 2017 US
62558818 Sep 2017 US
Continuation in Parts (1)
Number Date Country
Parent PCT/US2017/053171 Sep 2017 US
Child 16363209 US