The present invention generally relates to a micro-electro-mechanical system (MEMS) that conveys light to and from a region of interest (ROI), and more specifically, to a system that is selectively used for both imaging the ROI, and for displaying an image in the ROI.
Conventional small-scale image acquisition systems, such as endoscopes and boroscopes, typically sample an image plane using a bundle of optical fibers that correspond to pixels on a camera detector such as a charge coupled device (CCD). Trying to minimize a system's size using this approach is limited by a number of factors, including the overall diameter of the fiber bundle, the number of pixel detectors on the camera detector, and diffractive properties of light beams. Reducing the diameter of a conventional acquisition device reduces the possible number of pixels, and thus reduces the resolution and/or field of view (FOV) of the device. However, a reduction in diameter and size would enable users to examine areas unreachable by currently designed endoscopes, reduce collateral damage to tissue, and enable integration of imaging with other functional devices such as therapy devices.
Similarly, many small-scale image display systems, such as head mounted displays (HMDs), beam light from an optical fiber onto deflectable mirrors or rotating polygonal mirrors to produce an image on an image plane. This approach also has many size limitations. For instance, light beams of less than 3 millimeters (mm) are impractical for displays using mirrors, because mirror scanners and grating deflectors must be significantly larger than the light beam diameter to avoid clipping the beam or adding diffraction. Reducing the diameter of a conventional display device reduces the possible number of pixels, and thus reduces the resolution and/or field of view (FOV) of the device. However, a reduction in diameter and size would enable construction of more comfortable HMDs, and enable integration of display with other functional devices.
An older type of scanning image display system includes an electromechanical modulator. The modulator comprises a full width array of closely spaced fiber-like reflectors which deflect when a voltage potential is applied. The voltage potential is selectively applied to the reflector in accordance with an image signal. This technique requires a very complicated circuit to control the overall deflection of the reflectors and the overall size is quite large.
As one practical application, minimally invasive medical procedures (MIMPs) has increased the demand for small diameter systems that result in less tissue damage and trauma, faster recovery times, and lower risks to the patient. Typically, instruments used by practitioners of MIMPs include several different discrete systems for optical imaging, monitoring, maneuvering, sizing, diagnosis, biopsy, therapy, surgery, and non-visual monitoring/sensing. It would be preferable to combine the functions provided by these instruments in a single compact device to reduce the number of surgical ports that are currently required for a plurality of single-function tools. By employing an integrated multi-functional tool so that only one small port is used, the risks associated with repeatedly removing and inserting surgical tools can be dramatically reduced. Since most MIMPs require the practitioner to constantly monitor the procedure visually, optical imaging is considered a requirement for any fully integrated system for MIMPs. Thus, an appropriate multifunction instrument will most likely include an optical imaging system, and the imaging system should be compact so that it can be integrated with one or more diagnostic, and/or therapeutic tools.
The current tools used for MIMPs cannot readily be integrated with an optical imaging system without increasing the size of the resultant instrument to an excessive degree. All currently available commercial optical imaging systems that include a maneuverable flexible shaft must maintain a certain size (diameter) in order to preserve image quality. As indicated above, currently available flexible scopes cannot be made smaller than this limit unless image field-of-view (FOV) or resolution is sacrificed. Also, currently available imaging systems typically use an external light source to generate light, and use an optical waveguide to direct the light to an ROI within a patient's body. Although imaging and some diagnostic capability can be integrated into existing scopes, such as standard tissue imaging in combination with fluorescence for early detection of cancers, the optical systems of current flexible scopes are not sufficiently small to provide integrated diagnoses and therapies at the required degrees of performance, size, and price that will be demanded in the future by medical practitioners.
Presently available flexible scope designs use either a bundle of optical fibers (optical waveguides) and/or one or more cameras having an array of detectors to capture an image. Thus, the diameter of these flexible scopes employed for remote imaging cannot be reduced to smaller than the image size. Even if one ignores additional optical fibers used for illumination of an ROI, the scope diameter is therefore limited by the individual pixel size of a camera or by the diameter of optical fibers used to acquire the image. Currently, the smallest pixel element is determined by the size of the end of an optical fiber, which has a minimum core diameter of about 4 μm. To propagate light through an optical fiber, a surrounding cladding layer is required, increasing the minimum pixel size to more than 5 μm in diameter. If a standard video graphics adapter (sVGA) image is desired (e.g., with a resolution of 640×480 pixels), then a minimum diameter required for just the imaging optical fiber is more than 3 mm. Therefore, to achieve scopes with less than 3 mm overall diameter using current technologies, resolution and/or FOV must be sacrificed by having fewer pixel elements. All commercially available scopes suffer from this fundamental tradeoff between high image quality and small size.
Currently available scopes also suffer from poor control mechanisms. Some optical systems use an optical fiber and camera at a tip of a flexible scope to illuminate a ROI and acquire an image. The fiber and camera are manually controlled by the practitioner positioning the tip of the flexible scope. Other optical systems use a resonant fiber that is actuated into resonance with one or more nodes to produce a desired illumination spot. Although these systems actuate the fiber, such systems can not precisely control the position of the fiber tip without adding material to the fiber scan system and increasing the diameter and/or rigid-tip length. Other optical systems deflect or move mirrors to position the light beam rather than move the waveguide. However, as discussed above, mirrors must be larger than the light beam diameter to avoid clipping the beam or adding diffraction. Thus, the mirrors must be larger than the waveguide, thereby increasing the overall size of the instrument.
Some microscopes actuate a cantilever waveguide for near-field imaging. However, near-field systems have a very limited FOV (e.g., typically less than 500 nanometers), and a light-emitting tip must be positioned within nanometers of the target. Near-field systems are based on emitting light through a microscopic aperture with dimensions smaller than the wavelength of visible light. The emitted light reflects off the closely positioned target and is detected before the light has time to diffract and dissipate. A near-field system may be useful for imaging individual cells or molecules, but is not suitable for most medical procedures and other dynamic applications which require a FOV of at least a micron and can not be dependant on precisely positioning a tip within nanometers of the target. Using larger wavelengths to provide a suitable FOV with a near-field system would still require a substantially larger imaging system, which could not be integrated into a multi-function instrument. As an alternative, some microscopes actuate a cantilever waveguide for confocal microscope imaging. However, simple confocal systems are limited to single wavelength operation, which does not enable color imaging or display.
Thus, it would be desirable to reduce the imaging system for the purpose of reducing the overall size of an instrument used for MIMPs and other applications. To currently perform diagnostic or therapeutic MIMPs, one or more separate instruments are used within the FOV of a standard endoscopic imager, and any additional separate instrument must often be held and maneuvered by a second medical practitioner. Typically, the second instrument provides a high intensity point source of light for optical therapies, a hot-tipped probe for thermal therapies, or a trocar used for mechanical cutting. The second instrument is moved to the surface of the tissue and usually moved within or across the surface of the tissue, covering the area of interest as the tool is scanned and manipulated by hand. These secondary instruments are often inserted into the patient's body through a separate port, and thus, while being used, are viewed from a different point of view in the visual image. Furthermore, the therapeutic instrument often blocks the practitioner's direct view of the ROI with the imaging tool, making highly accurate therapies quite difficult for the medical practitioner to achieve. Significant amounts of training and practice are required to overcome these difficulties, as well as the capability to work with a reduced sense of touch that is conveyed through the shaft of an instrument having friction and a non-intuitive pivot at the point of entry. Thus, to work effectively with current imaging and therapeutic technologies, the practitioner of MIMPs must be highly trained and skilled.
Clearly, there is a need for an imaging system that is small enough to be integrated with diagnostic and/or therapeutic functions to create an instrument that is sufficiently intuitive to use as to require little training or skill. Similarly, a small, integrated display system would greatly improve mobility for a head mounted display and enable very localized display of images. Ideally, an image acquisition or display system should integrate a light source, an actuation system, a position sensing system, light detectors, and a local control system, yet be smaller than currently available systems. Despite its small size, the integrated system should still be capable of providing a sufficient FOV, a good image size, and high resolution. The integrated system should also enable a practitioner to ensure that therapy can be administered to the ROI imaged within a patient's body. Currently, no integrated system is small enough to provide these capabilities and cannot be easily modified to provide such capabilities.
This application specifically incorporates by reference the disclosures and drawings of each patent application and issued patent identified above as a related application.
In accord with the present invention, an apparatus is defined for providing image acquisition and/or image display in a limited ROI. The apparatus comprises a micro-electro-mechanical system (MEMS) that integrates needed components into a miniature device that can be produced with conventional micro-fabrication techniques. The apparatus preferably includes one or more integrated light sources such as one or more laser diodes for illuminating an ROI, displaying an image, providing a therapy, and/or performing another function. Alternatively, the light source can include a generation component and delivery component, whereby only the delivery component is integrated into the apparatus. The light source can further include a modulator or filter, or modulation and filtering can be performed on the input or output of the light source. Also included is a cantilever having a fixed end and a free end. The fixed end is attached to a substrate that supports many or all other components of the apparatus. Preferably, the free end of the cantilever is released from the substrate during fabrication of the cantilever such that the cantilever can move in two orthogonal directions. The cantilever also preferably comprises a light-transmissive material such as an epoxy resin that acts as a waveguide to direct light from the light source toward the ROI. In that case, the one or more light sources are optically coupled to the fixed end of the cantilever waveguide, and the free end of the cantilever waveguide is adapted to be positioned adjacent to the ROI. One or more scanning actuators are disposed adjacent to the cantilever and supported by the substrate. The scanning actuators cause the light from the free end of the cantilever to scan the ROI to illuminate the ROI for image acquisition or for image display. The light may pass through a lens attached to, or just beyond the fixed end of the cantilever. One or more position sensors also detect the position of the free end of the cantilever, providing feedback for control. When used for image acquisition, one or more light detectors receive light backscattered from the ROI, producing a signal corresponding to an intensity of the backscattered light. The signal can be used to produce an image of the ROI on a display. A control circuit is preferably coupled to the scanning actuators, the light sources, the position sensors, and the light detectors. The control circuit selectively energizes the one or more light sources to image the ROI, display an image in the ROI, and/or render another function to the ROI. Other functions can include diagnosing a condition, rendering therapy, sensing a condition, and monitoring a medical procedure—all in regard to the ROI. The control circuit can also provide long term control of scanning stability. The above components can be fabricated on multiple substrates, each substrate being best suited for a corresponding component. These subassemblies can then be bonded together or otherwise integrated to form the complete image acquisition and/or display device.
In an alternate embodiment of the present invention, light from the one or more light sources is directed along one or more stationary waveguides to illuminate the ROI. Backscattered light is then received at the free end of the cantilever, which may have a lens fabricated on the free end. The cantilever scans the backscattered light and directs the backscattered light to the fixed end where a light detector is optically coupled. Another embodiment does not require the cantilever to be a waveguide. Instead, a light source is located at the free end of the cantilever to directly scan into the ROI. Yet another embodiment uses a flexible fiber to receive backscattered light from the ROI and direct the light to a separate light detection component. It is also contemplated that a plurality of cantilever waveguides can be used in parallel to convey light to and from the ROI.
In another configuration, the free end of the cantilever waveguide is tapered to a substantially smaller cross-sectional size than the fixed end, producing a tapered end that emits light having a substantially smaller point spread function (PSF) than light that would be emitted from a non-tapered end. The tapered free end can also form a gradient index lens. Alternatively, a micro refractive lens can be provided at the free end of the cantilever. As another alternative, a diffractive lens can be micro-fabricated at the free end. In addition, or alternatively, a lens can be provided between the ROI and the light detectors and/or between a therapeutic light source and the ROI.
One form of the scanning actuator includes one or more electrostatic actuators that use electrostatic forces to preferably move the free end of the cantilever in substantially transverse directions. In another embodiment, the scanning actuator comprises one or more piezoelectric actuators that harness the piezoelectric effect to move the free end of the cantilever. Other actuation methods can be employed, although it is preferable to use actuation methods that can be integrated into the apparatus with conventional micro-fabrication techniques. In any case, the actuators can be used to drive the cantilever into resonance with one or more nodes. The free end can then scan the ROI in a raster, spiral, or other pattern. Alternatively, the actuators can be used to selectively drive the free end to precise positions.
Additionally, the position sensors can be used for feedback control of the free end of the cantilever. Similar to the scanning actuators, multiple embodiments of the position sensors can be implemented. The scanning actuators can be used in an alternating fashion to drive and detect the free end of the cantilever. Preferably, however, separate transducers are integrated into the waveguide or substrate to detect the position of the free end in orthogonal directions. The transducers can comprise piezoelectric detectors, capacitive sensors, piezoresistive sensors, or other micro-fabricated position sensors. Position of the free end can also be determined by detection of light lost or light scattered from the cantilever waveguide.
Another aspect of the invention comprises a method for enabling either far-field image acquisition or display of an image in a limited ROI. The method includes forming a cantilever on a substrate, and removing a portion of the substrate underlying the cantilever so that the cantilever can be deflected. The method also includes supporting the cantilever at a fixed end so that the fixed end remains fixed to the substrate, and a free end of the cantilever extends freely beyond where the portion of the substrate was removed from supporting the cantilever. This enables the free end to move relative to a target in the limited ROI. The cantilever is deflected so as to move the free end in a desired motion. The cantilever conveys light, so that if the cantilever is employed for acquiring the far-field image, the light is reflected from a target and conveyed from the free end toward the fixed end. Alternatively, if the cantilever is employed for displaying the image, the light is emitted toward the target from the free end. The method further includes detecting a position of the free end of the cantilever, and producing a signal indicative of the position for use in controlling the cantilever to move in the desired motion. Other aspects and details of the invention are described in further detail below.
This Summary has been provided to introduce a few concepts in a simplified form that are further described in detail below in the Description. However, this Summary is not intended to identify key or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
Various aspects and attendant advantages of one or more exemplary embodiments and modifications thereto will become more readily appreciated as the same becomes better understood by reference to the following detailed description, when taken in conjunction with the accompanying drawings, wherein:
Figures and Disclosed Embodiments Are Not Limiting
Exemplary embodiments are illustrated in referenced Figures of the drawings. It is intended that the embodiments and Figures disclosed herein are to be considered illustrative rather than restrictive. No limitation on the scope of the technology and of the claims that follow is to be imputed to the examples shown in the drawings and discussed herein.
Example Applications of the Present Invention
As indicated above, the present application is a continuation of a copending patent application Ser. No. 10/655,482, filed on Sep. 4, 2003, which itself is a continuation-in-part of prior copending patent application Ser. No. 09/850,594, filed on May 7, 2001, and now issued as U.S. Pat. No. 6,975,898, which is based on prior copending provisional patent application Ser. No. 60/212,411, filed on Jun. 19, 2000, all of which are hereby explicitly incorporated by reference. These applications describe a medical imaging, diagnostic, and therapy device. However, the present invention may be used for acquiring an image, for displaying an image, or for otherwise detecting or delivering light. Nevertheless, for exemplary purposes, the present invention is primarily described with regard to a preferred embodied as a miniature medical device such as an endoscope. Those skilled in the art will recognize that the present invention can also be embodied in a non-medical image acquisition device, a wearable display, a biological, chemical or mechanical sensor, or any other miniature high resolution, far-field device. For example, the invention can be used to illuminate specific patterns for microlithography, micro-inspection, and micro-illumination. The invention can further be embodied in a bar code reader, a range finder, or a device for combining simultaneous sensing and delivery functions.
As an endoscope, the present invention can be used to integrate both imaging and non-imaging functionality, such as diagnosis, monitoring, and therapy of an internal ROI, instead of requiring separate instruments for imaging and for rendering therapy or other functions to a site. For example, an integrated endoscope can provide ultraviolet therapy and monitoring. Also, many optical diagnostic and therapeutic techniques rely on high quality illumination at elevated intensities, which is inherent in optical scanning and cannot be achieved with diffuse illumination. A scanned beam of intense optical energy is more effective at overcoming the signal-to-noise limitations of photon detectors used in conventional diagnostic imaging systems. When fluorescent dye molecules are used as tracers for specific cells or structures, the signal conversion rates from illumination to fluorescence are very low and often buried in noise. In many therapeutic applications, such as photodynamic therapy (PDT), the optical excitation of PDT labels on cancerous cells creates free radicals that kill nearby cells. Doses of intense optical illumination are applied to overcome the natural buffering mechanisms within the body, to attain effective concentrations of free radicals. Laser therapies that rely on optical heating, cutting, and cauterization of tissues require the highest optical intensities that can be delivered and cannot be used effectively with diffuse illumination. Directed, focused beams of light on tissue for precise exposure times are necessary for reducing surrounding tissue damage which is provided in a controlled optical scan system. Furthermore, high quality illumination can include a high degree of optical monochromaticity, coherence, polarization, high modulation frequency, high pulse repetition rates, and short pulse duration.
Image Acquisition System Processing Overview
An example system for providing imaging and non-imaging functionality through an endoscope is shown in
Externally, additional or alternate illumination sources, modulators, filters, and detectors may be provided as shown in a block 56. For example, external light source systems for producing red, green, blue (RGB), ultraviolet (UV), Infrared (IR), and/or high intensity light may include a delivery component to convey light to the distal end of the endoscope. As illustrated, all or portions of the additional or alternate illumination sources may be partially, or completely inside the patient's body. For instance, additional light emitting diodes may be integrated with the components of block 54 at the distal end of the endoscope. For external illumination sources, modulators, filters, and detectors are also optionally coupled to the electromechanical scan actuator(s) inside the patient's body and to the scanner control actuators. Scanner motion detectors are used for controlling the scanning and produce a signal that may be fed back to the scanner actuators, illumination source, and modulators to implement scanning control.
In a block 60, image signal filtering, buffering, scan conversion, amplification, and other processing functions are implemented using the electronic signals produced by the imaging photon detectors and for the other photon detectors employed for diagnosis/therapy, and monitoring purposes. As illustrated, some or all of these functions may alternatively be implemented with integrated circuitry that is near the distal end of the endoscope inside the patient's body. Blocks 56 and 60 are interconnected bi-directionally to convey signals that facilitate the functions performed by each respective block. Similarly, each of these blocks is bi-directionally coupled in communication with a block 62 in which analog-to-digital (A/D) and digital-to-analog (D/A) converters are provided for processing signals that are supplied to a computer workstation user interface, shown in a block 68, employed for image acquisition, processing, for executing related programs, and for other functions. Control signals from the computer workstation are fed back to block 62 and converted into analog signals, where appropriate, for controlling or actuating each of the functions provided in blocks 56 and 60. The A/D converters and D/A converters within block 62 are also coupled bi-directionally to a block 64 in which data storage is provided, and to a block 66. Block 66 represents a user interface for maneuvering, positioning, and stabilizing the end of the scanning optical waveguide within a patient's body.
In block 64, the data storage is used for storing the image data produced by the detectors within a patient's body, and for storing other data related to the imaging and functions implemented by the scanning optical waveguide. Block 64 is also coupled bi-directionally to the computer workstation and to an interactive display monitor(s) in a block 70. Block 70 receives an input from block 60, enabling images of the ROI to be displayed interactively. In addition, one or more passive video display monitors may be included within the system, as indicated in a block 72. As indicated in a block 74, other types of display devices, for example, a head-mounted display (HMD) system, can also be provided, enabling medical personnel to view an ROI as a pseudo-stereo image. The HMD system can include a display embodiment of the invention to display the image acquired from within the patient's body. The display embodiment is effectively an inverse of the image acquisition system.
Display System Processing Overview
An example system for providing display functionality is shown in
Other external or non-integrated components are similar to those used for the image acquisition system discussed above. For example, in a block 90, image signal filtering, buffering, scan conversion, amplification, and other processing functions are implemented. Blocks 86 and 90 are interconnected bi-directionally to convey signals that facilitate the functions performed by each respective block. Similarly, each of these blocks is bi-directionally coupled in communication with a block 92 in which analog-to-digital (A/D) and digital-to-analog (D/A) converters are provided for processing signals that are supplied to or by a computer workstation user interface, shown in a block 98, employed for image display, for processing, for executing related programs, and for other functions. Control signals from the computer workstation can be fed back to block 92 and converted into analog signals, where appropriate, for controlling or actuating each of the functions provided in blocks 86, 90 and 84. The A/D converters and D/A converters within block 92 are also coupled bi-directionally to a block 94 in which data storage is provided, and to a block 96. Block 96 represents a user interface for maneuvering, positioning, and stabilizing the end of the scanning optical waveguide for display. Block 94 is also coupled bi-directionally to the computer workstation, and the data storage is used for storing the image data, and for storing other data related to the display and functions implemented by the scanning optical waveguide. In addition, one or more passive video display monitors may be included within the system (not shown) for test purposes. Further detail is discussed below with regard to
Prototype Fiberoptic-MEMS Hybrid Embodiment
As indicated above, it is desirable to produce a scanning device with a small cross-sectional area that can be produced at relatively low cost and high volume to make endoscopes and other imaging and display devices more economical and thereby facilitate their widespread use as disposable devices. Micro-electromechanical systems (MEMS) technology makes these goals achievable using an integrated thin film device.
In this embodiment, electrostatic actuators 156 act on a thin film optical waveguide 150 that is supported on a raised ledge 148. The thin film optical waveguide is only approximately 0.003 mm in diameter. A distal portion 152 of the thin film optical waveguide is caused to scan in the two orthogonal directions indicated by the curved arrows in
Optical fiber 144 is preferably affixed to silicon substrate 146 within a centering V notch 160 to ensure that it is aligned with thin film optical waveguide 150. Since the optical fiber is approximately 0.1 mm in diameter, care must be taken to provide accurate alignment between the ends of the optical fiber and the thin film optical waveguide.
In the view of the embodiments shown in
Integrated Image Acquisition Emitting Scanner Embodiment
Further detail is now provided regarding preferred embodiments.
Emitting scanner system 200 creates an illuminated spot on a target 190 by scanning light from a light source 202. Light source 202 preferably comprises a semiconductor device such as a light-emitting diode (LED) or a laser diode, so that the light source can be fabricated along with a scanner 210 using conventional micro-fabrication techniques. However, as suggested above, an external light source can alternatively be coupled to scanner 210 with an optical fiber or other waveguide. Preferably, when voltage is applied to light source 202 from a power lead 203, an emitter 204 generates light. Light source 202 is preferably end-butted to cantilever 212, which acts as a waveguide to direct emitted light from a fixed end 214 to a free end 216.
Fixed end 214 is attached to a substrate 220 such as n-type silicon. During fabrication of emitting scanner system 200, substrate 220 is etched to create a channel 222 into which free end 216 can deflect. Free end 216 may include a lens 218 to collimate or focus the light onto target 190. Free end 216 is preferably driven into resonance in one or more orthogonal directions to create an illuminated spot on target 190. For example, a vertical actuator 230 can deflect cantilever 212 in a vertical direction relative to a primary plane of substrate 220. Similarly, a horizontal actuator can be implemented with deflection electrodes 234a-234d to deflect cantilever 212 in a horizontal direction relative to the primary plane of substrate 220. By controlling the vertical and horizontal deflections, free end 216 can illuminate target 190 in a raster scan pattern, a spiral pattern, or other pattern. Alternatively, those skilled in the art will recognize that the free end can be deflected into a two-dimensional circular motion or into a two-dimensional rocking motion using a single actuator. The illumination can be used for acquiring an image, displaying an image, performing a therapy, or performing another function. For image acquisition, the image created from the backscattered light is captured by hybrid photon detectors 224a and 224b that are integrated onto substrate 220. Position sensor array 236 detects the vertical and horizontal position of free end 216 as a function of a conductive layer 235 that is integrated onto cantilever 212. This or other position sensor implementations enable long term control of scanning stability.
Integrated Display Scanner Embodiment
As with the image acquisition system, mechanical scanning is provided by vertical actuator 230 and horizontal actuators 234a-234d. The position of free end 216 is monitored by the same type of position sensors as used for the image acquisition system. The information provided by the position sensors is used by an actuator control circuit to maneuver, position and/or stabilize free end 216 for creating the displayed image.
Integrated Image Acquisition Collecting Scanner Embodiment
An alternative preferred embodiment is provided for image acquisition.
Integrated Hybrid Embodiments
As an alternative to collecting scanner system 240 and emitting scanner system 200 described above, a number of hybrids of the two systems can be implemented. For example,
Another hybrid embodiment uses the cantilever to support and move the light source directly. For instance,
Other hybrid approaches can be used that maintain an integrated system. For example, a parallel array of cantilevers could be fabricated adjacent to each other and actuated in one dimension, thus creating a light scan over an area. This device would not require the relatively fast scan rates and large amplitudes of a single scanning waveguide. As a further example, the functional components, such as the laser diode light source, the waveguide, the photodiodes, the coupler, the position sensors and actuators can be integrated on separate substrates, each consisting of stacked functional modules. The light source might use a different substrate from that supporting the cantilever. For instance, GaAs can be used for the laser diode substrate for its ideal optical emission spectrum and output power efficiency. Further details regarding embodiments of the components comprising collecting scanner systems and emitting scanner systems are described below.
Cantilever
To balance the need for the cantilever to transmit light and mechanically resonate, a number of alternative materials or material combinations are possible. In one embodiment, the cantilever comprises a two-layer composite of silicon dioxide (SiO2) and silicon. The SiO2 is used as an optical core through which the light travels. However, as a thermal oxide, SiO2 has less-than-desirable mechanical properties. Thus, a thin SiO2 layer of approximately 2.2 micrometers (μm) is thermally grown on a layer of single crystal silicon that is approximately 30 μm thick. The silicon layer gives the composite cantilever increased mechanical stiffness and durability. It is also preferable to include a low index buffer layer to optically isolate the silicon layer from the SiO2 layer, because silicon has a high index of refraction and is absorbing in the visible band. However, for a short length cantilever such as less than 2 mm, the buffer layer can be omitted without excessive optical power loss.
Alternatively, a film of silicon nitride (SixNy), or other compound can be used as the waveguide. However, a thick (SixNy) film (e.g., >1 μm) cantilever waveguide can be difficult to fabricate. It is also difficult to align the thin optically transmissive layer with the emitter of an integrated light source, and even more difficult to align the thin optically transmissive layer with a fiber from an external light source. Thus, a preferred cantilever embodiment includes a thicker cantilever waveguide comprised of a mechanically durable material that still provides good optical transmission. One such material, which is also well suited to micro-fabrication, is SU-8 photoresist, originally developed by IBM™ (see U.S. Pat. No. 4,882,245). SU-8 offers beneficial imaging capabilities such as vertical sidewall profiles and dimensional control over an entire structure height. In addition, high functionality results in minimal swelling. Processing advantages include a highly cross-linked structure, which results in chemical resistance and high thermal characteristics and processing to greater than 200° C. As an epoxy based resin, SU-8 offers good adhesion to most surfaces as well as improved wetting on silicon glass, metals, and other low surface energy substrates. With its exposure near ultraviolet (UV) wavelengths (350-400 nm), SU-8 is a cost effective alternative to expensive X-ray processing.
Thus, cantilever 212 preferably comprises an SU-8 cantilever waveguide that is approximately 85 μm thick at the fixed end and tapered to fit within a smaller diameter toward the free end. Cantilever 212 is also approximately 125 μm wide, and approximately 0.5 mm to 1.0 mm long from fixed end 214 to free end 216. An overall larger coupling area at the fixed end makes it much easier to couple a light source to the cantilever waveguide, and increases the amount of light coupled into the cantilever waveguide. To further assist optical coupling, a tapered waveguide coupler (not shown) can be fabricated between the light source and the cantilever waveguide (or stationary waveguides). The SU-8 epoxy resin also makes the cantilever waveguide more durable. Although the modulus of elasticity of SU-8 (4.02 GPa) is less than the SiO2/Si composite beam (silicon 125 GPa and silicon oxide 57 GPa), the increased thickness of the SU-8 cantilever waveguide results in resonant frequencies of approximately 20 kHz, which is typical of sVGA video rates.
Cantilever 212 is preferably formed by first spin coating the SU-8 photoresist onto the silicon substrate. The SU-8 is exposed with a mask to define the shape of the cantilever. The unexposed SU-8 is removed with a developing solution. A deep reactive ion etching (REI) process then etches the silicon substrate down to near the fixed end to release the SU-8 cantilever. A detailed description of the fabrication steps and REI process are provided in “Development of a Microfabricated Scanning Endoscope Using SU-8 Based Optical Waveguide” (Wei-Chih Wang, Reynolds Panergo, and Per Reinhall, Proceedings of Society of Photo-Optical Instrumentation Engineers, September 2003) and “Deep Reactive Ion Etching of Silicon Using an Aluminum Etching Mask” (Wei-Chih Wang, Joe Nhut Ho, and Per Reinhall, Proceedings of Society of Photo-Optical Instrumentation Engineers, Vol. 4876, 2003), both of which are hereby explicitly incorporated by reference. Other layers of material and micro-fabrication steps can be used to create the other integrated components along with one or more cantilevers.
Cantilever 212 can also be tapered with fixed end 214 being wider than free end 216. Tapering increases angular tip deflection, which provides a larger FOV. However, increased tip deflection may have to be balanced against the overall device size that is desired for a given application. Tapering can also be used to reduce the effective point source size of light emitted from the cantilever.
Lens
A variety of lenses can be implemented at the free end of the cantilever waveguide or stationary waveguides.
Actuators
A variety of actuators can be implemented to drive the cantilever. One embodiment utilizes electrostatic force. An electrostatic actuator is advantageous for a number of reasons, including:
For two dimensional rectilinear raster scanning, the cantilever is scanned in two orthogonal axes simultaneously. To produce such a scan, a second set of independent and orthogonally oriented deflection electrodes are used to control the horizontal motion of the cantilever.
Since the electrostatic actuator is not bandwidth limited by the scanning frequency, this technique can provide a higher scan rate than required by most standard video displays such as 31.5 kHz for VGA and 37.5-40 kHz for SVGA. Note, however, that for bidirectional scanning, the frequency of the cantilever need only be half of these stated values. Alternatively, a macro scale raster scanning device can be used as an identification (ID) scanner or a bar code scanner. For two dimensional nonrectilinear scans using single or dual actuators, the waveguide must be driven with a large base excitation to attain a large FOV. By controlling the excitation frequency, phase and amplitude, a steady in-and-out swirling scanning pattern can be achieved from free end 216. A circular scan pattern can be excited by applying excitation in horizontal and vertical directions ninety degrees out of phase. The circular pattern with varying radii can be controlled by the amplitude of the excitation. A rotating rectilinear scan pattern can be excited by applying electric potentials to two electrodes placed slightly at an angle to each other rather than orthogonal to each other. To generate the rotation on the rectilinear motion, a larger voltage must be applied to the electrodes for one direction (e.g., angled electrodes) than the voltage applied to the electrodes for the other direction (e.g., vertical electrodes). The result is a line sweep rocking back and forth between 0 and 180 degrees.
The relationship between deflection of free end 216 and the applied voltage can be nonlinear. To improve its linearity, an electrostatic comb drive can be used as the actuator such as that described by W. C. Tang et al. (IEEE Sensors and Actuator Workshop. A21, 23 (1990)). In a comb drive, the capacitance is varied through changing area, not the gap. Since capacitance is linearly related to area, the displacement will vary as the square of the applied voltage. In addition, harnessing the nonlinearity of cantilever deflection would be advantageous in that it would then be possible for a single actuator to generate two-dimensional (2D) motion of free end 216.
For vertical actuator 230 shown in
The disadvantage of using piezoelectric thin film for an actuator is that the actuator requires a high voltage for displacement in the micron regime. However, the problem can be partially alleviated by implementing a bimorph configuration. When mechanical pressure is applied to one of these materials, the crystalline structure produces a voltage proportional to the pressure. Conversely, when an electric field is applied, the structure changes shape producing dimensional changes in the material. Within certain range of electric and thermal stress, the voltage change ΔV gives rise to a corresponding force change ΔF based on
ΔV=dijxΔF/∈o∈rA
where dij is a charge sensitivity coefficient, x is the spacing between the two conducting plates of area A, and ∈o and ∈r are air and material dielectric constants, respectively. (For further detail, see G. S. Kino, Acoustic Wave Device, Imaging & Analog Signal Processing (1987)). The electromechanical materials preferably used for a microactuator are ZnO, lead zirconate titanate (PZT) and polyvinylidene fluoride (PVDF). A preferred way of depositing a ZnO thin film is to use a sputtering method. (For further detail, see S. B. Krupanidhi et al., J. Appl. Phys., 56, 3308 (1984); B. T. Khuri-Yakub et al., J. Appl. Phys. 52, 4772 (1981)). Depositing PZT usually involves either sputtering or a sol-gel method, which is a method based on spin application of a chemical solution. (For further detail, see A. Okada, J. Appl. Phys., 48, 2905 (1977); T. Tunkasiri et al., J. Mat. Sci. Lett., 19, 1913 (2000); G. Yi et al., J. App. Phys., 64, (1989); M. L. Wen et al., Proceedings-of-the-SPIE, 3892, (1999)). PVDF is preferably deposited as spin cast film from dilute solution in which PVDF powder has been dissolved.
As indicated by functional blocks 54 and 84 of
Position Sensors
A variety of position sensors can also be implemented to detect the position or other motion characteristic of the cantilever. The position sensor embodiment shown in
Another position sensor embodiment utilizes the piezoelectric effect. As with a piezoelectric actuator, a piezoelectric position sensor includes a piezoelectric thin film deposited on both sides of the cantilever. Displacement of the cantilever is determined by measuring the strain-induced electric field on the piezoelectric thin film. The configuration of a piezoelectric position sensor is the same as the piezoelectric actuator shown in
A similar, but alternate position sensor embodiment uses a piezoresistive effect, which results in a change of carrier mobility as a function of stress. Effectively, a piezoresistance position sensor comprises a semiconductor strain gauge such as taught by J. J. Wortman et al., IEEE Elect. Dev. 16, 855 (1969); B. Puers et al., IEEE Elect. Dev. 35, 764 (1988) and S. R. Manalis, Appl. Phys. Lett. 69, 3944 (1996). The position sensor comprises electrically conducting, strain-sensitive regions that are fabricated by diffusing impurities, such as a boron dopant, into a highly resistive, single-crystal cantilever. For example, a p-type layer of boron can be diffused into an n-type silicon layer on the cantilever. The diffusion process preferably comprises open-tube boron diffusion from boron nitride, or boron ion plantation.
A bi-axial displacement sensor is illustrated in
ΔR/R=πlTl+πtTt
where πl and πt are longitudinal and transverse piezoresistance coefficients, and Tl and Tt are stresses parallel and perpendicular to the direction of current in the layer.
As indicated by functional blocks 54 and 84 of
Alternatively, a ferromagnetic material can be deposited on the scanning waveguide, so that the free end position can be tracked with inductive coils. Conversely, a magnetic sensor can detect a change in magnetic field. As another alternative, an integrated, dual axis interferometer can be used to detect the vertical and horizontal position of the free end of the waveguide. A piezoresistive sensor can also be used in the cantilever to detect position. If space is available, a quadrant fiber bundle can be used to detect light provided from the free end of the waveguide. Additionally, the actuator itself can be used to detect position.
Light Detectors
As illustrated in
As an alternative to using a conventional mesa-geometry photodiode configuration for the light detector, the diode can be hybridized with a waveguide and a fiber detector to optimize the intensity absorption. As illustrated by
In any case, the small detecting area on the diode (approximately few tens of microns in diameter) provides giga hertz (GHz) range bandwidth. Also, since pairs of red, green, and blue photodetectors are required for capturing a color image, these silicon-based photodiodes can offer sufficient bandwidth in the visible spectrum (e.g., the photodetector bandwidths must exceed 12.5 MHz for VGA and 19.8 MHz for SVGA video standard). To improve the overall wavelength response with modest bias, an intrinsic region of high resistivity can be added to the p-n junction to form a so-called PIN structure. To obtain a high current gain and maintain a high operating frequency, an avalanche photodetector (APD) structure can be implemented, such as that described by P. P. Webb, IEEE solid state sensors symposium. 96 (1970). In this device, a basic p-n structure is operated under a very high reverse bias. By setting the bias precisely at the point of avalanche breakdown, carrier multiplication due to impact ionization can result in significant gain in terms of increase in the carrier to photon ratio. The current multiplication for an avalanche diode can be as high as 4 orders of magnitude (based on commercially available photovoltaic photodiode and APD from UDT Sensor LTD).
Although the present invention has been described in connection with the preferred form of practicing it and modifications thereto, those of ordinary skill in the art will understand that many other modifications can be made to the present invention within the scope of the claims that follow. For example, modular components can be constructed separately, each with an optimal substrate. The modular components can then be bonded together using anodic, adhesive, or other bonding methods. Accordingly, it is not intended that the scope of the invention in any way be limited by the above description, but instead be determined entirely by reference to the claims that follow.
This application is a continuation of a copending patent application Ser. No. 10/655,482, filed on Sep. 4, 2003, which itself is a continuation-in-part of prior patent application Ser. No. 09/850,594, filed on May 7, 2001, and now issued on Dec. 13, 2005, as U.S. Pat. No. 6,975,898, the benefit of the filing date of which is hereby claimed under 35 U.S.C. §120. Application Ser. No. 09/850,594 is itself based on a prior copending provisional application Ser. No. 60/212,411, filed on Jun. 19, 2000, the benefit of the filing date of which is hereby claimed under 35 U.S.C. §119(e).
This invention was made with government support under grant numbers CA094303 and CA096633 awarded by the National Institutes of Health. The government has certain rights in this invention.
Number | Name | Date | Kind |
---|---|---|---|
3889662 | Mitsui | Jun 1975 | A |
3918438 | Hayamizu et al. | Nov 1975 | A |
4118270 | Pan et al. | Oct 1978 | A |
4234788 | Palmer et al. | Nov 1980 | A |
4265699 | Ladany | May 1981 | A |
4410235 | Klement et al. | Oct 1983 | A |
4454547 | Yip et al. | Jun 1984 | A |
4686963 | Cohen et al. | Aug 1987 | A |
4688555 | Wardle | Aug 1987 | A |
4695163 | Schachar | Sep 1987 | A |
4710619 | Haberl | Dec 1987 | A |
4743283 | Borsuk | May 1988 | A |
4758222 | McCoy | Jul 1988 | A |
4762118 | Lia et al. | Aug 1988 | A |
4768513 | Suzuki | Sep 1988 | A |
4782228 | Westell | Nov 1988 | A |
4804395 | Clark et al. | Feb 1989 | A |
4824195 | Khoe | Apr 1989 | A |
4850364 | Leavitt | Jul 1989 | A |
4928316 | Heritage et al. | May 1990 | A |
4979496 | Komi | Dec 1990 | A |
4983165 | Loiterman | Jan 1991 | A |
5037174 | Thompson | Aug 1991 | A |
5074642 | Hicks | Dec 1991 | A |
5103497 | Hicks | Apr 1992 | A |
5172685 | Nudelman | Dec 1992 | A |
5209117 | Bennett | May 1993 | A |
5231286 | Kajimura et al. | Jul 1993 | A |
5247174 | Berman | Sep 1993 | A |
5272330 | Betzig et al. | Dec 1993 | A |
5286970 | Betzig et al. | Feb 1994 | A |
5305759 | Kaneko et al. | Apr 1994 | A |
5321501 | Swanson et al. | Jun 1994 | A |
5360968 | Scott | Nov 1994 | A |
5381782 | DeLaRama et al. | Jan 1995 | A |
5394500 | Marchman | Feb 1995 | A |
5405337 | Maynard | Apr 1995 | A |
5425123 | Hicks | Jun 1995 | A |
5459803 | Yamane et al. | Oct 1995 | A |
5480046 | Filas et al. | Jan 1996 | A |
5507725 | Savage et al. | Apr 1996 | A |
5512035 | Konstorum et al. | Apr 1996 | A |
5535759 | Wilk | Jul 1996 | A |
5549542 | Kovalcheck | Aug 1996 | A |
5563969 | Honmou | Oct 1996 | A |
5570441 | Filas et al. | Oct 1996 | A |
5621830 | Lucey et al. | Apr 1997 | A |
5627922 | Kopelman et al. | May 1997 | A |
5643175 | Adair | Jul 1997 | A |
5649897 | Nakamura | Jul 1997 | A |
5668644 | Kuroiwa et al. | Sep 1997 | A |
5703979 | Filas et al. | Dec 1997 | A |
5715337 | Spitzer et al. | Feb 1998 | A |
5724169 | LaGasse | Mar 1998 | A |
5727098 | Jacobson | Mar 1998 | A |
5765561 | Chen et al. | Jun 1998 | A |
5894122 | Tomita | Apr 1999 | A |
5906620 | Nakao et al. | May 1999 | A |
5919200 | Stambaugh et al. | Jul 1999 | A |
5939709 | Ghislain et al. | Aug 1999 | A |
5947905 | Hadjicostis et al. | Sep 1999 | A |
5984860 | Shan | Nov 1999 | A |
5991697 | Nelson et al. | Nov 1999 | A |
6035229 | Silverstein et al. | Mar 2000 | A |
6046720 | Melville et al. | Apr 2000 | A |
6059719 | Yamamoto et al. | May 2000 | A |
6069698 | Ozawa et al. | May 2000 | A |
6081605 | Roth et al. | Jun 2000 | A |
6091067 | Drobot et al. | Jul 2000 | A |
6096054 | Wyzgala et al. | Aug 2000 | A |
6097528 | Lebby et al. | Aug 2000 | A |
6134003 | Tearney et al. | Oct 2000 | A |
6142957 | Diamond et al. | Nov 2000 | A |
6148095 | Prause et al. | Nov 2000 | A |
6161035 | Furusawa | Dec 2000 | A |
6169281 | Chen et al. | Jan 2001 | B1 |
6185443 | Crowley | Feb 2001 | B1 |
6191862 | Swanson et al. | Feb 2001 | B1 |
6211904 | Adair et al. | Apr 2001 | B1 |
6215437 | Schurmann et al. | Apr 2001 | B1 |
6240312 | Alfano et al. | May 2001 | B1 |
6241657 | Chen et al. | Jun 2001 | B1 |
6246914 | de la Rama et al. | Jun 2001 | B1 |
6289144 | Neuschafer et al. | Sep 2001 | B1 |
6294775 | Seibel et al. | Sep 2001 | B1 |
6327493 | Ozawa et al. | Dec 2001 | B1 |
6370422 | Richards-Kortum et al. | Apr 2002 | B1 |
6387119 | Wolf et al. | May 2002 | B2 |
6388641 | Tidwell et al. | May 2002 | B2 |
6441359 | Cozier et al. | Aug 2002 | B1 |
6443894 | Sumanaweera et al. | Sep 2002 | B1 |
6461337 | Minotti et al. | Oct 2002 | B1 |
6466687 | Uppaluri et al. | Oct 2002 | B1 |
6485413 | Boppart et al. | Nov 2002 | B1 |
6492962 | Melville et al. | Dec 2002 | B2 |
6515274 | Moskovits et al. | Feb 2003 | B1 |
6515781 | Lewis et al. | Feb 2003 | B2 |
6525310 | Dunfield | Feb 2003 | B2 |
6545260 | Ono et al. | Apr 2003 | B1 |
6546271 | Reisfeld | Apr 2003 | B1 |
6549801 | Chen et al. | Apr 2003 | B1 |
6550918 | Agostinelli et al. | Apr 2003 | B1 |
6563105 | Seibel et al. | May 2003 | B2 |
6563998 | Farah et al. | May 2003 | B1 |
6564087 | Pitris et al. | May 2003 | B1 |
6564089 | Izatt et al. | May 2003 | B2 |
6567678 | Oosta et al. | May 2003 | B1 |
6612980 | Chen et al. | Sep 2003 | B2 |
6615072 | Izatt et al. | Sep 2003 | B1 |
6678541 | Durkin et al. | Jan 2004 | B1 |
6685718 | Wyzgala et al. | Feb 2004 | B1 |
6687010 | Horii et al. | Feb 2004 | B1 |
6689064 | Hager et al. | Feb 2004 | B2 |
6690963 | Ben-Haim et al. | Feb 2004 | B2 |
6694983 | Wolf et al. | Feb 2004 | B2 |
6735463 | Izatt et al. | May 2004 | B2 |
6755532 | Cobb | Jun 2004 | B1 |
6773394 | Taniguchi et al. | Aug 2004 | B2 |
6779892 | Agostinelli et al. | Aug 2004 | B2 |
6785571 | Glossop | Aug 2004 | B2 |
6788967 | Ben-Haim et al. | Sep 2004 | B2 |
6818001 | Wulfman et al. | Nov 2004 | B2 |
6826342 | Bise et al. | Nov 2004 | B1 |
6832984 | Stelzer et al. | Dec 2004 | B2 |
6836560 | Emery | Dec 2004 | B2 |
6839586 | Webb | Jan 2005 | B2 |
6845190 | Smithwick et al. | Jan 2005 | B1 |
6856712 | Fauver et al. | Feb 2005 | B2 |
6858005 | Ohline et al. | Feb 2005 | B2 |
6872433 | Seward et al. | Mar 2005 | B2 |
6882429 | Weitekamp et al. | Apr 2005 | B1 |
6889175 | Green | May 2005 | B2 |
6892090 | Verard et al. | May 2005 | B2 |
6895270 | Ostrovsky | May 2005 | B2 |
6902528 | Garibaldi et al. | Jun 2005 | B1 |
6932829 | Majercak | Aug 2005 | B2 |
6975898 | Seibel et al. | Dec 2005 | B2 |
7004173 | Sparks et al. | Feb 2006 | B2 |
7023558 | Fee et al. | Apr 2006 | B2 |
7038191 | Kare et al. | May 2006 | B2 |
7072046 | Xie et al. | Jul 2006 | B2 |
7158234 | Uchiyama et al. | Jan 2007 | B2 |
7170610 | Knuttel | Jan 2007 | B2 |
7179220 | Kukuk | Feb 2007 | B2 |
7189961 | Johnston et al. | Mar 2007 | B2 |
7230583 | Tidwell et al. | Jun 2007 | B2 |
7252674 | Wyzgala et al. | Aug 2007 | B2 |
7261687 | Yang | Aug 2007 | B2 |
7324211 | Tsujita | Jan 2008 | B2 |
7349098 | Li et al. | Mar 2008 | B2 |
7366376 | Shishkov et al. | Apr 2008 | B2 |
7404929 | Fulghum, Jr. | Jul 2008 | B2 |
7447408 | Bouma et al. | Nov 2008 | B2 |
7473232 | Teague | Jan 2009 | B2 |
7515274 | Gelikonov et al. | Apr 2009 | B2 |
7530948 | Seibel et al. | May 2009 | B2 |
7615005 | Stefanchik et al. | Nov 2009 | B2 |
7616986 | Seibel et al. | Nov 2009 | B2 |
7747312 | Barrick et al. | Jun 2010 | B2 |
7783337 | Feldman et al. | Aug 2010 | B2 |
7901348 | Soper et al. | Mar 2011 | B2 |
20010030744 | Chang et al. | Oct 2001 | A1 |
20020071625 | Bartholomew et al. | Jun 2002 | A1 |
20030009189 | Gilson et al. | Jan 2003 | A1 |
20030032878 | Shahidi | Feb 2003 | A1 |
20030045778 | Ohline et al. | Mar 2003 | A1 |
20030055317 | Taniguchi et al. | Mar 2003 | A1 |
20030103199 | Jung et al. | Jun 2003 | A1 |
20030103665 | Uppaluri et al. | Jun 2003 | A1 |
20030142934 | Pan et al. | Jul 2003 | A1 |
20030160721 | Gilboa et al. | Aug 2003 | A1 |
20030179428 | Suzuki et al. | Sep 2003 | A1 |
20030208107 | Refael | Nov 2003 | A1 |
20030208134 | Secrest et al. | Nov 2003 | A1 |
20030216639 | Gilboa et al. | Nov 2003 | A1 |
20030220749 | Chen et al. | Nov 2003 | A1 |
20030236564 | Majercak | Dec 2003 | A1 |
20040015049 | Zaar | Jan 2004 | A1 |
20040015053 | Bieger et al. | Jan 2004 | A1 |
20040033006 | Farah | Feb 2004 | A1 |
20040061072 | Gu et al. | Apr 2004 | A1 |
20040118415 | Hall et al. | Jun 2004 | A1 |
20040147827 | Bowe | Jul 2004 | A1 |
20040176683 | Whitin et al. | Sep 2004 | A1 |
20040181148 | Uchiyama et al. | Sep 2004 | A1 |
20040199052 | Banik et al. | Oct 2004 | A1 |
20040243227 | Starksen et al. | Dec 2004 | A1 |
20040249267 | Gilboa | Dec 2004 | A1 |
20040260199 | Hardia et al. | Dec 2004 | A1 |
20050020878 | Ohnishi et al. | Jan 2005 | A1 |
20050020926 | Wiklof et al. | Jan 2005 | A1 |
20050036150 | Izatt et al. | Feb 2005 | A1 |
20050054931 | Clark | Mar 2005 | A1 |
20050065433 | Anderson | Mar 2005 | A1 |
20050085693 | Belson et al. | Apr 2005 | A1 |
20050111009 | Keightley et al. | May 2005 | A1 |
20050168751 | Horii et al. | Aug 2005 | A1 |
20050171438 | Chen et al. | Aug 2005 | A1 |
20050171592 | Majercak | Aug 2005 | A1 |
20050183733 | Kawano et al. | Aug 2005 | A1 |
20050206774 | Tsujimoto | Sep 2005 | A1 |
20050215854 | Ozaki et al. | Sep 2005 | A1 |
20050215911 | Alfano et al. | Sep 2005 | A1 |
20050228290 | Borovsky et al. | Oct 2005 | A1 |
20050250983 | Tremaglio et al. | Nov 2005 | A1 |
20050272975 | McWeeney et al. | Dec 2005 | A1 |
20060015126 | Sher | Jan 2006 | A1 |
20060030753 | Boutillette et al. | Feb 2006 | A1 |
20060052662 | Kress | Mar 2006 | A1 |
20060100480 | Ewers et al. | May 2006 | A1 |
20060126064 | Bambot et al. | Jun 2006 | A1 |
20060149134 | Soper et al. | Jul 2006 | A1 |
20060149163 | Hibner et al. | Jul 2006 | A1 |
20060187462 | Srinivasan et al. | Aug 2006 | A1 |
20060202115 | Lizotte et al. | Sep 2006 | A1 |
20060252993 | Freed et al. | Nov 2006 | A1 |
20070038119 | Chen et al. | Feb 2007 | A1 |
20070066983 | Maschke | Mar 2007 | A1 |
20070088219 | Xie et al. | Apr 2007 | A1 |
20070093703 | Sievert et al. | Apr 2007 | A1 |
20070129601 | Johnston et al. | Jun 2007 | A1 |
20070213618 | Li et al. | Sep 2007 | A1 |
20070270650 | Eno et al. | Nov 2007 | A1 |
20080004491 | Karasawa | Jan 2008 | A1 |
20080221388 | Seibel et al. | Sep 2008 | A1 |
Number | Date | Country |
---|---|---|
4428967 | Dec 1995 | DE |
0 713 672 | May 1996 | EP |
0 520 388 | Sep 1996 | EP |
1 077 360 | Feb 2001 | EP |
1 088 515 | Apr 2001 | EP |
1 142 529 | Oct 2001 | EP |
0 712 032 | Dec 2001 | EP |
1 310 206 | May 2003 | EP |
1 421 913 | May 2004 | EP |
0 910 284 | Jan 2007 | EP |
1 063 921 | Feb 2007 | EP |
05-154154 | Jun 1993 | JP |
06-511312 | Dec 1994 | JP |
2001174744 | Jun 2001 | JP |
WO 9320742 | Oct 1993 | WO |
WO 9602184 | Feb 1996 | WO |
WO 9838907 | Sep 1998 | WO |
WO 9843530 | Oct 1998 | WO |
WO 9904301 | Jan 1999 | WO |
WO 0197902 | Dec 2001 | WO |
WO 2005024496 | Mar 2005 | WO |
Entry |
---|
Barhoum et al., “Optical modeling of an ultrathin scanning fiber endoscope, a preliminary study of confocal versus non-confocal detection.” Optics Express, vol. 13, No. 19: 7548-7562, Sep. 19, 2005. |
Barnard et al., “Mode Transforming Properties of Tapered Single-mode Fiber Microlens.” Appl. Opt. vol. 32, No. 12: 2090-2094, Apr. 20, 1993. |
Barnard et al., “Single-mode Fiber Microlens with Controllable Spot Size.” Appl. Opt. vol. 30, No. 15: 1958-1962, May 20, 1991. |
Bird et al., “Two-photon fluorescence endoscopy with a micro-optic scanning head.” Optics Letters, vol. 28, No. 17: 1552-1554, 2003. |
Borreman et al., “Fabrication of Polymeric Multimode Waveguides and Devices in SU-8 Photoresist Using Selective Polymerization.” Proceedings Symposium IEEE/LEOS Benelux Chapter, Amsterdam: pp. 83-86, 2002. |
Brown et al., “Recognising Panoramas.” Proceedings of the Ninth IEEE International Conference on Computer Vision 8pp., Apr. 2003. |
Brunetaud et al., “Lasers in Digestive Endoscopy.” Journal of Biomedical Optics vol. 2, No. 1: 42-52, Jan. 1997. |
Chen et al., “Dispersion management up to the third order for real-time optical coherence tomography involving a phase or frequency modulator.” Optics Express vol. 12, No. 24: 5968-5978, 2004. |
Chen et al., “Optical Doppler tomographic imaging of fluid flow velocity in highly scattering media.” Optics Letters, vol. 22, No. 1: 64-66, 1997. |
Clark et al., “Fiber delivery of femtosecond pulses from a Ti:sapphire laser.” Optics Letters, vol. 26, No. 17: 1320-1322, 2001. |
Deschamps et al., “Automatic construction of minimal paths in 3D images: An application to virtual endoscopy.” CARS'99 —H. U Lemke, M.W. Vannier, K. Inamura & A.G. Fannan (Editors) Elsevier Science B.V.: 151-155, 1999. |
Dickensheets et al., “A Scanned Optical Fiber Confocal Microscope.” Three-Dimensional Microscopy SPIE vol. 2184: 39-47, 1994. |
Dickensheets et al., “Micromachined scanning confocal optical microscope.” Optics Letters, vol. 21, No. 10: 764-766, May 15, 1996. |
Drexler et al., “In vivo ultrahigh-resolution optical coherence tomography.” Optics Letters, vol. 24, No. 17: 1221-1223, 1999. |
Finci et al., “Tandem balloon catheter for coronary angioplasty.” Catheter Cardiovascular Diagnosis vol. 12, No. 6: 421-425, 1986. 2pp Abstract. |
Flusberg et al., “In vivo brain imaging using a portable 3.9 gram two-photon fluorescence microendoscope.” Optics Letters, vol. 30, No. 17: 2272-2274. 2005. |
Fu et al., “Nonlinear optical microscopy based on double-clad photonic crystal fibers.” Optics Express vol. 13, No. 14: 5528-5534 + supplemental page, 2005. |
Göbel et al., “Miniaturized two-photon microscope based on a flexible coherent fiber bundle and a gradient-index lens objective.” Optics Letters, vol. 29, No. 21: 2521-2523, 2004. |
Helmchen et al., “A Miniature Head-Mounted Two-Photon Microscope: High Resolution Brain Imaging in Freely Moving Animals.” NEURON vol. 31: 903-912, Sep. 27, 2001. |
Herline et al., “Surface Registration for Use in Interactive, Image-Guided Liver Surgery.” Computer Aided Surgery, vol. 5: 11-17, 1999. |
Higgins et al., “Integrated Bronchoscopic Video Tracking and 3D CT Registration for Virtual Bronchoscopy.” Medical Imaging 2003, vol. 5031: 80-89, 2003. |
Huang et al., “Optical Coherence Tomography.” Science vol. 254, Issue 5035: 1178-1181, 1991. |
Huber et al., “Amplified, frequency swept lasers for frequency domain reflectometry and OCT imaging: design and scaling principles.” Optics Express vol. 13, No. 9: 3513-3528, May 2, 2005. |
Jung et al., “Multiphoton endoscopy.” Optics Letters, vol. 28, No. 11: 902-904, 2003. |
Kiesslich et al., “Diagnosing Helicobacter pylori In Vivo by Confocal Laser Endoscopy.” Gastroenterology vol. 128: 2119-2123, 2005. |
Kiraly et al., “Three-Dimensional Path Planning for Virtual Bronchoscopy.” IEEE Transactions on Medical Imaging, vol. 23, No. 9: 1365-1379, Sep. 2004. |
Lee et al., “Microlenses on the End of Single-mode Optical Fibers for Laser Applications.” Appl. Opt. vol. 24, No. 19: 3134-3139, Oct. 1, 1985. |
Lewis et al., “Scanned beam medical imager.” MOEMS Display and Imaging System II, edited by Hakan Urey, David L. Dickensheets, Proceedings of SPIE, Bellingham, WA, vol. 5348: 40-51, 2004. |
Lexer et al., “Dynamic coherent focus OCT with depth-independent transversal resolution.” Journal of Modern Optics vol. 46, No. 3: 541-553, 1999. |
Li et al., “Optical Coherence Tomography: Advanced Technology for the Endoscopic Imaging of Barrett's Esophagus” Endoscopy, vol. 32, No. 12: 921-930, 2000. |
Liu et al., “3D Navigation for Endoscope by Magnetic Field.” Proceedings of SPIE, vol. 4556 25-28, 2001. |
Liu et al., “Rapid-scanning forward-imaging miniature endoscope for real-time optical coherence tomography.” Optics Letters, vol. 29, No. 15: 1763-1765, 2004. |
Martinez, O.E., “3000 Times Grating Compressor with Positive Group-Velocity Dispersion—Application to Fiber Compensation in 1.3-1.6 μm Region.”IEEE Journal of Quantum Electronics vol. 23: 59-64, 1987. |
Mori et al., “A Method for Tracking camera motion of real endoscope by using virtual endoscopy system,” Proceedings of SPIE: 1-12, 2000. <www.http://www.toriwaki.nuie.nagoya-u.ac.jp> 12pp 1-12. |
Morofke et al., “Wide dynamic range detection of bidirectional flow in Doppler Optical coherence tomograph using a two-dimensional Kasai estimator,” Optics Letters, vol. 32, No. 3: 253-255, Feb. 1, 2007. |
Murakami et al., “A Miniature Confocal Optical Microscope With Mems Gimbl scanner.”The 12th International Conference on Solid Sate Sensors, Actuators and Microsystems Boston: 587-590, Jun. 8-12, 2003. |
Myaing et al., “Enhanced two-photon biosensing with double-clad photonic crystal fibers,” Optics Letters, vol. 28, No. 14: 1224-1226, 2003. |
Ohmi et al., “Quasi In-Focus Optical Coherence Tomography.” Japanese Journal of Applied Physics vol. 43, No. 2: 845-849, 2004. |
Oikawa et al., “Intra-operative Guidance with Real-time Information of Open MRI and Manipulators Using Coordinate-Integration Module.” Proceedings of SPIE, vol. 5029: 653-600, 2003. |
Pagoulatos et al., “Image-based Registration of Ultrasound and Magnetic Resonance Images: A Preliminary Study,”Proceedings of SPIE, vol. 3976: 156-164, 2000. |
Patterson et al., “Applications of time-resolved light scattering measurements to photodynamic therapy dosimetry,”SPIE vol. 1203, Photodynamic Therapy: Mechanism II: 62-75, 1990. |
Pyhtila et al., “Determining nuclear morphology using an improved angle-resolved low coherence interferometry system.” Optics Express, vol. 11, No. 25: 3473-3484, Dec. 15, 2003. |
Pyhtila et al., “Fourier-domain angle-resolved low coherence interferometry through an endoscopic fiber bundle for light-scattering spectroscopy.” Optics Letters, vol. 31, No. 6: 772-774, Dec. 1, 2005. |
Pyhtila et al., “Rapid, depth-resolved light scattering measurements using Fourier domain, angle-resolved low coherence interferometry.” Optical Society of America: 6pp, 2004. |
Podoleanu et al., “Three dimensional OCT images from retina and skin.” Optics Express vol. 7, No. 9: 292-298, 2000. |
Qi et al., “Dynamic focus control in high-speed optical coherence tomography based on a microelectromechanical mirror.” Optics Communications vol. 232: 123-128, 2004. |
Russo et al., “Lens-ended Fibers for Medical Applications: A New Fabrication Technique.” Appl. Opt. vol. 23, No. 19: 3277-3283, Oct. 1, 1984. |
Sasaki et al., “Scanning Near-Field Optical Microscope using Cantilever Integrated with Light Emitting Diode, Waveguide, Aperture, and Photodiode.” Optical MEMS 2000 Invited Speakers: Advance Program, Sponsored by IEEE Lasers and Electro-Optics Society: 16pp, 2000. Available at <http://www.ieee.org/organizations/society/leos/LEOSCONF/MEMS/omspeak.html.>. |
Schmitt et al., “An optical coherence microscope with enhanced resolving power in thick tissue.” Optics Communications 142: 203-207, 1997. |
Schwartz et al., “Electromagnetic Navigation during Flexible Bronchoscopy.” Interventional Pulmonology: Respiration, vol. 70: 516-522, 2003. |
Seibel et al., “Unique Features of Optical Scanning, Single Fiber Endoscopy.” Lasers in Surgery and Medicine vol. 30: 177-183, 2002. |
Shahidi et al., “Implementation, Calibration and Accuracy Testing of an Image-Enhanced Endoscopy System.” IEEE Transactions On Medical Imaging, vol. 21, No. 12: 1524-1535, 2002. |
Shinagawa et al., “CT-Guided Transbronchial Biopsy Using an Ultrathin Bronchoscopic Navigation.” Chest, vol. 125, No. 3: 1138-1143, 2003. |
Shiraishi et al., “Spot Size Reducer for Standard Single-Mode Fibers Utilizing a Graded-Index Fiber Tip.” ECOC 97: 50-53, Sep. 22-25, 1997. |
Shoji et al., “Camera motion tracking of real endoscope by using virtual endoscopy system and texture information.” Proceedings of SPIE, vol. 4321: 122-133, 2001. |
Skala et al., “Multiphoton Microscopy of Endogenous Fluorescence Differentiates Normal, Precancerous, and Cancerous Squamous Epithelial Tissues.” Cancer Research vol. 65, No. 4: 1180-1186, Feb. 15, 2005. Available at <www.aacrjournals.org>. |
Smithwick et al., “Modeling and Control of the Resonant Fiber Scanner for Laser Scanning Display or Acquisition.” SID 03 DIGEST: 1455-1457, 2003. |
Solomon et al., “Three-dimensional CT-Guided Bronchoscopy With a Real-Time Electromagnetic Position Sensor,” “A Comparison of Two Image Registration Methods.” Chest, vol. 118, No. 6: 1783-1787, 2000. |
Srivastava, S., “Computer-Aided Identification of Ovarian Cancer in Confocal Microendoscope Images,” Department of Electrical and Computer Engineering, University of Arizona Graduate School, Thesis: 213pp, 2004. |
Tearney et al., “Determination of the Refractive-Index of Highly Scattering Human Tissue by Optical Coherence Tomography.” Optics Letters, vol. 20, No. 21: 2258-2260, 1995. |
Tsai et al., “All-Optical Histology Using Ultrashort Laser Pulses.” Neuron Cell Press, vol. 39: 27-41, Jul. 3, 2003. |
Vakoc et al., “Comprehensive esophageal microscopy by using optical frequency-domain imaging (with video).” Gastrointestinal Endoscopy, vol. 65, No. 6: 898-905, 2007. |
Wang et al., “Deep Reactive Ion Etching of Silicon Using an Aluminum Etching Mask.” Proceedings of SPIE, vol. 4876: 633-640, 2003. |
Wilson et al., “Optical Reflectance and Transmittance of Tissues: Principles and Applications.” IEEE Journal of Quantum Electronics, vol. 26, No. 12: 2186-2199, Dec. 1990. |
Xu et al., “3D Motion Tracking of pulmonary lesions using CT fluoroscopy images for robotically assisted lung biopsy.” Proceedings of SPIE, vol. 5367: 394-402, 2004. |
Yamada et al., “Characteristics of a Hemispherical Microlens for Coupling Between a Semiconductor Laser and Single-Mode Fiber.” IEEE J. Quant. Electron, vol. QE-16, No. 10: 1067-1072, Oct. 1980. |
Yamamoto et al., “Total enteroscopy with a nonsurgical steerable double-balloon method.” Gastrointestinal Endoscopy vol. 53, No. 2: 216-220, Feb. 2001. Abstract only. |
Yang et al., “High speed, wide velocity dynamic range Doppler optical coherence tomography (Part I): System design, signal processing, and performance.” Optics Express, vol. 11, No. 7: 794-809, Apr. 7, 2003. |
Yang et al., “Micromachined array tip for multifocus fiber-based optical coherence tomography.” Optics Letters, vol. 29, No. 15: 1754-1756, 2004. |
Yelin et al., “Double-clad fiber for endoscopy.” Optics Letters, vol. 29, No. 20: 2408-2410, Oct. 15, 2004. |
Yelin et al., “Three-dimensional miniature endoscopy.” Nature vol. 443: 765 plus supplemental information, Oct. 19, 2006. <www.nature.com/nature/journal/v443/n7113/extref/443765a-s2.doc>. |
Yoon et al., “Analysis of Electro Active Polymer Bending: A Component in a Low Cost Ultrathin Scanning Endoscope.” Sensors and Actuators A—Physical: pp. 1-26, Submitted Jan. 13, 2006, Published Jul. 2006. |
Yun et al., “Comprehensive volumetric optical microscopy in vivo.” Nature Medicine vol. 12, No. 12: 1429-1433, Dec. 2006. |
Yun et al., “Motion artifacts in optical coherence tomography with frequency-domain ranging.” Optics Express vol. 12, No. 13: 2977-2998, Jun. 28, 2004. |
Zhang et al., “In vivo blood flow imaging by a swept laser source based Fourier domain optical Doppler tomography.” Optics Express vol. 13, No. 19: 7449-7457, Sep. 19, 2005. |
Zipfel et al., “Live tissue intrinsic emission microscopy using multiphoton-excited native fluorescence and second harmonic generation.” PNAS vol. 100, No. 12: 7075-7080, Jun. 10, 2003. Available at <www.pnas.org/cgi/doi/10.1073/pnas.0832308100>. |
n. a., “Given® Diagnostic System.” The Platform for PillCam™ Endoscopy Given Imaging Ltd.: 4pp, 2001-2004. <http:www.givenimaging.com>. |
n. a., “NASA-Inspired Shape-Sensing Fibers Enable Minimally Invasive Surgery.” NASA Tech Briefs vol. 32, No. 2: 12, 14, Feb. 2008. |
n. a., “NANO™SU-8 2000 Negative Tone Photoresist Formulations 2002-2025.” Micro-Chem: 5pp, © 2001. |
Number | Date | Country | |
---|---|---|---|
20090235396 A1 | Sep 2009 | US |
Number | Date | Country | |
---|---|---|---|
60212411 | Jun 2000 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 10655482 | Sep 2003 | US |
Child | 12434129 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 09850594 | May 2001 | US |
Child | 10655482 | US |