Embodiments of the present invention relate to microendoscopy, specifically, a Three Dimensional Light-Field Microendoscopy with a GRIN Lens array.
Optical endoscopy has emerged as an indispensable clinical tool for modern minimally invasive surgery. Most endoscopy systems primarily capture a 2D projection of the 3D surgical field. Currently available 3D endoscopes can restore stereoscopic vision directly by projecting laterally shifted views of the operating field to each eye through 3D glasses. These tools provide surgeons with informative 3D visualizations, but they do not enable quantitative volumetric rendering of tissue, nor do they provide quantitative depth perception, effective anatomic landmark recognition, and efficient learning curve for trainees. Therefore, advanced tools are desired to quantify tissue tomography for high precision microsurgery or medical robotics. Accordingly, there is a need for a device that provides the surgeon with the depth perception that they need, especially surgeons in surgical areas with in-depth extension, vascular encasement, or dense tumor structures.
Accordingly, the present invention is directed to Three Dimensional Light-Field Microendoscopy with a GRIN Lens array that obviates one or more of the problems due to limitations and disadvantages of the related art.
An advantage of the present invention is to provide an endoscopic imaging platform that allows for real-time, quantitative 3D anatomical visualization and interpretation during complex procedures.
In accordance with the purpose(s) of this invention, as embodied and broadly described herein, this invention, in one aspect, relates to a microendscopy system. The system in based on an endoscopic probe. The probe has a plurality of integrated fiber optics for uniform illumination and an array of gradient index (GRIN) lenses configured to capture a reflected light field from one or more on-axis sampling of the reflected light field and two or more off-axis sampling of the reflected light field.
In the microendscopic system or probe: The captured reflected light field may include 6 off-axis samples. The plurality of integrated fiber optics may include 3 or more fibers embedded within an endoscopic probe core, each having an illumination output. Each of the fiber optics may include an illumination output at an end of the endoscopic probe, wherein the illumination outputs are distributed equally at the end of the endoscopic probe. In an aspect, the GRIN lens array may have 6 GRIN lenses. The 6 GRIN lenses may be positioned in a hexagonal array at an end of the endoscopic probe. The probe may further include an illumination output of one of the integrated fiber optics at a center of the hexagonal array at the end of the endoscopic probe. The GRIN lenses of the GRIN lens array may alternate with illumination outputs of individual ones of the integrated fiber optic at an end of the endoscopic probe. The system may include an image processing unit configured to receive the reflected light field from the endoscopic probe to reconstruct an image using a ray-optics reconstruction operation. A first subset of the GRIN lenses in the array may have a first property and wherein a second subset of the GRIN lenses in the array may have a second property different from the first. The system or probe may include a third subset of GRIN lenses having a third property different from the first property and the second property. The fiber optics may include LED light sources.
According to principles described herein, the point-spread function (PSF) calibration may involve fitting an experimental depth-dependent magnification function M(z) with a ray-optics model of magnification. The system or probe may further include a coherent light source and a camera for receiving reflected laser light.
A method of performing microendoscopy according to the principles described herein includes providing illumination from an end of an endoscopic probe, the endoscopic probe having a plurality of integrated fiber optic illumination sources for uniform illumination; capturing a reflected light field, for light-field imaging, through an array of gradient index (GRIN) lenses, including (i) one or more on-axis sampling of the reflected light field and (ii) two or more off-axis sampling of the reflected light field; and reconstructing an image via a reconstruction algorithm using (i) the one or more on-axis sampling of the reflected light field and (ii) the two or more off-axis sampling of the reflected light field. The method may further include further providing a laser light source from the end of the endoscope and performing laser speckle contrast imaging (LSCI) using reflection of light provided by the laser light source.
Further embodiments, features, and advantages of the Three Dimensional Light-Field Microendoscopy with GRIN Lens array, as well as the structure and operation of the various embodiments of the Three Dimensional Light-Field Microendoscopy with GRIN Lens array, are described in detail below with reference to the accompanying drawings.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only, and are not restrictive of the invention as claimed.
The accompanying figures, which are incorporated herein and form part of the specification, illustrate Three Dimensional Light-Field Microendoscopy with GRIN Lens array. Together with the description, the figures further serve to explain the principles of the Three Dimensional Light-Field Microendoscopy with GRIN Lens array described herein and thereby enable a person skilled in the pertinent art to make and use the Three Dimensional Light-Field Microendoscopy with GRIN Lens array.
Reference will now be made in detail to embodiments of the Three Dimensional Light-Field Microendoscopy with GRIN Lens array with reference to the accompanying figures. The same reference numbers in different drawings may identify the same or similar elements.
It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention cover the modifications and variations of this invention, provided they come within the scope of the appended claims and their equivalents.
Light-field imaging suggests itself as a promising solution to the challenge. The approach can capture both the spatial and angular information of optical signals, permitting the computational synthesis of the 3D volume of an object. Presented herein is GRIN lens array microendoscopy (GLAM) as a single-shot, full-color, and quantitative 3D microendoscopy system. GLAM contains integrated fiber optics for illumination and a GRIN lens array to capture the reflected light field. The system can provide a 3D resolution of ˜100 μm over an imaging depth of ˜22 mm and a field of view up to 1 cm2. GLAM maintains, in some embodiments, a small form factor consistent with the clinically desirable design, making the system readily translatable to a clinical prototype.
GLAM can be configured, in some embodiments, as a compact, single-shot, full-color, and quantitative 3D microendoscopy system. By subsampling the angular component of the light field, we gain access to the axial dimension, achieving a 3D resolution of ˜100 μm over an imaging depth of ˜22 mm and a field of view up to ˜1 cm2. The system can incorporate a GRIN lens array instead of the two-lens stereoscopic scheme, promising an alternative paradigm for clinical applications requiring high-resolution, quantitative volumetric measurements. GLAM exhibits a small form factor, a prototype readily translatable to further preclinical and clinical testing.
Specifically, in some embodiments, the system PSF is used for pre-calibration of the 3D reconstruction algorithm. This has the advantages of 1) considering misalignments or other experimental anomalies and 2) making the quantitative 3D reconstruction sample-independent. One current limitation of the PSF calibration approach is that it assumes a nominal lens separation of 1.4 mm. This may be slightly different between each lens due to inhomogeneities in the 3D-printed core. Future versions of the analysis software could address this by calibrating the lens pitch directly to the GLAM system. Also, this may improve the estimation of the PSF offset and GRIN-to-relay spacing. For the reconstruction, speeds of ˜0.9 seconds per millimeter have been obtained over multiple millimeters of depth without any further optimization of the algorithm or processing hardware. Through the use of a GPU and optimization of the reconstruction algorithm to fully utilize this hardware, the reconstruction algorithm should be able to obtain video-rate real-time 3D imaging and visualization.
The use of the GRIN lens array (GLA) enables quantitative depth estimation and allows for simple chromatic calibration for accurate RGB depth encoding. The pinhole image offsets in the off-axis elemental images provide a quick readout for axial chromatic aberrations in the GLA, which has been incorporated into the system magnification function M(z). Traditionally, this information would be obtained through a more complicated imaging protocol involving scanning optics and a fluorescent sample. In contrast, this calibration method offers a fast readout of axial aberrations suitable for incorporating into the downstream analysis.
The 3D reconstruction obtained with the GLAM system demonstrates robust axial sectioning capability and shows recovered depth information about opaque, reflective samples on the microscale. Notably, the GLA assembly used in this system can be used as a blueprint that is readily reconfigurable and scalable by altering the pitch and focal distance of the system to match desired applications. Additionally, GRIN-to-relay spacing or additional relay lenses can tune the magnification function as needed. Increasing the pitch will improve axial resolution with the trade-off of a smaller field of view and a more prominent form factor, while decreasing the pitch has the opposite effect. An exemplary effect of adjusting GRIN-to-relay spacing can be seen in
The use of integrated illumination is another characteristic that makes the endoscope system viable for practical use. Around the GRIN lens array are six optical fibers, which provide uniform illumination to the area in front of the endoscope. A computer can continuously control the intensities of the illumination to provide the appropriate amount of lighting for data sampling.
The properties, including the high 3D resolution, full-color acquisition, and computational simplicity, promise GLAM for future advancement and realization for surgical procedures. Furthermore, the system offers the potential to integrate quantitative 3D imaging with other devices, such as surgical robotics, to conduct more accurate automated 3D navigation within tight spaces in the body. Such a strategy for microimaging in three dimensions could be extended beyond the medical realm for general engineering and manufacturing purposes.
An optical microscopy system may include an endoscopic probe that has a plurality of integrated fiber optics for uniform illumination, and a gradient index (GRIN) lens array configured to capture a reflected light field (e.g., for light-field imaging) from one or more on-axis sampling of the reflected light field (e.g., 1) and two or more off-axis sampling of the reflected light field (e.g., 6). The plurality of integrated fiber optics may include 3 or more fibers (e.g., 6) embedded within an endoscopic probe core. The system may further include an image processing unit configured to receive the reflected light field from the endoscopic probe to reconstruct an image using a ray-optics reconstruction operation. The image processing unit may be configured to employ point-spread function (PSF) calibration in the ray-optics reconstruction operation. The PSF calibration may involve fitting an experimental depth-dependent magnification function M(z) with a ray-optics model of magnification.
A method of operating an optical microscopy system according to principles described herein may include uniformly illuminating an endoscopic probe having a plurality of integrated fiber optics for uniform illumination; capturing a reflected light field, for light-field imaging, through a gradient index (GRIN) lens array, including (i) one or more on-axis sampling of the reflected light field and (ii) two or more off-axis sampling of the reflected light field; and reconstructing an image via a reconstruction algorithm using (i) the one or more on-axis sampling of the reflected light field and (ii) the two or more off-axis sampling of the reflected light field.
The method may include calibrating the reconstruction algorithm using a point-spread function (PSF) that fits an experimental depth-dependent magnification function M(z) with a ray-optics model of magnification.
A non-transitory computer-readable medium may be provided having instructions stored thereon, wherein execution of the instructions by a processor causes the processor to perform any of the methods described herein.
A non-transitory computer-readable medium may be provided having instructions stored thereon, wherein execution of the instructions by a processor causes the processor to perform operations for any of the systems described herein.
In an exemplary embodiment, an optical microendoscopy system may include an endoscopic probe, the probe having a plurality of integrated fiber optics for uniform illumination and an array of gradient index (GRIN) lenses configured to capture a reflected light field from one or more on-axis sampling of the reflected light field and two or more off-axis sampling of the reflected light field. The captured reflected light field may include 6 off-axis samples.
The plurality of integrated fiber optics may include 3 or more fibers embedded within an endoscopic probe core, each having an illumination output.
Each of the fiber optics may include an illumination output at an end of the endoscopic probe, wherein the illumination outputs are distributed equally at the end of the endoscopic probe. The GRIN lens array have 6 GRIN lenses.
The 6 GRIN lenses may be positioned in a hexagonal array at the end of the endoscopic probe. In this aspect, the optical microendoscopy system may include a seventh GRIN lens at a center of the hexagonal array at the end of the endoscopic probe.
The optical microscopy system may further include an illumination output of one of the integrated fiber optics at a center of the hexagonal array at the end of the endoscopic probe.
GRIN lenses of the GRIN lens array may alternate with illumination outputs of individual ones of the integrated fiber optic at an end of the endoscopic probe.
In an aspect, the optical microendoscopy system may further include an image processing unit configured to receive the reflected light field from the endoscopic probe to reconstruct an image using a ray-optics reconstruction operation.
The image processing unit may be configured to employ point-spread function (PSF) calibration in the ray-optics reconstruction operation. The point-spread function (PSF) calibration may involve fitting an experimental depth-dependent magnification function M(z) with a ray-optics model of magnification.
In an aspect, a first subset of the GRIN lenses in the array may comprise a first property and wherein a second subset of the GRIN lenses in the array comprise a second property different from the first. For example, the first property may be a first polarization and the second property may be a second polarization. A third subset of GRIN lenses comprises a third property different from the first property and the second property.
The first property, the second property and the third property may be, for example, polarization, fluorescence in addition to the bright field, wavelength, or time-gated light. The integrated fiber optics may include LED light sources. The endoscopic probe may include a Hopkins rod lens. The endoscopic probe may include a gradient index relay. There may be a laser light source at an end of the endoscopic probe and a camera for receiving reflected laser light. The laser light source may illuminate in a near-infrared range.
A method of operating an optical microendoscopy system according to principles described herein may include providing illumination from the end of an endoscopic probe, the endoscopic probe having a plurality of integrated fiber optic illumination sources for uniform illumination; capturing a reflected light field, for light-field imaging, through an array of gradient index (GRIN) lenses, including (i) one or more on-axis sampling of the reflected light field and (ii) two or more off-axis sampling of the reflected light field; and reconstructing an image via a reconstruction algorithm using (i) the one or more on-axis sampling of the reflected light field and (ii) the two or more off-axis sampling of the reflected light field. The method may also include calibrating the reconstruction algorithm using a point-spread function (PSF) that fits an experimental depth-dependent magnification function M(z) with a ray-optics model of magnification.
The method may also include reconstructing images of a microscopy sample at different z-axis distances from the microscopy sample. Illumination from the end of the endoscopic probe may be provided by a plurality of illumination outputs of the integrated fiber optic light sources in a uniform distribution at the end of the endoscopic probe.
The GRIN lenses of the GRIN lens array may alternate with illumination outputs of individual ones of the integrated fiber optic at an end of the endoscopic probe. The array of GRIN lenses may be a hexagonal array.
The method may include providing a laser light source from the end of the endoscope and performing laser speckle contrast imaging (LSCI) using reflection of light provided by the laser light source. The laser light source may illuminate in a near infrared range. The laser speckle contrast imaging may involve capturing an image of reflected laser light and measuring speckle contrast of different areas of the image of the reflected laser light. The speckle contrast may be a function of pixel illumination and window mean intensity. The LSCI may be performed before reconstructing the image via the reconstruction algorithm, and results of the reconstructing are superimposed on results of the LSCI. Reconstructing the image via the reconstruction algorithm may be performed before the LSCI, and results of the LSCI are superimposed on results of the LSCI.
A non-transitory computer readable medium may be provided having instructions stored thereon, wherein execution of the instructions by a processor causes the processor to perform any method described herein.
A system for processing images in microendoscopy according to principles described herein may include a memory comprising executable instructions and a processor configured to execute the executable instructions and cause the system to perform any method described herein.
While described herein as being directed to an optical microendoscopy system, the probe described herein, separate from the overall optical microendoscopy system, falls within the scope of this disclosure. That is, an endoscopic probe may have a plurality of integrated fiber optics for uniform illumination and an array of gradient index (GRIN) lenses configured to capture a reflected light field from one or more on-axis sampling of the reflected light field and two or more off-axis sampling of the reflected light field. The probe can include all of the configurations described herein. For example, the plurality of integrated fiber optics may include 3 or more fibers embedded within an endoscopic probe core, each having an illumination output. Each of the fiber optics may include an illumination output at an end of the endoscopic probe, wherein the illumination outputs are distributed equally at the end of the endoscopic probe. The GRIN lens array has 6 GRIN lenses. The 6 GRIN lenses may be positioned in a hexagonal array at an end of the endoscopic probe. The probe may include a seventh GRIN lens at a center of the hexagonal array at the end of the endoscopic probe. The optical microscopy system may further include an illumination output of one of the integrated fiber optics at a center of the hexagonal array at the end of the endoscopic probe. GRIN lenses of the GRIN lens array may alternate with illumination outputs of individual ones of the integrated fiber optic at an end of the endoscopic probe.
It should be noted that the number of GRIN lenses can be reduced, perhaps at the cost of some resolution, without departing from the spirit and scope of the principles described herein. For example, 3, 4, 5, 6, or 7 GRIN lenses can be used as long as there is sufficient angular information to reconstruct a 3D image as described herein. It is also possible to replace some of the GRIN lenses with a light source. For example, in one configuration, 6 GRIN lenses are provided in a hexagonal configuration with a light source at the center. Additional or alternate light sources can be provided at a predetermined pitch on the end of the endoscopic probe to provide uniform or known illumination.
The prototype included a hexagonal GRIN lens array (GLA) developed to harness the benefits of light-field over stereoscopic imaging while mitigating the imaging quality trade-offs induced with conventional light-field methods (
In some embodiments, GLAM can be constructed from a combination of off-the-shelf components and custom 3D-printed parts. All custom parts were designed in SOLIDWORKS® and printed with a resin 3D printer (Form 2, FLGPBK023D, Formlabs). The GLAM imaging probe is composed of an inner 3D printed core that houses seven 1-mm diameter 0.5NA GRIN lens assemblies (GRINTECH, GT-ERLS-100-005-150) arranged with a pitch of 1.4 mm hexagonally within individual tubes. Illumination is provided by an LED source (Thorlabs, MWWHF2), and the outside of the core has six grooves for threading optical fibers (Thorlabs, BF72HS01) between the inner core and outer sheath. The outer sheath is a 304 Stainless Steel Tube (McMaster-Carr, 8987K7) with an outer diameter of 0.203″ (5.15 mm) and a wall thickness of 0.01″ (0.25 mm). The probe fits into a custom holder with a side access door to align the lenses (
The system may utilize a ray-optics reconstruction without any computationally costly deconvolution to reduce the computational burden for eventual clinical application. This approach can enable full-color quantitative reconstruction of spatially dense samples over centimeters of depth while reducing reconstruction time by an order of magnitude over the previous wave-optics model. The reconstruction procedure is based on the optical parametrization outlined in
A detailed discussion of the point-spread function (PSF) calibration can be found below. The latest version of the software will be available upon publication at: https://github.com/ShuJiaLab/3D_endoscopy.
A light-field system maps axial shifts in the object space to lateral shifts in the image space—thus encoding the depth into 2D information. The reconstruction process is an inverse mapping, i.e., a back projection. As a pre-calibration step for reconstruction, the PSF of the system is acquired as depicted schematically in the box within
As the pinhole is translated in steps of Az0=50 μm, there is a corresponding shift in image space given by Δxε. The shift Δxε is proportional to the magnification change of the system over
As illustrated by
To fully characterize the mapping from the object to image space, we further solved for the unknowns offset O and GRIN-to-relay spacing a. This is achieved by fitting the experimental M(z) with a ray-optics model of magnification in our system. Changing the GRIN-to-relay spacing will alter the magnitude of M(z), while different PSF offsets will left- or right-shift the portion of the curve captured by our PSF.
GLAM is calibrated through the acquisition of the PSF of the system. A pinhole is aligned along the optical axis of the center GRIN lens and translated axially.
The RGB magnification curves were averaged prior to fitting the model to determine the average pinhole position, shown in
Each of the seven GRIN lenses in GLAM is separately aligned until its image sharpness reaches a maximum. Characterization of the lens-by-lens alignment of the off-axis lenses can be seen in
The six M(z) functions are averaged in each RGB channel, as shown in
The effective field of view of GLAM will change over the imaging depth depending on the overlap in the viewing region of each lens as well as the changing effective pixel size (see
The magnification can also be used to calculate the effective image pixel size at different depths, which were used to estimate the resolution of the system. To quantify the lateral resolution of GLAM, we mounted and imaged a USAF resolution target (R1DS1N, Thorlabs) on a motorized linear translation stage. These images of the target were taken using transmitted light.
The axial resolution of the system was measured using the same USAF resolution target and transmitted light. The USAF target was mounted on a rotating stage, allowing imaging of the target at a range of angles. Angling the target introduced a variable deviation in the axial position of the bars on the target that was used to determine the smallest axial distance that the system could resolve. The target was imaged at 0°, 10°, 20°, 30°, 40°, and 450 angles.
The distance between the target and the endoscope was such that the middle bar of the (2,2) group on the USAF target was 6.5 mm from the endoscope when the target was angled at 200 (
Lastly, we validated the performance of the GLAM system for phantom organs that contain fine features at various axial positions. In particular, the anatomical structures within a 3D printed heart model were imaged (
Laser speckle is an interference pattern produced by light reflected or scattered from different parts of the illuminated surface. If the surface is rough (surface height variations larger than the wavelength of the laser light used), light from different parts of the surface within a resolution cell (the area just resolved by the optical system imaging the surface) traverses different optical path lengths to reach the image plane. (In the case of an observer looking at a laser-illuminated surface, the resolution cell is the resolution limit of the eye and the image plane is the retina.) The resulting intensity at a given point on the image is determined by the superposition of all waves arriving at that point. If the resultant amplitude is zero because all the individual waves cancel out, a dark speckle is seen at the point; if all the waves arrive at the point in phase, an intensity maximum is observed.
Physicians are accustomed to endoscopes that provide two-dimensional images originating from three-dimensional structures. However, surgeons' access to topological information in the axial dimension of the surgical field facilitated by rigid 3D endoscopy has been reported to reduce operation times and errors, especially in surgeries involving significant in-depth extension or 3D tissue complexity. Furthermore, 3D endoscopy has shown promise for improving the learning experience of medical trainees.
Current clinical technologies incorporate stereoscopic imaging principles, in which two apertures record the tissue landscape simultaneously. The surgeon relays this information as a 2D image on either a head-mounted display or a specialized monitor that can be converted to a 3D perception using polarized eyewear. While demonstrating the power of restoring depth perception to minimally invasive surgery, stereoscopic approaches suffer ergonomic and analytical downsides. Practically, the eyewear can cause dizziness and headaches after long periods of use, with some surgeons reporting excessive strain with the stereoscopic systems compared to conventional 2D endoscopy, even though operation times can be reduced. Additionally, stereoscopic vision lacks quantitative 3D recording and reconstruction for intraoperative decisions, subsequent diagnostics, or data storage. As a result, surgeries may require other imaging procedures, such as micro-CT or MRI, to quantify the 3D morphology of tissue. Indeed, acquiring quantitative volumetric information during surgical procedures has significant implications for diagnostics, treatment, and integration with digital and robotic devices.
In contrast, to achieve 3D reconstruction without eyewear, computational approaches to stereoscopic endoscopy such as deformable shape-from-motion and shape-from-shading, have been proposed to quantify the 3D surface. However, these algorithms are highly sample-dependent and may suffer from reduced temporal resolution due to required probe translation. Other quantitative approaches to stereoscopic imaging using epipolar geometry and the pinhole camera model have been shown to attain quantitative results but remain limited by the nonuniform field of view due to the use of only two apertures.
Light-field imaging is an optical methodology that addresses the limitations of the two-aperture approach in stereoscopic imaging. Light-field imaging, often used in 3D microscopy, is characterized by sampling the 2D spatial and 2D angular components of the light field with a lens or camera array. For example, this approach has been applied to endoscopy by Orth and colleagues, who demonstrated that 3D light-field imaging could be obtained with a flexible multi-mode fiber bundle. In contrast, rigid light-field endoscopy and otoscopy systems have also been proposed, utilizing microlenses or tunable electro-optic lens arrays, both of which, however, lead to a significant reduction in lateral resolution. Therefore, there remains a demand for lens-based 3D endoscopy techniques that maintain high spatial resolution while providing quantitative depth information.
In this work, we demonstrate fast, 3D, multi-color microendoscopic imaging achieved by using a hexagonal gradient index (GRIN) lens array. This GRIN lens array microendoscopy system, or GLAM, provides a quantitative 3D imaging methodology with a high lateral and axial resolution. With the capability to detect the depth of features with sub-millimeter accuracy, GLAM is designed to meet many of the functional and physical requirements of a practical endoscope, including a small-diameter stainless steel shaft, built-in illumination, and multi-color imaging. With this combination of optical functionality and realistic endoscope design, GLAM demonstrates that quantitative 3D light-field imaging using GRIN lenses can be practically applied to rigid endoscopy, paving the way for future development using this optical paradigm. We expect GLAM to provide a necessary prototype to increase operative safety and efficiency with further implications on improving instrument control during robotic surgery.
Additional experimental results and examples are provided herein, each of which is incorporated by reference herein in its entirety. The presently described system and methods endeavor to adapt to and enhance 3D human visual navigation into various complex microenvironments, to provide quantitative and machine-intelligent recognition and display of 3D anatomies, allow for “Glasses-free”, real-time guidance and intervention, and lead to breakthroughs in patient care, clinical screening, diagnostics, decision-making, as well as medical training. Embodiments described herein utilize clinically-relevant design parameters: compact light-field propagation and a LED-based array of fiber optics for uniform illumination.
It should be appreciated that the logical operations described above can be implemented (1) as a sequence of computer-implemented acts or program modules running on a computing system and/or (2) as interconnected machine logic circuits or circuit modules within the computing system. The implementation is a matter of choice dependent on the performance and other requirements of the computing system. Accordingly, the logical operations described herein are referred to variously as state operations, acts, or modules. These operations, acts, and/or modules can be implemented in software, in firmware, in special purpose digital logic, in hardware, and any combination thereof. It should also be appreciated that more or fewer operations can be performed than shown in the figures and described herein. These operations can also be performed in a different order than those described herein.
In addition to the various microendoscopy systems and probes discussed herein,
In an embodiment, the computing device 200 may comprise two or more computers in communication with each other that collaborate to perform a task. For example, but not by way of limitation, an application may be partitioned in such a way as to permit concurrent and/or parallel processing of the instructions of the application. Alternatively, the data processed by the application may be partitioned in such a way as to permit concurrent and/or parallel processing of different portions of a data set by the two or more computers. In an embodiment, virtualization software may be employed by the computing device 200 to provide the functionality of a number of servers that is not directly bound to the number of computers in the computing device 200. For example, virtualization software may provide twenty virtual servers on four physical computers. In an embodiment, the functionality disclosed above may be provided by executing the application and/or applications in a cloud computing environment. Cloud computing may comprise providing computing services via a network connection using dynamically scalable computing resources. Cloud computing may be supported, at least in part, by virtualization software. A cloud computing environment may be established by an enterprise and/or may be hired on an as-needed basis from a third-party provider. Some cloud computing environments may comprise cloud computing resources owned and operated by the enterprise as well as cloud computing resources hired and/or leased from a third-party provider.
In its most basic configuration, computing device 200 typically includes at least one processing unit 220 and system memory 230. Depending on the exact configuration and type of computing device, system memory 230 may be volatile (such as random-access memory (RAM)), non-volatile (such as read-only memory (ROM), flash memory, etc.), or some combination of the two.
This most basic configuration is illustrated in
Computing device 200 may have additional features/functionality. For example, computing device 200 may include additional storage such as removable storage 240 and non-removable storage 250, including, but not limited to, magnetic or optical disks or tapes. Computing device 200 may also contain network connection(s) 280 that allow the device to communicate with other devices, such as over the communication pathways described herein. The network connection(s) 280 may take the form of modems, modem banks, Ethernet cards, universal serial bus (USB) interface cards, serial interfaces, token ring cards, fiber distributed data interface (FDDI) cards, wireless local area network (WLAN) cards, radio transceiver cards such as code division multiple access (CDMA), global system for mobile communications (GSM), long-term evolution (LTE), worldwide interoperability for microwave access (WiMAX), and/or other air interface protocol radio transceiver cards, and other well-known network devices. Computing device 200 may also have input device(s) 270 such as keyboards, keypads, switches, dials, mice, track balls, touch screens, voice recognizers, card readers, paper tape readers, or other well-known input devices. Output device(s) 260 such as printers, video monitors, liquid crystal displays (LCDs), touch screen displays, displays, speakers, etc. may also be included. The additional devices may be connected to the bus in order to facilitate communication of data among the components of the computing device 200. All these devices are well known in the art and need not be discussed at length here.
The processing unit 220 may be configured to execute program code encoded in tangible, computer-readable media. Tangible, computer-readable media refers to any media that is capable of providing data that causes the computing device 200 (i.e., a machine) to operate in a particular fashion. Various computer-readable media may be utilized to provide instructions to the processing unit 220 for execution. Example tangible, computer-readable media may include, but is not limited to, volatile media, non-volatile media, removable media and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. System memory 230, removable storage 240, and non-removable storage 250 are all examples of tangible, computer storage media. Example tangible, computer-readable recording media include, but are not limited to, an integrated circuit (e.g., field-programmable gate array or application-specific IC), a hard disk, an optical disk, a magneto-optical disk, a floppy disk, a magnetic tape, a holographic storage medium, a solid-state device, RAM, ROM, electrically erasable program read-only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices.
In light of the above, it should be appreciated that many types of physical transformations take place in the computer architecture 200 in order to store and execute the software components presented herein. It also should be appreciated that the computer architecture 200 may include other types of computing devices, including hand-held computers, embedded computer systems, personal digital assistants, and other types of computing devices known to those skilled in the art. It is also contemplated that the computer architecture 200 may not include all of the components shown in
In an example implementation, the processing unit 220 may execute program code stored in the system memory 230. For example, the bus may carry data to the system memory 230, from which the processing unit 220 receives and executes instructions. The data received by the system memory 230 may optionally be stored on the removable storage 240 or the non-removable storage 250 before or after execution by the processing unit 220.
It should be understood that the various techniques described herein may be implemented in connection with hardware or software or, where appropriate, with a combination thereof. Thus, the methods and apparatuses of the presently disclosed subject matter, or certain aspects or portions thereof, may take the form of program code (i.e., instructions) embodied in tangible media, such as floppy diskettes, CD-ROMs, hard drives, or any other machine-readable storage medium wherein, when the program code is loaded into and executed by a machine, such as a computing device, the machine becomes an apparatus for practicing the presently disclosed subject matter. In the case of program code execution on programmable computers, the computing device generally includes a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device. One or more programs may implement or utilize the processes described in connection with the presently disclosed subject matter, e.g., through the use of an application programming interface (API), reusable controls, or the like. Such programs may be implemented in a high-level procedural or object-oriented programming language to communicate with a computer system. However, the program(s) can be implemented in assembly or machine language, if desired. In any case, the language may be a compiled or interpreted language and it may be combined with hardware implementations.
Moreover, the various components may be in communication via wireless and/or hardwire or other desirable and available communication means, systems and hardware. Moreover, various components and modules may be substituted with other modules or components that provide similar functions.
Although example embodiments of the present disclosure are explained in some instances in detail herein, it is to be understood that other embodiments are contemplated. Accordingly, it is not intended that the present disclosure be limited in its scope to the details of construction and arrangement of components set forth in the following description or illustrated in the drawings. The present disclosure is capable of other embodiments and of being practiced or carried out in various ways.
It must also be noted that, as used in the specification and the appended claims, the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise. Ranges may be expressed herein as from “about” or “5 approximately” one particular value and/or to “about” or “approximately” another particular value. When such a range is expressed, other exemplary embodiments include from the one particular value and/or to the other particular value.
By “comprising” or “containing” or “including” is meant that at least the name compound, element, particle, or method step is present in the composition or article or method, but does not exclude the presence of other compounds, materials, particles, method steps, even if the other such compounds, material, particles, method steps have the same function as what is named.
In describing example embodiments, terminology will be resorted to for the sake of clarity. It is intended that each term contemplates its broadest meaning as understood by those skilled in the art and includes all technical equivalents that operate in a similar manner to accomplish a similar purpose. It is also to be understood that the mention of one or more steps of a method does not preclude the presence of additional method steps or intervening method steps between those steps expressly identified. Steps of a method may be performed in a different order than those described herein without departing from the scope of the present disclosure. Similarly, it is also to be understood that the mention of one or more components in a device or system does not preclude the presence of additional components or intervening components between those components expressly identified.
As discussed herein, a “subject” may be any applicable human, animal, or other organism, living or dead, or other biological or molecular structure or chemical environment, and may relate to particular components of the subject, for instance specific tissues or fluids of a subject (e.g., human tissue in a particular area of the body of a living subject), which may be in a particular location of the subject, referred to herein as an “area of interest” or a “region of interest.
The construction and arrangement of the systems and methods as shown in the various exemplary embodiments are illustrative only. Although only a few embodiments have been described in detail in this disclosure, many modifications are possible (e.g., variations in sizes, dimensions, structures, shapes and proportions of the various elements, values of parameters, mounting arrangements, use of materials, colors, orientations, etc.). For example, the position of elements may be reversed or otherwise varied, and the nature or number of discrete elements or positions may be altered or varied. Accordingly, all such modifications are intended to be included within the scope of the present disclosure. The order or sequence of any process or method steps may be varied or re-sequenced according to alternative embodiments. Other substitutions, modifications, changes, and omissions may be made in the design, operating conditions, and arrangement of the exemplary embodiments without departing from the scope of the present disclosure.
While various embodiments of the present invention have been described above, it should be understood that they have been presented by way of example only, and not limitation. It will be apparent to persons skilled in the relevant art that various changes in form and detail can be made therein without departing from the spirit and scope of the present invention. Thus, the breadth and scope of the present invention should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents.
This application is a non-provisional conversion of and claims priority benefit to U.S. Provisional Application Ser. No. 63/285,551, filed Dec. 3, 2021, pending, which is hereby incorporated by this reference in its entirety.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2022/051698 | 12/2/2022 | WO |
Number | Date | Country | |
---|---|---|---|
63285551 | Dec 2021 | US |