The present invention relates generally to medical devices. More particularly, the present invention relates to a compact endoscope design for three-dimensional surgical guidance.
Surgeons have been increasingly utilizing minimally invasive surgical guidance techniques not only to reduce surgical trauma but also to achieve accurate and objective surgical risk evaluations. A typical minimally invasive surgical guidance system provides visual assistance in two-dimensional (2D) anatomy and pathology of internal organ within a limited field of view (FOV). Current developments in minimally invasive 3D surgical endoscope involve techniques such as stereoscopy, time of flight (ToF), and structure illumination to achieve depth discrimination.
3D stereoscopy is a well-developed technique which relies on the searching of stereo correspondence between two distinct views of the scene and produce disparity estimation from the both images. From the calculated disparities, the 3D structure can be deduced using triangulation method based on the geometric calibration with the camera. Different stereo reconstruction algorithms have been used to establish disparities both spatially and temporally for the correspondence extraction of the images from the both views to achieve depth resolution from 0.05 mm to 0.6 mm (the hybrid recursive matching), with flexibility in disparity extraction based on feature-based technique (the seed propagation method) or using an efficient convex optimization for disparity searching based on a 3D cost-volume configuration (the cost-volume method). These methods of 3D reconstruction has been commercially developed and have found wide acceptance throughout United States and Europe.
Time-of-flight method calculates travelling distance of light emitted from the illumination source and reflected by the object to the sensor, and deduces the depth information from the time difference between the emitted and reflected light, to reconstruct surface structure in 3D. Therefore, this technique does not rely on correspondence search or baseline restriction, leading to a compact design in use for surgical endoscopic system. Depth resolution using 3D time-of-fight surgical endoscope ranges from 0.89 mm to about 4 mm. However, due to the low light environment in tissue imaging, a ToF camera often uses high power laser source to illuminate internal targets. Some other limitations in depth evaluation using ToF come from specular reflectance, inhomogeneous illumination, and other systemic errors such as the sensor run-time, its temperature tolerance and imaging exposure time. Other tissue-light interaction factors such as tissue biological properties in absorption and scattering also contribute in the ToF systematic error. So far, an industrial prototype of the ToF for surgical endoscope (Richard Wolf GmbH (Knittlingen, Germany)) was introduced but the device has not yet been widely accepted.
Plenoptic imaging, or light field imaging, calculates depth from a single image collected by multiple reflected rays from the object through a microlens array located in front of a camera sensor. Each image from the sensor comprised of multiple micro images from the microlens array, i.e. each pixel of a micro image related to particular directions of different incoming light. The microlenses are aligned in a specific configuration and can come with different focusing lengths (multi-focus plenoptic camera) for an optimization of a maximal effective lateral resolution and the required depth of field. The lateral resolution of a multi-focus plenoptic camera can achieve up to a quarter of the sensor pixel in lateral resolution with an order of magnitude of 1 mm in axial resolution. Although benefiting from high depth resolution, 3D reconstruction using plenoptic imaging often requires customized sensor and microlens array for a particular imaging fields. Plenoptic imaging have also been commercially developed, primarily for consumer and non-medical applications.
Structured illumination or fringe projection profilometry (FPP) provides depth quantification similar to stereoscope technique, i.e. FPP relies on the parallax and triangulation of tissue location in relation to the camera and the projector. However, instead of searching for disparities, FPP detects the fringe patterns which are actively projected onto the tissue, therefore it highlights feature points presented in discontinuing surface or homogeneous regions. Moreover, the hardware flexibility also makes FPP simpler to implement compared to the abovementioned techniques. For in-vitro medical application, FPP has been used widely in dermatology for skin and wound inspection and health-care for idiopathic scoliosis diagnostic. However, FPP endoscopic imaging for medical intraoperative inspection has been so far rather limited. Certain efforts using FPP in laparoscope have been made. An example is FPP endoscope for augmented reality visualization, the device provides a direct depth perception overlaid capture on recorded images of trial phantom and tissue cadaver. Another example is in tubular tissue scanning, where the use of a collection of color light rings that achieved an average dimensional error of 92 μm. Majority of laparoscope development in tissue profilometry involves the use of a high power light source for SNR enhancement. Some examples such as in motion tracking using 100 mW, 660 nm wavelength laser diode and produces tracking error of 0.05 mm and for tissue profilometry with an accuracy of 0.15 mm using multispectral spot projection dispersed via a prism from a 4 W supercontinuum light source. Although these techniques achieve high precision and small cavity access, they often require complicated and may be prohibitively expensive hardware setup.
Accordingly, there is a need in the art for a compact, cost-effective endoscope design for three-dimensional surgical guidance.
The foregoing needs are met, to a great extent, by the present invention which provides a device for 3D imagery including an imaging probe configured for fringe projection profilometry (FPP) and configured to provide a wide field of view (FOV). The device includes a CCD sensor. The device also includes an illumination probe and a digital projector.
In accordance with an aspect of the present invention, the device includes an angle controller. The angle controller sets a distance for separation between the imaging probe and the illumination probe. The device includes a housing for the imaging probe, the illumination probe, and the angle controller. The device also includes a non-transitory computer readable medium programmed to execute a flexible camera calibration method for the 3D reconstruction in free space to provide an optimal fringe pattern for an inner tissue profile.
In accordance with another aspect of the present invention, the imaging probe and the illumination probe each have a diameter of 5 mm. The device can also include a digital micromirror device (DMD). The DMD is configured to project a fringe pattern. There is a minimum of a 15° angle between the imaging and illumination probe. The device also includes synchronizing structured patterns from the DMD with the imaging camera.
In accordance with yet another aspect of the present invention, a method for a 3D image of a region of interest includes providing a wide field of view (FOV) with an imaging probe, such that quantitative depth information is addressed. The method includes illuminating the region of interest. The method also includes projecting a fringe pattern and capturing the 3D image of the region of interest.
In accordance with still another aspect of the present invention, the method includes using an illumination probe for illuminating the region of interest. The method includes setting a distance of separation between the imaging probe and the illumination probe. The method includes setting the distance of separation to be a minimum of a 15° angle. The method includes executing the method in conjunction with a non-transitory computer readable medium. The non-transitory computer readable medium is programmed to execute a flexible camera calibration method for 3D reconstruction in free space to provide an optimal fringe pattern for an inner tissue profile. The method includes using a digital micromirror device (DMD) to project the fringe pattern. The method includes using a coordinate transform to correspond each image point to each respective point on the DMD via the collected fringe patterns. The method includes using a CCD camera to capture the 3D image of the region of interest. The method can also include providing tissue profilometry.
The accompanying drawings provide visual representations, which will be used to more fully describe the representative embodiments disclosed herein and can be used by those skilled in the art to better understand them and their inherent advantages. In these drawings, like reference numerals identify corresponding elements and:
The presently disclosed subject matter now will be described more fully hereinafter with reference to the accompanying Drawings, in which some, but not all embodiments of the inventions are shown. Like numbers refer to like elements throughout. The presently disclosed subject matter may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will satisfy applicable legal requirements. Indeed, many modifications and other embodiments of the presently disclosed subject matter set forth herein will come to mind to one skilled in the art to which the presently disclosed subject matter pertains having the benefit of the teachings presented in the foregoing descriptions and the associated Drawings. Therefore, it is to be understood that the presently disclosed subject matter is not to be limited to the specific embodiments disclosed and that modifications and other embodiments are intended to be included within the scope of the appended claims.
The present invention is directed to endoscopic structure illumination to provide a simple, inexpensive 3D endoscopic technique to conduct high resolution 3D imagery for use in surgical guidance system. The present invention is directed to an FPP endoscopic imaging setup which provides a wide field of view (FOV) that addresses a quantitative depth information and can be integrated with commercially available endoscopes to provide tissue profilometry. Furthermore, by adapting a flexible camera calibration method for the 3D reconstruction technique in free space, the present invention provides an optimal fringe pattern for the inner tissue profile capturing within the endoscopic view and validate the method using both static and dynamic samples that exhibits a depth of field (DOF) of approximately 20 mm and a relative accuracy of 0.1% using a customized printed calibration board.
The fringe-projection-based 3D endoscopic system of the present invention, as illustrated in
The FPP reconstruction method is based on parallax and triangulation between the camera and the structured light from the projector, in relation with the sample surface. In other words, a coordinate transform is used to correspond each image point on the camera sensor to each respective point on the DMD via the collected fringe patterns, i.e. treating the DMD as a second camera similar to the stereo-vision technique. To establish the transformation, a governing equation such as in Eq. (5) was derived to relate the object depth information to the phase map of the projection fringes. A vertical, frequency-shifted sinusoidal fringe wave as formulated in Eq. (1) is typically used to perform FPP to achieve full field and fast image processing. This fringe wave pattern is given as:
where Io is the intensity modulation amplitude, (u, v) are the spatial pixel indices, δ the shifted phase and k the fringe number, w the pattern width. In the application, k={1, 2, 6, 30}.
After projection of the pattern and collection of the images, the wrapped phase of each obtained image with different fringe patterns are calculated using the conventional four-step phase shift method as given by:
where I1-4 represents the intensity of shifted image.
In as much as the periodity of the collected structured patterns, its phase map at each point is restrained to a principal range, leading to the phase discontinuity at higher frequency fringes, therefore phase unwrapping is necessary to extract the absolute phase value. The phase unwrapping technique of the present invention is formulated based on the relation between the current and previous unwrapped phase information from the previous frequency, as described in Eq. (4), with the lowest frequency defined to have one fringe on its pattern, therefore its unwrapped phase is equal to its wrapped phase. For other higher frequencies, the unwrapped phase distribution can be calculated based on the unwrapped phase distribution of the previous frequency
where the wrapping operator denotes the argument rounding to the closest integer, the superscript uw and w refers to unwrapped and wrapped, f represents fringe frequency, i.e. number of fringe per projection pattern and n={2, 3, 4} is the nth order of fringe frequency where
The out-of-plane height z at each pixel index (i,j) is proportional to the unwrapped phase as in Eq. (5):
where co-11, do-11 are constants determined by geometrical and other relevant parameters, φ is unwrapped phase distribution. The extension to second order of u and v is for the accuracy enhancement for complex real-world structures.
The calibration technique to determine co-11, do-11 was performed using a customized printed ring-centered calibration board. Its positions and tilting angles are varied to cover the interested imaging volume. Each calibration control point j at a board position i on the calibration board is transformed to the corresponding point (Xc,ij,Yc,ij,Zc,ij) in the camera coordinate system. The first board position (i=1) is determined to be the reference plane. A reference plane is the zero height of the 3D image structure and constructed by placing the calibration board perpendicular to the optical axis from the camera to the object. The reference plane is then formulated by fitting a planar equation with constant coefficient A, B, C to every point of the first board image.
After the calculation of unwrapped phase φ and the height zij of each calibration control point j, the Levenberg-Marquard least-squares fitting method is used to obtain the coefficients co-11, do-11 in Eq. (5).
Σi=1aΣj=1b(z−zij)2. (7)
To validate the 3D endoscope, several static and dynamic objects with structural complexity were imaged, such as a depth of field target (DOF 5-15. Edmund Optics, York, UK), a dried human temporal bone, and a human mouth cavity.
To measure the depth of field of the system, the intensity along a horizontal line pair section, was examined to determine the region of greatest contrast between line pairs. The DOF target was uniformly illuminated by a white LED light source (MCWHL1, Thorlabs, Newton, N.J., USA), and the target is located such that the first point of the target is in the same plane as the endoscopes' distal end. The optimal DOF is about 20 mm, i.e. images within 20 mm would give the best height accuracy.
To minimize the specular reflectance on the DOF target, the DOF was moved 10 mm away from the distal end of the endoscopes. To calculate the height accuracy within the DOF range, which is indicated in
To validate the system performance for imaging biological samples, 3D imaging of a human dried temporal bone, and an in vivo human mouth cavity was performed. The depth reconstruction height maps in both 2D and 3D are displayed in
The next experiment focuses on the dynamic capturing of a mouth cavity in both closed and opened state. As shown in
The system's relative sensitivity is about 0.1% with a depth FOV of 2 cm. The proposed imaging system is the foundation for real time 3D endoscope with the use of parallelization and the integration of both illumination and imaging scopes into a single scope. Besides 3D capturing, endoscopic guided imaging can be integrated with other tissue analysis techniques such as speckle imaging and multispectral imaging to evaluate tissue perfusion kinetics and classify tissue types based on its spectral signatures.
Besides the setup as described above several other embodiments meet the relative height accuracy of 0.1% and the overall housing diameter of about 10 mm or less. The designs introduce the optical components for structure illumination and imaging site and simplify the use of two separate endoscopes, therefore, support the incision minimization and systemic stability.
The first design utilizes two flexible imaging probes for illumination and imaging purpose, housed in two smaller ports in a rigid tube as described in
For the purpose of pattern projection and image collection, both illumination and imaging fibers are attached with a pair of achromatic doublets as illustrated in
Another embodiment is an alternative way for using flexible scope by utilizing the commercially available rigid surgical endoscope as indicated in
Another embodiment allows the use of the common medical-graded scopes in both rigid and flexible forms. Where the illumination and imaging tasks can be performed in either of the scope.
In addition, the above-mentioned design, setup in
As an extension of the angle controller design, a design with multiple joints as indicated in
The use of the common rigid/flexible scope for the 3D reconstruction can also be used to characterize tissue biological properties using multispectral or speckle imaging technique. A customized spectral light source can be couple into the light port of the scope or into a separate light pipe (for a customized fiber bundle scheme) which support the illumination on the object. Tissue information such as its textures, tissue classification, tissue thickness, vascular structure, blood perfusion and blood flow can be deduced based on spectral analysis.
The present invention can be carried out and/or supported using a computer, non-transitory computer readable medium, or alternately a computing device or non-transitory computer readable medium incorporated into the imaging device. Indeed, any suitable method of calculation known to or conceivable by one of skill in the art could be used. It should also be noted that while specific equations are detailed herein, variations on these equations can also be derived, and this application includes any such equation known to or conceivable by one of skill in the art.
A non-transitory computer readable medium is understood to mean any article of manufacture that can be read by a computer. Such non-transitory computer readable media includes, but is not limited to, magnetic media, such as a floppy disk, flexible disk, hard disk, reel-to-reel tape, cartridge tape, cassette tape or cards, optical media such as CD-ROM, writable compact disc, magneto-optical media in disc, tape or card form, and paper media, such as punched cards and paper tape.
The computing device can be a special computer designed specifically for this purpose. The computing device can be unique to the present invention and designed specifically to carry out the method of the present invention. Imaging devices generally have a console which is a proprietary master control center of the imager designed specifically to carry out the operations of the imager and receive the imaging data created by the imager. Typically, this console is made up of a specialized computer, custom keyboard, and multiple monitors. There can be two different types of control consoles, one used by the operator and the other used by the physician. The operator's console controls such variables as the thickness of the image, the amount of tube current/voltage, mechanical movement of the patient table and other radiographic technique factors. The physician's viewing console allows viewing of the images without interfering with the normal imager operation. This console is capable of rudimentary image analysis. The operating console computer is a non-generic computer specifically designed by the imager manufacturer for bilateral (input output) communication with the scanner. It is not a standard business or personal computer that can be purchased at a local store. Additionally this console computer carries out communications with the imager through the execution of proprietary custom built software that is designed and written by the imager manufacturer for the computer hardware to specifically operate the hardware.
The many features and advantages of the invention are apparent from the detailed specification, and thus, it is intended by the appended claims to cover all such features and advantages of the invention which fall within the true spirit and scope of the invention. Further, since numerous modifications and variations will readily occur to those skilled in the art, it is not desired to limit the invention to the exact construction and operation illustrated and described, and accordingly, all suitable modifications and equivalents may be resorted to, falling within the scope of the invention. While exemplary embodiments are provided herein, these examples are not meant to be considered limiting. The examples are provided merely as a way to illustrate the present invention. Any suitable implementation of the present invention known to or conceivable by one of skill in the art could also be used.
This application claims the benefit of U.S. Provisional Patent Application No. 62/374,267, filed on Aug. 12, 2016, which is incorporated by reference herein, in its entirety.
This invention was made with government support under CBET-1430040 awarded by the National Science Foundation. The government has certain rights in the invention.
Number | Date | Country | |
---|---|---|---|
62374267 | Aug 2016 | US |