Advances in technology have provided advances in imaging capabilities for medical use. One area that has enjoyed some of the most beneficial advances is that of endoscopic surgical procedures because of the advances in the components that make up an endoscope.
Conventional, digital video systems used for laparoscopy, arthroscopy, ENT, gynecology and urology are based upon conventional, rigid endoscopes, which are optically and mechanically coupled to a separate hand-piece unit. The hand-piece may comprise an image sensor(s). Image information is optically transmitted along the length of the endoscope, after which it is focused upon the sensor via an optical coupler. The endoscope is free to rotate with respect to the image sensor and the operator will typically exploit this fact to cover a greater range of a scene of a surgical site when using endoscopes with a non-zero viewing angle. The orientation of the image as seen on the viewing display or monitor depends on the orientation of the hand-piece unit with respect to the scene. Generally the user or operator of the hand-piece wishes the vertical direction in the image to be the same direction as their own upright direction.
Non-limiting and non-exhaustive implementations of the disclosure are described with reference to the following figures, wherein like reference numerals refer to like parts throughout the various views unless otherwise specified. Advantages of the disclosure will become better understood with regard to the following description and accompanying drawings where:
For reasons of cost and simplicity, an improved endoscope design concept involves placing an image sensor within the endoscope itself and transmitting the image data to the remainder of the camera system electrically. In an implementation of the disclosure, the image sensor may be placed within a distal end of the endoscope. The challenge for such a system is to maintain high image quality using a sensor that is space constrained. This challenge may be overcome by a system that incorporates a monochrome image sensor with minimal peripheral circuitry, connection pads and logic. Color information is provided by pulsing different frames with different wavelengths of light using, e.g., laser or LED light sources. The image sensor is able to capture frames within 1/120 s or less, thereby producing full color video at a rate of 60 Hz or higher.
Another challenge arising from this approach is in providing a final image orientation for a user, which still reflects the hand-piece orientation with respect to the scene. One, purely mechanical approach is to have the sensor be rigidly coupled to the hand-piece and to rotate the endoscope, including the lens stack, at the front end independently. This may be accomplished by incorporating two concentric tubes. The system allows for a distal prism to rotate, which changes the angle of view of the user or operator, while the sensor remains fixed at a constant location. This allows the device to be used in the same manner as expected by a user or operator experienced in using conventional rigid endoscopy systems. The user or operator may rotate an outer lumen, thereby changing the angle of view, while the sensor remains in a fixed position and the image viewable on screen remains at a constant horizon. The prism may rotate while the sensor does not rotate, such that the user does not lose orientation.
This disclosure extends to an alternative approach in which the sensor is rigidly coupled, along with the lens stack, to a single tube while the digital images are rotated in the image signal processing pipeline or chain (ISP). The disclosure contemplates using a digital representation of the angle of the endoscope tube with respect to the hand-piece that is continuously available to the ISP during operation. Several approaches to this are possible, as described more fully herein.
The disclosure also extends to a solution for endoscopy applications in which the image sensor is resident at the distal end of the endoscope. With an image sensor located in the distal end of an endoscopic device, there are challenges present, which are not at issue when the imaging sensor is located remotely from the distal end of the endoscopic device. For example, when a user or operator rotates or changes the angle of the endoscopic device, which is common during a surgery, the image sensor will change orientation and the image horizon shown on screen will also change. What is needed are devices and systems that accommodate an image sensor being located in the distal end of the endoscopic device without changing the orientation and maintaining a constant image horizon for the user or operator. As will be seen, the disclosure provides devices and systems that can do this in an efficient and elegant manner.
In the following description of the disclosure, reference is made to the accompanying drawings, which form a part hereof, and in which is shown by way of illustration specific implementations in which the disclosure may be practiced. It is understood that other implementations may be utilized and structural changes may be made without departing from the scope of the disclosure.
It must be noted that, as used in this specification and the appended claims, the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise.
As used herein, the terms “comprising,” “including,” “containing,” “characterized by,” and grammatical equivalents thereof are inclusive or open-ended terms that do not exclude additional, unrecited elements or method steps.
Further, where appropriate, functions described herein can be performed in one or more of: hardware, software, firmware, digital components, or analog components. For example, one or more application specific integrated circuits (ASICs) can be programmed to carry out one or more of the systems and procedures described herein. Certain terms are used throughout the following description and Claims to refer to particular system components. As one skilled in the art will appreciate, components may be referred to by different names. This document does not intend to distinguish between components that differ in name, but not function.
Referring now to
Referring now to
Referring now to
Referring now to
For each of the implementations discussed above, any pertinent resultant voltage 300 is fed to an analog-digital converter (ADC) or digitizer 400. The digital number is then relayed to the ISP or the camera processing chain. The angle-sensing elements (112, 114; 212, 213, 214; and 312, 313, 314) discussed above may be placed into or as part of a fixed hand-piece 110 where the scope 120 rotates with respect to the fixed hand-piece system, which is illustrated best in
As illustrated in
The rotation post 175 allows the user to rotate the scope 120 in a way that is similar to rotating a conventional scope (as shown in
For each embodiment, the rotating and fixed components of the angle detection system can be mounted to the rotation sleeve 170 and hand-piece 110, respectively.
It will be appreciated that the digital angle information may be made available to the image processing chain where it is sampled periodically (e.g., each frame) and quantized appropriately to, e.g., 5° or 10°, units. In order to prevent rapid angular oscillation of the final image between adjacent angles, a degree of hysteresis is required. One approach is to only allow an image transformation if the same quantized angle has been observed consistently within the previous n samples, where n would be tuned to the satisfaction of the user.
The basis of rotation of an image plane through angle θ is described by the following transformation:
x2=(X1−x0)cos θ−(Y1−y0)sin θ+x0
y2=(Y1−y0)cos θ+(X1−x0)sin θ+y0
where (X1, Y1) are the original integer pixel coordinates, (x2,y2) are the final real-number pixel coordinates and (x0,y0) marks the axis of rotation. In general, unless θ is a multiple of 90°, x2 and y2 are not integers. The pixel locations in the final image buffer can be filled by truncating or rounding the (x2,y2) values to integer coordinates (X2,Y2):
X2=int(x2)
Y2=int(y2)
This approach results in multiple candidate cases and void pixels, however. The void pixels can be filled by nearest neighbor substitution, which has a resolution and artifact penalty, or by interpolation (e.g., bilinear or bicubic), requiring an occupancy investigation in their localities.
A more practical approach is afforded by taking each final integer pixel location and applying the inverse rotation transformation to arrive at real-number coordinates within the original plane:
x1=(X2−x0)cos θ+(Y2−y0)sin θ+x0
y1=(Y2−y0)cos θ−(X2−x0)sin θ+y0
Since pixel data within that plane are known to be at all integer coordinates, it is straightforward to derive an interpolated image content estimate. This interpolation can again either be bilinear or bicubic, e.g., Bilinear interpolation requires knowing only the closest four pixels, (two in each dimension). They are identified as (Xa, Ya), (Xa, Yb), (Xb, Ya) and (Xb, Yb), where:
Xa=int(x1); Xb=1+int(x1)
Ya=int(y1); Yb=1+int(y1)
The convolution kernel is described by:
where;
α=x1−Xa
β=y1−Ya
in pixel units.
Referring now to
It will be appreciated that the teachings and principles of the disclosure may be used in a reusable device platform, a limited use device platform, a re-posable use device platform, or a single-use/disposable device platform without departing from the scope of the disclosure. It will be appreciated that in a re-usable device platform an end-user is responsible for cleaning and sterilization of the device. In a limited use device platform the device can be used for some specified amount of times before becoming inoperable. Typical new device is delivered sterile with additional uses requiring the end-user to clean and sterilize before additional uses. In a re-posable use device platform a third-party may reprocess the device (e.g., cleans, packages and sterilizes) a single-use device for additional uses at a lower cost than a new unit. In a single-use/disposable device platform a device is provided sterile to the operating room and used only once before being disposed of.
Additionally, the teachings and principles of the disclosure may include any and all wavelengths of electromagnetic energy, including the visible and non-visible spectrums, such as infrared (IR), ultraviolet (UV), and X-ray.
The foregoing description has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the disclosure to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. Further, it should be noted that any or all of the aforementioned alternate implementations may be used in any combination desired to form additional hybrid implementations of the disclosure.
Further, although specific implementations of the disclosure have been described and illustrated, the disclosure is not to be limited to the specific forms or arrangements of parts so described and illustrated. The scope of the disclosure is to be defined by the claims appended hereto, any future claims submitted here and in different applications, and their equivalents.
This application is a continuation of U.S. application Ser. No. 14/214,412, filed Mar. 14, 2014 (U.S. Pat. No. 10,362,240) and claims the benefit of U.S. Provisional Application No. 61/792,119, filed Mar. 15, 2013, which are incorporated herein by reference in their entirety, including but not limited to those portions that specifically appear hereinafter, the incorporation by reference being made with the following exception: In the event that any portion of the above-referenced applications are inconsistent with this application, this application supersedes said above-referenced applications. Not Applicable.
Number | Name | Date | Kind |
---|---|---|---|
4433675 | Konoshima | Feb 1984 | A |
5187572 | Nakamura et al. | Feb 1993 | A |
5241170 | Field, Jr. et al. | Aug 1993 | A |
5877819 | Branson | Mar 1999 | A |
6073043 | Schneider | Jun 2000 | A |
6272269 | Naum | Aug 2001 | B1 |
6296635 | Smith | Oct 2001 | B1 |
6331156 | Haefele et al. | Dec 2001 | B1 |
6387043 | Yoon | May 2002 | B1 |
6419626 | Yoon | Jul 2002 | B1 |
6471637 | Green et al. | Oct 2002 | B1 |
6485414 | Neuberger | Nov 2002 | B1 |
6690466 | Miller et al. | Feb 2004 | B2 |
6692431 | Kazakevich | Feb 2004 | B2 |
6899675 | Cline et al. | May 2005 | B2 |
6916286 | Kazakevich | Jul 2005 | B2 |
6921920 | Kazakevich | Jul 2005 | B2 |
6961461 | MacKinnon et al. | Nov 2005 | B2 |
7037258 | Chatenever et al. | May 2006 | B2 |
7037259 | Hakamata et al. | May 2006 | B2 |
7189226 | Auld et al. | Mar 2007 | B2 |
7211042 | Chatenever et al. | May 2007 | B2 |
7258663 | Doguchi et al. | Aug 2007 | B2 |
7540645 | Kazakevich | Jun 2009 | B2 |
7544163 | MacKinnon et al. | Jun 2009 | B2 |
7783133 | Dunki-Jacobs et al. | Aug 2010 | B2 |
7794394 | Frangioni | Sep 2010 | B2 |
7833152 | Chatenever et al. | Nov 2010 | B2 |
10362240 | Richardson et al. | Jul 2019 | B2 |
20010018553 | Krattiger et al. | Aug 2001 | A1 |
20010030744 | Chang | Oct 2001 | A1 |
20030142753 | Gunday | Jul 2003 | A1 |
20040249267 | Gilboa | Dec 2004 | A1 |
20050159740 | Paul et al. | Jul 2005 | A1 |
20050234302 | MacKinnon et al. | Oct 2005 | A1 |
20050250983 | Tremaglio et al. | Nov 2005 | A1 |
20060069314 | Farr | Mar 2006 | A1 |
20080045800 | Farr | Feb 2008 | A2 |
20080071142 | Gattani | Mar 2008 | A1 |
20080159653 | Dunki-Jacobs | Jul 2008 | A1 |
20090012361 | MacKinnon et al. | Jan 2009 | A1 |
20090220156 | Ito | Sep 2009 | A1 |
20090292168 | Farr | Nov 2009 | A1 |
20100022829 | Irion et al. | Jan 2010 | A1 |
20100033170 | Velasquez | Feb 2010 | A1 |
20100048999 | Boulais et al. | Feb 2010 | A1 |
20100125166 | Henzler | May 2010 | A1 |
20100249817 | Mark | Sep 2010 | A1 |
20100286475 | Robertson | Nov 2010 | A1 |
20110181840 | Cobb | Jul 2011 | A1 |
20110237882 | Saito | Sep 2011 | A1 |
20110237884 | Saito | Sep 2011 | A1 |
20110263941 | Wright et al. | Oct 2011 | A1 |
20120004508 | McDowall et al. | Jan 2012 | A1 |
20120041267 | Benning et al. | Feb 2012 | A1 |
20120078052 | Cheng | Mar 2012 | A1 |
20120257030 | Lim et al. | Oct 2012 | A1 |
20120307030 | Blanquart | Dec 2012 | A1 |
20130060086 | Talbert et al. | Mar 2013 | A1 |
20130155305 | Shintani | Jun 2013 | A1 |
Number | Date | Country |
---|---|---|
2005328970 | Dec 2005 | JP |
2011-019549 | Feb 2011 | JP |
2005031433 | Apr 2005 | WO |
Entry |
---|
Holler Kurt et al. “Endoscopic Orientation Correct,” Sep. 20, 2019, International Conference on Computer Analysis of Images and Patters. CAIP 2017: Computer Analysis of Images and Patterns; [Lecture Notes in Computer Science; Lec. Notes Computer], Springer, Berlin, Heidelberg, pp. 459-466. |
Anonymous: “Olympus Microscopy Resource 4,7-12 Center: Geometrical Transformation—Java Tutorial,” Dec. 31, 2012 (Dec. 31, 2012), XP055294458, Retrieved from the Internet: URL:http://www.olympusmicro.com/primer/java/digitalimaging/processing/geometricaltransformation/index.html [retrieved on Aug. 9, 2016]. |
Anonymous: “Potentiometergeber—Wikipedia,” Oct. 8, 2012 (Oct. 8, 2012), XP055294467, Retrieved from the Internet: URL:https://de.wikipedia.org/w/index.php?title=Potentiometergeber&01did=109084688 [retrieved on Aug. 9, 2016]. |
Anonymous: “Resolver (electrical)—Wikipedia, the free encyclopedia,” Mar. 2, 2013 (Mar. 2, 2013), XP055294466, Retrieved from the Internet: URL:https://en.wikipedia.org/w/index.php?title=Resolver (electrical)&oldid=541673781 [retrieved on-Aug. 9, 2016]. |
Anonymous: “Rotary encoder—Wikipedia, the free encyclopedia,” Mar. 5, 2013 (Mar. 5, 2013), XP055294463, Retrieved from the Internet: URL:https://en.wikipedia.org/w/index.php?title=Rotary encoder&oldid=542165237 [retrieved on Aug. 9, 2016]. |
Eberly, David: “Integer-Based Rotations of 4, 7-12 Images Contents,” Mar. 2, 2008 (Mar. 2, 2008), pp. 1-5, XP055294442, Retrieved from the Internet: URL:http://www.geometrictools.com/Documentation/IntegerBasedRotation.pdf [retrieved on Aug. 9, 2016]. |
Computer generated English translation of Japanese Publication No. 2011-019549, Published Feb. 3, 2011. |
Number | Date | Country | |
---|---|---|---|
20190356867 A1 | Nov 2019 | US |
Number | Date | Country | |
---|---|---|---|
61792119 | Mar 2013 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14214412 | Mar 2014 | US |
Child | 16509297 | US |