The embodiments herein relate to scopes, such as endoscopes, borescopes, and microscopes. Embodiments relate more specifically to dual-tube stereoscopes.
Endoscopes, borescopes, and microscopes typically provide a single path between an object and the imaging plane or the eye(s) of the viewer. An endoscope is an optical viewing device typically consisting of a rigid or flexible elongated body with an eyepiece at the proximal end, an objective lens at the distal end, and whose two ends are linked together by relay optics, fiber bundles, or other waveguides. Borescopes and microscopes are similarly constructed. The optical system can be surrounded by optical fibers or other light sources used for illumination of the remote object. An internal image of the illuminated object is formed by the objective lens and magnified by the eyepiece, which presents it to the viewer's eye.
Endoscopes are typically used to view the inside of the human body. There are numerous types of endoscopes, including: laparoscopes, endoscopes, fetoscopes, bronchoscopes, etc. Borescopes are used for inspection work, to view areas that are otherwise inaccessible, such as inside engines, industrial gas turbines, steam turbines, etc. Microscopes are typically used to view small objects in a magnified way.
Scopes that have a single optical path are limited in that they provide only a monoscopic view of the object being viewed. Further, previous methods of adding a second optical path to allow stereoscopic viewing have been cumbersome. These problems and others are addressed by the techniques, systems, methods, devices and computer-readable media described herein.
Presented herein are techniques, methods, systems, devices, and computer-readable media for dual-tube stereoscopes. In some embodiments, a scope may include an elongated body comprising a proximal end and a distal end, the proximal end having at least one proximal opening, the distal end having combined first and second distal openings; a first waveguide coupled to the first distal opening; and a second waveguide coupled to the second distal opening. There may also be optics situated near the proximal end of the elongated body and configured to receive light from the first and second waveguides and to transmit the received light through the at least one proximal opening onto a single light-receiving device.
Various techniques for producing dual images using a single camera and a dual-tube endoscope described herein may include, in various embodiments, receiving light through two distal lenses; transmitting the received light to two waveguides; transmitting light from the two waveguides onto a single light-receiving device as a single image containing two sub-images; and processing the single image to produce two images based at least in part on the two sub-images.
Some embodiments for processing dual-tube stereoscope images include receiving a single digital image from a single light-receiving device, the single digital image comprising two sub-images, the two sub-images having been received at the single light-receiving device from optics, which in turn received light from dual waveguides in a scope; and processing the single digital image in order to produce one output image for each of the two sub-images.
Numerous other embodiments are described throughout herein.
For purposes of summarizing the invention and the advantages achieved over the prior art, certain objects and advantages of the invention are described herein. Of course, it is to be understood that not necessarily all such objects or advantages need to be achieved in accordance with any particular embodiment. Thus, for example, those skilled in the art will recognize that the invention may be embodied or carried out in a manner that achieves or optimizes one advantage or group of advantages as taught or suggested herein, without necessarily achieving other objects or advantages as may be taught or suggested herein.
All of these embodiments are intended to be within the scope herein disclosed. These and other embodiments will become readily apparent to those skilled in the art from the following detailed description and from referring to the attached figures, the invention not being limited to any particular disclosed embodiment(s).
Various embodiments herein provide for dual-tube stereo endoscopes. Consider an endoscope inside the body, taking images of a colon, for example. In order to view the colon stereoscopically, a left eye image and a right eye image must be produced. One approach would be to use two cameras at the ends of the endoscope, one of which would take a right eye view, while the other would take a left eye view. Together, these two images would enable stereoscopic viewing. A problem with this approach is that the resolution of these cameras, given that they must be very small, would be quite low. Further, when sterilizing the equipment, the camera and the endoscope would typically be put in an autoclave, and it is difficult to protect electronic equipment in the autoclave. Another approach would be to use two optical paths that connect the distal end of the endoscope to two cameras at the proximal end of the endoscope. An issue with this approach would be that the stereo endoscope's cameras would be bulky and heavy and therefore difficult to use.
As described herein, there is another approach: using a dual-tube scope and providing dual images on a single imager, such as those typically used with single-tube scopes. In some embodiments, the dual-tube stereoscopes are usable with a standard single-tube scope's mount, which has a single camera. The embodiments include dual, parallel optical paths which can each have a waveguide. As used herein, a ‘waveguide’ is a broad term and is intended to encompass its plain and ordinary meaning, including without limitation, any device or group of devices that can transmit light along a path or in a direction, such as relay optics, coherent fiber bundles, fiber optics, or other waveguides. The scope may also include fiber optics leading to the objective end, or lights mounted at the objective end, designed to illuminate the inside of the body or other objects being viewed with the scope. Light reflects off of objects and enters dual lenses at the distal end and passes through waveguides to the exit optics, which prepare the light for capture by a single light-receiving device. As used herein, a ‘light-receiving device’ is a broad term encompassing its plain and ordinary meaning, including without limitation, an apparatus for taking photos or video, such as any of the standard cameras used in current single-tube scopes. The resolution of the light-receiving device, such as a camera and its imager, can be a currently used resolution, for example, full high definition (“HD”) or “quarter HD.” For example, a five-millimeter scope may have a theoretical resolution limit somewhere under five hundred lines, and the standard HD imager or quarter-HD imager may be able to capture images above that resolution.
As noted above, exit optics may be used to transmit the dual images through the dual light paths and reproject them onto the single imager. There may be a single, shared-exit optical device (e.g., a lens or a group of lenses) or dual-exit devices (e.g., dual lenses or dual groups of lenses). A single-exit optical device may combine the two optical paths and reproject them onto the single camera, which has built-in optics to refocus on its imager(s). Dual-exit optics, one for each optical path, may also be used to focus and or project the light into the single camera. The optics used by the camera to focus the dual optical paths onto the imager may be any known optics, lens, or set of lenses, such as 20 mm, 24 mm, 28 mm, 35 mm optics or lenses and the like. In some embodiments, a scope's camera may have multiple individual imagers, each viewing a different color band of the full image, the different bands of light being separated by beam splitters or other such devices.
After the light from the dual-tube scope has been projected onto the camera's imager(s), processing may take place using a computer or other device to calibrate the images, correct for distortions, separate the two images, and the like. Once these dual images are received, they may be used to display a left-eye image and a right-eye image to an end user.
Additionally, more than two light paths may be used. For example, there may be four lenses at the distal end of the scope. Those four lenses may be attached to four waveguides and the four waveguides may transmit light to a single or to multiple optics at the proximal end of the scope. The optics at the proximal end of the scope may prepare the light for acquisition by a single camera. This single image with the four sub-images may then be processed by a computer or multiple computers, by a processor, or by multiple processors in order to produce four images that can be used to produce stereoscopic or depth information, for example.
A scope 110 is also part of the system 100. The scope 110 may include lenses 140 and 141 at the distal end of the scope 110, as well as waveguides 120 and 121 within the scope. As discussed above, the waveguides 120 and 121 may transmit light to optics 130. The optics 130 may prepare the light for transmission to the single camera 180. The light may pass through a single or multiple openings at the proximal end of the scope 110 (not illustrated in
After light has been transmitted through optics 230 onto the single camera 280, the single image, with its two sub-images, is transmitted to computer system 290. At computer system 290, the two sub-images may be calibrated and/or otherwise corrected. In some embodiments, processing the two sub-images includes calibrating and/or (re-)aligning the two sub-images, if the scope has been bent or twisted, or is otherwise out of alignment. Consider
Processing the two sub-images may also include correcting for distortion in the two sub-images. The distortion may be caused by the optics in the scope including the distal lenses, the optical relays, and/or the exit optics. Correcting for distortion in images received through lenses can be performed by processes known in the art. Processing the two sub-images may also include zooming the images in or out, detection of zooming performed by camera or coupler, scaling the images to be larger or smaller, or the like. This may be useful, in some embodiments, when the zoom on a camera is not the desired zoom, for example.
After the two sub-images have been calibrated and corrected for distortion, the two sub-images can be separated. Separating the two sub-images into two images, in some embodiments, may include writing a portion of the corrected single image corresponding to the first sub-image into one portion of memory and writing the portion of the corrected single image corresponding to the second sub-image into another portion of memory. These two images, once processed and separated, can be shown to an operator as a dual image (e.g., image pair) or as a stereoscopic image. Displaying these two images as a stereoscopic image can allow an operator to view objects seen through the scope stereoscopically, “in 3D,”—almost as if the operator's eyes were observing from the end of the scope. In the case of an endoscope, for example, if the doctor using the endoscope is stereoscopically viewing images from inside the body, the appearance of the stereo images may be such that the doctor can perceive depth corresponding to the depth of the objects inside the body.
In some embodiments, the sub-images received through the distal lenses may be diffraction-limited or approximately diffraction-limited. For example, the sub-images received through the distal lenses transmitted through the two waveguides and through the optics onto a single camera may have a resolution lower than that of the single camera. In some embodiments, diffraction may limit the resolution of light that can be focused by standard optics. The equation or calculation usable to determine the diffraction limit using standard spherical ground optics may be:
Sin(θ)=1.22*λ/D
Where
As described above with respect to
In block 440, the single camera's image is processed to produce two images, one of each of which is associated with the light path from the two distal lenses. Processing the single camera's image (with its two sub-images) to produce two separate images may include calibrating and/or aligning the image and correcting distortion in the image in order to produce two images. This is described elsewhere herein and an example is shown in
The blocks of method 400 may be performed in a different order, additional blocks may be performed as part of the method, and/or blocks may be omitted from the method.
Dual-tube stereoscopes may be used to produce dual images, stereoscopic images, or may be used to extract or reconstruct depth from a scene in order to produce 3D models. In some embodiments, the dual-tube stereoscope may be an endoscope, such as a laparoscope, enteroscope, colonoscope, sigmoidoscope, rectoscope, anoscope, proctoscope, rhinoscope, bronchoscope, otoscope, cystoscope, gynoscope, colposcope, hysteroscope, falloposcope, arthoscope, thoracoscope, mediastinoscope, amnioscope, fetoscope, laryngoscope, esophagoscope, bronchoscope, epiduroscope, and other types of surgical or medical scopes. Non-medical scopes are also embodiments of scopes discussed herein, such as architectural endoscopes, which may be used for planning in architectural and pre-visualization of scale models. Additionally, embodiments of the scopes herein may be borescopes, which may be used for internal inspection of complex technical systems, for example. Additional scopes may include, in various embodiments, microscopes.
As depicted in
The camera 780 may include a single imager that would traditionally receive a single image corresponding to a single optical path, but instead receives a single image containing two sub-images from the scope 710. As discussed above, the two sub-images on the single image may later be used for stereoscopic presentation, or to display a dual image from the scope. A mount 793 may be connected or coupled to a camera hub 795. The camera hub 795 may transmit the single image to a stereoscopic or monoscopic monitor 781 and the dual images may be displayed as raw data or may first be processed by computer system 790 and returned to the camera hub 795 for production of a dual image or stereoscopic image on monitor 781. Camera hub 795 may also transmit the images to computer system 790. The computer system 790 may then produce the two images from the two sub-images contained within the single received image captured by the scope 710. The two images may be displayed together (e.g., side by side) on monitor 783 or stereoscopically on monitor 783. Operator 792 may also be wearing a head-mounted display 782 or 3D viewing glasses 782. Multiple stereoscopic monitors 783 may present multiple copies of the stereoscopic images simultaneously for multiple viewers. The computer system 790 may also be equipped with a digital recorder or other device that records the video stream being presented at one or more of the displays 781, 782, and/or 783. For example, a program such as “Fraps” or other stereo recording software may be used or integrated into the computer 790 to record calibrated, aligned, distortion-corrected stereoscopic output.
In embodiments where the operator 792 is wearing 3D viewing glasses 782, the operator may view monitor 781 or 783 in order to see a stereoscopic image of the objects, or images captured by scope 710. In embodiments where an operator 792 is wearing a head-mounted display 782, two sub-images captured by the scope 710 and transmitted from the mount 793 to the camera hub 795, may be processed by the computer 790 in order to produce dual images to be shown to the left and right eye of the operator 792 by means of the head-mounted display 782. The operator may also manipulate or otherwise interact with the images and/or the computer system 790 using input devices 791, such as a mouse and/or keyboard.
Some embodiments include kits for use with or containing some or all of the parts for a dual-tube stereoscope. For example, one or more parts of a dual-tube stereoscope may be disposable and those disposable parts may come in a kit, such as a sterile bag. For example, if a sheath attachable to the distal end of the scope were removable and disposable, then a kit for the dual-tube stereoscope may include the sheath.
The processes and systems described herein may be performed on or encompass various types of hardware, such as computer systems. In some embodiments, computer 790, displays 781, 782, and 783, camera hub 795, and/or input device 791 may each be separate computer systems, applications, or processes, or may run as part of the same computer systems, applications, or processes—or one of more may be combined to run as part of one application or process—and/or each or one or more may be part of or run on a computer system. A computer system may include a bus or other communication mechanism for communicating information, and a processor coupled with the bus for processing information. The computer systems may have a main memory, such as a random access memory or other dynamic storage device, coupled to the bus. The main memory may be used to store instructions and temporary variables. The computer systems may also include a read-only memory or other static storage device coupled to the bus for storing static information and instructions. The computer systems may also be coupled to a display, such as a CRT or LCD monitor. Input devices may also be coupled to the computer system. These input devices may include a mouse, a trackball, keyboard, joystick, touch screen, or cursor direction keys.
Each computer system may be implemented using one or more physical computers or computer systems, or portions thereof. The instructions executed by the computer system may also be read in from a computer-readable storage medium. The computer-readable storage medium may be a CD, DVD, optical or magnetic disk, laserdisc, carrier wave, or any other medium that is readable by the computer system. In some embodiments, hardwired circuitry may be used in place of or in combination with software instructions executed by the processor. Communication among modules, systems, devices, and elements may be over direct or switched connections, and wired or wireless networks or connections, via directly connected wires, or via any other appropriate communication mechanism. The communication among modules, systems, devices, and elements may include handshaking, notifications, coordination, encapsulation, encryption, headers, such as routing or error detecting headers, or any other appropriate communication protocol or attribute. Communication may also make use of messages related to HTTP, HTTPS, FTP, TCP, IP, ebMS OASIS/ebXML, secure sockets, VPN, encrypted or unencrypted pipes, MIME, SMTP, MIME Multipart/Related Content-type, SQL, etc.
The 3D graphics may be produced using two or more captured images and/or based on underlying data models and projected onto one or more 2D planes in order to create left and right eye images for a head mount, lenticular, or other 3D display. Any appropriate 3D graphics processing may be used for displaying or rendering, including processing based on OpenGL, Direct3D, Java 3D, etc. Whole, partial, or modified 3D graphics packages may also be used, such packages including 3DS Max, SolidWorks, Maya, Form Z, Cybermotion 3D, or any others. In some embodiments, various parts of the needed rendering may occur on traditional or specialized graphics hardware. The rendering may also occur on the general-purpose CPU, on programmable hardware, on a separate processor, be distributed over multiple processors, over multiple dedicated graphics cards, or may use any other appropriate combination of hardware or technique.
In some embodiments, displays 781, 782, and/or 783 present stereoscopic 3D images to an operator, such as a physician. Stereoscopic 3D displays deliver separate imagery to each of the user's eyes. This can be accomplished by a passive stereoscopic display, an active frame-sequential stereoscopic display, a lenticular auto-stereoscopic display, or any other appropriate type of display. The displays 781, 782, and/or 783 may be passive alternating-row or alternating-column displays. Example of polarization-based alternating-row displays include the Miracube G240S, as well as Zalman Trimon Monitors. Alternating-column displays include devices manufactured by Sharp, as well as many “auto-stereoscopic” displays (e.g. by Philips). Displays 781, 782, and/or 783 may also be cathode ray tubes (CRTs). CRT-based devices, may use temporal sequencing, showing imagery for the left and right eye in temporal sequential alternation; this method may also be used by newer, projection-based devices, as well as by rapidly switchable (e.g., 120 Hz) liquid crystal display (LCD) devices. In some embodiments, a user may wear a head-mounted display 782 in order to receive 3D images from the computer system 790. In such embodiments, a separate display, such as the pictured displays 781 and/or 783, may be omitted.
As will be apparent, the features and attributes of the specific embodiments disclosed above may be combined in different ways to form additional embodiments, all of which fall within the scope of the present disclosure.
Depending on the embodiment, certain acts, events, or functions of any of the algorithms described herein can be performed in a different sequence, can be added, merged, or left out all together (e.g., not all described acts or events are necessary for the practice of the algorithms). Moreover, in certain embodiments, acts or events can be performed concurrently rather than sequentially, e.g., through multi-threaded processing, interrupt processing, or multiple processors or processor cores, or on other parallel architectures.
The various illustrative logical blocks, modules, and algorithm steps described in connection with the embodiments disclosed herein can be implemented as electronic hardware, as computer software, or as combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. The described functionality can be implemented in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the disclosure.
The various illustrative logical blocks and modules described in connection with the embodiments disclosed herein can be implemented or performed by a machine or computing device. Here the term ‘computing device’ includes its plain and ordinary meaning, including, but not limited to any machine, hardware, or other device capable of performing calculations or operations automatically, such as a general-purpose processor, a digital signal processor (DSP), an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general-purpose processor can be a microprocessor, a controller, microcontroller, or state machine, combinations of the same, or the like. A processor can also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
The steps of a method, process, or algorithm described in connection with the embodiments disclosed herein can be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module can reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM or other optical media, or any other form of computer-readable storage medium known in the art. An exemplary storage medium can be coupled to the processor such that the processor can read information from, and write information to, the storage medium. In some embodiments, the storage medium can be integral to the processor. The processor and the storage medium can reside in an ASIC. The ASIC can optionally reside in a user terminal. In some embodiments, the processor and the storage medium can reside as discrete components in a user terminal.
Conditional language used herein, such as, among others, “can,” “could,” “might,” “may,” “e.g.,” and the like, unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or states. Thus, such conditional language is not generally intended to imply that features, elements and/or states are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without author input or prompting, whether these features, elements and/or states are included or are to be performed in any particular embodiment.
Any process descriptions, elements, or blocks in the flow diagrams described herein and/or depicted in the attached figures should be understood as potentially representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process. Alternate implementations are included within the scope of the embodiments described herein, in which elements or functions may be deleted, executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those skilled in the art.
All of the methods and processes described above may be embodied in, and fully automated via, software code modules executed by one or more general-purpose computers or processors, such as those computer systems described above. The code modules may be stored in any type of computer-readable medium or other computer storage device. Some or all of the methods may alternatively be embodied in specialized computer hardware.
It should be emphasized that many variations and modifications may be made to the above-described embodiments, the elements of which are to be understood as being among other acceptable examples. All such modifications and variations are intended to be included herein within the scope of this disclosure and protected by the following claims.
The following patents and publications are incorporated by reference herein in their entireties for all purposes: U.S. Pat. No. 6,898,022, U.S. Pat. No. 6,614,595, U.S. Pat. No. 6,450,950, U.S. Pat. No. 6,104,426, U.S. Pat. No. 5,776,049, U.S. Pat. No. 5,673,147 U.S. Pat. No. 5,603,687, U.S. Pat. No. 5,527,263, U.S. Pat. No. 5,522,789, U.S. Pat. No. 5,385,138, U.S. Pat. No. 5,222,477, U.S. Pat. No. 5,191,203, U.S. Pat. No. 5,122,650, U.S. Pat. No. 4,862,873, U.S. Pat. No. 4,873,572, U.S. Pat. No. 7,277,120, and U.S. Pub. No. 2008/0151041.
This application claims benefit of U.S. Provisional Application No. 61/230,570, filed Jul. 31, 2009, entitled Stereo Endoscope System, to Kurtis Keller et al, which is incorporated by reference herein for all purposes.
Number | Date | Country | |
---|---|---|---|
61230570 | Jul 2009 | US |