Not Applicable.
Advances in technology have provided advances in imaging capabilities for medical use. One area that has enjoyed some of the most beneficial advances is that of endoscopic surgical procedures because of the advances in the components that make up an endoscope.
The disclosure relates generally to electromagnetic sensing and sensors, increasing the color accuracy and reducing the fixed pattern noise. The features and advantages of the disclosure will be set forth in the description which follows, and in part will be apparent from the description, or may be learned by the practice of the disclosure without undue experimentation. The features and advantages of the disclosure may be realized and obtained by means of the instruments and combinations particularly pointed out in the appended claims.
Non-limiting and non-exhaustive implementations of the disclosure are described with reference to the following figures, wherein like reference numerals refer to like parts throughout the various views unless otherwise specified. Advantages of the disclosure will become better understood with regard to the following description and accompanying drawings.
The disclosure extends to methods, systems, and computer based products for digital imaging that may be primarily suited to medical applications, and for producing an image in light deficient environments and correction of white balance and/or fixed pattern noise at startup or at any other time during a procedure.
In the following description of the disclosure, reference is made to the accompanying drawings, which form a part hereof, and in which is shown by way of illustration specific implementations in which the disclosure may be practiced. It is understood that other implementations may be utilized and structural changes may be made without departing from the scope of the disclosure.
In describing and claiming the subject matter of the disclosure, the following terminology will be used in accordance with the definitions set out below.
It must be noted that, as used in this specification and the appended claims, the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise.
As used herein, the terms “comprising,” “including,” “containing,” “characterized by,” and grammatical equivalents thereof are inclusive or open-ended terms that do not exclude additional, unrecited elements or method steps.
As used herein, the phrase “consisting of” and grammatical equivalents thereof exclude any element or step not specified in the claim.
As used herein, the phrase “consisting essentially of” and grammatical equivalents thereof limit the scope of a claim to the specified materials or steps and those that do not materially affect the basic and novel characteristic or characteristics of the claimed disclosure.
As used herein, the term “proximal” shall refer broadly to the concept of a portion nearest an origin.
As used herein, the term “distal” shall generally refer to the opposite of proximal, and thus to the concept of a portion farther from an origin, or a furthest portion, depending upon the context.
As used herein, color sensors or multi spectrum sensors are those sensors known to have a color filter array (CFA) thereon so as to filter the incoming electromagnetic radiation into its separate components. In the visual range of the electromagnetic spectrum, such a CFA may be built on a Bayer pattern or modification thereon in order to separate green, red and blue spectrum components of the light.
Modern digital video systems such as those employed for endoscopy incorporate various levels of calibration for the purpose of rendering the image as ideal as possible. In essence, the prime motivation is to mimic the human visual system as closely as possible. Raw color images captured under different types of broad-spectrum illumination (such as sunlight, tungsten filaments, fluorescent lighting, white LEDs etc.), will all have different overall color casts. The human visual system is highly effective in automatically balancing out the biases introduced by the illumination spectra, so as to, e.g., idealize the perception of white and grey scene components. For example, a white sheet of paper always seems white, irrespective of whether the light is, e.g., incandescent or daylight. Raw digital images of a white sheet of paper may appear different shades of off-white under different illuminants, however. To counter this, a digital imaging system, as opposed to the human visual system, must incorporate a white balance process. In fact, most of the process of white balancing is to adjust for the fact that an image sensor response for each color channel is different. The quantum efficiency for a silicon photodiode or other light sensing element is lower for blue photons than for red and green photons for instance.
Endoscopy systems, such as those illustrated in
Option 2, has an advantage for signal-to-noise ratio, since the dominant source of noise is the Poisson uncertainty in photon arrival rate, which scales as the square root of the signal.
The digital processing stages associated with CMOS image sensors are also concerned with correcting for non-idealities that are inherent within the sense technology. One such non-ideality is so-called fixed pattern noise (FPN), which has a strongly detrimental effect on image quality. It arises due to random variations in black level from pixel to pixel. There may also be a column to column component (CFPN) reflecting the analog readout architecture. The degree to which the FPN is offensive to an image signal is dependent on the level of contrast with respect to the true noise sources, such as temporal read noise and photon shot noise. The perception threshold for random pixel FPN is around ¼ of the temporal noise at 60 frames per second, while for CFPN it is around 1/20.
In striving for these targets, a strategy may include compensating for FPN using a dark reference buffer stored on the camera or imaging device, which may be accessible by the ISP. As each physical pixel is sampled by the ISP it may have its dedicated black correction applied. If the illumination is under the fast control of the camera (made possible with LEDs and laser diodes), periodic dark frames may be acquired to keep a running average of the black offsets in order to account for temperature variations. An important component of FPN arises from thermal carrier generation with the photosensitive elements, which has an exponential dependence on absolute temperature.
This disclosure is concerned with a convenient method for the calibration, both initially and at other times during a surgical procedure, for endoscopy systems having full control over their illumination sources. Although an example supported in this disclosure is with a single-use system with sensor at the distal tip, this technique is applicable to re-posable, re-usable and limited use endoscopes, with sensor at the distal tip or within the proximal camera head, with multiple sensor (e.g., for 3D imaging) or single sensor, with rigid or flexible scopes. A set of various system configurations for Minimally Invasive Surgery (MIS) and endoscopy is shown in
As illustrated in
It will be appreciated that a dark frame may be created from a single sensing of the pixel array while the cap 230 is covering the distal end of the lumen. It will be appreciated that the cap 230 may be configured, dimensioned, sized and shaped to fit snuggly onto the distal end of the lumen (illustrated best in
The endoscope may be a reusable endoscopic device, a limited use endoscopic device, a re-posable use endoscopic device, or a single-use endoscopic device without departing from the scope of the disclosure.
Continuing to refer to
The system configuration shown in 1B of
The system configuration shown in 1C of
The system configuration shown in 1D of
The system configuration shown in 1E of
The system configuration shown in 1F of
The system configuration shown in 1G of
It will be appreciated that any of the above-identified configurations for an endoscopic system shown in
Referring now to
In an embodiment, a manual procedure, where the operator may place a specially designed cap 230 over the endoscope distal tip at any time during a procedure and then instruct the system to perform the calibration.
It will be appreciated that the camera system may acquire a number of frames in darkness, e.g., dark frames or dark frame references, to form the seed dark correction data used for FPN cancellation. The system may turn on the light source, which feeds light out through the endoscopic tip, as it is during normal imaging operation, and acquire another set of frames for the purpose of computing the relative color channel responses. The system may record these responses in memory, retrieve the responses from memory, and use them to compute the appropriate coefficients for white balance. The operator may remove the cap 230 and begin using the system as normal.
It will be appreciated that implementations of the disclosure may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory, as discussed in greater detail below. Implementations within the scope of the disclosure may also include physical and other computer-readable media for carrying or storing computer-executable instructions and/or data structures. Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system. Computer-readable media that store computer-executable instructions are computer storage media (devices). Computer-readable media that carry computer-executable instructions are transmission media. Thus, by way of example, and not limitation, implementations of the disclosure can comprise at least two distinctly different kinds of computer-readable media: computer storage media (devices) and transmission media.
Computer storage media (devices) includes RAM, ROM, EEPROM, CD-ROM, solid state drives (“SSDs”) (e.g., based on RAM), Flash memory, phase-change memory (“PCM”), other types of memory, other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
A “network” is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices. In an implementation, a sensor and camera control unit may be networked in order to communicate with each other, and other components, connected over the network to which they are connected. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a computer, the computer properly views the connection as a transmission medium. Transmissions media can include a network and/or data links which can be used to carry desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer-readable media.
As can be seen in
Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor, cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. The computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the described features or acts described above. Rather, the described features and acts are disclosed as example forms of implementing the claims.
Those skilled in the art will appreciate that the disclosure may be practiced in network computing environments with many types of computer system configurations, including, personal computers, desktop computers, laptop computers, message processors, control units, camera control units, hand-held devices, hand pieces, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, tablets, pagers, routers, switches, various storage devices, and the like. It should be noted that any of the above mentioned computing devices may be provided by or located within a brick and mortar location. The disclosure may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks. In a distributed system environment, program modules may be located in both local and remote memory storage devices.
Further, where appropriate, functions described herein can be performed in one or more of: hardware, software, firmware, digital components, or analog components. For example, one or more application specific integrated circuits (ASICs) or field programmable gate arrays can be programmed to carry out one or more of the systems and procedures described herein. Certain terms are used throughout the following description and Claims to refer to particular system components. As one skilled in the art will appreciate, components may be referred to by different names. This document does not intend to distinguish between components that differ in name, but not function.
Computing device 300 includes one or more processor(s) 302, one or more memory device(s) 304, one or more interface(s) 306, one or more mass storage device(s) 308, one or more Input/Output (I/O) device(s) 310, and a display device 330 all of which are coupled to a bus 312. Processor(s) 302 include one or more processors or controllers that execute instructions stored in memory device(s) 304 and/or mass storage device(s) 308. Processor(s) 302 may also include various types of computer-readable media, such as cache memory.
Memory device(s) 304 include various computer-readable media, such as volatile memory (e.g., random access memory (RAM) 314) and/or nonvolatile memory (e.g., read-only memory (ROM) 316). Memory device(s) 304 may also include rewritable ROM, such as Flash memory.
Mass storage device(s) 308 include various computer readable media, such as magnetic tapes, magnetic disks, optical disks, solid-state memory (e.g., Flash memory), and so forth. As shown in
I/O device(s) 310 include various devices that allow data and/or other information to be input to or retrieved from computing device 300. Example I/O device(s) 310 include digital imaging devices, electromagnetic sensors and emitters, cursor control devices, keyboards, keypads, microphones, monitors or other display devices, speakers, printers, network interface cards, modems, lenses, CCDs or other image capture devices, and the like.
Display device 330 includes any type of device capable of displaying information to one or more users of computing device 300. Examples of display device 330 include a monitor, display terminal, video projection device, and the like.
Interface(s) 306 include various interfaces that allow computing device 300 to interact with other systems, devices, or computing environments. Example interface(s) 306 may include any number of different network interfaces 320, such as interfaces to local area networks (LANs), wide area networks (WANs), wireless networks, and the Internet. Other interface(s) include user interface 318 and peripheral device interface 322. The interface(s) 306 may also include one or more user interface elements 318. The interface(s) 306 may also include one or more peripheral interfaces such as interfaces for printers, pointing devices (mice, track pad, etc.), keyboards, and the like.
Bus 312 allows processor(s) 302, memory device(s) 304, interface(s) 306, mass storage device(s) 308, and I/O device(s) 310 to communicate with one another, as well as other devices or components coupled to bus 332. Bus 312 represents one or more of several types of bus structures, such as a system bus, PCI bus, IEEE 1394 bus, USB bus, and so forth.
For purposes of illustration, programs and other executable program components are shown herein as discrete blocks, although it is understood that such programs and components may reside at various times in different storage components of computing device 300, and are executed by processor(s) 302. Alternatively, the systems and procedures described herein can be implemented in hardware, or a combination of hardware, software, and/or firmware. For example, one or more application specific integrated circuits (ASICs) can be programmed to carry out one or more of the systems and procedures described herein.
It will be appreciated that the disclosure may be used with any image sensor, whether a CMOS image sensor or CCD image sensor, without departing from the scope of the disclosure. Further, the image sensor may be located in any location within the overall system, including, but not limited to, the tip of the endoscope, the hand piece of the imaging device or camera, the control unit, or any other location within the system without departing from the scope of the disclosure.
Implementations of an image sensor that may be utilized by the disclosure include, but are not limited to, the following, which are merely examples of various types of sensors that may be utilized by the disclosure.
Referring now to
The system and method 700 of
The system and method 700 may comprise actuating an emitter to emit a pulse of a wavelength of electromagnetic radiation to cause illumination within the light deficient environment. In an implementation, the pulse may be within a wavelength range that comprises a portion of the electromagnetic spectrum. In an implementation, the emitter may be a laser emitter and the system and method 700 may further comprise pulsing the laser emitter at a predetermined interval. In an implementation, the method may further comprise actuating a pixel array at a sensing interval that corresponds to the pulse interval of the laser emitter.
The system and method 700 may comprise creating a dark frame from a single sensing of the pixel array while the cap is in place. It will be appreciated that in an implementation, a plurality of dark frames may be created from a plurality of sensing the pixel array while the cap is in place. In an implementation, the dark frames may be created upon startup of a system and stored within memory associated with the system.
In an implementation, the system and method 700 may comprise creating a dark frame after the pixel array is at operating temperature. In an implementation, the dark frame may be created after a surgical procedure has begun. In an implementation, the dark frame may comprise a plurality of dark frames that are created as part of the image frame stream by not applying electromagnetic radiation at given times and stored within memory associated with the system.
The system and method 700 may comprise a plurality of caps that correspond and are opaque to the emitted electromagnetic radiation.
The system and method 700 may comprise a response of the pixel array that corresponds to a photo-signal generated under controlled monochromatic radiation. In an implementation, the system and method 700 may comprise a response of the pixel array corresponds to the photo-signal generated under a plurality of wavelengths of radiation. In an implementation, a response of the pixel array corresponds to the photo-signal generated under a continuous spectrum of wavelengths of radiation.
The system and method 800 of
The system and method 900 of
In an implementation, the system and method 900 may comprise a plurality of caps that correspond and are opaque to the emitted electromagnetic radiation.
In an implementation, a response of the pixel array may correspond to a photo-signal generated under controlled monochromatic radiation. In an implementation, a response of the pixel array corresponds to the photo-signal generated under a plurality of wavelengths of radiation. In an implementation, a response of the pixel array corresponds to the photo-signal generated under a continuous spectrum of wavelengths of radiation.
In an implementation, the endoscope may be a reusable endoscopic device. In an implementation, the endoscope is a limited use endoscopic device. In an implementation, the endoscope is a re-posable use endoscopic device. In an implementation, the endoscope is a single-use endoscopic device.
In an implementation, the system and method 900 may comprise actuating a laser emitter to emit a pulse of a wavelength of electromagnetic radiation to cause illumination within the light deficient environment. In an implementation, the pulse is within a wavelength range that comprises a portion of the electromagnetic spectrum. In an implementation, the system and method 900 may further comprise pulsing the laser emitter at a predetermined interval. In an implementation, the system and method 900 may comprise actuating a pixel array at a sensing interval that corresponds to the pulse interval of said laser emitter.
It will be appreciated that the teachings and principles of the disclosure may be used in a reusable device platform, a limited use device platform, a re-posable use device platform, or a single-use/disposable device platform without departing from the scope of the disclosure. It will be appreciated that in a re-usable device platform an end-user is responsible for cleaning and sterilization of the device. In a limited use device platform the device can be used for some specified amount of times before becoming inoperable. Typical new device is delivered sterile with additional uses requiring the end-user to clean and sterilize before additional uses. In a re-posable use device platform a third-party may reprocess the device (e.g., cleans, packages and sterilizes) a single-use device for additional uses at a lower cost than a new unit. In a single-use/disposable device platform a device is provided sterile to the operating room and used only once before being disposed of.
Additionally, the teachings and principles of the disclosure may include any and all wavelengths of electromagnetic energy, including the visible and non-visible spectrums, such as infrared (IR), ultraviolet (UV), and X-ray.
It will be appreciated that various features disclosed herein provide significant advantages and advancements in the art. The following embodiments are exemplary of some of those features.
In the foregoing Detailed Description of the Disclosure, various features of the disclosure are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed disclosure requires more features than are expressly recited in each claim. Rather, inventive aspects lie in less than all features of a single foregoing disclosed embodiment.
It is to be understood that the above-described arrangements are only illustrative of the application of the principles of the disclosure. Numerous modifications and alternative arrangements may be devised by those skilled in the art without departing from the spirit and scope of the disclosure and the appended claims are intended to cover such modifications and arrangements.
Thus, while the disclosure has been shown in the drawings and described above with particularity and detail, it will be apparent to those of ordinary skill in the art that numerous modifications, including, but not limited to, variations in size, materials, shape, form, function and manner of operation, assembly and use may be made without departing from the principles and concepts set forth herein.
Further, where appropriate, functions described herein can be performed in one or more of: hardware, software, firmware, digital components, or analog components. For example, one or more application specific integrated circuits (ASICs) can be programmed to carry out one or more of the systems and procedures described herein. Certain terms are used throughout the following description and Claims to refer to particular system components. As one skilled in the art will appreciate, components may be referred to by different names. This document does not intend to distinguish between components that differ in name, but not function.
This application is a continuation of U.S. patent application Ser. No. 15/351,222 filed Nov. 14, 2016, which is a continuation of U.S. patent application Ser. No. 14/214,791 filed Mar. 15, 2014 (now U.S. Pat. No. 9,492,060) and which claims the benefit of U.S. Provisional Application No. 61/791,186, filed Mar. 15, 2013, which are incorporated herein by reference in their entirety, including but not limited to those portions that specifically appear hereinafter, the incorporation by reference being made with the following exception: In the event that any portion of any of the above-referenced applications is inconsistent with this application, this application supersedes said above-referenced applications.
Number | Name | Date | Kind |
---|---|---|---|
4433675 | Konoshima | Feb 1984 | A |
5187572 | Nakamura et al. | Feb 1993 | A |
5196938 | Blessinger | Mar 1993 | A |
5241170 | Field, Jr. et al. | Aug 1993 | A |
5748234 | Lippincott | May 1998 | A |
5757011 | Whitebook et al. | May 1998 | A |
5784099 | Lippincott | Jul 1998 | A |
5868666 | Okada et al. | Feb 1999 | A |
6181830 | Sato | Jan 2001 | B1 |
6272269 | Naum | Aug 2001 | B1 |
6331156 | Haefele et al. | Dec 2001 | B1 |
6485414 | Neuberger | Nov 2002 | B1 |
6690466 | Miller et al. | Feb 2004 | B2 |
6692431 | Kazakevich | Feb 2004 | B2 |
6899675 | Cline et al. | May 2005 | B2 |
6921920 | Kazakevich | Jul 2005 | B2 |
6961461 | MacKinnon et al. | Nov 2005 | B2 |
6999118 | Suzuki | Feb 2006 | B2 |
7037259 | Hakamata et al. | May 2006 | B2 |
7189226 | Auld et al. | Mar 2007 | B2 |
7258663 | Doguchi et al. | Aug 2007 | B2 |
7540645 | Kazakevich | Jun 2009 | B2 |
7544163 | MacKinnon et al. | Jun 2009 | B2 |
7545434 | Bean et al. | Jun 2009 | B2 |
7794394 | Frangioni | Sep 2010 | B2 |
8100826 | MacKinnon et al. | Jan 2012 | B2 |
8675980 | Liege et al. | Mar 2014 | B2 |
8736007 | Wu et al. | May 2014 | B2 |
9107598 | Cheung et al. | Aug 2015 | B2 |
9492060 | Blanquart | Nov 2016 | B2 |
10477127 | Blanquart | Nov 2019 | B2 |
20010030744 | Chang | Oct 2001 | A1 |
20030222997 | Iketani | Dec 2003 | A1 |
20040109488 | Glukhovsky et al. | Jun 2004 | A1 |
20040135209 | Hsieh et al. | Jul 2004 | A1 |
20050234302 | MacKinnon et al. | Oct 2005 | A1 |
20060069314 | Farr | Mar 2006 | A1 |
20080045800 | Farr | Feb 2008 | A2 |
20090012361 | MacKinnon et al. | Jan 2009 | A1 |
20090046180 | Shibano et al. | Feb 2009 | A1 |
20090057559 | Koyama | Mar 2009 | A1 |
20090160976 | Chen et al. | Jun 2009 | A1 |
20090171159 | Jorgensen | Jul 2009 | A1 |
20090225158 | Kimoto | Sep 2009 | A1 |
20090234183 | Abe | Sep 2009 | A1 |
20090237498 | Modell | Sep 2009 | A1 |
20090292168 | Farr | Nov 2009 | A1 |
20100286475 | Robertson | Nov 2010 | A1 |
20110074943 | Modell | Mar 2011 | A1 |
20110181840 | Cobb | Jul 2011 | A1 |
20110237882 | Saito | Sep 2011 | A1 |
20110237884 | Saito | Sep 2011 | A1 |
20110251484 | Carpenter et al. | Oct 2011 | A1 |
20120004508 | McDowall et al. | Jan 2012 | A1 |
20120041267 | Benning et al. | Feb 2012 | A1 |
20120071720 | Banik | Mar 2012 | A1 |
20120078052 | Chenc | Mar 2012 | A1 |
20120262621 | Sato et al. | Oct 2012 | A1 |
20130016347 | Gono | Jan 2013 | A1 |
20130030247 | Kimoto | Jan 2013 | A1 |
20130035549 | Abe | Feb 2013 | A1 |
20130093912 | Uchida | Apr 2013 | A1 |
20130188144 | Makihira | Jul 2013 | A1 |
20140022365 | Yoshino | Jan 2014 | A1 |
20140200406 | Bennett et al. | Jul 2014 | A1 |
20170064231 | Blanquart | Mar 2017 | A1 |
Number | Date | Country |
---|---|---|
2018824 | Jan 2009 | EP |
2018824 | Jul 2010 | EP |
2018824 | Jul 2010 | EP |
2005-211231 | Aug 2005 | JP |
2012-139446 | Jul 2012 | JP |
Entry |
---|
Computer generated English translation of JP 2006-314616. |
Number | Date | Country | |
---|---|---|---|
20200084400 A1 | Mar 2020 | US |
Number | Date | Country | |
---|---|---|---|
61791186 | Mar 2013 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15351222 | Nov 2016 | US |
Child | 16680291 | US | |
Parent | 14214791 | Mar 2014 | US |
Child | 15351222 | US |