System and method for producing and improving images

Information

  • Patent Grant
  • 8197399
  • Patent Number
    8,197,399
  • Date Filed
    Monday, May 21, 2007
    17 years ago
  • Date Issued
    Tuesday, June 12, 2012
    12 years ago
Abstract
A method for displaying images includes adjusting at least one characteristic of an image from a first imaging device of an endoscope to match at least one corresponding characteristic of an image from a second imaging device of the endoscope. The at least one characteristic may be one or more of color, contrast and brightness. An endoscopic system includes an endoscope including a first imaging device and a second imaging device, and a display device that displays an image from the first imaging device of the endoscope and an image from the second imaging device of the endoscope, wherein the images are sized so that an object, when placed at the same distance from the imaging devices, appears to have about the same size in the images.
Description
TECHNICAL FIELD OF THE INVENTION

The present invention relates to a system and method for producing and improving images.


BACKGROUND OF THE INVENTION

Multiple endoscopic devices with multiple cameras and light sources may be used for medical procedures, inspection of small pipes, or remote monitoring. For example, such an endoscopic device may be a medical endoscope comprising a flexible tube, and a camera and a light source mounted on the distal end of the flexible tube. The endoscope is insertable into an internal body cavity through a body orifice to examine the body cavity and tissues for diagnosis. The tube of the endoscope has one or more longitudinal channels, through which an instrument can reach the body cavity to take samples of suspicious tissues or to perform other surgical procedures such as polypectomy.


There are many types of endoscopes, and they are named in relation to the organs or areas with which they are used. For example, gastroscopes are used for examination and treatment of the esophagus, stomach and duodenum; colonoscopes for the colon; bronchoscopes for the bronchi; laparoscopes for the peritoneal cavity; sigmoidoscopes for the rectum and the sigmoid colon; arthroscopes for joints; cystoscopes for the urinary bladder; and angioscopes for the examination of blood vessels.


Each endoscope has a single forward viewing camera mounted at the distal end of the flexible tube to transmit an image to an eyepiece or video camera at the proximal end. The camera is used to assist a medical professional in advancing the endoscope into a body cavity and looking for abnormalities. The camera provides the medical professional with a two-dimensional view from the distal end of the endoscope. To capture an image from a different angle or in a different portion, the endoscope must be repositioned or moved back and forth. Repositioning and movement of the endoscope prolongs the procedure and causes added discomfort, complications, and risks to the patient. Additionally, in an environment similar to the lower gastro-intestinal tract, flexures, tissue folds and unusual geometries of the organ may prevent the endoscope's camera from viewing all areas of the organ. The unseen area may cause a potentially malignant (cancerous) polyp to be missed.


This problem can be overcome by providing an auxiliary camera and an auxiliary light source. The auxiliary camera and light source can be oriented to face the main camera and light source, thus providing an image of areas not viewable by the endoscope's main camera. This arrangement of cameras and light sources can provide both front and rear views of an area or an abnormality. In the case of polypectomy where a polyp is excised by placing a wire loop around the base of the polyp, the camera arrangement allows better placement of the wire loop to minimize damage to the adjacent healthy tissue.


SUMMARY OF THE INVENTION

The present invention relates to devices and methods for producing and improving video images generated by the imaging devices of endoscopes.


In accordance with one aspect of the invention, a method for displaying images includes adjusting at least one characteristic of an image from a first imaging device of an endoscope to match at least one corresponding characteristic of an image from a second imaging device of the endoscope. The characteristic may be one or more of color, contrast and brightness.


In a preferred embodiment, the adjusting step includes creating a histogram for each of RGB colors for the image from the first imaging device and a histogram for each of the RGB colors for the image from the second imaging device; adjusting the gamut of each histogram of the image from the first imaging device to match the gamut of the corresponding histogram of the image from the second imaging device; and using gamma coefficients to adjust a color level of each histogram of the image from the first imaging device to match a color level of the corresponding histogram of the image from the second imaging device.


In accordance with another aspect of the invention, a method for displaying images includes placing, side by side, an image from a first imaging device of an endoscope and an image from a second imaging device of the endoscope, wherein the imaging devices face each other; and reversing one of the images left for right.


In accordance with still another aspect of the invention, a method for sizing images includes placing an image from a first imaging device of an endoscope and an image from a second imaging device of the endoscope on a display device; and sizing the images so that an object, when placed at the same distance from the imaging devices, appears to have about the same size in the images.


In accordance with yet another aspect of the invention, a method for processing images includes placing image data from first and second imaging devices of an endoscope in one computer file for simultaneous display on a display device. Preferably, the image data from the imaging devices are time-correlated.


In a preferred embodiment, patient information data is also placed in the computer file for simultaneous display with the images on the display device.


In a further preferred embodiment, a time stamp is placed in the computer file for simultaneous display with the images and patient information data on the display device.


In accordance with still yet another aspect of the invention, an endoscopic system includes an endoscope that has a first imaging device and a second imaging device, and a controller that adjusts at least one characteristic of an image from the first imaging device of the endoscope to match at least one corresponding characteristic of an image from the second imaging device of the endoscope. The at least one characteristic may be one or more of color, contrast and brightness.


In a preferred embodiment, the controller creates a histogram for each of RGB colors for the image from the first imaging device and a histogram for each of the RGB colors for the image from the second imaging device; adjusts the gamut of each histogram of the image from the first imaging device to match the gamut of the corresponding histogram of the image from the second imaging device; and uses gamma coefficients to adjust a color level of each histogram of the image from the first imaging device to match a color level of the corresponding histogram of the image from the second imaging device.


In accordance with a further aspect of the invention, an endoscopic system includes an endoscope including a first imaging device and a second imaging device, and a display device that displays, side by side, an image from the first imaging device of the endoscope and an image from the second imaging device of the endoscope, wherein the imaging devices face each other, and wherein one of the images is reversed left for right.


In accordance with a still further aspect of the invention, an endoscopic system includes an endoscope including a first imaging device and a second imaging device, and a display device that displays an image from the first imaging device of the endoscope and an image from the second imaging device of the endoscope, wherein the images are sized so that an object, when placed at the same distance from the imaging devices, appears to have about the same size in the images.


In accordance with a yet further aspect of the invention, an endoscopic system includes an endoscope including a first imaging device and a second imaging device, and a controller that places image data from the first and second imaging devices of the endoscope in one computer file for simultaneous display on a display device. Preferably, the image data from the imaging devices are time-correlated.


In a preferred embodiment, patient information data is also placed in the computer file for simultaneous display with the images on the display device.


In a further preferred embodiment, a time stamp is placed in the computer file for simultaneous display with the images and patient information data on the display device.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 shows a perspective view of an endoscope with an imaging assembly according to one embodiment of the present invention.



FIG. 2 shows a perspective view of the distal end of an insertion tube of the endoscope of FIG. 1.



FIG. 3 shows a perspective view of the imaging assembly shown in FIG. 1.



FIG. 4 shows a perspective view of the distal ends of the endoscope and imaging assembly of FIG. 1.



FIG. 5 shows a schematic representation of a display device used with the endoscope of FIG. 1.



FIG. 6 shows a schematic representation of a screen showing two images and patient information.





DETAILED DESCRIPTION OF EMBODIMENTS OF THE INVENTION


FIG. 1 illustrates an exemplary endoscope 10 of the present invention. This endoscope 10 can be used in a variety of medical procedures in which imaging of a body tissue, organ, cavity or lumen is required. The types of procedures include, for example, anoscopy, arthroscopy, bronchoscopy, colonoscopy, cystoscopy, EGD, laparoscopy, and sigmoidoscopy.


The endoscope 10 of FIG. 1 includes an insertion tube 12 and an imaging assembly 14, a section of which is housed inside the insertion tube 12. As shown in FIG. 2, the insertion tube 12 has two longitudinal channels 16. In general, however, the insertion tube 12 may have any number of longitudinal channels. An instrument can reach the body cavity through one of the channels 16 to perform any desired procedures, such as to take samples of suspicious tissues or to perform other surgical procedures such as polypectomy. The instruments may be, for example, a retractable needle for drug injection, hydraulically actuated scissors, clamps, grasping tools, electrocoagulation systems, ultrasound transducers, electrical sensors, heating elements, laser mechanisms and other ablation means. In some embodiments, one of the channels can be used to supply a washing liquid such as water for washing. Another or the same channel may be used to supply a gas, such as CO2 or air into the organ. The channels 16 may also be used to extract fluids or inject fluids, such as a drug in a liquid carrier, into the body. Various biopsy, drug delivery, and other diagnostic and therapeutic devices may also be inserted via the channels 16 to perform specific functions.


The insertion tube 12 preferably is steerable or has a steerable distal end region 18 as shown in FIG. 1. The length of the distal end region 18 may be any suitable fraction of the length of the insertion tube 12, such as one half, one third, one fourth, one sixth, one tenth, or one twentieth. The insertion tube 12 may have control cables (not shown) for the manipulation of the insertion tube 12. Preferably, the control cables are symmetrically positioned within the insertion tube 12 and extend along the length of the insertion tube 12. The control cables may be anchored at or near the distal end 36 of the insertion tube 12. Each of the control cables may be a Bowden cable, which includes a wire contained in a flexible overlying hollow tube. The wires of the Bowden cables are attached to controls 20 in the handle 22. Using the controls 20, the wires can be pulled to bend the distal end region 18 of the insertion tube 12 in a given direction. The Bowden cables can be used to articulate the distal end region 18 of the insertion tube 12 in different directions.


As shown in FIG. 1, the endoscope 10 may also include a control handle 22 connected to the proximal end 24 of the insertion tube 12. Preferably, the control handle 22 has one or more ports and/or valves (not shown) for controlling access to the channels 16 of the insertion tube 12. The ports and/or valves can be air or water valves, suction valves, instrumentation ports, and suction/instrumentation ports. As shown in FIG. 1, the control handle 22 may additionally include buttons 26 for taking pictures with an imaging device on the insertion tube 12, the imaging assembly 14, or both. The proximal end 28 of the control handle 22 may include an accessory outlet 30 (FIG. 1) that provides fluid communication between the air, water and suction channels and the pumps and related accessories. The same outlet 30 or a different outlet can be used for electrical lines to light and imaging components at the distal end of the endoscope 10.


As shown in FIG. 2, the endoscope 10 may further include an imaging device 32 and light sources 34, both of which are disposed at the distal end 36 of the insertion tube 12. The imaging device 32 may include, for example, a lens, single chip sensor, multiple chip sensor or fiber optic implemented devices. The imaging device 32, in electrical communication with a processor and/or monitor, may provide still images or recorded or live video images. The light sources 34 preferably are equidistant from the imaging device 32 to provide even illumination. The intensity of each light source 34 can be adjusted to achieve optimum imaging. The circuits for the imaging device 32 and light sources 34 may be incorporated into a printed circuit board (PCB).


As shown in FIGS. 3 and 4, the imaging assembly 14 may include a tubular body 38, a handle 42 connected to the proximal end 40 of the tubular body 38, an auxiliary imaging device 44, a link 46 that provides physical and/or electrical connection between the auxiliary imaging device 44 to the distal end 48 of the tubular body 38, and an auxiliary light source 50 (FIG. 4). The auxiliary light source 50 may be an LED device.


As shown in FIG. 4, the imaging assembly 14 of the endoscope 10 is used to provide an auxiliary imaging device at the distal end of the insertion tube 12. To this end, the imaging assembly 14 is placed inside one of the channels 16 of the endoscope's insertion tube 12 with its auxiliary imaging device 44 disposed beyond the distal end 36 of the insertion tube 12. This can be accomplished by first inserting the distal end of the imaging assembly 14 into the insertion tube's channel 16 from the endoscope's handle 18 and then pushing the imaging assembly 14 further into the assembly 14 until the auxiliary imaging device 44 and link 46 of the imaging assembly 14 are positioned outside the distal end 36 of the insertion tube 12 as shown in FIG. 4.


Each of the main and auxiliary imaging devices 32, 44 may be an electronic device which converts light incident on photosensitive semiconductor elements into electrical signals. The imaging sensor may detect either color or black-and-white images. The signals from the imaging sensor can be digitized and used to reproduce an image that is incident on the imaging sensor. Two commonly used types of image sensors are Charge Coupled Devices (CCD) such as a VCC-5774 produced by Sanyo of Osaka, Japan and Complementary Metal Oxide Semiconductor (CMOS) camera chips such as an OVT 6910 produced by OmniVision of Sunnyvale, Calif. Preferably, the main imaging device 32 is a CCD imaging device, and the auxiliary imaging device 44 is a CMOS imaging device.


When the imaging assembly 14 is properly installed in the insertion tube 12, the auxiliary imaging device 44 of the imaging assembly 14 preferably faces backwards towards the main imaging device 32 as illustrated in FIG. 4. The auxiliary imaging device 44 may be oriented so that the auxiliary imaging device 44 and the main imaging device 32 have adjacent or overlapping viewing areas. Alternatively, the auxiliary imaging device 44 may be oriented so that the auxiliary imaging device 44 and the main imaging device 32 simultaneously provide different views of the same area. Preferably, the auxiliary imaging device 44 provides a retrograde view of the area, while the main imaging device 32 provides a front view of the area. However, the auxiliary imaging device 44 could be oriented in other directions to provide other views, including views that are substantially parallel to the axis of the main imaging device 32.


As shown in FIG. 4, the link 46 connects the auxiliary imaging device 44 to the distal end 48 of the tubular body 38. Preferably, the link 46 is a flexible link that is at least partially made from a flexible shape memory material that substantially tends to return to its original shape after deformation. Shape memory materials are well known and include shape memory alloys and shape memory polymers. A suitable flexible shape memory material is a shape memory alloy such as nitinol. The flexible link 46 is straightened to allow the distal end of the imaging assembly 14 to be inserted into the proximal end of assembly 14 of the insertion tube 12 and then pushed towards the distal end 36 of the insertion tube 12. When the auxiliary imaging device 44 and flexible link 46 are pushed sufficiently out of the distal end 36 of the insertion tube 12, the flexible link 46 resumes its natural bent configuration as shown in FIG. 3. The natural configuration of the flexible link 46 is the configuration of the flexible link 46 when the flexible link 46 is not subject to any force or stress. When the flexible link 46 resumes its natural bent configuration, the auxiliary imaging device 44 faces substantially back towards the distal end 36 of the insertion tube 12 as shown in FIG. 5.


In the illustrated embodiment, the auxiliary light source 50 of the imaging assembly 14 is placed on the flexible link 46, in particular on the curved concave portion of the flexible link 46. The auxiliary light source 50 provides illumination for the auxiliary imaging device 44 and may face substantially the same direction as the auxiliary imaging device 44 as shown in FIG. 4.


The endoscope of the present invention, such as the endoscope 10 shown in FIG. 1, may be part of an endoscope system that may also include a controller 52 and a display device 54, as shown in FIG. 5. In the preferred embodiment shown in FIG. 5, the controller 52 is connected to the main and auxiliary imaging devices 32, 44 to receive image data. The controller 52 may be used to process the image data and transmit the processed image data to the display device 54. The term “controller” as used in this specification is broadly defined. In some embodiments, for example, the controller may simply be a signal processing unit.


In the embodiment shown in FIG. 5, the display device 54 displays, side by side, the image 56 from the main imaging device 32 and the image 58 from the auxiliary imaging device 44. In the present invention, the images may also be displayed on different display devices, and the term “side by side” may simply mean that the two images are positioned so that they can be viewed by the same operator during a medical procedure. The controller 52 preferably incorporates the image data from the main and auxiliary imaging devices 32, 44 into a single signal and sends the signal to the display device 54. In some embodiments, the display device 54 includes a wide screen display with a 16:9 aspect ratio. Preferably, the two images 56, 58 are sized appropriately for display on the wide screen display. For example, the image 56 from the main imaging device 32 may be displayed about 1.5 times larger than the image 58 from the auxiliary imaging device 44. This sizing ratio may also be used to balance the resolution of the two images, as well as to take into account the different aspect ratios of the two images 56, 58. For example, the image 56 from the main imaging device 32 may be displayed with a 1:1 aspect ratio, while the image 58 from the auxiliary imaging device 44 may have a 4:3 aspect ration. The images 56, 58 may also be sized so that the same object, when placed at the same distance from the imaging devices 32, 44, appears to have about the same size in the images 56, 58. The images 56, 58 shown in FIG. 5 are not drawn to scale.


As shown in FIG. 5, one of the images 56, 58 on the display device 54 may be reversed from left for right. With this arrangement, an object 60 that appears on the left side of one image 56 also appears on the left side of the other image 58. Similarly, an object 62 that appears on the right side of one image 56 also appears on the right side of the other image 58, 56. Additionally, when an object moves from the left side of one of the images 56, 58 to the right side, the same object also moves from the left side of the other image 56, 58 to the right side. And, if the object in one image appears to rotated clockwise, the object will appear to rotate clockwise in the other image. Furthermore, the movements of the imaging devices 32, 44 appear to be coordinated. This arrangement makes it easier for an operator to observe, identify and correlate the objects and their movements in both images 56, 58.


Preferably, the data for the two images 56, 58 and possibly other data 64, such as patient information data or a time stamp, are stored in one computer file. In some cases, the patent information may be associated with one of the two images 56, 58. Preferably, the stored images 56, 58 and possibly other data 64 are time-correlated (i.e., they are captured at the same time). For example, as shown in FIG. 6, the two images 56, 58 and possibly other data 64 may be incorporated into one screen 66 in an image file. In some embodiments, the two images 56, 58 and possibly other data 64 may be captured in one jpeg file.


In some preferred embodiments, one or more characteristics of one image 56, 58 may be adjusted to match the same or similar one or more characteristics of the other image 58, 56, so that the images 56, 58 and the objects in the images 56, 58 have similar appearances. The characteristics may include, for example, color, contrast, and brightness. In one example, one or more characteristics of the auxiliary imaging device's image 58 are adjusted to match those of the main imaging device's image 56. Matched images make it easier for an operator to observe, identify and correlate the objects in the images.


In one preferred embodiment, the following technique is used to adjust the characteristics of the auxiliary imaging device's image 58 to match those of the main imaging device's image 56. First, a histogram for each of the RGB colors is created for the auxiliary imaging device's image 58 (called “current file”). The image used to create the histograms may be an average of the past images, such as the past two to ten images, preferably the past four images. And a histogram for each of the RGB colors is created also for the main imaging device's image 56 (called “master file”). This histogram may be the average of the histograms of the past images, such as the histograms of the past two to ten images, preferably the histograms of the past four images.


Second, a minimum and maximum is determined for each histogram by means of thresholding. Then a clip and gain is set for each histogram of the auxiliary imaging device's image to equalize its color gamut to that of the corresponding histogram of the main imaging device's image. In particular, the minimum and maximum for each histogram of the auxiliary imaging device's image are adjusted to match those for the corresponding histogram of the main imaging device's image.


Finally, gamma coefficients are used to adjust the color levels of the histograms of the auxiliary imaging device's image to match those of the histograms of the main imaging device's image. The equations for the gamma coefficients are:

red_gamma_color_balance=(current_profile.m_AverageRed*Master_average)/(master_profile.m_AverageRed*current_average);
green_gamma_color_balance=(current_profile.m_AverageGreen*Master_average)/(master_profile.m_AverageGreen*current_average); and
blue_gamma_color_balance=(current_profile.m_AverageBlue*master_average)/(master_profile.m_AverageBlue*current_average)

Gamma coefficients are used because they are simple and convenient and preserve black and white points and because the code can be re-used for conventional gamma correction.


Additional processing of the images, such as sharpening, frame averaging, and noise reduction, may be performed.


The images described above may be still pictures or continuous video images (such as television images). When the images are video images, in the embodiment of the invention in which one or more characteristics of one image are adjusted to match those of another image, the characteristics are adjusted continuously in real time (i.e., dynamically). For example, the characteristics of the video image may be adjusted for every frame of the image. The reason for real time adjustment is that the video images are changing constantly as the lighting, object distance or tissue color varies.


The implementation of the above-described features may be performed digitally by software or firmware in the controller. Alternately, the image manipulation can be performed by hardware image chipsets, FPGAs or other electrical circuitry. These image manipulation techniques are well known in the field of graphic and video processing and will not be described in detail.


Although in the preferred embodiments described above, the images are from the main and auxiliary imaging devices of the same endoscope, the images may also come from imaging devices of different endoscopes such as laparoscopes. For example, when two laparoscopes are used during a procedure, the images from the laparoscopes may have different characteristics due to, for example, different imaging device types, different manufacturing techniques, or differences in lighting sensitivities. The controller that receives images from the laparoscopes may designate any one of the images as a master and then match the second image to the master image. In this way the operator is able to conduct a procedure with consistent visualization across the laparoscopes. Additionally, the present invention may be used with three or more images from two or more endoscopes. For example, two images may be adjusted to match a third image.


While particular embodiments of the present invention have been shown and described, it will be obvious to those skilled in the art that changes and modifications can be made without departing from this invention in its broader aspects. Therefore, the appended claims are to encompass within their scope all such changes and modifications as fall within the true spirit and scope of this invention.

Claims
  • 1. A method for matching image characteristics, comprising: acquiring a first image from a first imaging sensor of an endoscope;acquiring a second image from a second imaging sensor, wherein the second imaging sensor faces the first imaging sensor; andusing a controller that has been pre-programmed with an algorithm to dynamically adjust images in real-time to adjust at least one characteristic of the second image to match at least one corresponding characteristic of the first image.
  • 2. The method of claim 1, wherein the characteristic is color.
  • 3. The method of claim 1, wherein the characteristic is contrast.
  • 4. The method of claim 1, wherein the characteristic is brightness.
  • 5. The method of claim 1, wherein the at least one characteristic includes first and second characteristics.
  • 6. The method of claim 5, wherein the first and second characteristics are color and contrast.
  • 7. The method of claim 5, wherein the first and second characteristics are color and brightness.
  • 8. The method of claim 5, wherein the first and second characteristics are contrast and brightness.
  • 9. The method of claim 5, wherein the at least one characteristic includes first, second and third characteristics.
  • 10. The method of claim 9, wherein the first, second and third characteristics are color, contrast and brightness.
  • 11. The method of claim 1, wherein the adjustment is performed continuously in real time.
  • 12. The method of claim 1, wherein the algorithm includes: creating a histogram for each of RGB colors for the first image and a histogram for each of the RGB colors for the second image;adjusting the gamut of each histogram of the second image to match the gamut of the corresponding histogram of the first image; andusing gamma coefficients to adjust a color level of each histogram of the second image to match a color level of the corresponding histogram of the first image.
  • 13. An endoscopic system comprising: a first imaging sensor of an endoscope;a second imaging sensor, wherein the second imaging sensor faces the first imaging sensor; anda controller that has been pre-programmed with an algorithm to dynamically adjust, in real-time, at least one characteristic of an image from the second imaging sensor to match at least one corresponding characteristic of an image from the first imaging sensor of the endoscope.
  • 14. The system of claim 13, wherein the characteristic is color.
  • 15. The system of claim 13, wherein the characteristic is contrast.
  • 16. The system of claim 13, wherein the characteristic is brightness.
  • 17. The system of claim 13, wherein the at least one characteristic includes first and second characteristics.
  • 18. The system of claim 17, wherein the first and second characteristics are color and contrast.
  • 19. The method of claim 17, wherein the first and second characteristics are color and brightness.
  • 20. The system of claim 17, wherein the first and second characteristics are contrast and brightness.
  • 21. The system of claim 17, wherein the at least one characteristic includes first, second and third characteristics.
  • 22. The system of claim 21, wherein the first, second and third characteristics are color, contrast and brightness.
  • 23. The system of claim 13, wherein the controller algorithm includes: creating histogram for each of RGB colors for the image from the first imaging sensor and a histogram for each of the RGB colors for the image from the second imaging sensor;adjusting the gamut of each histogram of the image from the second imaging sensor to match the gamut of the corresponding histogram of the image from the first imaging sensor; andusing gamma coefficients to adjust a color level of each histogram of the image from the second imaging sensor to match a color level of the corresponding histogram of the image from the first imaging sensor.
Parent Case Info

This application claims the benefit of U.S. Provisional Patent Application No. 60/801,748, filed May 19, 2006, the entire disclosure of which is incorporated herein by reference.

US Referenced Citations (316)
Number Name Date Kind
3437747 Sheldon Apr 1969 A
3610231 Takahashi et al. Oct 1971 A
3643653 Takahashi et al. Feb 1972 A
3739770 Mori Jun 1973 A
3889662 Mitsui Jun 1975 A
3897775 Furihata Aug 1975 A
3918438 Hayamizu et al. Nov 1975 A
4261344 Moore et al. Apr 1981 A
4351587 Matsuo et al. Sep 1982 A
4398811 Nishioka et al. Aug 1983 A
4494549 Namba et al. Jan 1985 A
4573450 Arakawa Mar 1986 A
4586491 Carpenter May 1986 A
4625236 Fujimori et al. Nov 1986 A
4646722 Silverstein et al. Mar 1987 A
4699463 D'Amelio et al. Oct 1987 A
4721097 D'Amelio Jan 1988 A
4727859 Lia Mar 1988 A
4741326 Sidall et al. May 1988 A
4790295 Tashiro Dec 1988 A
4800870 Reid, Jr. Jan 1989 A
4825850 Opie et al. May 1989 A
4836211 Sekino et al. Jun 1989 A
4846154 MacAnally et al. Jul 1989 A
4852551 Opie et al. Aug 1989 A
4853773 Hibino et al. Aug 1989 A
4862873 Yajima et al. Sep 1989 A
4867138 Kubota et al. Sep 1989 A
4869238 Opie et al. Sep 1989 A
4870488 Ikuno et al. Sep 1989 A
4873572 Miyazaki et al. Oct 1989 A
4873965 Danieli Oct 1989 A
4884133 Kanno et al. Nov 1989 A
4899732 Cohen Feb 1990 A
4905667 Foerster et al. Mar 1990 A
4907395 Opie et al. Mar 1990 A
4911148 Sosnowski et al. Mar 1990 A
4911564 Baker Mar 1990 A
4926258 Sasaki May 1990 A
4947827 Opie et al. Aug 1990 A
4947828 Carpenter et al. Aug 1990 A
4979496 Komi Dec 1990 A
4991565 Takahashi et al. Feb 1991 A
5019040 Itaoka et al. May 1991 A
5025778 Silverstein et al. Jun 1991 A
5026377 Burton et al. Jun 1991 A
5050585 Takahashi Sep 1991 A
RE34100 Opie et al. Oct 1992 E
RE34110 Opie et al. Oct 1992 E
5159446 Hibino et al. Oct 1992 A
5166787 Irion Nov 1992 A
5178130 Kaiya et al. Jan 1993 A
5187572 Nakamura et al. Feb 1993 A
5193525 Silverstein et al. Mar 1993 A
5196928 Karasawa et al. Mar 1993 A
5253638 Tamburrino et al. Oct 1993 A
5260780 Staudt, III Nov 1993 A
5271381 Ailinger et al. Dec 1993 A
5305121 Moll Apr 1994 A
5318031 Mountford et al. Jun 1994 A
5329887 Ailinger et al. Jul 1994 A
5337734 Saab Aug 1994 A
5381784 Adair Jan 1995 A
5398685 Wilk et al. Mar 1995 A
5406938 Mersch et al. Apr 1995 A
5434669 Tabata et al. Jul 1995 A
5443781 Saab Aug 1995 A
5447148 Oneda et al. Sep 1995 A
5483951 Frassica et al. Jan 1996 A
5494483 Adair Feb 1996 A
5518501 Oneda et al. May 1996 A
5520607 Frassica et al. May 1996 A
5530238 Meulenbrugge et al. Jun 1996 A
5533496 De Faria-Correa et al. Jul 1996 A
5536236 Yabe et al. Jul 1996 A
5556367 Yabe et al. Sep 1996 A
5613936 Czarnek et al. Mar 1997 A
5614943 Nakamura et al. Mar 1997 A
5626553 Frassica et al. May 1997 A
5634466 Gruner Jun 1997 A
5653677 Okada et al. Aug 1997 A
5667476 Frassica et al. Sep 1997 A
5679216 Takayama et al. Oct 1997 A
5681260 Ueda et al. Oct 1997 A
5682199 Lankford Oct 1997 A
5685822 Harhen Nov 1997 A
5692729 Harhen Dec 1997 A
5696850 Parulski et al. Dec 1997 A
5702348 Harhen Dec 1997 A
5706128 Greenberg Jan 1998 A
5711299 Manwaring et al. Jan 1998 A
5722933 Yabe et al. Mar 1998 A
5752912 Takahashi et al. May 1998 A
5762603 Thompson Jun 1998 A
5817061 Goodwin et al. Oct 1998 A
5827177 Oneda et al. Oct 1998 A
5833603 Kovacs et al. Nov 1998 A
5843103 Wulfman Dec 1998 A
5843460 Labigne et al. Dec 1998 A
5860914 Chiba et al. Jan 1999 A
5876329 Harhen Mar 1999 A
5916147 Boury Jun 1999 A
5924977 Yabe et al. Jul 1999 A
5938587 Taylor et al. Aug 1999 A
5982932 Prokoski Nov 1999 A
5989182 Hori et al. Nov 1999 A
5989224 Exline et al. Nov 1999 A
6017358 Yoon Jan 2000 A
6026323 Skladnev et al. Feb 2000 A
6066090 Yoon May 2000 A
6099464 Shimizu et al. Aug 2000 A
6099466 Sano et al. Aug 2000 A
6099485 Patterson Aug 2000 A
6106463 Wilk Aug 2000 A
6174280 Oneda et al. Jan 2001 B1
6190330 Harhen Feb 2001 B1
6214028 Yoon et al. Apr 2001 B1
6261226 McKenna et al. Jul 2001 B1
6261307 Yoon et al. Jul 2001 B1
6277064 Yoon Aug 2001 B1
6296608 Daniels et al. Oct 2001 B1
6301047 Hoshino et al. Oct 2001 B1
6350231 Ailinger et al. Feb 2002 B1
6369855 Chauvel et al. Apr 2002 B1
6375653 Desai Apr 2002 B1
6387043 Yoon May 2002 B1
6433492 Buonavita Aug 2002 B1
6456684 Mun et al. Sep 2002 B1
6461294 Oneda et al. Oct 2002 B1
6482149 Torii Nov 2002 B1
6527704 Chang et al. Mar 2003 B1
6547724 Soble et al. Apr 2003 B1
6554767 Tanaka Apr 2003 B2
6564088 Soller et al. May 2003 B1
6640017 Tsai et al. Oct 2003 B1
6648816 Irion et al. Nov 2003 B2
6683716 Costales Jan 2004 B1
6687010 Horii et al. Feb 2004 B1
6697536 Yamada Feb 2004 B1
6699180 Kobayashi Mar 2004 B2
6736773 Wendlandt et al. May 2004 B2
6748975 Hartshorne et al. Jun 2004 B2
6796939 Konomura et al. Sep 2004 B1
6833871 Merrill et al. Dec 2004 B1
6845190 Smithwick et al. Jan 2005 B1
6891977 Gallagher May 2005 B2
6916286 Kazakevich Jul 2005 B2
6928314 Johnson et al. Aug 2005 B1
6929636 von Alten Aug 2005 B1
6947784 Zalis Sep 2005 B2
6951536 Yokoi et al. Oct 2005 B2
6965702 Gallagher Nov 2005 B2
6966906 Brown Nov 2005 B2
6974411 Belson Dec 2005 B2
6997871 Sonnenschein et al. Feb 2006 B2
7004900 Wendlandt et al. Feb 2006 B2
7029435 Nakao Apr 2006 B2
7041050 Ronald May 2006 B1
7095548 Cho et al. Aug 2006 B1
7103228 Kraft et al. Sep 2006 B2
7116352 Yaron Oct 2006 B2
7173656 Dunton et al. Feb 2007 B1
7228004 Gallagher et al. Jun 2007 B2
7280141 Frank et al. Oct 2007 B1
7317458 Wada Jan 2008 B2
7322934 Miyake et al. Jan 2008 B2
7341555 Ootawara et al. Mar 2008 B2
7362911 Frank Apr 2008 B1
7405877 Schechterman Jul 2008 B1
7435218 Krattiger et al. Oct 2008 B2
7436562 Nagasawa et al. Oct 2008 B2
7507200 Okada Mar 2009 B2
7551196 Ono et al. Jun 2009 B2
7556599 Rovegno Jul 2009 B2
7561190 Deng et al. Jul 2009 B2
7621869 Ratnakar Nov 2009 B2
7646520 Funaki et al. Jan 2010 B2
7678043 Gilad Mar 2010 B2
7683926 Schechterman et al. Mar 2010 B2
7749156 Ouchi Jul 2010 B2
7825964 Hoshino et al. Nov 2010 B2
7864215 Carlsson et al. Jan 2011 B2
7910295 Hoon et al. Mar 2011 B2
7927272 Bayer et al. Apr 2011 B2
8009167 Dekel et al. Aug 2011 B2
8064666 Bayer Nov 2011 B2
8070743 Kagan et al. Dec 2011 B2
20010007468 Sugimoto et al. Jul 2001 A1
20010037052 Higuchi et al. Nov 2001 A1
20010051766 Gazdzinski Dec 2001 A1
20010056238 Tsujita Dec 2001 A1
20020026188 Balbierz et al. Feb 2002 A1
20020039400 Kaufman et al. Apr 2002 A1
20020089584 Abe Jul 2002 A1
20020095168 Griego et al. Jul 2002 A1
20020099267 Wendlandt et al. Jul 2002 A1
20020101546 Sharp et al. Aug 2002 A1
20020110282 Kraft et al. Aug 2002 A1
20020115908 Farkas et al. Aug 2002 A1
20020156347 Kim et al. Oct 2002 A1
20020193662 Belson Dec 2002 A1
20030004399 Belson Jan 2003 A1
20030011768 Jung et al. Jan 2003 A1
20030032863 Kazakevich Feb 2003 A1
20030040668 Kaneko et al. Feb 2003 A1
20030045778 Ohline et al. Mar 2003 A1
20030065250 Chiel et al. Apr 2003 A1
20030088152 Takada May 2003 A1
20030093031 Long et al. May 2003 A1
20030093088 Long et al. May 2003 A1
20030103199 Jung et al. Jun 2003 A1
20030105386 Voloshin et al. Jun 2003 A1
20030120130 Glukhovsky Jun 2003 A1
20030125630 Furnish Jul 2003 A1
20030125788 Long Jul 2003 A1
20030130711 Pearson et al. Jul 2003 A1
20030153866 Long et al. Aug 2003 A1
20030161545 Gallagher Aug 2003 A1
20030167007 Belson Sep 2003 A1
20030171650 Tartaglia et al. Sep 2003 A1
20030176767 Long et al. Sep 2003 A1
20030179302 Harada et al. Sep 2003 A1
20030187326 Chang Oct 2003 A1
20030195545 Hermann et al. Oct 2003 A1
20030197781 Sugimoto et al. Oct 2003 A1
20030197793 Mitsunaga et al. Oct 2003 A1
20030225433 Nakao Dec 2003 A1
20030233115 Eversull et al. Dec 2003 A1
20040023397 Vig et al. Feb 2004 A1
20040034278 Adams Feb 2004 A1
20040049096 Adams Mar 2004 A1
20040059191 Krupa et al. Mar 2004 A1
20040080613 Moriyama Apr 2004 A1
20040097790 Farkas et al. May 2004 A1
20040109164 Horii et al. Jun 2004 A1
20040111019 Long Jun 2004 A1
20040122291 Takahashi Jun 2004 A1
20040141054 Mochida et al. Jul 2004 A1
20040158124 Okada Aug 2004 A1
20040207618 Williams et al. Oct 2004 A1
20040242987 Liew et al. Dec 2004 A1
20050010084 Tsai Jan 2005 A1
20050014996 Konomura et al. Jan 2005 A1
20050020918 Wilk et al. Jan 2005 A1
20050020926 Wiklof et al. Jan 2005 A1
20050038317 Ratnakar Feb 2005 A1
20050038319 Goldwasser et al. Feb 2005 A1
20050068431 Mori Mar 2005 A1
20050085693 Belson et al. Apr 2005 A1
20050085790 Guest et al. Apr 2005 A1
20050096502 Khalili May 2005 A1
20050154278 Cabiri et al. Jul 2005 A1
20050165272 Okada et al. Jul 2005 A1
20050165279 Adler et al. Jul 2005 A1
20050177024 Mackin Aug 2005 A1
20050203420 Kleen et al. Sep 2005 A1
20050215911 Alfano et al. Sep 2005 A1
20050222500 Itoi Oct 2005 A1
20050228224 Okada et al. Oct 2005 A1
20050267361 Younker et al. Dec 2005 A1
20050272975 McWeeney et al. Dec 2005 A1
20050272977 Saadat et al. Dec 2005 A1
20060044267 Xie et al. Mar 2006 A1
20060052709 DeBaryshe et al. Mar 2006 A1
20060058584 Hirata Mar 2006 A1
20060106286 Wendlandt et al. May 2006 A1
20060149127 Seddiqui et al. Jul 2006 A1
20060149129 Watts et al. Jul 2006 A1
20060183975 Saadat et al. Aug 2006 A1
20060217594 Ferguson Sep 2006 A1
20060279632 Anderson Dec 2006 A1
20060285766 Ali Dec 2006 A1
20060293562 Uchimura et al. Dec 2006 A1
20070015967 Boulais et al. Jan 2007 A1
20070015989 Desai et al. Jan 2007 A1
20070083081 Schlagenhauf et al. Apr 2007 A1
20070103460 Zhang et al. May 2007 A1
20070142711 Bayer et al. Jun 2007 A1
20070173686 Lin et al. Jul 2007 A1
20070177008 Bayer et al. Aug 2007 A1
20070177009 Bayer et al. Aug 2007 A1
20070183685 Wada et al. Aug 2007 A1
20070185384 Bayer et al. Aug 2007 A1
20070225552 Segawa et al. Sep 2007 A1
20070225734 Bell et al. Sep 2007 A1
20070238927 Ueno et al. Oct 2007 A1
20070244354 Bayer Oct 2007 A1
20070279486 Bayer et al. Dec 2007 A1
20070280669 Karim Dec 2007 A1
20070293720 Bayer Dec 2007 A1
20080021269 Tinkham et al. Jan 2008 A1
20080021274 Bayer et al. Jan 2008 A1
20080033450 Bayer et al. Feb 2008 A1
20080039693 Karasawa Feb 2008 A1
20080064931 Schena et al. Mar 2008 A1
20080065110 Duval et al. Mar 2008 A1
20080071291 Duval et al. Mar 2008 A1
20080079827 Hoshino et al. Apr 2008 A1
20080097292 Cabiri et al. Apr 2008 A1
20080114288 Whayne et al. May 2008 A1
20080130108 Bayer et al. Jun 2008 A1
20080154288 Belson Jun 2008 A1
20080199829 Paley et al. Aug 2008 A1
20080275298 Ratnakar Nov 2008 A1
20090015842 Leitgeb et al. Jan 2009 A1
20090023998 Ratnakar Jan 2009 A1
20090036739 Hadani Feb 2009 A1
20090049627 Kritzler Feb 2009 A1
20090082629 Dotan et al. Mar 2009 A1
20090105538 Van Dam et al. Apr 2009 A1
20090137867 Goto May 2009 A1
20090213211 Bayer et al. Aug 2009 A1
20090231419 Bayer et al. Sep 2009 A1
20100217076 Ratnakar Aug 2010 A1
20110160535 Bayer et al. Jun 2011 A1
20110213206 Boutillette et al. Sep 2011 A1
Foreign Referenced Citations (47)
Number Date Country
1 628 603 Jun 2005 CN
1628603 Jun 2005 CN
196 26433 Jan 1998 DE
19626433 Jan 1998 DE
20 2006 017 173 Mar 2007 DE
0 586 162 Mar 1994 EP
1 570 778 Sep 2005 EP
1 769 720 Apr 2007 EP
711 949 Sep 1931 FR
49-130235 Dec 1974 JP
56-9712 Jan 1981 JP
62-094312 Jun 1987 JP
63-309912 Dec 1988 JP
3-159629 Jul 1991 JP
5-341210 Dec 1993 JP
6-130308 May 1994 JP
7-352 Jan 1995 JP
7-354 Jan 1995 JP
7-021001 Apr 1995 JP
8-206061 Aug 1996 JP
7-136108 May 1998 JP
11-76150 Mar 1999 JP
WO 9315648 Aug 1993 WO
WO-9917542 Apr 1999 WO
WO-9930506 Jun 1999 WO
WO 02085194 Oct 2002 WO
WO-02085194 Oct 2002 WO
WO-02084105 Nov 2002 WO
WO-02094105 Nov 2002 WO
WO-2006073676 Jul 2006 WO
WO-2006073725 Jul 2006 WO
WO-2006110275 Oct 2006 WO
WO-2006110275 Oct 2006 WO
WO-2007015241 Feb 2007 WO
WO-2007015241 Feb 2007 WO
WO-2007070644 Jun 2007 WO
WO-2007087421 Aug 2007 WO
WO-2007092533 Aug 2007 WO
WO-2007092636 Aug 2007 WO
WO-2007136859 Nov 2007 WO
WO-2007136879 Nov 2007 WO
WO-2009015396 Jan 2009 WO
WO-2009015396 Jan 2009 WO
WO-2009014895 Jan 2009 WO
WO-2009049322 Apr 2009 WO
WO-2009049322 Apr 2009 WO
WO-2009062179 May 2009 WO
Related Publications (1)
Number Date Country
20070270642 A1 Nov 2007 US
Provisional Applications (1)
Number Date Country
60801748 May 2006 US