Method and device for examining or imaging an interior surface of a cavity

Information

  • Patent Grant
  • 9044185
  • Patent Number
    9,044,185
  • Date Filed
    Monday, August 13, 2012
    12 years ago
  • Date Issued
    Tuesday, June 2, 2015
    9 years ago
Abstract
A method for examining an interior surface of a cavity includes the steps of capturing partial images of an interior surface of a cavity; joining the captured partial images to form a complete image of said interior surface of the cavity; and providing an warning if the joined partial images does not form a complete image of said interior surface of the cavity.
Description
TECHNICAL FIELD OF THE INVENTION

The present invention relates to a method and device for examining or imaging an interior surface of a cavity such as a colon.


BACKGROUND OF THE INVENTION

A scope is often used to view and examine the interior of a cavity. An endoscope is a medical device comprising a flexible tube, which is insertable into an internal body cavity through a body orifice to examine the body cavity and tissues for diagnosis. An endoscope may include a camera and a light source mounted on the distal end of its flexible tube to allow visualization of the internal environment of the body cavity. The tube of the endoscope has one or more longitudinal channels, through which an instrument can reach the body cavity to take samples of suspicious tissues or to perform other surgical procedures such as polypectomy.


To insert an endoscope into an internal body cavity, a physician advances the endoscope's flexible tube into the body cavity with the distal end of the flexible tube at the front. The physician may steer the flexible tube to follow the cavity's contour by controlling a bendable distal end portion of the flexible tube. After the endoscope is advanced to the end of the colon, the physician begins to retract the endoscope and visually scans the colon for abnormalities as the endoscope is retracted.


It is important for the physician to examine all areas of the colon where abnormalities may occur. Failure to do so may have grave consequences. However, it is difficult for the physician to simultaneously focus on examining the colon and keep track of the areas that have not been examined (or the areas of the colon that have been examined). Therefore, it is desirable to have a device or method that assists the physician in keeping track of the unexamined areas of the colon (or the examined areas).


Additionally, to ensure a careful examination of the colon, it is desirable to monitor the amount of time the physician spends examining an area of the colon, and to warn the physician if she spends insufficient time examining the area.


SUMMARY OF THE INVENTION

According to one embodiment of the present invention, an endoscope may be used to examine or image an interior surface of a cavity such as a colon. To examine (or image) a colon, for example, an operator such as a physician may first advance the endoscope to the end of the colon or to a point beyond an area of the colon to be examined. Then the operator may retract the endoscope and start examining the colon by viewing the partial images of the colon captured by the imaging device of the endoscope. The partial images captured by the imaging device are relayed to a video processing device that joins the partial images to generate a two dimensional image of the colon's interior surface. If the video processing device cannot generate a single complete view of the colon's interior surface (i.e. an area of the colon is missing from the single view), it emits a warning signal, which communicates to the physician that an area of the colon's interior surface has been missed. The physician can then move the imaging device to the missing area and capture one or more additional images. The video processing device can then integrate the additional images into the single image of the colon's interior surface. At the end of the procedure, the processing device has created a complete two-dimensional image of the colon's interior surface.


According to another embodiment of the invention, the video processing device can calculate the scanning speed and/of the total amount of time that the imaging device spends in a segment of the colon such as the ascending or transverse portion of the colon. This information can also be used to warn the physician of potential hasty examination. Those and other embodiments of the present invention overcome the disadvantages associated with the prior art.


The following is a more detailed description of some features of the present invention's embodiments. According to one aspect of the invention, a method for examining or imaging an interior surface of a cavity includes the steps of capturing partial images of an interior surface of a cavity; joining the captured partial images to form a complete image of said interior surface of the cavity; and providing an warning if the joined partial images does not form a complete image of said interior surface of the cavity.


In one preferred embodiment, the step of capturing partial images includes the steps of storing the captured partial images; and recording a sequence in which the partial images were captured.


In another preferred embodiment, the cavity is a tubular cavity and each partial image is a partial image of said interior surface of the tubular cavity. And the step of joining the captured partial images includes flattening the partial images of the interior surface of the tubular cavity; and joining the flattened partial images to form a complete flat image of said interior surface of the tubular cavity.


In still another preferred embodiment, the step of flattening each partial image includes outlining the lumen of the tubular cavity in said partial image by analyzing said partial image for the difference in contrast between the lumen of the tubular cavity and said interior surface of the tubular cavity; and excising the lumen from said partial image.


In yet another preferred embodiment, the step of flattening each partial image includes excising an outer edge of the tubular cavity in said partial image.


In still yet another preferred embodiment, the excised outer edge of said interior surface of the tubular cavity is larger than, but similar in shape to, the excised lumen.


In another preferred embodiment, the tubular cavity is a colon, and the excised outer edge of the interior surface of the colon is an outline of a haustral fold of the colon.


In a further preferred embodiment, the step of flattening each partial image includes flattening the excised partial image to create a rectangular image.


In a still further preferred embodiment, the step of flattening the excised partial image to create a rectangular image includes straightening each of the inner and outer edges of said interior surface of the tubular cavity into a substantially straight line.


In a yet further preferred embodiment, the step of joining the captured partial images includes identifying similar regions or corresponding key points between any two images.


In a yet still further preferred embodiment, the step of joining the captured partial images includes calculating a suitable transformation matrix which brings the any two images together such that the key points or similar regions overlap.


In another preferred embodiment, the step of joining the captured partial images includes joining the two images by meshing or overlapping the images as dictated by the transformation matrix.


In still another preferred embodiment, the method further includes capturing one or more additional partial images of a missing area in the image of the interior surface of the cavity if the joined partial images does not form a complete image of said interior surface of the cavity; joining the one or more additional partial images with the incomplete image of said interior surface of the cavity to form a complete image of said interior surface of the cavity; and providing an warning if the joined partial images still does not form a complete image of said interior surface of the cavity.


In yet another preferred embodiment, the method further includes providing direction to an operator to reach the missing area.


In yet still another preferred embodiment, the step of providing direction includes using an on-screen navigation cue to direct an operator to the missing area.


In a further preferred embodiment, the on-screen navigation cue includes an arrow and the missing area, both of which are displayed on a screen.


In a further preferred embodiment, the method further includes calculating a scanning speed.


In a still further preferred embodiment, the step of calculating the scanning speed includes identifying similar regions or corresponding key points between any two images; calculating a distance by which a key point or corresponding area has moved from the earlier one of the two images to the later of the two images; and calculating the scanning speed by dividing the distance by the time lapsed between the two images.


In a yet further preferred embodiment, the step of calculating the distance includes counting the number of image pixels by which the key point or corresponding area has moved.


In another preferred embodiment, the method further includes providing a warning if the scanning speed is greater than a given value.


In still another preferred embodiment, the method further includes calculating an amount of time spent on examining a region of said interior surface of the cavity.


In yet another preferred embodiment, the method further includes recognizing known features of said interior surface of the cavity to determine the region being examined.


In a further preferred embodiment, the method further includes providing a warning if the amount of time spent on examining the region is less than a given value.


According to another aspect of the invention, a method for examining or imaging an interior surface of a cavity includes capturing partial images of an interior surface of a cavity; joining the captured partial images to form a complete image of said interior surface of the cavity; capturing one or more additional partial images of a missing area in the image of said interior surface of the cavity if the joined partial images does not form a complete image of said interior surface of the cavity; and joining the one or more additional partial images with the incomplete image of said interior surface of the cavity to form a complete image of said interior surface of the cavity.


In a preferred embodiment, the method further includes providing direction to an operator to reach the missing area.


In another preferred embodiment, the step of providing direction includes using an on-screen navigation cue to direct an operator to the missing area.


In still another preferred embodiment, the on-screen navigation cue includes an arrow and the missing area, both of which are displayed on a screen.


According to still another aspect of the invention, a method for examining or imaging an interior surface of a colon includes capturing partial images of an interior surface of a colon; and joining the captured partial images to form a complete image of said interior surface of the colon.


In a preferred embodiment, each partial image is a partial image of said interior surface of the colon, and the step of joining the captured partial images includes flattening the partial images of the interior surface of the colon; and joining the flattened partial images to form a complete flat image of said interior surface of the colon.


In another preferred embodiment, the step of flattening each partial image includes outlining the lumen of the colon in said partial image by analyzing said partial image for the difference in contrast between the lumen of the colon and said interior surface of the colon; and excising the lumen from said partial image.


In still another preferred embodiment, the step of flattening each partial image includes excising an outer edge of the colon in said partial image.


In yet another preferred embodiment, the excised outer edge of said interior surface of the colon is larger than, but similar in shape to, the excised lumen.


In a further preferred embodiment, the excised outer edge of the interior surface of the colon is an outline of a haustral fold of the colon.


In a still further preferred embodiment, the step of flattening each partial image includes flattening the excised partial image to create a rectangular image.


In a yet further preferred embodiment, the step of flattening the excised partial image to create a rectangular image includes straightening each of the inner and outer edges of said interior surface of the colon into a substantially straight line.


In a yet still further preferred embodiment, the step of joining the captured partial images includes identifying similar regions or corresponding key points between any two images.


In another preferred embodiment, the step of joining the captured partial images includes calculating a suitable transformation matrix which brings the two images together such that the key points or similar regions overlap.


In still another preferred embodiment, wherein the step of joining the captured partial images includes joining the two images by meshing or overlapping the images as dictated by the transformation matrix.


In yet another preferred embodiment, the method further includes providing an warning if the joined partial images does not form a complete image of said interior surface of the colon.


In yet still another preferred embodiment, the method further includes capturing one or more additional partial images of a missing area in the image of the interior surface of the colon if the joined partial images does not form a complete image of said interior surface of the colon; joining the one or more additional partial images with the incomplete image of said interior surface of the colon to form a complete image of said interior surface of the colon; and providing an warning if the joined partial images still does not form a complete image of said interior surface of the colon.


In a further preferred embodiment, the method further includes providing an warning if the joined partial images does not form a complete image of said interior surface of the colon.


In another preferred embodiment, the method further includes capturing one or more additional partial images of a missing area in the image of the interior surface of the colon if the joined partial images does not form a complete image of said interior surface of the colon; joining the one or more additional partial images with the incomplete image of said interior surface of the colon to form a complete image of said interior surface of the colon; and providing an warning if the joined partial images still does not form a complete image of said interior surface of the colon.


According to yet another aspect of the invention, a method for examining or imaging an interior surface of a colon includes visually scanning an interior surface of a colon; calculating a scanning speed; and providing a warning if the scanning speed is greater than a given value.


In another preferred embodiment, the step of calculating the scanning speed includes capturing partial images of said interior surface of a colon; identifying similar regions or corresponding key points between any two images; calculating a distance by which a key point or corresponding area has moved from the earlier one of the two images to the later of the two images; and calculating the scanning speed by dividing the distance by the time lapsed between the two images.


In still another preferred embodiment, the step of calculating the distance includes counting the number of image pixels by which the key point or corresponding area has moved.


According to a further aspect of the invention, a method for examining an interior surface of a colon includes visually scanning an interior surface of a colon; and calculating an amount of time spent on examining a region of said interior surface of the colon.


In another preferred embodiment, the method further includes recognizing known features of said interior surface of the colon to determine the region being examined.


In a further preferred embodiment, the method further includes providing a warning if the amount of time spent on examining the region is less than a given value.


According to a further aspect of the invention, a device for examining or imaging an interior surface of a cavity includes an element for capturing partial images of an interior surface of a cavity; an element for joining the captured partial images to form a complete image of said interior surface of the cavity; and an element for providing an warning if the joined partial images does not form a complete image of said interior surface of the cavity.


According to a still further aspect of the invention, a device for examining or imaging an interior surface of a cavity includes an element capturing partial images of an interior surface of a cavity; an element joining the captured partial images to form a complete image of said interior surface of the cavity; an element capturing one or more additional partial images of a missing area in the image of said interior surface of the cavity if the joined partial images does not form a complete image of said interior surface of the cavity; and an element joining the one or more additional partial images with the incomplete image of said interior surface of the cavity to form a complete image of said interior surface of the cavity.


According to a yet further aspect of the invention, a device for examining or imaging an interior surface of a colon includes an element capturing partial images of an interior surface of a colon; and an element joining the captured partial images to form a complete image of said interior surface of the colon.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 shows a perspective view of an endoscope that can be used with the present invention.



FIG. 2 shows a perspective view of the distal end of an insertion tube of the endoscope of FIG. 1.



FIGS. 3
a to 3g show an example of image transformation.



FIGS. 4
a to 4g show another example of image transformation.



FIG. 5 shows a diagram illustrating the joining of images.



FIG. 6 shows an on-screen cue for directing an operator to a missing area of a joined image.





DETAILED DESCRIPTION OF EMBODIMENTS OF THE INVENTION

According to one embodiment of the present invention, an endoscope may be used to examine or image an interior surface of a cavity such as a colon. To examine (or image) a colon, for example, an operator such as a physician may first advance the endoscope to the end of the colon or to a point beyond an area of the colon to be examined. Then the operator may retract the endoscope and start examining the colon by viewing the partial images of the colon captured by the imaging device of the endoscope. The partial images captured by the imaging device are relayed to a video processing device that joins, either in real time or subsequent to a colon examination, the partial images to generate a complete two-dimensional image of the colon's interior surface. If the video processing device cannot generate a complete view of the colon's interior surface, it emits a warning signal, which communicates to the physician that an area of the colon's interior surface has been missed. The physician can then move the imaging device to the missing area and capture one or more additional images. The video processing device can then integrate the additional images into the two dimensional image of the colon's interior surface. At the end of the procedure, the processing device has created a complete two-dimensional image of the colon's interior surface.


The complete image of the colon's interior surface may be used for various purposes. For example, a series of complete images of the colon's interior surface may be obtained and stored over a period of time. A newer image may be compared an older image to determine whether there have been any new polyps or whether there has been any enlargement of a polyps. Additionally, stored images may be used to prove in a malpractice lawsuit that the physician did not miss a polyps during a colon examination.



FIG. 1 illustrates an exemplary endoscope 10 that can be used with one or more embodiments of the present invention. In particular, this endoscope 10 can be used in the examining or imaging of the interior surface of a cavity. For example, the endoscope 10 can be used in a variety of medical procedures in which examining or imaging of a body tissue, organ, cavity or lumen is required. The types of procedures include, for example, anoscopy, arthroscopy, bronchoscopy, colonoscopy, cystoscopy, EGD, laparoscopy, and sigmoidoscopy.


The endoscope 10 includes an insertion tube 14 that, as shown in FIG. 2, has two longitudinal channels 16. In general, however, the insertion tube 14 may have any number of longitudinal channels. Each longitudinal channel 16 allows an instrument to reach the body cavity to perform any desired procedures such as to take samples of suspicious tissues or to perform other surgical procedures such as polypectomy. The instruments may be, for example, a retractable needle for drug injection, hydraulically actuated scissors, clamps, grasping tools, electrocoagulation systems, ultrasound transducers, electrical sensors, heating elements, laser mechanisms and other ablation means. In some embodiments, one of the channels can be used to supply a washing liquid such as water for washing. Another or the same channel may be used to supply a gas, such as CO2 or air into the organ. The channels 16 may also be used to extract liquids or inject liquids, such as a drug in a liquid carrier, into the body.


The insertion tube 14 preferably is steerable or has a steerable distal end region 18 as shown in FIG. 1. The length of the distal end region 18 may be any suitable fraction of the length of the insertion tube 14, such as one half, one third, one fourth, one sixth, one tenth, or one twentieth. The insertion tube 14 may have control cables (not shown) for the manipulation of the insertion tube 14. Preferably, the control cables are symmetrically positioned within the insertion tube 14 and extend along the length of the insertion tube 14. The control cables may be anchored at or near the distal end 19 of the insertion tube 14. Each of the control cables may be a Bowden cable, which includes a wire contained in a flexible overlying hollow tube. The wires of the Bowden cables are attached to controls 20 in the handle 22 (FIG. 1). Using the controls 20, the wires can be pulled to bend the distal end region 18 of the insertion tube 14 in a given direction.


As shown in FIG. 1, the endoscope 10 may also include a control handle 22 connected to the proximal end 24 of the insertion tube 14. Preferably, the control handle 22 has one or more ports and/or valves (not shown) for controlling access to the channels 16 of the insertion tube 14. The ports and/or valves can be air or water valves, suction valves, instrumentation ports, and suction/instrumentation ports. As shown in FIG. 1, the control handle 22 may additionally include buttons 26 for taking pictures with an imaging device on the insertion tube 14.


The proximal end 28 of the control handle 22 may include an accessory outlet 30 (FIG. 1) that provides fluid communication between the air, water and suction channels and the pumps and related accessories. The same outlet 30 or a different outlet can be used for electrical lines to light and imaging components at the distal end of the endoscope 10.


As shown in FIG. 2, the endoscope 10 also includes an imaging device 32 and light sources 34, both of which are disposed at the distal end 19 of the insertion tube 14. Alternatively, the imaging device 32 and light source 34 may be positioned on the cylindrical sidewall of the insertion tube 14. The imaging device 32 may include, for example, a lens, single chip sensor, multiple chip sensor or fiber optic implemented devices. The imaging device 32, in electrical communication with a processor and/or monitor, may provide still images or recorded or live video images. The light sources 34 may be light emitting diodes (LEDs) or fiber optical delivery of light from an external light source. The light sources 34 preferably are equidistant from the imaging device 32 to provide even illumination. The intensity of each light source 34 can be adjusted to achieve optimum imaging. The circuits for the imaging device 32 and light sources 34 may be incorporated into a printed circuit board (PCB).


According to one embodiment of the present invention, this endoscope 10 may be used to examine or image an interior surface of a cavity such as a colon. To examine (or image) a colon, for example, an operator such as a physician may insert the endoscope 10 into the patient's rectum and then advance it to the end of the colon or to a point beyond an area of the colon to be examined. Then the operator may retract the endoscope 10 and start examining the colon by viewing the images captured by the imaging device 32 of the endoscope 10. Alternatively, the operator may examine the colon by advancing the endoscope 10 (as opposed to retracting the endoscope 10). In general, the operator may move or position the endoscope 10 in any suitable manner during the examination of the colon.


As the colon is being examined, partial still images of the colon are captured. The still images may be captured from the video signal generated by the imaging device 32. Alternatively, a still camera may be used to capture the images. The still images may be captured either automatically or manually. To manually capture the partial still images, the operator may decide when a still image is captured by pressing a button. Manual operation has the advantage that an image is captured only when the view is sufficient clear and when there is no fluid or excrement in the view that prevents an unobstructed view of the colon's interior surface. If there is fluid or secretion in the view, the operator may wash the colon or extract the fluid or secretion from the colon before an image is captured. The images captured by the imaging device 32 are then relayed to a processing device, which stores the images in memory. Preferably the order in which the images are captured is also stored.


Given the image capture rate of a typical imaging device, it may be unnecessary to store and use every image in order to obtain the complete two-dimensional image of the colon's interior surface. Accordingly, an image that is blurry or difficult to join may be discarded and the next image may be stored and used. A blurry image may be caused by fluid or excrement in the colon. In addition, when the imaging device 32 is paused at a location, duplicate or similar images can be discarded such that unnecessary images are not stored and used to form the final joined image. For example, if a procedure such as a biopsy or polypectomy needs to be performed using the endoscope, the physician can pause the image capture such that the final joined image is not adversely affected. Furthermore, the operator can decide whether images are being collected merely for display, for creating the final joined image, or both.


If the imaging device 32 is disposed at the distal end 19 of the insertion tube 14, the imaging device 32 faces the longitudinal direction of the colon, and the image of the colon captured by the imaging device 32 will likely show a view of the colon's interior surface along the longitudinal direction. In other words, as shown in FIG. 3a, the captured view of the colon 40 will likely show the colon's lumen 42 surrounded by the colon's interior surface 44, with the colon's interior surface 44 farther away from the imaging device 32 being at the center of the image 40 and surrounding the colon's lumen 42 and with the colon's interior surface 44 closer to the imaging device 32 being at the outer edge of the image 40.


In some embodiments of the present invention, after an image of the colon has been captured and relayed to the video processing device, the video processing device manipulates and scales the image from showing a longitudinal view of the colon's interior surface to showing a “flattened” rectangular view of the colon's interior surface. This procedure, illustrated in FIGS. 3a to 3g, may be carried out by excising the image so that it shows only the interior surface of a given length of the colon. FIG. 3a illustrates an image 40 captured by the imaging device 32 and relayed to the video processing device. As a first step, as shown in FIG. 3b, the processing device outlines the lumen 42 by analyzing the image 40 for the difference in contrast between the colon's lumen 42 and its interior surface 44. After it has outlined the lumen 42, the processing device excises the lumen 42 from the image 40, as shown in FIG. 3c. In some embodiments, both the lumen 42 and the area surrounding the lumen 42 may be excised from the image 40. Then, the outer edge 46 of the image 40 may be excised to produce a ring-shaped image of the colon, as shown in FIGS. 3d and 3e. Preferably, the outer edge of the excised image is similar in shape to its inner edge of the excised image. In other words, the outer edge of the excised image can be equally spaced from the inner edge in the radial direction. This allows the flattened image to have a substantially rectangular configuration. Alternatively, the outer edge of the image can follow the outline of a haustral fold of the colon as shown in FIG. 3d. A haustral fold can be identified by the unique pattern of shading and contrast exhibited in the image.


This excised image is then “cut” radially and longitudinally along a side of the image (FIG. 3f), and it is manipulated and flattened to show a rectangular view of the colon's interior surface (FIG. 3g). The excised image may be “cut” with or without overlap. When “cut” with overlap, the two “cut” edges of the image may overlap, and a region of the image may be on both sides of the “cut.” To carry out this procedure, the processing device may convert the inner and outer edges of the image into substantially straight lines such that the ring-shaped view is converted into a rectangular view as shown in FIGS. 3f to 3g. This conversion causes certain areas of the image to undergo compression and others expansion.


The previous discussion presupposes that the image of the colon shows a longitudinal view of the colon. In other words, the endoscope 10 lies parallel with the longitudinal axis of the colon, and the imaging device 32 of the endoscope 10 is disposed at the distal end 19 of the endoscope 10 and faces the longitudinal direction of the colon. In a situation where the imaging device 32 is angled away from the longitudinal axis of the colon, the image 50 may not show the entire lumen 52, and the image may need to be reconstructed in a slightly different manner (FIGS. 4a to 4g). As seen in FIG. 4a, the lumen 52 or part of the lumen 52 is identified by the difference in contrast. Once the lumen 52 has been identified, it is excised from the image 50 as shown in FIG. 4c, and a corresponding arc is also excised from the outer edge of the image 50 as shown in FIG. 4d. The image 50 is then converted to a substantially rectangle view as shown in FIGS. 4f and 4g.


In a situation where the image does not show the colon's lumen, the processing device may locate the image spatially based on the positions of the previous images, such as the positions of the preceding images. For example, if the imaging device 32 faces a direction that is perpendicular to the longitudinal direction of the colon, the processing device can locate the image based on the positions of the previous images that overlap with this particular image. Images captured from this viewpoint may not need to be converted because of their substantially rectangular and flat shape.


Once an image has been converted into a flat view, the image is analyzed in conjunction with other images, such as the preceding images, to find similar regions and define corresponding key points. This can be accomplished by any one of the various methods known in the field of imaging technology. One such method is an algorithm known as SIFT (Scale Invariant Feature Transform), which is invariant to image scaling, rotations, and partially invariant to changes in illumination and 3D camera viewpoint. Interest points, which are invariant to scale and rotation, are identified in each image by constructing a multi-scale pyramid of Difference of Gaussian (DoG) images. Key points are identified by localizing maxima or minima in the Gaussian pyramid across levels. Next, each interest point is oriented by computing a gradient orientation histogram. A set of orientation histograms in a neighborhood such as 4×4 pixel neighborhood may be used to create the key point descriptor. Finally, the feature descriptors are normalized in order to account for differences in illumination. Once feature points and descriptors are identified in each image, corresponding key points are identified. And, after similar regions or corresponding key points are identified between images, a suitable transformation matrix, which brings the images together such that key points or similar regions overlap, is calculated. An index or number may be used to measure the degree of similarity between two regions of two images. If this index or number is above a given value the two regions of the two images are considered to be overlapping.


In the final step, two images are joined together by meshing or overlapping the images as dictated by the transformation matrix. Every image thereafter may be then joined to the preceding series of joined images. It is hoped that, by the end of the procedure, a single image that includes a 2-D view of the interior surface of the colon results as shown in FIG. 5, which shows that four partially overlapping images 60, 62, 64, 66 are joined to form a single image.


In another preferred embodiment, to verify that the single joined image of the interior surface of the colon is complete, the processing device checks to ensure that no areas are missing as it continuously joins images together. When an area is missing, the processing device sends a signal, such as an audio and/or visual signal, that alerts the physician to the missed area. The physician can then return to the missing area and capture one or more additional images. In addition, in the event that an image is fuzzy or otherwise unsuitable for the construction of the single joined image, the processing device also alerts the physician so that one or more additional images may be acquired to take the place of the unsuitable image.


A missing area in the joined image can be detected in various manners. For example, there is likely a missing area if an excised inner or outer edge of a partial image is not joined to another partial image or if a region bordering on an excised inner or outer edge of a partial image does not have a corresponding region in another partial image and therefore cannot be joined to another partial image. This, however, does not always apply to the cut edge of a partial image (FIG. 3f), which is made so that the partial image can be flattened (FIGS. 3a-3g). This may also not apply to the first and last images because these two images each have an edge not joined to another image. Another way to detect a missing area is to see whether the cut edges of each partial image can be rejoined after the partial images have been joined to form a single image. If the cut edges of each partial image cannot be rejoined, the single image will likely have a missing area.


In another preferred embodiment, when there is a missing area in the single joined image or an unsuitable image, the processing device uses an on-screen navigation cue to direct the operator to the location of the missing area or unsuitable image. As shown in FIG. 6, the on-screen navigation cue may include an arrow 72 on the screen 70 indicating the desired direction of movement for the imaging device 32. Additionally or alternatively, the on-screen navigation cue may include a highlighted area 74 on the screen 70 representing the location of the missing area or unsuitable image. The processing device may implement this feature by comparing the images joined previously with the current image being captured by the imaging device 32. By comparing corresponding key points between these images, the processing device can direct the operator to move the imaging device 32 to the desired location.


In a further preferred embodiment, the processing device can calculate the scanning speed and/of the total amount of time that the imaging device spends in a segment of the colon such as the ascending or transverse portion of the colon. To calculate the scanning speed, the processing device may rely on an algorithm that determines tracking time or speed in a manner similar to an optical computer mouse. The processing device can perform the same or similar steps of analyzing captured images. Before joining the images, the processing device analyzes the distance by which key points or corresponding areas have moved from one image to a subsequent image. The processing device can calculate the scanning speed by dividing the distance traveled by the time lapsed between the two images. The distance by which a given point or feature travels can be denoted by the number of image pixels. Each pixel can then be standardized to a measurement of actual distance such that the calculation can be performed.


The distance traveled can also be calculated by measuring the size change of a geometric feature from one image to another image. For example, if the geometric feature appears larger in an earlier image and smaller in a later image, it can be concluded that the imaging device is moving away from the geometric feature. The distance traveled by the imaging device can be calculated from the change in size.


To calculate the total amount of time spent in a segment of the colon, the processing device needs to recognize when the imaging device is in the segment of the colon. In one preferred embodiment, the processing device recognizes a segment of the colon by its distinctive features, which can be, for example, the various flexures (e.g., splenic, hepatic, etc). Preferably, the processing device recognizes a feature by comparing the captured image of the feature with a stored standard or typical image of the feature. For example, as the endoscope is withdrawn from the end of the colon, the hepatic flexure is expected first, and images are filtered for properties that would suggest an image of the hepatic flexure. The location of areas of shading/contrast and the location of the lumen in the image would suggest to the processing device that the image is of a flexure. The processing device can alert the operator about whether she is scanning the colon too fast and provide data on how much time was spent in each segment of the colon.


Another feature that can be used to recognize the segment of the colon is the geometric shape of the colon. For example, the lumen of the transverse colon has a particularly triangular shape. An image of the colon's geometric shape can be compared with a database of images the colon's geometric shape to determine which segment of the colon the imaging device is in.


In an alternate embodiment of the invention, the endoscope may have a sensor or transducer for communicating the position (such as the location and/or orientation) of the imaging device to the processing device. Examples of such a positioning sensor include magnetic positioning sensors such as the Aurora System manufactured by NDI International of Waterloo, Canada, RF positioning sensors, or optical positioning sensors. The processing device can integrate the positional information of the imaging device with the image capturing and joining algorithm to better determine how to join the images. Joining of the images can be improved based on the imaging device's position, or based on information about the particular geometry of the colon the imaging device is in.


While particular embodiments of the present invention have been shown and described, it will be obvious to those skilled in the art that changes and modifications can be made without departing from this invention in its broader aspects. For instance, the above embodiments involve the examination of a colon. In general, however, a method or device of the present invention can be used to examine any cavity, such as any body cavity. Therefore, the appended claims are to encompass within their scope all such changes and modifications as fall within the true spirit and scope of this invention.

Claims
  • 1. An accessory device for a colonoscope, the device comprising: an imaging device configured to capture images of an interior surface of a colon, wherein the imaging device is configured to be positioned on a sidewall of a colonoscope;a positioning sensor for detecting positional information of the imaging device within the colon; anda processing device in communication with the imaging device and positioning sensor, wherein the processing device is programmed to join the images using the positional information, and wherein the processing device is further programmed to calculate an amount of time spent by the imaging device in an identified colon segment.
  • 2. The device of claim 1, wherein the processing device comprises a memory capable of storing images and positional information.
  • 3. The device of claim 2, wherein the memory is capable of storing the order in which images are captured.
  • 4. The device of claim 1, wherein the positioning sensor is further configured to detect orientational information of the imaging device within the colon.
  • 5. The device of claim 4, wherein the processing device is further programmed to join the images using the orientational information.
  • 6. The device of claim 1, wherein the processing device is programmed to generate a complete image of the interior surface of the colon segment using the joined images.
  • 7. The device of claim 6, wherein the processing device is programmed to detect and provide a warning if the joined images do not form a complete image of the interior surface of the colon segment.
  • 8. The device of claim 1, wherein the positioning sensor comprises a magnetic positioning sensor.
  • 9. The device of claim 1, wherein the positioning sensor comprises a RF positioning sensor.
  • 10. The device of claim 1, wherein the positioning sensor comprises an optical positioning sensor.
  • 11. The device of claim 2, wherein the processing device is further programmed to compare a newly captured image of the interior surface of the colon with the joined images.
  • 12. The device of claim 2, wherein the processing device is further programmed to compare a newly captured image with an old image retrieved from the memory to identify changes in the interior surface of the colon.
  • 13. The device of claim 2, wherein the processing device is further programmed to provide image and positional information to an operator so that the operator can return to a previously imaged location during a procedure.
  • 14. The device of claim 1, wherein the processing device is programmed to recognize distinctive features of the colon segment.
  • 15. A method of using an accessory device for a colonoscope, the method comprising: capturing images of an interior surface of a colon using imaging device of an accessory device for a colonoscope, wherein the imaging device is positioned on a sidewall of the colonoscope and wherein the accessory device further comprises a positioning sensor for detecting positional information of the imaging device within the colon, and a processing device in communication with the imaging device and the positioning sensor; andjoining the captured images using the processing device, wherein the processing device is programmed to join the captured images using the positional information, and wherein the processing device is further programmed to calculate an amount of time spent by the imaging device in an identified colon segment.
  • 16. The method of claim 15, wherein the processing device comprises a memory capable of storing images, positional information, and the order in which the images are captured.
  • 17. The method of claim 15, further comprising clearing fluids from the colon prior to capturing images.
  • 18. The method of claim 15, further comprising generating a complete image of the interior surface of the colon segment using the joined images.
  • 19. The method of claim 18, further comprising detecting if the joined images do not form a complete image of the interior surface of the colon segment and generating a warning.
  • 20. The method of claim 15, wherein the positioning sensor is configured to detect orientational information of the imaging device within the colon.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 13/275,206, filed on Oct. 17, 2011, which is a continuation of U.S. patent application Ser. No. 12/101,050, filed on Apr. 10, 2008, which claims the benefit of U.S. Provisional Patent Application No. 60/911,054, filed Apr. 10, 2007, the entire disclosures of which are incorporated herein by reference in their entirety.

US Referenced Citations (373)
Number Name Date Kind
3437747 Sheldon Apr 1969 A
3610231 Takahashi et al. Oct 1971 A
3643653 Takahashi et al. Feb 1972 A
3739770 Mori Jun 1973 A
3889662 Mitsui Jun 1975 A
3897775 Furihata Aug 1975 A
3918438 Hayamizu et al. Nov 1975 A
4066071 Nagel Jan 1978 A
4261344 Moore et al. Apr 1981 A
4351587 Matsuo et al. Sep 1982 A
4398811 Nishioka et al. Aug 1983 A
4494549 Namba et al. Jan 1985 A
4573450 Arakawa Mar 1986 A
4586491 Carpenter May 1986 A
4602281 Nagasaki et al. Jul 1986 A
4625236 Fujimori et al. Nov 1986 A
4646722 Silverstein et al. Mar 1987 A
4699463 D'Amelio et al. Oct 1987 A
4721097 D'Amelio Jan 1988 A
4727859 Lia Mar 1988 A
4741326 Sidall et al. May 1988 A
4742817 Kawashima et al. May 1988 A
4790295 Tashiro Dec 1988 A
4800870 Reid, Jr. Jan 1989 A
4825850 Opie et al. May 1989 A
4836211 Sekino et al. Jun 1989 A
4846154 MacAnally et al. Jul 1989 A
4852551 Opie et al. Aug 1989 A
4853773 Hibino et al. Aug 1989 A
4862873 Yajima et al. Sep 1989 A
4867138 Kubota et al. Sep 1989 A
4869238 Opie et al. Sep 1989 A
4870488 Ikuno et al. Sep 1989 A
4873572 Miyazaki et al. Oct 1989 A
4873965 Danieli Oct 1989 A
4884133 Kanno et al. Nov 1989 A
4899732 Cohen Feb 1990 A
4905667 Foerster et al. Mar 1990 A
4907395 Opie et al. Mar 1990 A
4911148 Sosnowski et al. Mar 1990 A
4911564 Baker Mar 1990 A
4926258 Sasaki May 1990 A
4947827 Opie et al. Aug 1990 A
4947828 Carpenter et al. Aug 1990 A
4979496 Komi Dec 1990 A
4991565 Takahashi et al. Feb 1991 A
5019040 Itaoka et al. May 1991 A
5025778 Silverstein et al. Jun 1991 A
5026377 Burton et al. Jun 1991 A
5050585 Takahashi Sep 1991 A
RE34110 Opie et al. Oct 1992 E
5159446 Hibino et al. Oct 1992 A
5166787 Irion Nov 1992 A
5178130 Kaiya et al. Jan 1993 A
5187572 Nakamura et al. Feb 1993 A
5193525 Silverstein et al. Mar 1993 A
5196928 Karasawa et al. Mar 1993 A
5253638 Tamburrino et al. Oct 1993 A
5260780 Staudt, III Nov 1993 A
5271381 Ailinger et al. Dec 1993 A
5305121 Moll Apr 1994 A
5318031 Mountford et al. Jun 1994 A
5329887 Ailinger et al. Jul 1994 A
5337734 Saab Aug 1994 A
5381784 Adair Jan 1995 A
5398685 Wilk et al. Mar 1995 A
5406938 Mersch et al. Apr 1995 A
5434669 Tabata et al. Jul 1995 A
5443781 Saab Aug 1995 A
5447148 Oneda et al. Sep 1995 A
5483951 Frassica et al. Jan 1996 A
5494483 Adair Feb 1996 A
5518501 Oneda et al. May 1996 A
5520607 Frassica et al. May 1996 A
5530238 Meulenbrugge et al. Jun 1996 A
5533496 De Faria-Correa et al. Jul 1996 A
5536236 Yabe et al. Jul 1996 A
5556367 Yabe et al. Sep 1996 A
5613936 Czarnek et al. Mar 1997 A
5614943 Nakamura et al. Mar 1997 A
5626553 Frassica et al. May 1997 A
5634466 Gruner Jun 1997 A
5653677 Okada et al. Aug 1997 A
5667476 Frassica et al. Sep 1997 A
5679216 Takayama et al. Oct 1997 A
5681260 Ueda et al. Oct 1997 A
5682199 Lankford Oct 1997 A
5685822 Harhen Nov 1997 A
5692729 Harhen Dec 1997 A
5696850 Parulski et al. Dec 1997 A
5702348 Harhen Dec 1997 A
5706128 Greenberg Jan 1998 A
5711299 Manwaring et al. Jan 1998 A
5722933 Yabe et al. Mar 1998 A
5752912 Takahashi et al. May 1998 A
5762603 Thompson Jun 1998 A
5817061 Goodwin et al. Oct 1998 A
5827177 Oneda et al. Oct 1998 A
5833603 Kovacs et al. Nov 1998 A
5843103 Wulfman Dec 1998 A
5843460 Labigne et al. Dec 1998 A
5854859 Sobol Dec 1998 A
5860914 Chiba et al. Jan 1999 A
5876329 Harhen Mar 1999 A
5916147 Boury Jun 1999 A
5924977 Yabe et al. Jul 1999 A
5938587 Taylor et al. Aug 1999 A
5982932 Prokoski Nov 1999 A
5989182 Hori et al. Nov 1999 A
5989224 Exline et al. Nov 1999 A
6017358 Yoon Jan 2000 A
6026323 Skladnev et al. Feb 2000 A
6066090 Yoon May 2000 A
6099464 Shimizu et al. Aug 2000 A
6099466 Sano et al. Aug 2000 A
6099485 Patterson Aug 2000 A
6106463 Wilk Aug 2000 A
6174280 Oneda et al. Jan 2001 B1
6190330 Harhen Feb 2001 B1
6214028 Yoon et al. Apr 2001 B1
6261226 McKenna et al. Jul 2001 B1
6261307 Yoon et al. Jul 2001 B1
6277064 Yoon Aug 2001 B1
6296608 Daniels et al. Oct 2001 B1
6301047 Hoshino et al. Oct 2001 B1
6350231 Ailinger et al. Feb 2002 B1
6369855 Chauvel et al. Apr 2002 B1
6375653 Desai Apr 2002 B1
6387043 Yoon May 2002 B1
6433492 Buonavita Aug 2002 B1
6454702 Smith Sep 2002 B1
6456684 Mun et al. Sep 2002 B1
6461294 Oneda et al. Oct 2002 B1
6464633 Hosoda et al. Oct 2002 B1
6482149 Torii Nov 2002 B1
6503195 Keller et al. Jan 2003 B1
6527704 Chang et al. Mar 2003 B1
6547724 Soble et al. Apr 2003 B1
6554767 Tanaka Apr 2003 B2
6564088 Soller et al. May 2003 B1
6640017 Tsai et al. Oct 2003 B1
6648816 Irion et al. Nov 2003 B2
6683716 Costales Jan 2004 B1
6687010 Horii et al. Feb 2004 B1
6697536 Yamada Feb 2004 B1
6699180 Kobayashi Mar 2004 B2
6736773 Wendlandt et al. May 2004 B2
6748975 Hartshorne et al. Jun 2004 B2
6796939 Hirata et al. Sep 2004 B1
6833871 Merrill et al. Dec 2004 B1
6845190 Smithwick et al. Jan 2005 B1
6891977 Gallagher May 2005 B2
6916286 Kazakevich Jul 2005 B2
6928314 Johnson et al. Aug 2005 B1
6929636 von Alten Aug 2005 B1
6947784 Zalis Sep 2005 B2
6951536 Yokoi et al. Oct 2005 B2
6965702 Gallagher Nov 2005 B2
6966906 Brown Nov 2005 B2
6974240 Takahashi Dec 2005 B2
6974411 Belson Dec 2005 B2
6992694 Abe Jan 2006 B2
6997871 Sonnenschein et al. Feb 2006 B2
7004900 Wendlandt et al. Feb 2006 B2
7029435 Nakao Apr 2006 B2
7041050 Ronald May 2006 B1
7095548 Cho et al. Aug 2006 B1
7103228 Kraft et al. Sep 2006 B2
7116352 Yaron Oct 2006 B2
7173656 Dunton et al. Feb 2007 B1
7228004 Gallagher et al. Jun 2007 B2
7258663 Doguchi et al. Aug 2007 B2
7280141 Frank et al. Oct 2007 B1
7317458 Wada Jan 2008 B2
7322934 Miyake et al. Jan 2008 B2
7341555 Ootawara et al. Mar 2008 B2
7362911 Frank Apr 2008 B1
7389892 Park Jun 2008 B2
7405877 Schechterman Jul 2008 B1
7435218 Krattiger et al. Oct 2008 B2
7436562 Nagasawa et al. Oct 2008 B2
7507200 Okada Mar 2009 B2
7551196 Ono et al. Jun 2009 B2
7556599 Rovegno Jul 2009 B2
7561190 Deng et al. Jul 2009 B2
7621869 Ratnakar Nov 2009 B2
7646520 Funaki et al. Jan 2010 B2
7678043 Gilad Mar 2010 B2
7683926 Schechterman et al. Mar 2010 B2
7749156 Ouchi Jul 2010 B2
7825964 Hoshino et al. Nov 2010 B2
7864215 Carlsson et al. Jan 2011 B2
7910295 Hoon et al. Mar 2011 B2
7927272 Bayer et al. Apr 2011 B2
8009167 Dekel et al. Aug 2011 B2
8064666 Bayer Nov 2011 B2
8070743 Kagan et al. Dec 2011 B2
8182422 Bayer et al. May 2012 B2
8197399 Bayer et al. Jun 2012 B2
8235887 Bayer et al. Aug 2012 B2
8287446 Bayer Oct 2012 B2
8289381 Bayer et al. Oct 2012 B2
8310530 Bayer et al. Nov 2012 B2
8587645 Bayer et al. Nov 2013 B2
8797392 Bayer et al. Aug 2014 B2
8872906 Bayer et al. Oct 2014 B2
20010007468 Sugimoto et al. Jul 2001 A1
20010031912 Adler Oct 2001 A1
20010037052 Higuchi et al. Nov 2001 A1
20010051766 Gazdinski Dec 2001 A1
20010056238 Tsujita Dec 2001 A1
20020026188 Balbierz et al. Feb 2002 A1
20020039400 Kaufman et al. Apr 2002 A1
20020089584 Abe Jul 2002 A1
20020095168 Griego et al. Jul 2002 A1
20020099267 Wendlandt et al. Jul 2002 A1
20020101546 Sharp et al. Aug 2002 A1
20020103420 Coleman et al. Aug 2002 A1
20020110282 Kraft et al. Aug 2002 A1
20020115908 Farkas et al. Aug 2002 A1
20020156347 Kim et al. Oct 2002 A1
20020165444 Whitman Nov 2002 A1
20020193662 Belson Dec 2002 A1
20030001951 Tsujita et al. Jan 2003 A1
20030004399 Belson Jan 2003 A1
20030011768 Jung et al. Jan 2003 A1
20030032863 Kazakevich et al. Feb 2003 A1
20030040668 Kaneko et al. Feb 2003 A1
20030045778 Ohline et al. Mar 2003 A1
20030065250 Chiel et al. Apr 2003 A1
20030088152 Takada May 2003 A1
20030093031 Long et al. May 2003 A1
20030093088 Long et al. May 2003 A1
20030103199 Jung et al. Jun 2003 A1
20030105386 Voloshin et al. Jun 2003 A1
20030120130 Glukhovsky Jun 2003 A1
20030125630 Furnish Jul 2003 A1
20030128892 Avinash Jul 2003 A1
20030130711 Pearson et al. Jul 2003 A1
20030153866 Long et al. Aug 2003 A1
20030161545 Gallagher Aug 2003 A1
20030167007 Belson Sep 2003 A1
20030171650 Tartaglia et al. Sep 2003 A1
20030176767 Long et al. Sep 2003 A1
20030179302 Harada et al. Sep 2003 A1
20030187326 Chang Oct 2003 A1
20030195545 Hermann et al. Oct 2003 A1
20030197781 Sugimoto et al. Oct 2003 A1
20030197793 Mitsunaga et al. Oct 2003 A1
20030215788 Long Nov 2003 A1
20030225433 Nakao Dec 2003 A1
20030233115 Eversull et al. Dec 2003 A1
20040015049 Zaar Jan 2004 A1
20040023397 Vig et al. Feb 2004 A1
20040034278 Adams Feb 2004 A1
20040049096 Adams Mar 2004 A1
20040059191 Krupa et al. Mar 2004 A1
20040080613 Moriyama Apr 2004 A1
20040085443 Kallioniemi et al. May 2004 A1
20040097790 Farkas et al. May 2004 A1
20040109164 Horii et al. Jun 2004 A1
20040109319 Takahashi Jun 2004 A1
20040111019 Long Jun 2004 A1
20040122291 Takahashi Jun 2004 A1
20040141054 Mochida et al. Jul 2004 A1
20040158124 Okada Aug 2004 A1
20040207618 Williams et al. Oct 2004 A1
20040228544 Endo et al. Nov 2004 A1
20040242987 Liew et al. Dec 2004 A1
20050010084 Tsai Jan 2005 A1
20050014996 Konomura et al. Jan 2005 A1
20050020918 Wilk et al. Jan 2005 A1
20050020926 Wiklof et al. Jan 2005 A1
20050038317 Ratnakar Feb 2005 A1
20050038319 Goldwasser et al. Feb 2005 A1
20050068431 Mori Mar 2005 A1
20050085693 Belson et al. Apr 2005 A1
20050085790 Guest et al. Apr 2005 A1
20050096502 Khalili May 2005 A1
20050154278 Cabiri et al. Jul 2005 A1
20050165272 Okada et al. Jul 2005 A1
20050165279 Adler et al. Jul 2005 A1
20050177024 Mackin Aug 2005 A1
20050203420 Kleen et al. Sep 2005 A1
20050215911 Alfano et al. Sep 2005 A1
20050222500 Itoi Oct 2005 A1
20050228224 Okada et al. Oct 2005 A1
20050267361 Younker et al. Dec 2005 A1
20050272975 McWeeney et al. Dec 2005 A1
20050272977 Saadat et al. Dec 2005 A1
20060044267 Xie et al. Mar 2006 A1
20060052709 DeBaryshe et al. Mar 2006 A1
20060058584 Hirata Mar 2006 A1
20060106286 Wendlandt et al. May 2006 A1
20060114986 Knapp et al. Jun 2006 A1
20060149127 Seddiqui et al. Jul 2006 A1
20060149129 Watts et al. Jul 2006 A1
20060183975 Saadat et al. Aug 2006 A1
20060184037 Ince et al. Aug 2006 A1
20060217594 Ferguson Sep 2006 A1
20060238614 Konno Oct 2006 A1
20060250503 Crutchfield et al. Nov 2006 A1
20060252994 Ratnakar Nov 2006 A1
20060279632 Anderson Dec 2006 A1
20060280360 Holub Dec 2006 A1
20060285766 Ali Dec 2006 A1
20060293562 Uchimura et al. Dec 2006 A1
20070015967 Boulais et al. Jan 2007 A1
20070015989 Desai et al. Jan 2007 A1
20070066868 Shikii Mar 2007 A1
20070083081 Schlagenhauf et al. Apr 2007 A1
20070097206 Houvener et al. May 2007 A1
20070103460 Zhang et al. May 2007 A1
20070118017 Honda May 2007 A1
20070142711 Bayer et al. Jun 2007 A1
20070173686 Lin et al. Jul 2007 A1
20070177008 Bayer et al. Aug 2007 A1
20070177009 Bayer et al. Aug 2007 A1
20070183685 Wada et al. Aug 2007 A1
20070185384 Bayer et al. Aug 2007 A1
20070225552 Segawa et al. Sep 2007 A1
20070225554 Maseda et al. Sep 2007 A1
20070225734 Bell et al. Sep 2007 A1
20070238927 Ueno et al. Oct 2007 A1
20070244354 Bayer Oct 2007 A1
20070270642 Bayer et al. Nov 2007 A1
20070279486 Bayer et al. Dec 2007 A1
20070280669 Karim Dec 2007 A1
20070293720 Bayer Dec 2007 A1
20080021269 Tinkham et al. Jan 2008 A1
20080021274 Bayer et al. Jan 2008 A1
20080033450 Bayer et al. Feb 2008 A1
20080039693 Karasawa Feb 2008 A1
20080064931 Schena et al. Mar 2008 A1
20080065110 Duval et al. Mar 2008 A1
20080071291 Duval et al. Mar 2008 A1
20080079827 Hoshino et al. Apr 2008 A1
20080084478 Gilad et al. Apr 2008 A1
20080097292 Cabiri et al. Apr 2008 A1
20080114288 Whayne et al. May 2008 A1
20080130108 Bayer et al. Jun 2008 A1
20080154288 Belson Jun 2008 A1
20080199829 Paley et al. Aug 2008 A1
20080200763 Ueno Aug 2008 A1
20080275298 Ratnakar Nov 2008 A1
20090015842 Leitgeb et al. Jan 2009 A1
20090023998 Ratnakar Jan 2009 A1
20090028407 Seibel et al. Jan 2009 A1
20090036739 Hadani Feb 2009 A1
20090049627 Kritzler Feb 2009 A1
20090082629 Dotan et al. Mar 2009 A1
20090105538 Van Dam et al. Apr 2009 A1
20090137867 Goto May 2009 A1
20090208071 Nishimura et al. Aug 2009 A1
20090213211 Bayer et al. Aug 2009 A1
20090231419 Bayer Sep 2009 A1
20100217076 Ratnakar Aug 2010 A1
20110160535 Bayer et al. Jun 2011 A1
20110213206 Boutillette et al. Sep 2011 A1
20120033062 Bayer Feb 2012 A1
20120053407 Levy Mar 2012 A1
20120065468 Levy et al. Mar 2012 A1
20120209071 Bayer et al. Aug 2012 A1
20120224026 Bayer et al. Sep 2012 A1
20120229615 Kirma et al. Sep 2012 A1
20120232340 Levy et al. Sep 2012 A1
20120232343 Levy et al. Sep 2012 A1
20120232345 Levy et al. Sep 2012 A1
20130012778 Bayer et al. Jan 2013 A1
20130116506 Bayer et al. May 2013 A1
20140018624 Bayer et al. Jan 2014 A1
20140046136 Bayer et al. Feb 2014 A1
20140336459 Bayer et al. Nov 2014 A1
Foreign Referenced Citations (79)
Number Date Country
1628603 Jun 2005 CN
196 26 433 Jan 1998 DE
20 2006 017 173 Mar 2007 DE
0 586 162 Mar 1994 EP
1 570 778 Sep 2005 EP
1 769 720 Apr 2007 EP
711 949 Sep 1931 FR
49-130235 Dec 1974 JP
56-9712 Jan 1981 JP
56-56486 May 1981 JP
57-170707 Oct 1982 JP
60-83636 May 1985 JP
60-111217 Jun 1985 JP
62-094312 Jun 1987 JP
63-309912 Dec 1988 JP
1-267514 Oct 1989 JP
1-172847 Dec 1989 JP
2-295530 Dec 1990 JP
3-159629 Jul 1991 JP
4-500768 Feb 1992 JP
4-341232 Nov 1992 JP
5-285091 Nov 1993 JP
5-307144 Nov 1993 JP
5-341210 Dec 1993 JP
6-9228 Feb 1994 JP
60-76714 Mar 1994 JP
6-130308 May 1994 JP
6-169880 Jun 1994 JP
7-352 Jan 1995 JP
7-354 Jan 1995 JP
7-021001 Apr 1995 JP
7-136108 May 1995 JP
7-275197 Oct 1995 JP
8-024208 Jan 1996 JP
8-206061 Aug 1996 JP
9-56662 Mar 1997 JP
11-76150 Mar 1999 JP
11-253401 Sep 1999 JP
11-332821 Dec 1999 JP
2003-135388 May 2003 JP
2003-220023 Aug 2003 JP
2004-202252 Jul 2004 JP
2004-525717 Aug 2004 JP
2004-537362 Dec 2004 JP
2007-143580 Jun 2007 JP
WO-9315648 Aug 1993 WO
WO-9917542 Apr 1999 WO
WO-9930506 Jun 1999 WO
WO-02085194 Oct 2002 WO
WO-02094105 Nov 2002 WO
WO-02094105 Nov 2002 WO
WO-03013349 Feb 2003 WO
WO-2006073676 Jul 2006 WO
WO-2006073725 Jul 2006 WO
WO-2006087981 Aug 2006 WO
WO-2006110275 Oct 2006 WO
WO-2006110275 Oct 2006 WO
WO-2007015241 Feb 2007 WO
WO-2007015241 Feb 2007 WO
WO-2007070644 Jun 2007 WO
WO-2007070644 Jun 2007 WO
WO-2007087421 Aug 2007 WO
WO-2007087421 Aug 2007 WO
WO-2007092533 Aug 2007 WO
WO-2007092533 Aug 2007 WO
WO-2007092636 Aug 2007 WO
WO-2007092636 Aug 2007 WO
WO-2007136859 Nov 2007 WO
WO-2007136859 Nov 2007 WO
WO-2007136879 Nov 2007 WO
WO-2007136879 Nov 2007 WO
WO-2007136879 Nov 2007 WO
WO-2009014895 Jan 2009 WO
WO-2009015396 Jan 2009 WO
WO-2009015396 Jan 2009 WO
WO-2009049322 Apr 2009 WO
WO-2009049322 Apr 2009 WO
WO-2009049324 Apr 2009 WO
WO-2009062179 May 2009 WO
Non-Patent Literature Citations (168)
Entry
Advisory Action mailed on Nov. 2, 2010, for U.S. Appl. No. 11/609,838, filed Dec. 12, 2006, 3 pages.
Advisory Action mailed on May 23, 2011, for U.S. Appl. No. 11/751,605, filed May 21, 2007, 3 pages.
Amendment in Response to Non-Final Office Action filed on Jun. 29, 2009, for U.S. Appl. No. 11/215,660, filed Aug. 29, 2005, 9 pages.
Amendment in Response to Final Office Action filed on Mar. 8, 2010, for U.S. Appl. No. 11/215,660, filed Aug. 29, 2005, 11 pages.
Amendment in Response to Non-Final Office Action filed on Jun. 25, 2010, for U.S. Appl. No. 11/609,838, filed Dec. 12, 2006, 14 pages.
Amendment in Response to Non-Final Office Action filed on Aug. 30, 2010, for U.S. Appl. No. 11/215,660, filed Aug. 29, 2005, 17 pages.
Amendment in Response to Final Office Action filed on Oct. 22, 2010, for U.S. Appl. No. 11/609,838, filed Dec. 12, 2006, 15 pages.
Amendment in Response to Non-Final Office Action filed on Oct. 22, 2010, for U.S. Appl. No. 11/834,540, filed Aug. 6, 2007, 13 pages.
Amendment in Response to Non-Final Office Action filed on Feb. 9, 2011, for U.S. Appl. No. 11/828,835, filed Jul. 26, 2007, 10 pages.
Amendment in Response to Non-Final Office Action filed on Feb. 25, 2011, for U.S. Appl. No. 11/751,605, filed May 21, 2007, 15 pages.
Amendment in Response to Final Office Action filed on Feb. 28, 2011, for U.S. Appl. No. 11/215,660, filed Aug. 29, 2005, 11 pages.
Amendment in Response to Non-Final Office Action filed on Apr. 12, 2011, for U.S. Appl. No. 11/626,189, filed Jan. 23, 2007, 18 pages.
Amendment in Response to Non-Final Office Action filed on May 17, 2011, for U.S. Appl. No. 11/609,838, filed Dec. 12, 2006,18 pages.
Amendment in Response to Final Office Action filed on May 17, 2011, for U.S. Appl. No. 11/751,605, filed May 21, 2007, 10 pages.
Amendment in Response to Final Office Action filed on May 24, 2011, for U.S. Appl. No. 11/828,835, filed Jul. 26, 2007, 13 pages.
Amendment in Response to Non-Final Office Action filed on May 23, 2011, for U.S. Appl. No. 11/736,438, filed Apr. 17, 2007, 11 pages.
Amendment in Response to Non-Final Office Action filed on Jun. 6, 2011, for U.S. Appl. No. 12/101,050, filed Apr. 10, 2008, 17 pages.
Amendment in response to Final Office Action filed on Jun. 7, 2011, for U.S. Appl. No. 11/751,605, filed May 21, 2007, 11 pages.
Amendment in Response to Final Office Action filed on Dec. 7, 2011, for U.S. Appl. No. 11/736,438, filed Apr. 17, 2007, 10 pages.
Amendment in Response to Non-Final Office Action filed on Dec. 16, 2011, for U.S. Appl. No. 11/938,256, filed Nov. 10, 2007, 10 pages.
Amendment in Response to Non-Final Office Action filed on Jan. 9, 2012, for U.S. Appl. No. 11/751,596, filed May 21, 2007, 9 pages.
Amendment in Response to Non-Final Office Action filed on Feb. 15, 2012, for U.S. Appl. No. 11/626,189, filed Jan. 23, 2007, 13 pages.
Amendment in Response to Non-Final Office Action filed on Feb. 17, 2012, for U.S. Appl. No. 11/751,597, filed May 21, 2007, 18 pages.
Amendment in Response to Non-Final Office Action filed on Mar. 23, 2012, for U.S. Appl. No. 11/672,020, filed Feb. 6, 2007, 12 pages.
European Communication mailed on Jan. 22, 2009, for European Application No. 07777255.6, filed May 21, 2007, 2 pages.
European Office Action mailed on May 5, 2009, for European Patent Application No. 07763368.3, filed on Feb. 6, 2007, 3 pages.
European Office Action mailed on Feb. 5, 2010, for European Patent Application No. 06845440.4, filed on Dec. 13, 2006, 4 pages.
European Office Action mailed on Apr. 1, 2010, for European Patent Application No. 07717235.1, filed on Feb. 9, 2007, 2 pages.
European Office Action mailed on Nov. 8, 2010, for European Patent Application No. 05854262.2, filed on Dec. 8, 2005, 5 pages.
European Office Action mailed on Jun. 14, 2011, for European Patent Application No. 07795177.0, filed on May 21, 2007, 6 pages.
Final Office Action mailed on Oct. 8, 2009, for U.S. Appl. No. 11/215,660, filed Aug. 29, 2005, 12 pages.
Final Office Action mailed on Aug. 23, 2010, for U.S. Appl. No. 11/609,838, filed Dec. 12, 2006, 20 pages.
Final Office Action mailed on Nov. 1, 2010, for U.S. Appl. No. 11/215,660, filed Aug. 29, 2005, 12 pages.
Final Office Action mailed on Mar. 22, 2011, for U.S. Appl. No. 11/828,835, filed Jul. 26, 2007, 11 pages.
Final Office Action mailed on Apr. 29, 2011, for U.S. Appl. No. 11/751,605, filed May 21, 2007, 9 pages.
Final Office Action mailed on Aug. 3, 2011, for U.S. Appl. No. 11/736,438, filed Apr. 17, 2007, 11 pages.
Final Office Action mailed on Apr. 25, 2012, for U.S. Appl. No. 12/251,406, filed Oct. 14, 2008, 10 pages.
International Search Report mailed on May 18, 2006, for PCT Patent Application No. PCT/US2005/045499, filed on Dec. 8, 2005, 4 pages.
International Search Report mailed on May 19, 2006, for PCT Patent Application No. PCT/US2005/044624, filed on Dec. 8, 2005, 4 pages.
International Search Report mailed on Jun. 20, 2007, for PCT Patent Application No. PCT/US2006/047748, filed on Dec. 13, 2006, 3 pages.
International Search Report mailed on Sep. 28, 2007, for PCT Patent Application No. PCT/US2007/002096 filed on Jan. 23, 2007, 4 pages.
International Search Report mailed on Oct. 25, 2007, for PCT Patent Application No. PCT/US2007/003322, filed on Feb. 6, 2007, 5 pages.
International Search Report mailed on Oct. 26, 2007, for PCT Patent Application No. PCT/US2007/003631, filed on Feb. 9, 2007, 5 pages.
International Search Report mailed on Dec. 11, 2007, for PCT Patent Application No. PCT/US2007/012358, filed on May 21, 2007, 3 pages.
International Search Report mailed on Jan. 28, 2008, for PCT Patent Application No. PCT/US2007/012189, filed on May 21, 2007, 2 pages.
International Search Report mailed on Oct. 23, 2008, for PCT Patent Application No. PCT/US2008/069435, filed on Jul. 8, 2008, 4 pages.
International Search Report mailed on Feb. 25, 2009, for PCT Patent Application No. PCT/US2008/071390, filed on Jul. 28, 2008, 2 pages.
International Search Report mailed on Mar. 13, 2009, for PCT Patent Application No. PCT/US2008/083034, filed on Nov. 10, 2008, 2 pages.
International Search Report mailed on Mar. 13, 2009, for PCT Patent Application No. PCT/US2008/079891, filed on Nov. 10, 2008, 2 pages.
International Search Report mailed on Apr. 6, 2009, for PCT Patent Application No. PCT/US2008/079878, filed on Oct. 14, 2008, 3 pages.
Invitation to Pay Additional Fees mailed on Jul. 6, 2007, for PCT Patent Application No. PCT/US2007/002096, filed on Jan. 23, 2007, 4 pages.
Invitation to Pay Additional Fees mailed on Aug. 7, 2007, for PCT Patent Application No. PCT/US2007/003322, filed on Feb. 6, 2007, 5 pages.
Invitation to Pay Additional Fees mailed on Aug. 7, 2007, for PCT Patent Application No. PCT/US2007/003631, filed on Feb. 9, 2007, 5 pages.
Invitation to Pay Additional Fees mailed on Nov. 11, 2008, for PCT Patent Application No. PCT/US2008/071390, filed on Jul. 28, 2008, 5 pages.
Invitation to Pay Additional Fees mailed on Dec. 29, 2008, for PCT Patent Application No. PCT/US2008/079891, filed on Oct. 14, 2008, 7 pages.
Japanese Office Action mailed on Jul. 19, 2011, for Japanese Patent Application No. 2007-550378, filed on Dec. 8, 2005, with English Translation, 11 pages.
Japanese Office Action mailed Feb. 28, 2012, for Japanese Patent Application No. 2008-545817, filed on Dec. 13, 2006, with English Translation, 6 pages.
Japanese Office Action mailed on Feb. 28, 2012, for Japanese Patent Application No. 2008-551487, filed on Jan. 23, 2007, with English Translation, 9 pages.
Japanese Office Action mailed on Feb. 28, 2012, for Japanese Patent Application No. 2008-554410, filed on Feb. 9, 2007, with English Translation, 6 pages.
Japanese Office Action mailed on Mar. 6, 2012, for Japanese Patent Application No. 2008-553430, filed on Feb. 6, 2007, with English Translation, 6 pages.
Japanese Office Action mailed on Jan. 15, 2013, for Japanese Patent Application No. 2010-518438, filed on Jan. 25, 2010, with English Translation, 7 pages.
Non-Final Office Action mailed on Jan. 10, 2008, for U.S. Appl. No. 11/160,646, filed Jul. 1, 2005, 6 pages.
Non-Final Office Action mailed on Mar. 12, 2008, for U.S. Appl. No. 11/153,007, filed Jun. 14, 2005, 11 pages.
Non-Final Office Action mailed on Mar. 25, 2009, for U.S. Appl. No. 11/215,660, filed Aug. 29, 2005, 11 pages.
Non-Final Office Action mailed on Mar. 29, 2010, for U.S. Appl. No. 11/215,660, filed Aug. 29, 2005, 16 pages.
Non-Final Office Action mailed on Apr. 6, 2010, for U.S. Appl. No. 11/609,838, filed Dec. 12, 2006, 25 pages.
Non-Final Office Action mailed on Aug. 24, 2010, for U.S. Appl. No. 11/834,540, filed Aug. 6, 2007, 11 pages.
Non-Final Office Action mailed on Oct. 18, 2010, for U.S. Appl. No. 11/626,189, filed Jan. 23, 2007, 11 pages.
Non-Final Office Action mailed on Oct. 28, 2010, for U.S. Appl. No. 11/828,835, filed Jul. 26, 2007, 11 pages.
Non-Final Office Action mailed on Dec. 22, 2010, for U.S. Appl. No. 11/751,605, filed May 21, 2007, 10 pages.
Non-Final Office Action mailed on Feb. 17, 2011, for U.S. Appl. No. 11/609,838, filed Dec. 12, 2006, 24 pages.
Non-Final Office Action mailed on Mar. 2, 2011, for U.S. Appl. No. 11/736,438, filed Apr. 17, 2007, 10 pages.
Non-Final Office Action mailed on May 23, 2011, for U.S. Appl. No. 12/101,050, filed Apr. 10, 2008, 11 pages.
Non-Final Office Action mailed on Jun. 28, 2011, for U.S. Appl. No. 11/938,256, filed Nov. 10, 2007, 23 pages.
Non-Final Office Action mailed on Aug. 4, 2011, for U.S. Appl. No. 11/609,838, filed Dec. 12, 2006, 16 pages.
Non-Final Office Action mailed on Aug. 15, 2011, for U.S. Appl. No. 11/626,189, filed Jan. 23, 2007, 13 pages.
Non-Final Office Action mailed on Aug. 18, 2011, for U.S. Appl. No. 11/751,597, filed May 21, 2007, 25 pages.
Non-Final Office Action mailed on Sep. 9, 2011, for U.S. Appl. No. 11/751,596, filed May 21, 2007, 6 pages.
Non-Final Office Action mailed on Oct. 21, 2011, for PCT U.S. Appl. No. 12/251,406, filed Oct. 14, 2008, 8 pages.
Non-Final Office Action mailed on Oct. 26, 2011, for U.S. Appl. No. 11/673,470, filed Feb. 9, 2007, 40 pages.
Non-Final Office Action mailed on Nov. 23, 2011, for U.S. Appl. No. 11/672,020, filed Feb. 6, 2007, 12 pages.
Non-Final Office Action mailed on Feb. 13, 2012, for U.S. Appl. No. 13/275,206, filed Oct. 17, 2011, 13 pages.
Non-Final Office Action mailed on Feb. 14, 2012, for U.S Appl. No. 12/251,383, filed Oct. 14, 2008, 9 pages.
Non-Final Office Action mailed on Jun. 22, 2012, for U.S. Appl. No. 12/181,280, filed Jul. 28, 2008, 12 pages.
Notice of Allowance mailed on Dec. 13, 2010, for U.S. Appl. No. 11/834,540, filed Aug. 6, 2007, 4 pages.
Notice of Allowance mailed on Jul. 22, 2011, for U.S. Appl. No. 12/101,050, filed Apr. 10, 2008, 7 pages.
Notice of Allowance mailed on Feb. 8, 2012, for U.S. Appl. No. 11/609,838, filed Dec. 12, 2006, 8 pages.
Notice of Allowance mailed on Feb. 29, 2012, for U.S. Appl. No. 11/751,596, filed May 21, 2007, 10 pages.
Notice of Allowance mailed on Mar. 14, 2012, for U.S. Appl. No. 11/626,189, filed Jan. 23, 2007, 13 pages.
Notice of Allowance mailed on Jun. 7, 2012, for U.S. Appl. No. 11/751,597, filed May 21, 2007, 17 pages.
Notice of Allowance mailed on Jul. 19, 2013, for U.S. Appl. No. 13/606,465, filed Sep. 7, 2012, 14 pages.
Preliminary Amendment filed Jan. 26, 2009, for U.S. Appl. No. 11/672,020, filed Feb. 6, 2007, 11 pages.
Response to European Communication filed Feb. 6, 2009, for European Patent Application No. 07777255.6, filed on May 21, 2007, 5 pages.
Response to European Office Action filed on Nov. 11, 2009, for European Patent Application No. 07783368.3, filed on Feb. 6, 2007, 12 pages.
Response to European Office Action filed on Jul. 7, 2010, for European Patent Application No. 06845440.4, filed on Dec. 13, 2006, 13 pages.
Response to European Office Action filed on Aug. 18, 2010, for European Patent Application No. 07717235.1, filed on Feb. 9, 2007, 7 pages.
Response to European Office Action filed on Mar. 8, 2011, for European Patent Application No. 05854262.2, filed on Dec. 8, 2005, 11 pages.
Response to European Office Action filed on Dec. 13, 2011, for European Patent Application No. 07795177.0, filed on May 21, 2007, 9 pages.
Response to Restriction Requirement filed on Jan. 26, 2009, for U.S. Appl. No. 11/215,660, filed Aug. 29, 2005, 2 pages.
Response to Restriction Requirement filed on Jul. 23, 2010, for U.S. Appl. No. 11/751,605, filed May 21, 2007, 9 pages.
Response to Restriction Requirement filed on Aug. 4, 2010, for U.S. Appl. No. 11/834,540, filed Aug. 6, 2007, 5 pages.
Response to Restriction Requirement filed on Sep. 9, 2010, for U.S. Appl. No. 11/626,189, filed Jan. 23, 2007, 8 pages.
Response to Restriction Requirement filed on Oct. 21, 2010, for U.S. Appl. No. 11/828,835, filed Jul. 26, 2007, 7 pages.
Response to Restriction Requirement filed on Feb. 8, 2011, for U.S. Appl. No. 11/736,438, filed Apr. 17, 2007, 8 pages.
Response to Restriction Requirement filed on Apr. 27, 2011, for U.S. Appl. No. 12/101,050, filed Apr. 10, 2008, 13 pages.
Response to Restriction Requirement filed on Jun. 16, 2011, for U.S. Appl. No. 11/751,596, filed May 21, 2007, 8 pages.
Response to Restriction Requirement filed on Oct. 31, 2011, for U.S. Appl. No. 11/672,020, filed Feb. 6, 2007, 3 pages.
Restriction Requirement mailed on Oct. 30, 2008, for U.S. Appl. No. 11/215,660, filed Aug. 29, 2005, 7 pages.
Restriction Requirement mailed on Jun. 25, 2010, for U.S. Appl. No. 11/751,605, filed May 21, 2007, 9 pages.
Restriction Requirement mailed on Jul. 13, 2010, for U.S. Appl. No. 11/834,540, filed Aug. 6, 2007, 8 pages.
Restriction Requirement mailed on Aug. 10, 2010, for U.S. Appl. No. 11/626,189, filed Jan. 23, 2007, 5 pages.
Restriction Requirement mailed on Sep. 21, 2010, for U.S. Appl. No. 11/828,835, filed Jul. 26, 2007, 6 pages.
Restriction Requirement mailed on Dec. 10, 2010, for U.S. Appl. No. 11/736,438, filed Apr. 17, 2007, 16 pages.
Restriction Requirement mailed on Mar. 11, 2011, for U.S. Appl. No. 12/101,050, filed Apr. 10, 2008, 6 pages.
Restriction Requirement mailed on Jun. 6, 2011, for U.S. Appl. No. 11/751,596, filed May 21, 2007, 6 pages.
Restriction Requirement mailed on Sep. 29, 2011, for U.S. Appl. No. 11/672,020, filed Feb. 6, 2007, 6 pages.
Restriction Requirement mailed on Nov. 28, 2011, for U.S. Appl. No. 12/251,383, filed Oct. 14, 2008, 6 pages.
Substitute Preliminary Amendment filed Mar. 8, 2010, for U.S. Appl. No. 11/672,020, filed Feb. 6, 2007, 2 pages.
Written Opinion of the International Searching Authority mailed on May 18, 2006, for PCT Patent Application No. PCT/US2005/045499, filed on Dec. 8, 2005, 9 pages.
Written Opinion of the International Searching Authority mailed on May 19, 2006, for PCT Patent Application No. PCT/US2005/044624, filed on Dec. 8, 2005, 8 pages.
Written Opinion of the International Searching Authority mailed on Jun. 20, 2007, for PCT Patent Application No. PCT/US2006/047748, filed on Dec. 13, 2006, 7 pages.
Written Opinion of the International Searching Authority mailed on Sep. 28, 2007, for PCT Patent Application No. PCT/US2007/002096 filed on Jan. 23, 2007, 8 pages.
Written Opinion of the International Searching Authority mailed on Oct. 25, 2007, for PCT Patent Application No. PCT/US2007/003322, filed on Feb. 6, 2007, 9 pages.
Written Opinion of the International Searching Authority mailed on Oct. 26, 2007, for PCT Patent Application No. PCT/US2007/003631, filed on Feb. 9, 2007, 7 pages.
Written Opinion of the International Searching Authority mailed on Dec. 11, 2007, for PCT Patent Application No. PCT/US2007/012358, filed on May 21, 2007, 6 pages.
Written Opinion of the International Searching Authority mailed on Jan. 28, 2008, for PCT Patent Application No. PCT/US2007/012189, filed on May 21, 2007, 7 pages.
Written Opinion of the International Searching Authority mailed on Oct. 23, 2008, for PCT Patent Application No. PCT/US2008/069435, filed on Jul. 8, 2008, 6 pages.
Written Opinion of the International Searching Authority mailed on Feb. 25, 2009, for PCT Patent Application No. PCT/US2008/071390, filed on Jul. 28, 2008, 7 pages.
Written Opinion of the International Searching Authority mailed on Mar. 13, 2009, for PCT Patent Application No. PCT/US2008/083034, filed on Nov. 10, 2008, 4 pages.
Written Opinion of International Searching Authority mailed on Mar. 13, 2009, for PCT Patent Application No. PCT/US2008/079891, filed on Nov. 10, 2008, 5 pages.
Written Opinion of International Searching Authority mailed on Apr. 6, 2009, for PCT Patent Application No. PCT/US2008/079878, filed on Oct. 14, 2008, 13 pages.
Amendment in Response to Non-Final Office Action filed on Mar. 5, 2014, for U.S. Appl. No. 13/454,974, filed Apr. 24, 2012, 11 pages.
Amendment in Response to Final Office Action filed on Sep. 4, 2014, for U.S. Appl. No. 13/454,974, filed Apr. 24, 2012, 15 pages.
Amendment in Response to Non-Final Office Action mailed on Dec. 10, 2014, for U.S. Appl. No. 13/463,690, filed May 3, 2012, 12 pages.
European Office Action mailed on May 21, 2012, for European Patent Application No. 06845440.4, filed on Jul. 10, 2008, 6 pages.
European Office Action mailed on May 21, 2012, for European Patent Application No. 07717024.9, filed on Aug. 21, 2008, 5 pages.
European Office Action mailed on May 21, 2012, for European Patent Application No. 07717235.1, filed on Sep. 5, 2008, 5 pages.
European Office Action mailed on Sep. 3, 2013, for European Patent Application No. 05854262.2, filed on Dec. 8, 2005, 6 pages.
Extended European Search Report mailed Apr. 26, 2012, for European Patent Application No. 12153946.4, filed Feb. 3, 2012, 6 pages.
Extended European Search Report mailed Oct. 5, 2012, for European Patent Application No. 12162806.9, filed Apr. 2, 2012, 5 pages.
Final Office Action mailed on Apr. 23, 2012, for U.S. Appl. No. 11/938,256, filed Nov. 10, 2007, 13 pages.
Final Office Action mailed on Aug. 1, 2012, for U.S. Appl. No. 11/673,470, filed Feb. 9, 2007, 33 pages.
Final Office Action mailed on Mar. 6, 2014, for U.S. Appl. No. 11/751,605, filed May 21, 2007, 8 pages.
Final Office Action mailed on Jun. 10, 2014, for U.S. Appl. No. 11/215,660, filed Aug. 29, 2005, 14 pages.
Final Office Action mailed on Jul. 7, 2014, for U.S. Appl. No. 13/454,974, filed Apr. 24, 2012, 11 pages.
Final Office Action mailed on Dec. 24, 2014, for U.S. Appl. No. 13/463,690, filed May 3, 2012, 11 pages.
Japanese Office Action mailed Jan. 8, 2013 for Japanese Patent Application No. 2008-545817, filed on Dec. 13, 2006, with English Translation, 6 pages.
Japanese Office Action mailed on Mar. 5, 2013, for Japanese Patent Application No. 2008-553430, filed on Feb. 6, 2007, with English Translation, 4 pages.
Japanese Office Action mailed on May 13, 2014, for Japanese Patent Application No. 2013-138949, filed on Dec. 13, 2006, 6 pages.
Japanese Office Action mailed on May 27, 2014, for Japanese Patent Application No. 2013-111118, filed on Jan. 23, 2007, 13 pages.
Non-Final Office Action mailed on May 14, 2013, for U.S. Appl. No. 11/751,605, filed May 21, 2007, 11 pages.
Non-Final Office Action mailed on Jul. 19, 2013, for U.S. Appl. No. 11/938,256, filed Nov. 10, 2007, 12 pages.
Non-Final Office Action mailed on Aug. 30, 2013, for U.S. Appl. No. 11/673,470, filed Feb. 9, 2007, 37 pages.
Non-Final Office Action mailed on Sep. 5, 2013, for U.S. Appl. No. 13/454,974, filed Apr. 24, 2012, 11 pages.
Non-Final Office Action mailed on Nov. 14, 2013, for U.S. Appl. No. 11/215,660, filed Aug. 29, 2005, 10 pages.
Non-Final Office Action mailed on Jun. 11, 2014, for U.S. Appl. No. 13/463,690, filed May 3, 2012, 13 pages.
Non-Final Office Action mailed on Oct. 22, 2014, for U.S. Appl. No. 13/467,909, filed May 9, 2012, 8 pages.
Non-Final Office Action mailed on Dec. 22, 2014, for U.S. Appl. No. 14/025,539, filed Sep. 12, 2013, 12 pages.
Non-Final Office Action mailed on Jan. 26, 2015, for U.S. Appl. No. 13/454,974, filed Apr. 24, 2012, 14 pages.
Response to European Office Action filed on Feb. 20, 2014, for European Patent Application No. 05854262.2, filed on Dec. 8, 2005, 5 pages.
Response to Japanese Office Action mailed on Nov. 6, 2014, for Japanese Patent Application No. 2013-138949, filed on Dec. 13, 2006, 7 pages.
Response to Japanese Office Action mailed on Nov. 17, 2014, for Japanese Patent Application No. 2013-111118, filed on Jan. 23, 2007, 4 pages.
Response to Restriction Requirement filed on Jul. 19, 2013, for U.S. Appl. No. 13/454,974, filed Apr. 24, 2012, 7 pages.
Response to Restriction Requirement mailed on Apr. 21, 2014, for U.S. Appl. No. 13/463,690, filed May 3, 2012, 9 pages.
Response to Restriction Requirement filed on Sep. 2, 2014, for U.S. Appl. No. 13/467,909, filed May 9, 2012, 4 pages.
Restriction Requirement mailed on Jun. 21, 2013, for U.S. Appl. No. 13/454,974, filed Apr. 24, 2012, 11 pages.
Restriction Requirement mailed on Feb. 20, 2014, for U.S. Appl. No. 13/463,690, filed May 3, 2012, 6 pages.
Restriction Requirement mailed on Jul. 1, 2014, for U.S. Appl. No. 13/467,909, filed May 9, 2012, 9 pages.
Related Publications (1)
Number Date Country
20120300999 A1 Nov 2012 US
Provisional Applications (1)
Number Date Country
60911054 Apr 2007 US
Continuations (2)
Number Date Country
Parent 13275206 Oct 2011 US
Child 13584647 US
Parent 12101050 Apr 2008 US
Child 13275206 US