Systems and methods for motion adaptive filtering

Information

  • Patent Grant
  • 7864194
  • Patent Number
    7,864,194
  • Date Filed
    Friday, January 19, 2007
    17 years ago
  • Date Issued
    Tuesday, January 4, 2011
    13 years ago
Abstract
Methods and systems for motion adaptive filtering detect movement of text or areas of high spatial frequency in one frame to another frame of an image. When such movement is detected and meets a certain level or threshold, the subpixel rendering processing of such text or areas of high spatial frequency may be changed.
Description
BACKGROUND

In commonly owned United States patent Applications: (1) U.S. Pat. No. 6,903,754 (the '754 patent) [U.S. patent application Ser. No. 09/916,232], entitled “ARRANGEMENT OF COLOR PIXELS FOR FULL COLOR IMAGING DEVICES WITH SIMPLIFIED ADDRESSING,” filed Jul. 25, 2001; (2) United States patent Publication No. 2003/0128225 (the '225 application) [U.S. patent application Ser. No. 10/278,353], entitled “IMPROVEMENTS TO COLOR FLAT PANEL DISPLAY SUB-PIXEL ARRANGEMENTS AND LAYOUTS FOR SUB-PIXEL RENDERING WITH INCREASED MODULATION TRANSFER FUNCTION RESPONSE,” filed Oct. 22, 2002; (3) United States patent Publication No. 2003/0128179 (the '179 application) [U.S. patent application Ser. No. 10/278,352], entitled “IMPROVEMENTS TO COLOR FLAT PANEL DISPLAY SUB-PIXEL ARRANGEMENTS AND LAYOUTS FOR SUB-PIXEL RENDERING WITH SPLIT BLUE SUBPIXELS,” filed Oct. 22, 2002; (4) United States patent Publication No. 2004/0051724 (the '724 application) [U.S. patent application Ser. No. 10/243,094], entitled “IMPROVED FOUR COLOR ARRANGEMENTS AND EMITTERS FOR SUBPIXEL RENDERING,” filed Sep. 13, 2002; (5) United States patent Publication No. 2003/0117423 (the '423 application) [U.S. patent application Ser. No. 10/278,328], entitled “IMPROVEMENTS TO COLOR FLAT PANEL DISPLAY SUB-PIXEL ARRANGEMENTS AND LAYOUTS WITH REDUCED BLUE LUMINANCE WELL VISIBILITY,” filed Oct. 22, 2002; (6) United States patent Publication No. 2003/0090581 (the '581 application) [U.S. patent application Ser. No. 10/278,393], entitled “COLOR DISPLAY HAVING HORIZONTAL SUB-PIXEL ARRANGEMENTS AND LAYOUTS” filed Oct. 22, 2002; (7) United States patent Publication No. 2004/0080479 (the '479 application) [U.S. patent application Ser. No. 10/347,001], entitled “IMPROVED SUB-PIXEL ARRANGEMENTS FOR STRIPED DISPLAYS AND METHODS AND SYSTEMS FOR SUB-PIXEL RENDERING SAME,” filed Jan. 16, 2003, novel subpixel arrangements are therein disclosed for improving the cost/performance curves for image display devices and herein incorporated by reference.


These improvements are particularly pronounced when coupled with subpixel rendering (SPR) systems and methods further disclosed in those applications and in commonly owned United States patent Applications: (1) U.S. patent Publication No. 2003/0034992 (the '992 application) [U.S. patent application Ser. No. 10/051,612], entitled “CONVERSION OF RGB PIXEL FORMAT DATA TO PENTILE MATRIX SUB-PIXEL DATA FORMAT,” filed Jan. 16, 2002; (2) United States patent Publication No. 2003/0103058 (the '058 application) [U.S. patent application Ser. No. 10/150,355], entitled “METHODS AND SYSTEMS FOR SUB-PIXEL RENDERING WITH GAMMA ADJUSTMENT,” filed May 17, 2002; (3) U.S. patent Publication No. 2003/0085906 (the '906 application) [U.S. patent application Ser. No. 10/215,843], entitled “METHODS AND SYSTEMS FOR SUB-PIXEL RENDERING WITH ADAPTIVE FILTERING,” filed Aug. 8, 2002—all patent applications and other references mentioned in this specification are herein incorporated by reference.





BRIEF DESCRIPTION OF THE DRAWINGS

The accompanying drawings, which are incorporated in, and constitute a part of this specification illustrate exemplary implementations and embodiments of the invention and, together with the description, serve to explain principles of the invention.



FIGS. 1A and 1B depict a display screen with a word processing window open having text on the rendered in the window and scrolling down the screen.



FIG. 2 shows the performance curves of a liquid crystal display at the 100% illumination curve and the 50% illumination curve versus time.



FIG. 3 depicts one possible embodiment of a system made in accordance with the principles of the present invention.



FIGS. 4 and 5 and 6 and 7 are flowcharts of several embodiments of the techniques made in accordance with the principles of the present invention.





DETAILED DESCRIPTION

Reference will now be made in detail to implementations and embodiments, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts.


As was described in the two related patent applications noted above, some panel technologies—most notably liquid crystal displays (LCDs)—exhibit color error on subpixel rendered text or other areas of high spatial frequency (“HSF”) when viewed by an observer at an off-normal axis viewing angle. Those related applications disclose systems and methods for correcting such color error from off-normal viewing angles.


Color errors from other than off-normal viewing angle may be noticeable by viewers on some LCDs—even when observing from the normal axis to the display panel. For example, moving subpixel rendered text (or other areas of high spatial frequency) may produce color error while in motion. One example of this effect is scrolling text in a word processor application window. Depending on the panel technology (e.g. twisted nematic TN-LCD), the color error may be quite noticeable—and possibly distracting to a user—while scrolling the text. Of course, once the scrolling or motion stops, the color error typically ceases as the response time of TN LCD have time to “catch up” to the now-stationary text.



FIGS. 1A and 1B depict the situation in the one example noted. Display panel 100 is shown having a word processing application sending image data to the panel in a window 101. In the word processing window, there is some text 102 which is a paradigm example of high spatial frequency data. One point of the text 104 (e.g. an edge point of the character “T”) is at coordinate point (X,Y) on the panel in FIG. 1A. As the text is scrolled down, the edge point moves to an new coordinate point 106 (X′, Y′) on the panel in FIG. 1B. During the time the edge point was in transit between points 104 and 106, the “T” was visible on screen and was “moving” frame-by-frame down to its new point. If the response time of the panel's rendering technology (e.g. liquid crystal) is not sufficiently fast, then, if the “T” was being subpixel rendered at the time, color error may be noticeable.



FIG. 2 shows response curves for liquid crystal in going from either 100% or 50% illumination down to 0% illuminations (i.e. curves 202 and 204 respectively) and from 50% or 0% illumination up to 100% illumination (i.e. curves 206 and 208 respectively) and gives a better explanation as the nature of the problem. For example, when black text is rendered on display having a repeat subpixel grouping such as found in FIG. 5 of the related application, entitled “SYSTEMS AND METHODS FOR TEMPORAL SUBPIXEL RENDERING OF IMAGE DATA,” the green subpixels switch from 100% to 0% while the red and blue pixels switch from 100% to 50%. During motion of black text, the green pixels are therefore switching from 100% to 0% to 100%—while the red and the blue are switching from 100% to 50% to 100%. As is illustrated, the response time of the 100% to 0% is much faster than the 100% to 50%.


During motion of black text then, there will be an unbalanced condition of the brightness of red, green and blue pixels, which leads to color error. In fact, there will tend to be too much red and blue brightness which causes a magenta hue to the text. The transition from 0% to 100% is approximately the same as 50% to 100% so doesn't materially add to color error in this example. However, in other LCD modes, this transition could also have larger differences and will lead to color error during motion.


One embodiment to reduce the amount of color error on moving subpixel rendered text and other high spatial frequency image data is to employ an adaptive filter technique. An adaptive filter test may be used to detect motion of high-spatial-frequency edges in an image. When the moving edges are detected, subpixel rendering (SPR) of the text can be changed to a new state. After the moving edges are stationary, the SPR is turned back to the regular mode. Techniques such as those disclosed in the '906 application can be used to detect the edges and to detect the high frequency transitions in the data. A simple counter can be used with the SPR algorithm that counts the number of times an edge is detected in an image. Statistically, a large number of edges means that text is detected. If a low number of edges are detected, then the image is probably pictorial. Since this problem occurs primarily on edges of text, one embodiment might be to employ the change in filters for text only.



FIG. 3 depicts one system embodiment 300 for motion adaptive filtering. System 300 comprises a graphics subsystem 302 having a SPR subsystem 304 which comprises systems and methods of subpixel rendering source (such as those disclosed in the '612 application, '058 application, and the '843 application) image data 312 which is input into the graphics subsystem. The source image data may typically be generated by the operating system 314 or an application 316 and sent to the graphics subsystem for rendering on the display.


A memory 306 is available to SPR subsystem 304 to retain information about the number of and/or locations of points of high spatial frequency in the source image data. A timing controller (TCON) 308 is optionally provided to give timing commands to a display panel 310 which could be a LCD or any other technology having a suitably different response times vs. grey level to produce the color error discussed above. It will be appreciated that the system 300 is merely one possible embodiment to implement the techniques disclosed herein. For example, the SPR subsystem could be an application system integrated circuit (ASIC), field programmable gate array (FPGA), implemented entirely in software under control of a general processor control, or even implemented on the glass of the panel itself (particularly for low temperature polysilicon (LTPS) panels). Additionally, memory 106 could be implemented in RAM of any known species or any other known or future embodiment of memory. One embodiment comprises a graphical subsystem further comprising a subpixel rendering subsystem; a memory coupled to said subpixel rendering subsystem for storing input image data in a plurality of image frames, and a processing subsystem that tests for moving text or other points of high spatial frequency and, when the test indicates moving text or said other areas of high spatial frequency, sends signals to said subpixel rendering subsystem to change the subpixel rendering in successive frames of image data. It will be appreciated that the processing subsystem may be implemented integral or as a part of the subpixel rendering subsystem itself.



FIG. 4 shows one embodiment 400 of implementing a technique for correcting such color errors. In essence, the embodiment involves a display system comprising a graphics subsystem having a subpixel rendering system. The system notes points of high spatial frequency in a first frame of image data; compares the points of high spatial frequency with corresponding points in a second frame of image data; and when a number of points meeting a certain threshold have changed from high spatial frequency to low spatial frequency in the second frame, changing the subpixel rendering on input image data.


The technique starts at step 402 where a image data point at coordinate (X,Y) is input into the SPR subsystem. The point is tested at step 404 to see if it is the point at the end of a frame. If yes, then the technique starts processing at step 406. If not, then the point is tested (via an adaptive filter or by any other means known now or in the future) whether the point is at the edge of a high spatial frequency area (e.g. text) at step 408. If it is not, then at step 410 the image data is incremented at the next coordinate point and returns to step 402. Of course, other SPR functions could be applied to the point at any step of this technique, so the present embodiment may work with other SPR functions in conjunction with noting areas of moving text or HSF areas.


If the point is detected as an edge of text or HSF areas, then a “current” edge counter is incremented to count the number of edge points in a frame (thus, it may be desirable to reset the counter to zero at the beginning of each frame) at step 412. At step 414, the location of every current n-th edge point is stored—possibly in storage 306—where “n” is selected to give a good performance statistically to note areas of moving text or HSF areas. The number “n” takes on all possible ranges between 1 and the total number of addressable points on the screen. However, n=1 (i.e. save up to every possible addressable point on the screen) may be a useful metric if the system designer would want near perfect information as to where all edges of HSF text and images are located—but a lesser number of points would suffice to give a good indication that there are HSF areas in motion on the screen. With n=the total number of addressable points on screen (i.e. save one point of information every screen), this may not be useful as a metric as there may not be enough good data to indicate where there are significant amount of moving HSF text and images to warrant taking an action. Thus, the number “n” is optimally in between these two extreme values.


It will be appreciated that other embodiments could have other criteria for selecting and storing locations of points, including random selection. It is not necessary that a data is stored in modulo arithmetic fashion. It suffices that there are a sufficient number of points to note moving text and HSF areas. At step 416, the image data is incremented to the next location and begins processing at step 402 until there is an end of frame condition detected.



FIG. 5 continues with the processing at step 406/502 with a comparison of “current” edge points against the same points in the “previous” frame's stored values. If the comparison indicates that the current edge points are the “same” as the previous frame's edge points, then the test indicates that there is little or no motion detected and to go process the next frame of data and to turn on SPR as necessary (for example, if SPR had been turned off via this processing previously) at step 504. When, on the other hand, the comparison indicates that the current edge points are “different” enough from the previous frame's edge points, then motion is detected—enough so that the system can take corrective action at step 506—such as turning off SPR for the next frame (and other successive frames as necessary until the motion has stopped).


It should be appreciated that “same” and “different” encompass many possible metrics. “Same” could mean that not one edge point has changed (or has been added or deleted) from one frame to the next. Alternatively, “same” could mean that the system tolerates up to a certain number or a certain percentage of edge changes without need for taking corrective action. Also, the system might even take into consideration a percentage change in a certain subset area of the screen as either “same” or “different”. One embodiment might consider that a certain percentage change in a smaller subset area of the screen means that there is a high possibility that there is a window opened (e.g. word processor) that does not take up the full screen and that HSF information is moving. In such a case, the system might turn off SPR for that portion of the screen and leave the remaining screen as previously treated. Of course, the level of “same” and “different” could be preset into the system according to either a heuristic analysis or an empirical data analysis. These levels may be considered threshold levels or values and may be dynamically controlled by an artificial intelligent mechanism or alternatively, be set by the users themselves.


It should also be appreciated that the “current” frame and the “previous” frame may not necessarily be successive frames. It may suffice that the “current” and “previous” frame has a relevant relationship (i.e. every other frame or the like, or two related frames in an MPEG format) that might serve as a basis of noting that motion is being detected. Additionally, instead of comparing individual points frame by frame, if there is a MPEG encoding (or some other suitable encoding), it may be possible to detect changes in motion vectors.


At step 508, the current frame's edge data is transferred to the previous frame's data and the system is effectively reset at step 510 (e.g., the edge counter and current memory location for storing edge data can be reset) and ready to process another frame's worth of image data at step 512.


It will be appreciated that there are many possible embodiments and variations on the above embodiments and notions. It would suffice for the purposes of the present invention that the system be able to detect that HSF image data is in motion and that, if there is a level of such motion detected that would—in the estimation of the system—detract from the user's perspective (i.e. too much color error introduced), then the system can take corrective actions—such as turn off subpixel rendering for all or a portion of the screen or effectively alter the SPR in some way to correct the viewer's experience of the image. Instead of turning off SPR, the SPR can also be changed to another filter that is less sensitive to motion artifacts of the LCD. In fact, some of the alternative corrective actions are described in the two related patent applications noted above and incorporated herein and in the other patent applications also incorporated herein.


Another alliterative way of describing this technique is as follows:

    • At beginning of frame, set counter to zero.
    • Increment counter every time an edge is detected.
    • Store the location of every “nth” edge detection in a temporary memory.
    • Compare number of edges detected to a preset number.
    • If number of edges exceeds preset number, then set a flag indicating “text is present”.
    • Next frame, repeat process AND check to see if edges are stationary i.e. locations are the same of the statistical sample of edges.
    • If flag is set AND locations are not the same, then motion is detected. In this case, turn off all SPR on edges (when adaptive filter returns “true” value). Color error would be observed, but since the text is moving, is can not be easily seen. The rest of the image will be unchanged; for example if text is moving only in a small region of the screen.
    • If flag is not set AND locations are not the same, then SPR is on normally.
    • If flag is set AND locations are the same, then SPR is on normally.
    • If flag is not set AND locations are the same, then SPR is on normally.
    • Repeat.


A simplification (as shown in FIGS. 6 and 7) to the above process is to just compare the number of edges from frame to frame in memories Mem1 and Mem2 that can store count values; if the number is different by a preset amount, the text is moving (as long as the number of edges exceed the minimum). Even in the case of the text being “turned off”, there will be little disturbance in image quality since the action will be detected the next frame ( 1/60th second later) and the flag will be set to off. The size of the temporary memory (the value of “n”) will depend on the accuracy required for this process. This can be determined empirically.


Referring to the embodiment starting with step 600 in FIG. 6, a counter for counting edge data points is reset at step 602. A data point is inputted at step 604. The data point is tested at step 606 to determine if the data point is at the end of a frame. If the data point is at the end of the frame, the process continues to step 608 of FIG. 7. If the data is at the end of the frame, the data point is tested at step 610 to determine if it is an edge data point. If not, the process continues from step 610 back to step 604. If yes, the counter is incremented at step 612.


At step 614 of FIG. 7 that continues from step 608 of FIG. 6, the total number of edges (i.e., the count value from the counter) is stored in a memory (Mem2). This count value can provide the total number of edges for a current frame of input data. Another memory (Mem1) can store the total number of edges for a previous frame of input data. At step 616, a test is made to determine if the total number of edges in Mem2 is different from the total number of edges in Mem1. At step 620, if the total number is not different (i.e., if Mem2−Mem1=0), then no motion is detected and the number of edges in Mem1. At step 618, if the total number is different, motion is detected and different SPR can be applied used on the edge data points and the new value can be stored in Mem1.


As a refinement to all embodiments, the SPR could be altered on only the text or edges of HSF that areas moving and not to edges that are not moving. One embodiment for accomplishing this task is when moving edges are detected, the graphical subsystem can send a query back to the operating system to enquire as to what active windows might be open that would have moving HSF edges (e.g. word processors, image writers, etc). If there is an application having such an open window, the graphical subsystem could either ask the operating system and/or application to suspend any subpixel rendering mode for its image data inside the window or ask the operating system and/or application to give the dimensions of such window and the graphical subsystem would then alter or shut off SPR for those dimensions on screen.


An alternative embodiment that would not need to talk to the operating system might be for the graphical subsystem to turn off (or otherwise alter) SPR for all edges within a certain neighborhood of edges that are detected as moving. In this manner, most moving edges would have their SPR altered to substantially correct the color error introduced by movement. In such a case, it would be desirable to have a sufficiently large number of edges stored for comparison so that desirable subsets of the screen (i.e. scrolling windows) would be shut off or suitably altered.


It has now been disclosed several embodiments of the techniques, systems and methods of the present invention. It should be appreciated the many variations on these embodiments are possible and that the scope of the present invention is not limited to the embodiments disclosed herein; but encompasses the many possible variations to the same.

Claims
  • 1. A method for improving viewing characteristics of moving text or areas of high spatial frequency, the method implemented in a display system comprising a subpixel rendering system, the method comprising: automatically noting points of high spatial frequency in a first frame of image data;automatically comparing the points of high spatial frequency with corresponding points in a second frame of image data; andwhen a number of points meeting a certain threshold have changed from high spatial frequency in the second frame, automatically changing the subpixel rendering on input image data and turning the subpixel rendering off in successive frames of image data such that visual error due to movement of said high spatial frequency across said display system is reduced.
  • 2. The method as recited in claim 1 wherein automatically noting points of high spatial frequency further comprises storing a number of points that satisfy a test for whether a point is an edge point of text or other areas of high spatial frequency.
  • 3. The method as recited in claim 1 wherein automatically noting points of high spatial frequency further comprises storing every n-th point that satisfies a test for whether a point is an edge point of text or other areas of high spatial frequency.
  • 4. The method as recited in claim 1 wherein automatically noting points of high spatial frequency further comprises noting motion vectors in said first frame that have a video encoding that utilizes such motion vectors.
  • 5. The method as recited in claim 4 wherein automatically comparing the points of high spatial frequency with corresponding points in a second frame of image data further comprises noting motion vectors in said second frame and determining if there is a difference between said motion vectors in said second frame and said motion vectors in said first frame.
  • 6. The method as recited in claim 1 wherein automatically comparing the points of high spatial frequency with corresponding points in a second frame of image data further comprises noting points in said second frame corresponding to the location of points of high spatial frequency in said first frame and testing the points in the second frame to determine if the points are of high spatial frequencies.
  • 7. A method for improving viewing characteristics of moving text or areas of high spatial frequency, the method implemented in a display system comprising a subpixel rendering system, the method comprising: automatically noting points of high spatial frequency in a first frame of image data;automatically comparing the points of high spatial frequency with corresponding points in a second frame of image data; andwhen a number of points meeting a certain threshold have changed from high spatial frequency in the second frame, automatically determining which portion of the frame meets said certain threshold; and automatically changing the subpixel rendering of high spatial frequency image data and turning the subpixel rendering off in successive frames of image data in said portion of the frame.
  • 8. A graphical subsystem for a liquid crystal display, said display comprising a plurality of colored subpixels, said graphical subsystem comprising: a subpixel rendering subsystem;a memory coupled to said subpixel rendering subsystem for storing input image data regarding a plurality of image frames, anda processing subsystem that tests for moving text or other points of high spatial frequency and, when the test indicates moving text or said other areas of high spatial frequency, sends signals to said subpixel rendering subsystem to change the subpixel rendering upon said colored subpixels and to turn said subpixel rendering off in successive frames of image data.
  • 9. The graphical subsystem as recited in claim 8 wherein said subpixel rendering subsystem comprises one of a group, said group comprising: an ASIC, an FPGA, TFTs implemented on glass, or a computer-readable medium containing instructions to be executed on a general processor.
  • 10. The graphical subsystem as recited in claim 8 wherein said processing subsystem is implemented as a part of said subpixel rendering subsystem.
  • 11. A computer readable non-transitory medium, said medium having instructions that, when read by a general processing system, performs a method for improving viewing characteristics of moving text or areas of high spatial frequency upon liquid crystal displays comprising a plurality of colored subpixels, said method comprising: noting points of high spatial frequency in a first frame of image data;comparing the points of high spatial frequency with corresponding points in a second frame of image data; andwhen a number of points meeting a certain threshold have changed from high spatial frequency in the second frame, changing the subpixel rendering of input image data upon said colored subpixels and turning the subpixel rendering off.
  • 12. A subpixel rendering (SPR) system for displaying image data on displays having a plurality of colored subpixels, the SPR comprising: means for detecting points of high spatial frequency in a first frame of image data;means for comparing the detected points of high spatial frequency with corresponding points in a second frame of image data; andwhen the points of high spatial frequency are different in the second frame of image data from the points of high spatial frequency in the first frame of image data, means for changing subpixel rendering of input image data upon said colored subpixels and turning the subpixel rendering off.
  • 13. The subpixel rendering system as recited in claim 12 wherein the points relate to text data.
  • 14. The subpixel rendering system as recited in claim 12 further comprising means for changing the subpixel rendering in successive frames of image data.
RELATED APPLICATIONS

This application is a continuation of commonly owned and copending U.S. patent application Ser. No. 10/379,765 filed on Mar. 4, 2003, and claims as a priority date the benefit of the filing date thereof under 35 U.S.C. §120. U.S. patent application Ser. No. 10/379,765 is published as United States patent Publication No. 2004/0174380 A1 which is hereby incorporated by reference herein for all that it teaches. Subject matter in the present application is related to subject matter in the following United States patent documents: (1) United States patent Publication No. 2004/0196302 (the '302 application) [U.S. patent application Ser. No. 10/379,767] entitled “SYSTEMS AND METHODS FOR TEMPORAL SUBPIXEL RENDERING OF IMAGE DATA,” and (2) commonly owned U.S. Pat. No. 6,917,368 (the '368 patent) [U.S. patent application Ser. No. 10/379,766], entitled “SUB-PIXEL RENDERING SYSTEM AND METHOD FOR IMPROVED DISPLAY VIEWING ANGLES,” which are hereby incorporated herein by reference.

US Referenced Citations (181)
Number Name Date Kind
3971065 Bayer Jul 1976 A
4353062 Lorteije et al. Oct 1982 A
4439759 Fleming et al. Mar 1984 A
4593978 Mourey et al. Jun 1986 A
4642619 Togashi Feb 1987 A
4651148 Takeda et al. Mar 1987 A
4751535 Myers Jun 1988 A
4773737 Yokono et al. Sep 1988 A
4786964 Plummer et al. Nov 1988 A
4792728 Chang et al. Dec 1988 A
4800375 Silverstein et al. Jan 1989 A
4853592 Strathman Aug 1989 A
4874986 Menn et al. Oct 1989 A
4886343 Johnson Dec 1989 A
4908609 Stroomer Mar 1990 A
4920409 Yamagishi Apr 1990 A
4946259 Matino et al. Aug 1990 A
4965565 Noguchi Oct 1990 A
4966441 Conner Oct 1990 A
4967264 Parulski et al. Oct 1990 A
5006840 Hamada et al. Apr 1991 A
5010413 Bahr Apr 1991 A
5052785 Takimoto et al. Oct 1991 A
5062057 Blacken et al. Oct 1991 A
5113274 Takahashi et al. May 1992 A
5132674 Bottorf Jul 1992 A
5144288 Hamada et al. Sep 1992 A
5184114 Brown Feb 1993 A
5189404 Masimo et al. Feb 1993 A
5196924 Lumelsky et al. Mar 1993 A
5233385 Sampsell Aug 1993 A
5311337 McCartney, Jr. May 1994 A
5315418 Sprague et al. May 1994 A
5334996 Tanigaki et al. Aug 1994 A
5341153 Benzschawel et al. Aug 1994 A
5398066 Martinez-Uriegas et al. Mar 1995 A
5416890 Beretta May 1995 A
5436747 Suzuki Jul 1995 A
5438649 Ruetz Aug 1995 A
5448652 Vaidyanathan et al. Sep 1995 A
5450216 Kasson Sep 1995 A
5461503 Deffontaines et al. Oct 1995 A
5477240 Huebner et al. Dec 1995 A
5485293 Robinder Jan 1996 A
5535028 Bae et al. Jul 1996 A
5541653 Peters et al. Jul 1996 A
5543819 Farwell et al. Aug 1996 A
5561460 Katoh et al. Oct 1996 A
5563621 Silsby Oct 1996 A
5579027 Sakurai et al. Nov 1996 A
5642176 Abukawa et al. Jun 1997 A
5648793 Chen Jul 1997 A
5694186 Yanagawa et al. Dec 1997 A
5719639 Imamura Feb 1998 A
5724442 Ogatsu et al. Mar 1998 A
5731818 Wan et al. Mar 1998 A
5739802 Mosier Apr 1998 A
5748828 Steiner et al. May 1998 A
5754163 Kwon May 1998 A
5754226 Yamada et al. May 1998 A
5792579 Phillips Aug 1998 A
5815101 Fonte Sep 1998 A
5818405 Eglit et al. Oct 1998 A
5821913 Mamiya Oct 1998 A
5899550 Masaki May 1999 A
5917556 Katayama Jun 1999 A
5929843 Tanioka Jul 1999 A
5933253 Ito et al. Aug 1999 A
5949496 Kim Sep 1999 A
5973664 Badger Oct 1999 A
5990997 Jones et al. Nov 1999 A
6002446 Eglit Dec 1999 A
6008868 Silverbrook Dec 1999 A
6034666 Kanai et al. Mar 2000 A
6038031 Murphy Mar 2000 A
6049626 Kim Apr 2000 A
6054832 Kunzman et al. Apr 2000 A
6061533 Kajiwara May 2000 A
6064363 Kwon May 2000 A
6097367 Kuriwaki et al. Aug 2000 A
6100872 Aratani et al. Aug 2000 A
6108122 Ulrich et al. Aug 2000 A
6144352 Matsuda et al. Nov 2000 A
6147664 Hansen Nov 2000 A
6151001 Anderson et al. Nov 2000 A
6160535 Park Dec 2000 A
6184903 Omori Feb 2001 B1
6188385 Hill et al. Feb 2001 B1
6198507 Ishigami Mar 2001 B1
6219019 Hasegawa Apr 2001 B1
6219025 Hill et al. Apr 2001 B1
6225967 Hebiguchi May 2001 B1
6225973 Hill et al. May 2001 B1
6236390 Hitchcock May 2001 B1
6239783 Hill et al. May 2001 B1
6243055 Fergason Jun 2001 B1
6243070 Hill et al. Jun 2001 B1
6262710 Smith Jul 2001 B1
6271891 Ogawa et al. Aug 2001 B1
6278434 Hill et al. Aug 2001 B1
6297826 Semba et al. Oct 2001 B1
6299329 Mui et al. Oct 2001 B1
6326981 Mori et al. Dec 2001 B1
6327008 Fujiyoshi Dec 2001 B1
6332030 Manjunath Dec 2001 B1
6335719 An et al. Jan 2002 B1
6342876 Kim Jan 2002 B1
6346972 Kim Feb 2002 B1
6348929 Acharya Feb 2002 B1
6360008 Suzuki et al. Mar 2002 B1
6360023 Betrisey et al. Mar 2002 B1
6377262 Hitchcock et al. Apr 2002 B1
6384836 Naylor, Jr. et al. May 2002 B1
6392717 Kunzman May 2002 B1
6393145 Betrisey et al. May 2002 B2
6396505 Lui et al. May 2002 B1
6414719 Parikh Jul 2002 B1
6417867 Hallberg Jul 2002 B1
6429867 Deering Aug 2002 B1
6441867 Daly Aug 2002 B1
6453067 Morgan et al. Sep 2002 B1
6466618 Messing et al. Oct 2002 B1
6469766 Waterman et al. Oct 2002 B2
6483518 Perry et al. Nov 2002 B1
6545653 Takahara Apr 2003 B1
6545740 Werner Apr 2003 B2
6552706 Ikeda et al. Apr 2003 B1
6570584 Cok et al. May 2003 B1
6600495 Boland et al. Jul 2003 B1
6624828 Dresevic et al. Sep 2003 B1
6661429 Phan Dec 2003 B1
6674430 Kaufman Jan 2004 B1
6674436 Dresevic et al. Jan 2004 B1
6681053 Zhu Jan 2004 B1
6714212 Tsuboyama et al. Mar 2004 B1
6738526 Betrisey et al. May 2004 B1
6750875 Keely, Jr. Jun 2004 B1
6781626 Wang Aug 2004 B1
6801220 Greier et al. Oct 2004 B2
6804407 Weldy Oct 2004 B2
7167186 Credelle et al. Jan 2007 B2
20010017515 Kusunoki et al. Aug 2001 A1
20010040645 Yamazaki Nov 2001 A1
20010048764 Betrisey et al. Dec 2001 A1
20020012071 Sun Jan 2002 A1
20020015110 Elliott Feb 2002 A1
20020017645 Yamazaki Feb 2002 A1
20020122160 Kunzman Sep 2002 A1
20020140831 Hayashi Oct 2002 A1
20020149598 Greier et al. Oct 2002 A1
20020190648 Bechtel et al. Dec 2002 A1
20030011603 Koyama Jan 2003 A1
20030011613 Booth, Jr. Jan 2003 A1
20030034992 Brown Elliott et al. Feb 2003 A1
20030043567 Hoelen et al. Mar 2003 A1
20030071775 Ohashi et al. Apr 2003 A1
20030071826 Goertzen Apr 2003 A1
20030071943 Choo et al. Apr 2003 A1
20030077000 Blinn Apr 2003 A1
20030085906 Elliott et al. May 2003 A1
20030146893 Sawabe Aug 2003 A1
20030218618 Phan Nov 2003 A1
20040008208 Dresevic et al. Jan 2004 A1
20040036704 Han et al. Feb 2004 A1
20041002180 Hong et al. Feb 2004
20040061710 Messing et al. Apr 2004 A1
20040085495 Roh et al. May 2004 A1
20040095521 Song et al. May 2004 A1
20040114046 Lee et al. Jun 2004 A1
20040150651 Phan Aug 2004 A1
20040169807 Rho et al. Sep 2004 A1
20040179160 Rhee et al. Sep 2004 A1
20040213449 Safaee-Rad Oct 2004 A1
20040239837 Hong et al. Dec 2004 A1
20040247070 Ali Dec 2004 A1
20050031199 Ben-Chorin et al. Feb 2005 A1
20050094871 Berns et al. May 2005 A1
20050134600 Credelle et al. Jun 2005 A1
20050151752 Phan Jul 2005 A1
20050162600 Rho et al. Jul 2005 A1
20050169551 Messing et al. Aug 2005 A1
Foreign Referenced Citations (43)
Number Date Country
299 09 537 Oct 1999 DE
199 23 527 Nov 2000 DE
201 09 354 Sep 2001 DE
0 158 366 Oct 1985 EP
0 203 005 Nov 1986 EP
0 322 106 Jun 1989 EP
0 0671 650 Sep 1995 EP
0 793 214 Sep 1997 EP
0 812 114 Dec 1997 EP
0 878 969 Nov 1998 EP
0 899 604 Mar 1999 EP
1 083 539 Mar 2001 EP
1 261 014 Nov 2002 EP
1 381 020 Jan 2004 EP
2 133 912 Aug 1984 GB
2 146 478 Apr 1985 GB
60-107022 Jun 1985 JP
02-000826 Jan 1990 JP
03-078390 Apr 1991 JP
03-36239 May 1991 JP
06-102503 Apr 1994 JP
11-282008 Oct 1999 JP
02-983027 Nov 1999 JP
2001203919 Jul 2001 JP
2002215082 Jul 2002 JP
2004-4822 Aug 2004 JP
2004-78218 Nov 2004 JP
WO 9723860 Jul 1997 WO
WO 0021067 Apr 2000 WO
WO 0042564 Jul 2000 WO
WO 0042762 Jul 2000 WO
WO 0045365 Aug 2000 WO
WO 0067196 Nov 2000 WO
WO 0110112 Feb 2001 WO
WO 0129817 Apr 2001 WO
WO 0152546 Jul 2001 WO
WO 02059685 Aug 2002 WO
WO 03014819 Feb 2003 WO
WO 2004017129 Feb 2004 WO
WO 2004086128 Mar 2004 WO
WO 2004027503 Apr 2004 WO
WO 2004040548 May 2004 WO
WO 2005065027 Jul 2005 WO
Related Publications (1)
Number Date Country
20070115298 A1 May 2007 US
Continuations (1)
Number Date Country
Parent 10379765 Mar 2003 US
Child 11625211 US