Facial feature-localized and global real-time video morphing

Information

  • Patent Grant
  • 7397932
  • Patent Number
    7,397,932
  • Date Filed
    Monday, March 12, 2007
    17 years ago
  • Date Issued
    Tuesday, July 8, 2008
    15 years ago
Abstract
A method and apparatus for modifying, or morphing, a video image. A video feed is received which include an image of a person's face. A number of facial features are detected from the video feed. One or more of those facial features are selected and modified, with the modification being applied to the video feed to produce a morphed image. For example, the morphing can produce fun effects such as oversized or too small eyes, lips, etc.
Description
BACKGROUND OF THE INVENTION

The present invention relates to modifying video images, in particular varying features of a person on live video.


Software has existed for some time to modify images, such as a person's face. U.S. Pat. No. 5,825,941 is an example of software allowing a patient to see the effects of proposed plastic surgery before it is done. Other software allows users to modify static pictures that can be saved and printed, such as by varying the features of the face or morphing two faces together. U.S. Published Application No. 20050026685 shows a system for morphing video game characters. U.S. Published Application No. 20040201666 shows using user voice or other inputs to modify a virtual character.


A variety of applications are directed to modifying avatars for use with instant messaging. These applications allow a user to personalize the avatar image that is the representation of the user. For example, U.S. Published Application No. 20030043153 shows determining facial expressions from a webcam video feed and mapping those expressions to an avatar. An animation vector is mapped to a target mix vector. Each morph target may represent a facial expression, such as a neutral facial expression, a surprised facial expression, an angry facial expression, and a smiling facial expression.


BRIEF SUMMARY OF THE INVENTION

The present invention provides a method for modifying, or morphing, a video image. A video feed is received which include an image of a person's face. A number of facial features are detected from the video feed. One or more of those facial features are selected and modified, with the modification being applied to the video feed to produce a morphed image. For example, the morphing can produce fun effects such as oversized or too small eyes, lips, etc.


In one embodiment, the user can choose to modify particular features directly. For example, the user can thicken the lips, widen the mouth, etc. The user can then do similar modifications separately to the eyes, nose, ears, etc. Additionally, the user can select to have a hat or other accessories superimposed over the face.


In another embodiment, the user can select from multiple expressions. The selected expression would then automatically change the particular features required to achieve that expression. For example, the expression for a smile can cause the lips and mouth to raise in the smile expression, and additionally can cause smile lines or dimples and a twinkling appearance in the eyes. Alternately another “expression” such as a younger age could cause the smoothing of the skin to eliminate wrinkles, modification of the facial features to produce a more defined chin, eyes, etc.


In one embodiment, a video feed from a web cam, such as a USB video feed is provided through a camera driver to feature recognition software. That software then provides the location of the identified features to a separate morphing software module. The morphing software module then responds to user inputs to select and apply modifications to the various facial features. These modifications are then applied to the video feed, to produce an output video feed which is only slightly delayed in real time.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a diagram illustrating the system and user layers of software in an embodiment of a system incorporating the morphing features of the present invention.



FIG. 2 is a block diagram of an embodiment of a system incorporating the morphing features of the present invention.



FIGS. 3A and 3B are diagrams of a hovering dialog box and drop down menu in one embodiment of the invention.



FIG. 4 is a diagram of an embodiment of a user interface screen for selecting the type of morphing.



FIGS. 5A and 5B are diagrams of an embodiment of a user interface screen for a manual morphing of features directly.



FIG. 6 is a diagram of a user interface screen for expression selection.



FIG. 7 is a screen shot of an embodiment of an instant messenger application incorporating the morphing feature of the invention.





DETAILED DESCRIPTION OF THE INVENTION

System



FIG. 1 illustrates the operation of one embodiment of the invention. Incoming video 100 is provided to a video pipe 102 which is received by the morphing engine 104 of the present invention. The video pipe can be provided directly to applications 106 as indicated by arrow 108, or can be provided to the morphing engine. The morphing engine provides a modified video pipe to the applications as indicated by arrow 110.


The modified video pipe can be used by any application that uses live video input. This includes instant messaging applications (MSN Messenger, Yahoo Messenger, AOL messenger, etc.) as well as video capture applications (when—for example—an artist is capturing the video to the disk). This means that the morphing can be applied before the video is captured to the disk and stored or before it is sent to the interlocutor on the other side of the communication channel. This is because the morphing engine is integrated into the system-level part of the software, which makes it transparent to the applications using the video stream that lay above the system layer at the user layer. In this way the present invention is generic—i.e. it can coexist with any video application and modify its input according to the settings on quick assistant.



FIG. 2 is a block diagram of an embodiment of a system incorporating the morphing of the present invention. A web cam 10 is connected through a USB bus 12 to a computer 14. The USB bus 12 acts as a video feed from the camera to the computer. Alternately, a wireless USB feed could be provided, or any other bus or interface could be used to provide the video. Additionally, a camera, other than the web cam, or any other video source could be used. Additionally, the video source could come from within the computer or through a network to the computer from a remote source.


Computer 14 includes a camera driver 16 which receives the live video and provides it to feature recognition software module 18. Feature recognition module 18 could be any software which extracts information regarding facial features from a video image, such as the software produced by Neven Vision, a machine vision technology company with offices in Santa Monica, Calif.


The detected information regarding facial features is provided to a morphing module 20. The morphing module responds to user inputs to select facial features, determine the type of morphing, and apply the morphing. The user input can be provided through any peripheral, such as keyboard 22 or mouse 24 with associated keyboard driver 26 and mouse driver 28. After the morphing has been applied, the video feed is provided to application software 30. The application software could be an instant messenger application, a video conferencing application, or any other application which uses video. In one embodiment, the delay caused by the use of the feature recognition and morphing software is less than one frame of video.


Morphing Quick Assistant



FIG. 3A shows a hovering dialog box 32, which is a quick assistant that will open and appear on the screen when live video is requested. This automatic activation is accomplished by the camera or video driver software, which detects when a request is made for a video feed from an application, such as an instant messenger application. The quick assistant will appear close to the window of the program that called for the live video, so it will appear as if it were part of that application. An example for instant messaging is shown in FIG. 7. This provides an advantage in allowing the morphing application to work with third-party applications that use a video feed and need not be modified to accommodate the morphing engine. The user will typically assume that the quick assistant is part of that third-party application. This increases the ease of use and acceptation of the new technology.



FIG. 3B illustrates a drop down menu 34 of the quick assistant dialog box 32. Examples of options are shown, others may be provided in other embodiments. The user can turn on or off face tracking, and can enable tracking of multiple faces. Face tracking may or may not be required for the feature recognition module. The user can click on the Video Effect Selection and another dialog box would appear, such as the examples set forth in FIG. 4 or 5A. The menu can include options that are not part of the morphing features, but which a user may desire to modify at the same time, such as camera position. The options may allow the user to enable a video morphing effect, select an effect, or go to the Internet to download more effects.


Options



FIG. 4 is a diagram of one embodiment of an interface screen 40 for the morphing module to allow user selection of different options. This screen in one embodiment is the first screen the user sees when clicking on Video Effect Selection in FIG. 3B. A first button 42 is for selection of manual adjustment. This allows for the user to independently and directly adjust the facial features of the eyes, nose, mouth, etc. Button 44 allows expression selection. This would allow expressions, such as a smile, frown, anger, age, etc. The selection of one of these expressions, such as anger, would cause multiple facial features to be automatically adjusted appropriately without the user needing to adjust each one independently. For example, if anger is chosen, the software could automatically adjust the mouth position, the eyebrows, frown lines in the cheeks, etc. Outfits button 46 allows the user to apply various accessories to the video image. For example, a hat may be placed on the head, a scarf around the neck, earrings or other jewelry, piercings, tattoos, etc.


The morphing module allows a virtual makeover to be applied to a person. The size of the eyes can be changed, and their location can be modified slightly. The iris color can be changed. The lips could be made wider or narrower, or could be accentuated in other ways. A variety of other modifications are available.


Manual Selection



FIG. 5A is a diagram of an embodiment of a user interface box 50 for manual adjustment of facial features. Box 50 includes a series of buttons or drop down menu 56 which allows selection of particular features, such as enhance lips, stretch eyes, trim eyebrows, change iris color and remove wrinkles, etc. When a particular button is selected, a specialized modification box will appear for the particular facial feature selected.


When Lips Enhancement is selected the box 58 of FIG. 5B will appear. In the example shown, separate adjustment sliders are provided for fullness, width, color saturation and curvature. Other facial features will bring up different combinations of adjustment options with different input methods. After setting the desired effects, when the preview button is clicked, a preview window, such as shown in FIG. 7 below, will be provided to allow previewing in real time. In one embodiment, the adjustments of FIG. 5B are displayed at the same time as the preview window, or in the preview window, to allow the user to see the effects of moving the different sliders. It will be appreciated that other inputs than sliders can be used.


Expression Selection



FIG. 6 is a diagram of an embodiment of a user interface screen 60 for selecting particular expressions. A drop down menu 62 is provided. The user can select a particular one desired. Upon such a selection, a screen similar to that shown in FIG. 5B would appear with appropriate adjustment buttons. For example, for age, a slider icon may be provided to allow the user to linear move a pointer for a younger or older age. The software then automatically applies the appropriate adjustments to the various features of the face.


Preview Screen



FIG. 7 illustrates one embodiment of a preview screen. An instant messenger window 70 and a video window 72 are shown. Next to them is the hovering dialog box, or quick assistant 32. A drop down menu 74 next to the quick assistant highlights the effect being displayed. In the example shown, a celebrity effect, “Spock,” is displayed, with the ears of the user being made pointed and the eyebrows being arched. Becoming more logical is up to the user. Alternate embodiments are possible, such as additional dialog boxes which allow other adjustments or combination of adjustments. For example, the Spock effect could be applied, and the user could also separately enhance the lips or eyes or put on a hat.


Variations


In one embodiment, the deformation applied to a face (e.g., ‘fuller lips’) would move with the subject. In some prior art applications, the deformation is applied to frontal face. In an embodiment of the present invention, the 3D position of the head and the features is localized and the deformation is adjusted from a frontal position to the actual position of the user. Thus, even if the face is not frontal, the deformation (e.g. full lips) is maintained when the subject rotates his head so the lips or other feature would still look natural, but they would be ‘fuller’. This is accomplished with a 3D image being stored and modified for each feature, or with algorithms for adjustment in 3D on the fly being used.


In one embodiment, the present invention is able to apply the modifications to live video on the fly with a delay of less than one frame. This is possible because only certain features of the face are morphed or modified, not the entire face. In one embodiment, less than 20% or less than 10% of the video is modified. This limits the amount of processing necessary and increases the speed at which the morphing can be done.


In another embodiment, the user selects the morphing to be applied before the video is fed to the application software program. In another embodiment, the user can also do the adjustments on the fly, morphing or changing the video image in the middle of an instant messenger conversation, for example.


In one embodiment, the invention is integrated into the video pipe so that it is transparent to any application program. The application program need not be modified to accommodate the present invention, which is applied at the driver level of the video feed.


In another embodiment, popular morphing combinations are stored as a macro or particular expression. These are provided on a website which is accessible by users, and users can create their own and upload them to the website for sharing with other users. Morphing to produce features similar to those of celebrities can be stored as a macro on the web site for downloading.


In one embodiment, such as for a conference call with multiple people present, the morphing can be applied independently to different facial features. Thus, one person could be made to have wider eyes, another can be made smile all the time, another could be made to be anger, and another could be made to look younger.


In addition, an entire image can be morphed with, for example, effects similar to those found in a fun house mirror room. By keeping the effects simple, they can be applied in real time without undue delay. Delay is also cut down by the ability to bypass the facial feature detection software, since this is not needed when the entire image is morphed. In one embodiment, the morphing causes a delay of less than 20 milliseconds.


As will be understood as those of skill in the art, the present invention could be embodied in other specific forms without departing from the essential characteristics of the invention. For example, the morphing could be applied to features of the face of a pet, such as a dog, or any other image. The morphing module could also be made to appear upon initiation of an application program which uses still pictures, such as the sending of snapshots from a cell phone. The morphing engine could be made to appear anytime a video feed is requested by an application program, or only for certain types of application programs. Accordingly, reference should be made to the appended claims which set forth the scope of the claimed invention.

Claims
  • 1. A method for modifying a video image, comprising: receiving a live video feed including an image of a person's face;detecting a plurality of facial features from said live video feed;selecting at least a first one of said facial features;selecting a modification for said first facial feature;applying said modification to said live video feed to produce a modified video feed; andproviding said modified video feed to an application program.
  • 2. The method of claim 1 wherein said selecting at least a first one of said facial features comprises: selecting an expression; andmodifying a plurality of said facial features to achieve said expression.
  • 3. The method of claim 1 wherein said application program is an instant messaging program.
  • 4. The method of claim 1 further comprising: displaying a graphical user interface indicating the availability of said selecting steps upon detection of said live video feed.
  • 5. The method of claim 1 further comprising: displaying a graphical user interface indicating the availability of said selecting steps upon detection of said application program.
  • 6. An apparatus for modifying a video image, comprising: a live video input feed including an image of a person's face;a feature detection software module configured to detect a plurality of facial features from said live video feed;a morphing software module configured to receive an indication of detected facial features from said feature detection software and enable selection of at least a first one of said facial features, selection of a modification for said first facial feature and application of said modification to said live video feed to produce a modified video feed; anda modified video feed output directed to an application program.
  • 7. The apparatus of claim 6 wherein said morphing software module is further configured to allow: selecting an expression; andmodifying a plurality of said facial features to achieve said expression.
  • 8. The apparatus of claim 6 further comprising a quick assistant for accessing said morphing video software.
  • 9. A system for modifying a video image, comprising: a live video input feed including an image of a person's face;a feature detection software module configured to detect a plurality of facial features from said live video feed;a morphing software module configured to receive an indication of detected facial features from said feature detection software and enable selection of at least a first one of said facial features, selection of a modification for said first facial feature and application of said modification to said live video feed to produce a modified video feed; anda modified video feed output directed to an application program.
  • 10. The system of claim 9 further comprising: a camera generating said live video input feed;a computer, communicatively coupled to said camera, containing said morphing software module.
CROSS-REFERENCES TO RELATED APPLICATIONS

This application is a continuation of U.S. application Ser. No. 11/183,179, filed Jul. 14, 2005 now U.S. Pat. No. 7,209,577 entitled “FACIAL FEATURE-LOCALIZED AND GLOBAL REAL-TIME VIDEO MORPHING”, which application is incorporated herein by reference. Not Applicable Not Applicable

US Referenced Citations (325)
Number Name Date Kind
2297844 Shoemaker Oct 1942 A
3941925 Busch et al. Mar 1976 A
4005261 Sato et al. Jan 1977 A
4021847 Van Atta May 1977 A
4023064 Schagen et al. May 1977 A
4032785 Green et al. Jun 1977 A
4093959 Hedlund Jun 1978 A
4107741 Lemelson Aug 1978 A
4160263 Christy et al. Jul 1979 A
4206965 McGrew Jun 1980 A
4234894 Tokumaru et al. Nov 1980 A
4255764 Howe Mar 1981 A
4267561 Karpinsky et al. May 1981 A
4306252 Fernside Dec 1981 A
4315282 Schumacher Feb 1982 A
4337482 Coutta Jun 1982 A
4340905 Balding Jul 1982 A
4371893 Rabeisen Feb 1983 A
4383279 Kenney, II May 1983 A
4398211 Young Aug 1983 A
4411489 McGrew Oct 1983 A
4445140 Netzer Apr 1984 A
4470816 Marshall et al. Sep 1984 A
4491868 Berridge, Jr. et al. Jan 1985 A
4558329 Honda Dec 1985 A
4567515 Schumacher Jan 1986 A
4602280 Maloomian Jul 1986 A
4606068 Habitzreiter et al. Aug 1986 A
4629298 Trumbull et al. Dec 1986 A
4658297 Nomura et al. Apr 1987 A
4660101 Martin Apr 1987 A
4668095 Maeda May 1987 A
4682220 Beurskens Jul 1987 A
4717951 Fling Jan 1988 A
4727416 Cooper et al. Feb 1988 A
4731666 Csesznegi Mar 1988 A
4731743 Blancato Mar 1988 A
4750041 Vogel et al. Jun 1988 A
4757387 Saito Jul 1988 A
4763146 Niikura Aug 1988 A
4776796 Nossal Oct 1988 A
4777527 Camps et al. Oct 1988 A
4795253 Sandridge et al. Jan 1989 A
4795261 Nakata et al. Jan 1989 A
4804983 Thayer, Jr. Feb 1989 A
4809247 Elliott Feb 1989 A
4815845 Colbaugh et al. Mar 1989 A
4823285 Blancato Apr 1989 A
4827532 Bloomstein May 1989 A
4844475 Saffer et al. Jul 1989 A
4845641 Ninomiya et al. Jul 1989 A
4855813 Russell et al. Aug 1989 A
4864410 Andrews et al. Sep 1989 A
4868661 Takahashi Sep 1989 A
4872056 Hicks et al. Oct 1989 A
4888668 Roll Dec 1989 A
4890159 Ogin Dec 1989 A
4896175 Thayer, Jr. Jan 1990 A
4897827 Raetzer et al. Jan 1990 A
4899921 Bendat et al. Feb 1990 A
4903068 Shiota Feb 1990 A
4908700 Ishii et al. Mar 1990 A
RE33224 Speigelstein May 1990 E
4933773 Shiota et al. Jun 1990 A
4934773 Becker Jun 1990 A
4937665 Schiffman Jun 1990 A
4943867 Suetaka et al. Jul 1990 A
4945379 Date et al. Jul 1990 A
4959670 Thayer, Jr. Sep 1990 A
4965819 Kannes Oct 1990 A
4969040 Gharaui Nov 1990 A
4971312 Weinreich Nov 1990 A
4973890 Desjardins Nov 1990 A
4979815 Tsikos Dec 1990 A
4985762 Smith Jan 1991 A
4987432 Landwehr Jan 1991 A
4991005 Smith Feb 1991 A
4991014 Takahashi et al. Feb 1991 A
4995068 Chou et al. Feb 1991 A
4999586 Meyer et al. Mar 1991 A
5027223 Suetaka et al. Jun 1991 A
5032930 Suetaka et al. Jul 1991 A
5034817 Everett, Jr. Jul 1991 A
5111300 Nam May 1992 A
5155600 Maeda Oct 1992 A
5164834 Fukuda et al. Nov 1992 A
5168354 Martinez et al. Dec 1992 A
5179421 Parker et al. Jan 1993 A
5184228 Kobayashi et al. Feb 1993 A
5189490 Shetty et al. Feb 1993 A
5190286 Watanebe et al. Mar 1993 A
5196949 Swanberg Mar 1993 A
5198305 Wada et al. Mar 1993 A
5206629 Demond et al. Apr 1993 A
5218441 Karcher Jun 1993 A
5218626 Mckee Jun 1993 A
5223927 Kageyama et al. Jun 1993 A
5231432 Glenn Jul 1993 A
5235420 Gharaui Aug 1993 A
5249967 O'Leary et al. Oct 1993 A
5257091 Caicedo, Jr. et al. Oct 1993 A
5268734 Parker et al. Dec 1993 A
5274464 Strolle et al. Dec 1993 A
5274498 Rios-Rivera et al. Dec 1993 A
5277499 Kameyama Jan 1994 A
5278662 Womach et al. Jan 1994 A
5283433 Tsien Feb 1994 A
5296924 de Saint Blancard et al. Mar 1994 A
5299274 Wysocki et al. Mar 1994 A
5300942 Dolgoff Apr 1994 A
5301030 Ko Apr 1994 A
5303043 Glenn Apr 1994 A
5309496 Winsor May 1994 A
5311568 McKee, Jr. et al. May 1994 A
5317405 Kuriki et al. May 1994 A
5321499 Yu et al. Jun 1994 A
5330184 Douglas Jul 1994 A
5358407 Lainer Oct 1994 A
5369433 Baldwin et al. Nov 1994 A
5376978 Bae Dec 1994 A
5380206 Asprey Jan 1995 A
5392080 Galt et al. Feb 1995 A
5394198 Janow Feb 1995 A
5396301 Sasaki et al. Mar 1995 A
5398082 Henderson et al. Mar 1995 A
5414521 Ansley May 1995 A
5424838 Siu Jun 1995 A
5426460 Erving et al. Jun 1995 A
5439408 Wilkinson Aug 1995 A
5448315 Soohoo Sep 1995 A
5451163 Black Sep 1995 A
5455694 Ariki et al. Oct 1995 A
5465132 Mangelsdorf Nov 1995 A
5465144 Parket et al. Nov 1995 A
5467104 Furness, III et al. Nov 1995 A
5497237 Hosohkawa et al. Mar 1996 A
5502505 Nakata et al. Mar 1996 A
5513116 Buckley et al. Apr 1996 A
5533921 Wilkinson Jul 1996 A
5534917 MacDougall Jul 1996 A
5546316 Buckley et al. Aug 1996 A
5554033 Bizzi et al. Sep 1996 A
5559714 Banks et al. Sep 1996 A
5563988 Maes et al. Oct 1996 A
5564698 Honey et al. Oct 1996 A
5579054 Sezan et al. Nov 1996 A
5603617 Light Feb 1997 A
5612716 Chida et al. Mar 1997 A
5617386 Choi Apr 1997 A
5656907 Chainani et al. Aug 1997 A
5657073 Henley Aug 1997 A
5659692 Poggio et al. Aug 1997 A
5668629 Parker et al. Sep 1997 A
5681223 Weinreich Oct 1997 A
5689575 Sako et al. Nov 1997 A
5689799 Dougherty et al. Nov 1997 A
5697829 Chainani et al. Dec 1997 A
5697844 Von Kohorn Dec 1997 A
5717454 Adolphi et al. Feb 1998 A
5724074 Chainani et al. Mar 1998 A
5724519 Kato et al. Mar 1998 A
5740303 Ban Apr 1998 A
5790124 Fischer et al. Aug 1998 A
5825941 Linford et al. Oct 1998 A
5846086 Bizzi et al. Dec 1998 A
5850463 Horii Dec 1998 A
5862517 Honey et al. Jan 1999 A
5892554 DiCicco et al. Apr 1999 A
5897413 Erland Apr 1999 A
5903317 Sharir et al. May 1999 A
5912700 Honey et al. Jun 1999 A
5916024 Von Kohorn Jun 1999 A
5923791 Hanna et al. Jul 1999 A
5954077 Wain et al. Sep 1999 A
5963257 Katata et al. Oct 1999 A
5969715 Dougherty et al. Oct 1999 A
5982352 Pryor Nov 1999 A
5986670 Dries et al. Nov 1999 A
5986708 Katata et al. Nov 1999 A
5993048 Banks et al. Nov 1999 A
5999173 Ubillos Dec 1999 A
6016148 Kang et al. Jan 2000 A
6023299 Katata et al. Feb 2000 A
6023301 Katata et al. Feb 2000 A
6028593 Rosenberg et al. Feb 2000 A
6037989 Kim Mar 2000 A
6075568 Matsuura Jun 2000 A
6084914 Katata et al. Jul 2000 A
6121953 Walker Sep 2000 A
6122017 Taubman Sep 2000 A
6126551 Martin Oct 2000 A
6130677 Kunz Oct 2000 A
6140932 Frank et al. Oct 2000 A
6141060 Honey et al. Oct 2000 A
6147709 Martin et al. Nov 2000 A
6148148 Wain et al. Nov 2000 A
6154250 Honey et al. Nov 2000 A
6157396 Margulis et al. Dec 2000 A
6163322 LaChapelle Dec 2000 A
6167356 Squadron et al. Dec 2000 A
6198509 Dougherty et al. Mar 2001 B1
6206750 Barad et al. Mar 2001 B1
6208386 Wilf et al. Mar 2001 B1
6211941 Erland Apr 2001 B1
6222550 Rosman et al. Apr 2001 B1
6229550 Gloudemans et al. May 2001 B1
6229904 Huang et al. May 2001 B1
6233736 Wolzien May 2001 B1
6243491 Andersson Jun 2001 B1
6252632 Cavallero Jun 2001 B1
6266100 Gloudemans et al. Jul 2001 B1
6272231 Maurer et al. Aug 2001 B1
6287299 Sasnett et al. Sep 2001 B1
6292130 Cavallaro et al. Sep 2001 B1
6297853 Sharir et al. Oct 2001 B1
6298197 Wain et al. Oct 2001 B1
6301370 Steffens et al. Oct 2001 B1
6304665 Cavallaro et al. Oct 2001 B1
6317127 Daily et al. Nov 2001 B1
6330595 Ullman et al. Dec 2001 B1
6333985 Ueda et al. Dec 2001 B1
6356659 Wiskott et al. Mar 2002 B1
6361173 Ulahos et al. Mar 2002 B1
6363525 Daugherty et al. Mar 2002 B1
6366272 Rosenberg et al. Apr 2002 B1
6371862 Reda Apr 2002 B1
6373508 Moengen Apr 2002 B1
6384871 Wilf et al. May 2002 B1
6400374 Lanier Jun 2002 B2
6425825 Sitrick Jul 2002 B1
6430743 Matsuura Aug 2002 B1
6437820 Josefsson Aug 2002 B1
6438508 Tamir et al. Aug 2002 B2
6438753 Fegesch et al. Aug 2002 B1
6441825 Peters Aug 2002 B1
6441846 Carlbom et al. Aug 2002 B1
6449010 Tucker Sep 2002 B1
6449019 Fincher et al. Sep 2002 B1
6454415 Ulahos Sep 2002 B1
6456232 Milnes et al. Sep 2002 B1
6456340 Margulis Sep 2002 B1
6462662 Rundow et al. Oct 2002 B1
6463121 Milnes Oct 2002 B1
6466695 Potzsch et al. Oct 2002 B1
6498619 Yanagi Dec 2002 B1
6513069 Abato et al. Jan 2003 B1
6538676 Peters Mar 2003 B1
6545682 Ventrella et al. Apr 2003 B1
6563950 Wiskott et al. May 2003 B1
6567116 Aman et al. May 2003 B1
6570585 Hines et al. May 2003 B1
6574604 van Rijn Jun 2003 B1
6580811 Maurer et al. Jun 2003 B2
6597406 Gloudemans et al. Jul 2003 B2
6606096 Wang Aug 2003 B2
6614466 Thonas Sep 2003 B2
6624843 Lennon Sep 2003 B2
6631987 Reichow et al. Oct 2003 B2
6634959 Kuesters Oct 2003 B2
6656064 Zielinski Dec 2003 B2
6678641 Gibbs et al. Jan 2004 B2
6707487 Aman et al. Mar 2004 B1
6714660 Ohba Mar 2004 B1
6718104 Lowry Apr 2004 B2
6724442 Zyskowski et al. Apr 2004 B1
6744403 Milnes et al. Jun 2004 B2
6750919 Rosser Jun 2004 B1
6771319 Konuma Aug 2004 B2
6824387 Sakai et al. Nov 2004 B2
6824480 John et al. Nov 2004 B2
6829432 Misumi et al. Dec 2004 B2
6850250 Hoch Feb 2005 B2
6879712 Tuncay et al. Apr 2005 B2
6893127 Reichow et al. May 2005 B2
6903707 Hobgood et al. Jun 2005 B2
6903756 Giannini Jun 2005 B1
6909438 White et al. Jun 2005 B1
6937696 Mostafani Aug 2005 B1
6950130 Qian Sep 2005 B1
7000840 Reiffel Feb 2006 B2
7002551 Azuma et al. Feb 2006 B2
7006154 Dudkowski Feb 2006 B2
7006709 Kang et al. Feb 2006 B2
7020336 Cohen-Solal et al. Mar 2006 B2
7023913 Monroe Apr 2006 B1
7027083 Kanade et al. Apr 2006 B2
7034803 Reiffel Apr 2006 B1
7039222 Simon et al. May 2006 B2
7043056 Edwards et al. May 2006 B2
7053915 Jung et al. May 2006 B1
7062454 Giannini et al. Jun 2006 B1
7071898 Hobgood et al. Jul 2006 B2
7075556 Meier et al. Jul 2006 B1
7079176 Freeman et al. Jul 2006 B1
7085409 Sawhney et al. Aug 2006 B2
7097532 Rolicki et al. Aug 2006 B1
7098891 Pryor Aug 2006 B1
7099070 Reiffel Aug 2006 B2
7102666 Kanade et al. Sep 2006 B2
7106361 Kanade et al. Sep 2006 B2
7116342 Dengler et al. Oct 2006 B2
7120871 Harrington et al. Oct 2006 B1
7209577 McAlpine et al. Apr 2007 B2
20010033675 Maurer et al. Oct 2001 A1
20010036860 Yonezawa Nov 2001 A1
20020032906 Grossman Mar 2002 A1
20020037332 Cohen Mar 2002 A1
20020186221 Bell Dec 2002 A1
20030043153 Buddemeier et al. Mar 2003 A1
20030156134 Kim Aug 2003 A1
20030198384 Vrhel Oct 2003 A1
20030202686 Rowe Oct 2003 A1
20030206171 Kim et al. Nov 2003 A1
20030235341 Gokturk et al. Dec 2003 A1
20040012613 Rast Jan 2004 A1
20040052499 Singh Mar 2004 A1
20040085324 Yao May 2004 A1
20040085334 Reaney May 2004 A1
20040201666 Matsuo et al. Oct 2004 A1
20040239689 Fertig et al. Dec 2004 A1
20050026685 Ruark et al. Feb 2005 A1
20050041867 Loy et al. Feb 2005 A1
20050204287 Wang Sep 2005 A1
20050286799 Huang et al. Dec 2005 A1
20060075448 McAlpine et al. Apr 2006 A1
Foreign Referenced Citations (15)
Number Date Country
09160575 Jun 1997 JP
11-175061 Feb 1999 JP
2000-209500 Jul 2000 JP
2001-307124 Nov 2001 JP
2002-042168 Aug 2002 JP
WO 2004021695 Mar 2004 JP
351795 Feb 1999 TW
563053 Nov 2003 TW
1254569 May 2006 TW
200617701 Jun 2006 TW
WO 9703517 Jan 1997 WO
WO 0115450 Mar 2001 WO
WO 0165854 Sep 2001 WO
WO 0167759 Sep 2001 WO
WO 0203706 Jan 2002 WO
Related Publications (1)
Number Date Country
20070174775 A1 Jul 2007 US
Continuations (1)
Number Date Country
Parent 11183179 Jul 2005 US
Child 11717303 US