Handheld article with movement discrimination

Information

  • Patent Grant
  • 8212882
  • Patent Number
    8,212,882
  • Date Filed
    Thursday, May 27, 2010
    14 years ago
  • Date Issued
    Tuesday, July 3, 2012
    12 years ago
Abstract
A digital camera 10 has a pair of angular rate-sensing gyroscopic sensors 130 with mutually perpendicular axes and an electronic circuit 120 responsive to the sensor output signals to discriminate between voluntary and involuntary movements of the article as a function of the number of zero crossings per unit time of the signal and the average of the absolute amplitude of the signal.
Description
FIELD OF THE INVENTION

This invention relates to a handheld article, such as a digital camera, having apparatus to discriminate between voluntary and involuntary movement of the article.


BACKGROUND OF THE INVENTION

Handheld articles such as digital cameras are subject to movement in use, either involuntary (hand-jitter) or voluntary (e.g. panning). It would be useful to discriminate between these two types of movement.


The object of the present invention is to provide apparatus, in a handheld article, to discriminate between voluntary and involuntary movement of the article.


BRIEF SUMMARY OF THE INVENTION

According to the present invention there is provided a handheld article having at least one angular rate-sensing gyroscopic sensor and an electronic circuit responsive to the sensor output signal to discriminate between voluntary and involuntary movements of the article as a function of the number of zero crossings per unit time of the signal and the average of the absolute amplitude of the signal.


Preferably the article includes first and second angular rate-sensing gyroscopic sensors with transverse axes, the electronic circuit being responsive to both sensor output signals to discriminate between voluntary and involuntary movements of the article.


In an embodiment the article is a digital camera.





BRIEF DESCRIPTION OF DRAWINGS

An embodiment of the invention will now be described, by way of example, with reference to the accompanying drawings, in which:



FIG. 1 is a block diagram of a digital camera operating in accordance with an embodiment of the present invention.



FIGS. 2 to 4 are waveforms useful in understanding the operation of the embodiment of the invention.





DESCRIPTION OF PREFERRED EMBODIMENTS


FIG. 1 is a block diagram of a portable digital camera 10, operating in accordance with an embodiment of the present invention. It will be appreciated that many of the processes implemented in the digital camera are implemented in or controlled by software operating on a microprocessor, central processing unit, controller, digital signal processor and/or an application specific integrated circuit, collectively depicted as processor 120. All user interface and control of peripheral components such as buttons and display is controlled by a microcontroller 122. The processor 120, in response to a user input at 122, such as half pressing a shutter button (pre-capture mode 32), initiates and controls the digital photographic process.


Ambient light exposure is determined using a light sensor 40 in order to automatically determine if a flash is to be used. The distance to the subject is determined using a focusing mechanism 50 which also focuses the image on an image capture device 60. If a flash is to be used, processor 120 causes a flash device 70 to generate a photographic flash in substantial coincidence with the recording of the image by the image capture device 60 upon full depression of the shutter button.


The image capture device 60 digitally records the image in colour. The image capture device is known to those familiar with the art and may include a CCD (charge coupled device) or CMOS to facilitate digital recording. The flash may be selectively generated either in response to the light sensor 40 or a manual input 72 from the user of the camera. The high resolution image recorded by image capture device 60 is stored in an image store 80 which may comprise computer memory such a dynamic random access memory or a non-volatile memory. The camera is equipped with a display 100, such as an LCD, for preview images.


In the case of preview images which are generated in the pre-capture mode 32 with the shutter button half-pressed, the display 100 can assist the user in composing the image, as well as being used to determine focusing and exposure. Temporary storage 82 is used to store one or plurality of the preview images and can be part of the image store 80 or a separate component. The preview image is usually generated by the image capture device 60. For speed and memory efficiency reasons, preview images usually have a lower pixel resolution than the main image taken when the shutter button is fully depressed, and are generated by sub-sampling a raw captured image using software 124 which can be part of the general processor 120 or dedicated hardware or combination thereof.


Various processing functions 90 carried out on the main, high-resolution, image, and/or low resolution preview images, such as redeye detection and correction 92 and de-blurring 94, can be integral to the camera 10 or part of an external processing device 20 such as a desktop computer.


The camera 10 also includes two angular rate-sensing gyroscopic sensors 130 having vertical and horizontal axes respectively (vertical and horizontal refers to the orientation of the axes when the camera is held perfectly upright and level). In the present embodiment each sensor 130 comprises an Analog Devices ADXRS401 single chip sensor capable of measuring up to 75 degrees per second about its axis. The analog output signals from the two sensors are sampled 40,000 times per second for analog-to-digital conversion and applied to the digital processor 120. This allows the processor 120 to discriminate between voluntary and involuntary movement of the camera, as well as between camera movement and electronic jitter, in the following manner.


In a digital camera one may encounter three situations:


1. There is no movement (FIG. 2). The sensor output signal is solely electronic jitter (sensor noise).


2. There is involuntary hand jitter (FIG. 3). It adds to the sensor noise. Here, the hand jitter is dominant.


3. There is voluntary, desired hand movement (FIG. 4), for example, while panning to follow a moving vehicle. This adds to the involuntary hand jitter and sensor noise, but is dominant.


Sensor noise has a small amplitude and a large percentage of energy in the high frequency domain. Hand jitter increases the amplitude of the sensor output signals and the energy in the low frequency bands, while voluntary movement has the largest amplitude (due to a non-zero DC component) and most of the signal energy near zero frequency. The present embodiment discriminates between these three situations as a function of the number of zero-crossings per second NZC of each sensor signal, which is a measure of frequency, and the average of the absolute amplitude of each sensor signal | W| in Volts. The number of zero crossings per second (NZC) of a discrete signal w(n), where n=1, . . . N, is:







NZC
=




n
=
1


N
-
1




H


(


-

w


(
n
)



·

w


(

n
+
1

)



)




,






where






H


(
x
)



=

{




0
,





x

0







1
,





x
>
0











The value of NZC (which refers to the number of zero crossings per second) need not be established by counting zero crossings over a full second, since NZC changes relatively slowly over time and a count of the number of zero crossings taken over, for example, 1/10th of a second can be scaled up (in this example by multiplying by 10) and expressed as the number per second. The absolute amplitude of the signal is preferably averaged over the same period as the zero crossing count, but again the absolute average changes relatively slowly over time so exact correspondence is not strictly necessary.


Our tests, using two Analog Devices ADXRS401 single chip sensors mounted on a test board with mutually perpendicular axes and sensor outputs sampled at 40,000 samples per second, have shown that for the three situations above the following criteria generally apply for both horizontal and vertical components of random movement:


1. No movement: NZCEε[180; 250] and | W|ε[0.01; 0.025]


2. Hand jitter: NZCEε[50; 160] and | W|ε[0.03, 0.2]


3. Voluntary movement: NZC<10 and | W|>0.5.


These ranges can therefore be used for discriminating between the three situations. However, unless the movement is predominantly in one direction, the discrimination is likely to be less accurate if the criteria for only one direction (horizontal or vertical) are used for a decision, as one can see in the example below (FIG. 4, NZChor).


Therefore, we have found that a high degree of discrimination accuracy can be obtained by combining the criteria and computing the term:






TH
=



NZC
hor





W


_

hor


+


NZC
vert





W


_

vert







In our tests, we found empirically that if TH were greater than 1200, noise was the predominant factor; if not, hand jitter was present. We found this 1200 threshold to be highly accurate in discriminating between sensor noise and hand jitter. In our tests there were 2% false alarms (noise detected as jitter) and 1.8% misdetection. In 0.16% cases the test indicated voluntary movement instead of hand jitter.


To discriminate between hand jitter and voluntary movement we compared TH with 400, which we found empirically to be a useful threshold to differentiate between these two types of movement. A TH of less than 400 indicated voluntary movement while a TH greater than 400 indicated hand jitter. For soft movements it is hard to define the physical threshold between hand jitter and a voluntary movement. For hard movements, however, the tests proved 100% accurate.


Of course, if different sensors are used, there will be different threshold levels.



FIGS. 2 to 4 are typical waveforms of the sensor outputs for the three situations referred to above.



FIG. 2: Pure sensor noise. As one can see, both of the records have many zero-crossings, but no significant DC component. The high frequencies are much more important as compared to other cases. The energy level is low. The values computed from these records are:


NZChor=321


NZCvert=140


| W|hor=0.025


| W|vert=0.0034


TH=54016.47



FIG. 3: Hand jitter is predominant. As one can see, both of the records have insignificant DC components and a rather high number of zero-crossings. The values computed from these records are:


NZChor=78


NZCvert=119


| W|hor=0.093


| W|vert=0.079


TH=2345.03



FIG. 4: Voluntary movement is predominant. As one can see, both of the records have significant DC components which decreases the number of zero-crossings. The DC component shows the existence of the voluntary movement. The values computed from these records are:


NZChor=15


NZCvert=0


| W|hor=0.182


| W|vert=0.284


TH=82.42


The above technique is embodied in the camera 10 by the processor 120 iteratively calculating TH from the output signals from the sensors 150, comparing the calculated value with the thresholds 400 and 1200, and setting or clearing a respective flag according to whether TH is greater than 1200, less than 400, or between 400 and 1200. The state of the flags at any given moment will indicate whether the immediately preceding measurement detected sensor noise only, hand jitter or voluntary movement to a high degree of accuracy. This is done cyclically at least while the camera is in preview mode with the shutter button half-depressed, right up to the moment that the final full resolution image is captured. The final state of the flags at the moment the shutter is pressed can be stored as metadata with the image, and used in processing the image, e.g. as an input to the de-blur function 94 or alternatively, the state of the flag can be fed directly to the image processing function. During the preview phase the state of the flags can be used to determine whether to enable or disable an image stabilisation function or otherwise used as input to such a function to modify its operation.


In another application, the image processing functions include a face tracking module 96 as described in U.S. patent application Ser. No. 11/464,083 filed Aug. 11, 2006, now U.S. Pat. No. 7,315,631. Such a module periodically analyses acquired images to detect faces within the images and subsequently tracks these faces from one image to another in an image stream. Face detection is relatively processor intensive and needs to be performed as judiciously as possible. Thus, using the information provided with the present embodiment, the module 96 can for example decide to switch off face detection/tracking when a camera is being voluntarily moved as it might presume that it could not track face movement during panning of the camera, whereas if hand jitter is being experienced, the module can use the degree and direction of movement to attempt to predict where a face candidate region being tracked may have moved from one image in a stream to the next. If noise rather than hand jitter is being experienced, the module 96 could decide to use an existing location for a face candidate region rather than attempting to predict its location based on camera movement.


The invention is not limited to the embodiments described herein which may be modified or varied without departing from the scope of the invention.

Claims
  • 1. A digital camera-enabled portable device, comprising: a lens and image sensor for acquiring digital images;a processor;at least one angular rate-sensing gyroscopic sensor;an electronic circuit responsive to a sensor output signal to discriminate between voluntary and involuntary movements of the article as a function at least of the number of zero crossings per unit time of the signal; anda face tracking module configured to detect and track a face within the images and to select an operating condition based on a determination as to whether the article is undergoing voluntary movement, andwherein the operating condition comprises turning face tracking off upon determining that the article is undergoing voluntary movement.
  • 2. The digital camera enabled portable device of claim 1, wherein the function is proportional to the number of zero crossings of the signal.
  • 3. The digital camera enabled portable device of claim 1, including first and second angular rate-sensing gyroscopic sensors with transverse axes, the electronic circuit being responsive to both sensor output signals to discriminate between voluntary and involuntary movements of the article.
  • 4. The digital camera-enabled portable device of claim 3, wherein the axes of the first and second angular rate-sensing gyroscopic sensors are substantially perpendicular to one another.
  • 5. The digital camera-enabled portable device of claim 1, wherein the electronic circuit is responsive to the sensor output signal to discriminate between voluntary and involuntary movements of the article as a function also of the average of the absolute amplitude of the signal.
  • 6. A digital camera-enabled portable device, comprising: a lens and image sensor for acquiring digital images;a processor;at least one angular rate-sensing gyroscopic sensor;an electronic circuit responsive to a sensor output signal to discriminate between voluntary and involuntary movements of the article as a function at least of the number of zero crossings per unit time of the signal; anda face tracking module configured to detect and track a face within the images and to select an operating condition based on a determination as to whether the article is undergoing voluntary movement,first and second angular rate-sensing gyroscopic sensors with transverse axes, the electronic circuit being responsive to both sensor output signals to discriminate between voluntary and involuntary movements of the article,wherein the function is proportional to the number of zero crossings of the signal and inversely proportional to the average of the absolute amplitude of the signal.
  • 7. The digital camera-enabled portable device of claim 6, wherein the electronic circuit discriminates between involuntary and voluntary movements by determining whether the function falls below a first threshold indicating voluntary movement or above the first threshold indicating no voluntary movement.
  • 8. The digital camera-enabled portable device of claim 7, wherein the electronic circuit further discriminates between involuntary movements and sensor noise by determining whether the function falls below a second predetermined threshold greater than the first threshold indicating involuntary movement or above the second threshold indicating no movement.
  • 9. A digital camera-enabled portable device, comprising: a lens and image sensor for acquiring digital images;a processor;at least one angular rate-sensing gyroscopic sensor;an electronic circuit responsive to a sensor output signal to discriminate between voluntary and involuntary movements of the article as a function at least of the number of zero crossings per unit time of the signal; anda face tracking module configured to detect and track a face within the images and to select an operating condition based on a determination as to whether the article is undergoing voluntary movement,wherein the electronic circuit is responsive to the sensor output signal to discriminate between voluntary and involuntary movements of the article as a function also of the average of the absolute amplitude of the signal, andwherein the function is proportional to the number of zero crossings of the signal and inversely proportional to the average of the absolute amplitude of the signal.
  • 10. The digital camera-enabled portable device of claim 9, wherein the electronic circuit discriminates between involuntary and voluntary movements by determining whether the function falls below a first threshold indicating voluntary movement, or above the first threshold indicating no voluntary movement.
  • 11. The digital camera-enabled portable device of claim 10, wherein the electronic circuit further discriminates between involuntary movements and sensor noise when the function is above the first threshold by determining whether the function falls below a second predetermined threshold greater than the first threshold indicating involuntary movement, or above the second threshold indicating no movement.
  • 12. The digital camera-enabled portable device of claim 11, wherein the face tracking module is further configured to select an operating condition based on a further determination as to whether the article is undergoing no movement.
  • 13. The digital camera-enabled portable device of claim 12, wherein when the device is determined to have no movement, the operating condition comprises using an existing location for a face candidate region.
  • 14. The digital camera-enabled portable device of claim 13, wherein the operating condition comprises not attempting to predict a location for the face based on any camera movement.
  • 15. The digital camera-enabled portable device of claim 9, wherein the operating condition comprises turning face tracking off upon determining that the article is undergoing voluntary movement.
  • 16. One or more non-transitory computer-readable storage device having code embedded therein for programming a processor to perform a method of tracking a face within multiple images of an image stream acquired with a digital camera-enabled portable device equipped with at least one angular rate-sensing gyroscope, wherein the method comprises: receiving an angular rate-sensing gyroscopic sensor output signal;calculating a value of a function at least of the number of zero crossings per unit time of the signal;discriminating between voluntary and involuntary movements of the digital camera-enabled portable device based on the value; anddetecting and tracking said face within the multiple images only when the device is not undergoing voluntary movement,wherein the detecting and tracking of said face comprises turning face tracking off upon determining that the device is undergoing voluntary movement.
  • 17. The one or more computer-readable storage devices of claim 16, wherein the function is proportional to the number of zero crossings of the signal.
  • 18. The one or more computer-readable storage devices of claim 16, wherein the device is equipped with first and second angular rate-sensing gyroscopic sensors with transverse axes, and the discriminating between voluntary and involuntary movements of the device comprises analyzing both sensor output signals.
  • 19. The one or more computer-readable storage devices of claim 18, wherein the axes of the first and second angular rate-sensing gyroscopic sensors are substantially perpendicular to one another.
  • 20. The one or more computer-readable storage devices of claim 16, wherein the function is also a function of the average of the absolute amplitude of the signal.
  • 21. One or more non-transitory computer-readable storage device having code embedded therein for programming a processor to perform a method of tracking a face within multiple images of an image stream acquired with a digital camera-enabled portable device equipped with at least one angular rate-sensing gyroscope, wherein the method comprises: receiving an angular rate-sensing gyroscopic sensor output signal;calculating a value of a function at least of the number of zero crossings per unit time of the signal;discriminating between voluntary and involuntary movements of the digital camera-enabled portable device based on the value;detecting and tracking said face within the multiple images only when the device is not undergoing voluntary movement,wherein the device is equipped with first and second angular rate-sensing gyroscopic sensors with transverse axes, and the discriminating between voluntary and involuntary movements of the device comprises analyzing both sensor output signals,wherein the function is proportional to the number of zero crossings of the signal and inversely proportional to the average of the absolute amplitude of the signal.
  • 22. The one or more computer-readable storage devices of claim 21, wherein the discriminating between involuntary and voluntary movements comprises determining whether the function falls below a first threshold indicating voluntary movement or above the first threshold indicating no voluntary movement.
  • 23. The one or more computer-readable storage devices of claim 22, wherein the method further comprises discriminating between involuntary movements and sensor noise, including determining whether the function falls below a second predetermined threshold greater than the first threshold indicating involuntary movement or above the second threshold indicating no movement.
  • 24. The one or more computer-readable storage devices of claim 21, wherein the detecting and tracking of said face comprises turning face tracking off upon determining that the device is undergoing voluntary movement.
  • 25. One or more non-transitory computer-readable storage device having code embedded therein for programming a processor to perform a method of tracking a face within multiple images of an image stream acquired with a digital camera-enabled portable device equipped with at least one angular rate-sensing gyroscope, wherein the method comprises: receiving an angular rate-sensing gyroscopic sensor output signal;calculating a value of a function at least of the number of zero crossings per unit time of the signal;discriminating between voluntary and involuntary movements of the digital camera-enabled portable device based on the value; anddetecting and tracking said face within the multiple images only when the device is not undergoing voluntary movementwherein the function is also a function of the average of the absolute amplitude of the signal,wherein the function is proportional to the number of zero crossings of the signal and inversely proportional to the average of the absolute amplitude of the signal.
  • 26. The one or more computer-readable storage devices of claim 25, wherein the discriminating between involuntary and voluntary movements comprises determining whether the function falls below a first threshold indicating voluntary movement, or above the first threshold indicating no voluntary movement.
  • 27. The one or more computer-readable storage devices of claim 26, wherein the method further comprises discriminating between involuntary movements and sensor noise when the function is above the first threshold by determining whether the function falls below a second predetermined threshold greater than the first threshold indicating involuntary movement, or above the second threshold indicating no movement.
  • 28. The one or more computer-readable storage devices of claim 27, wherein the detecting and tracking of said face further comprises selecting an operating condition based on a further determination as to whether the device is undergoing no movement.
  • 29. The one or more computer-readable storage devices of claim 28, wherein when the device is determined to have no movement, the detecting and tracking of said face comprises using an existing location for a face candidate region.
  • 30. The one or more computer readable storage devices of claim 29, wherein the detecting and tracking of said face does not include predicting its location based on any camera movement.
  • 31. A method of tracking a face within multiple images of an image stream acquired with a digital camera-enabled portable device equipped with at least one angular rate-sensing gyroscope, wherein the method comprises: receiving an angular rate-sensing gyroscopic sensor output signal;calculating a value of a function at least of the number of zero crossings per unit time of the signal;discriminating between voluntary and involuntary movements of the digital camera-enabled portable device based on the value; anddetecting and tracking said face within the multiple images only when the device is not undergoing voluntary movement, andwherein the detecting and tracking of said face comprises turning face tracking off upon determining that the device is undergoing voluntary movement.
  • 32. The method of claim 31, wherein the function is proportional to the number of zero crossings of the signal.
  • 33. The method of claim 31, wherein the device is equipped with first and second angular rate-sensing gyroscopic sensors with transverse axes, and the discriminating between voluntary and involuntary movements of the device comprises analyzing both sensor output signals.
  • 34. The method of claim 33, wherein the axes of the first and second angular rate-sensing gyroscopic sensors are substantially perpendicular to one another.
  • 35. The method of claim 31, wherein the function is also a function of the average of the absolute amplitude of the signal.
  • 36. A method of tracking a face within multiple images of an image stream acquired with a digital camera-enabled portable device equipped with at least one angular rate-sensing gyroscope, wherein the method comprises: receiving an angular rate-sensing gyroscopic sensor output signal;calculating a value of a function at least of the number of zero crossings per unit time of the signal;discriminating between voluntary and involuntary movements of the digital camera-enabled portable device based on the value; anddetecting and tracking said face within the multiple images only when the device is not undergoing voluntary movementwherein the device is equipped with first and second angular rate-sensing gyroscopic sensors with transverse axes, and the discriminating between voluntary and involuntary movements of the device comprises analyzing both sensor output signals,wherein the function is proportional to the number of zero crossings of the signal and inversely proportional to the average of the absolute amplitude of the signal.
  • 37. The method of claim 36, wherein the discriminating between involuntary and voluntary movements comprises determining whether the function falls below a first threshold indicating voluntary movement or above the first threshold indicating no voluntary movement.
  • 38. The method of claim 37, further comprising discriminating between involuntary movements and sensor noise, including determining whether the function falls below a second predetermined threshold greater than the first threshold indicating involuntary movement or above the second threshold indicating no movement.
  • 39. A method of tracking a face within multiple images of an image stream acquired with a digital camera-enabled portable device equipped with at least one angular rate-sensing gyroscope, wherein the method comprises: receiving an angular rate-sensing gyroscopic sensor output signal;calculating a value of a function at least of the number of zero crossings per unit time of the signal;discriminating between voluntary and involuntary movements of the digital camera-enabled portable device based on the value; anddetecting and tracking said face within the multiple images only when the device is not undergoing voluntary movement,wherein the function is also a function of the average of the absolute amplitude of the signal,wherein the function is proportional to the number of zero crossings of the signal and inversely proportional to the average of the absolute amplitude of the signal.
  • 40. The method of claim 39, wherein the discriminating between involuntary and voluntary movements comprises determining whether the function falls below a first threshold indicating voluntary movement, or above the first threshold indicating no voluntary movement.
  • 41. The method of claim 40, further comprising discriminating between involuntary movements and sensor noise when the function is above the first threshold by determining whether the function falls below a second predetermined threshold greater than the first threshold indicating involuntary movement, or above the second threshold indicating no movement.
  • 42. The method of claim 41, wherein the detecting and tracking of said face further comprises selecting an operating condition based on a further determination as to whether the device is undergoing no movement.
  • 43. The method of claim 42, wherein when the device is determined to have no movement, the detecting and tracking of said face comprises using an existing location for a face candidate region.
  • 44. The method of claim 43, wherein the detecting and tracking of said face does not include predicting its location based on any camera movement.
  • 45. The method of claim 39, wherein the detecting and tracking of said face comprises turning face tracking off upon determining that the device is undergoing voluntary movement.
PRIORITY

This application is a Continuation of U.S. patent application Ser. No. 11/690,836, filed on Mar. 25, 2007, now U.S. Pat. No. 7,773,118, issued on Aug. 10, 2010, which is hereby incorporated by reference.

US Referenced Citations (193)
Number Name Date Kind
5251019 Moorman et al. Oct 1993 A
5374956 D'luna Dec 1994 A
5392088 Abe et al. Feb 1995 A
5428723 Ainscow et al. Jun 1995 A
5510215 Prince et al. Apr 1996 A
5599766 Boroson et al. Feb 1997 A
5686383 Long et al. Nov 1997 A
5747199 Roberts et al. May 1998 A
5751836 Wildes et al. May 1998 A
5756239 Wake May 1998 A
5756240 Roberts et al. May 1998 A
5802220 Black et al. Sep 1998 A
5889277 Hawkins et al. Mar 1999 A
5889554 Mutze Mar 1999 A
5909242 Kobayashi et al. Jun 1999 A
5981112 Roberts Nov 1999 A
6028960 Graf et al. Feb 2000 A
6035072 Read Mar 2000 A
6041078 Rao Mar 2000 A
6061462 Tostevin et al. May 2000 A
6081606 Hansen et al. Jun 2000 A
6114075 Long et al. Sep 2000 A
6122017 Taubman Sep 2000 A
6124864 Madden et al. Sep 2000 A
6134339 Luo Oct 2000 A
6269175 Hanna et al. Jul 2001 B1
6297071 Wake Oct 2001 B1
6297846 Edanami Oct 2001 B1
6326108 Simons Dec 2001 B2
6330029 Hamilton et al. Dec 2001 B1
6360003 Doi et al. Mar 2002 B1
6365304 Simons Apr 2002 B2
6381279 Taubman Apr 2002 B1
6387577 Simons May 2002 B2
6407777 DeLuca Jun 2002 B1
6535244 Lee et al. Mar 2003 B1
6555278 Loveridge et al. Apr 2003 B1
6567536 McNitt et al. May 2003 B2
6599668 Chari et al. Jul 2003 B2
6602656 Shore et al. Aug 2003 B1
6607873 Chari et al. Aug 2003 B2
6618491 Abe Sep 2003 B1
6625396 Sato Sep 2003 B2
6643387 Sethuraman et al. Nov 2003 B1
6741960 Kim et al. May 2004 B2
6863368 Sadasivan et al. Mar 2005 B2
6892029 Tsuchida et al. May 2005 B2
6947609 Seeger et al. Sep 2005 B2
6961518 Suzuki Nov 2005 B2
7019331 Winters et al. Mar 2006 B2
7072525 Covell Jul 2006 B1
7084037 Gamo et al. Aug 2006 B2
7160573 Sadasivan et al. Jan 2007 B2
7177538 Sato et al. Feb 2007 B2
7180238 Winters Feb 2007 B2
7195848 Roberts Mar 2007 B2
7292270 Higurashi et al. Nov 2007 B2
7315324 Cleveland et al. Jan 2008 B2
7315630 Steinberg et al. Jan 2008 B2
7315631 Corcoran et al. Jan 2008 B1
7316630 Tsukada et al. Jan 2008 B2
7316631 Tsunekawa Jan 2008 B2
7317815 Steinberg et al. Jan 2008 B2
7336821 Ciuc et al. Feb 2008 B2
7369712 Steinberg et al. May 2008 B2
7403643 Ianculescu et al. Jul 2008 B2
7453493 Pilu Nov 2008 B2
7453510 Kolehmainen et al. Nov 2008 B2
7460695 Steinberg et al. Dec 2008 B2
7469071 Drimbarean et al. Dec 2008 B2
7489341 Yang et al. Feb 2009 B2
7548256 Pilu Jun 2009 B2
7551755 Steinberg et al. Jun 2009 B1
7565030 Steinberg et al. Jul 2009 B2
7593144 Dymetman Sep 2009 B2
7623153 Hatanaka Nov 2009 B2
7657172 Nomura et al. Feb 2010 B2
7692696 Steinberg et al. Apr 2010 B2
7738015 Steinberg et al. Jun 2010 B2
20010036307 Hanna et al. Nov 2001 A1
20020006163 Hibi et al. Jan 2002 A1
20030052991 Stavely et al. Mar 2003 A1
20030058361 Yang Mar 2003 A1
20030091225 Chen May 2003 A1
20030103076 Neuman Jun 2003 A1
20030151674 Lin Aug 2003 A1
20030152271 Tsujino et al. Aug 2003 A1
20030169818 Obrador Sep 2003 A1
20030193699 Tay Oct 2003 A1
20030219172 Caviedes et al. Nov 2003 A1
20040066981 Li et al. Apr 2004 A1
20040076335 Kim Apr 2004 A1
20040090532 Imada May 2004 A1
20040120598 Feng Jun 2004 A1
20040120698 Hunter Jun 2004 A1
20040130628 Stavely Jul 2004 A1
20040145659 Someya et al. Jul 2004 A1
20040169767 Norita et al. Sep 2004 A1
20040212699 Molgaard Oct 2004 A1
20040218057 Yost et al. Nov 2004 A1
20040218067 Chen et al. Nov 2004 A1
20040247179 Miwa et al. Dec 2004 A1
20050010108 Rahn et al. Jan 2005 A1
20050019000 Lim et al. Jan 2005 A1
20050031224 Prilutsky et al. Feb 2005 A1
20050041121 Steinberg et al. Feb 2005 A1
20050041123 Ansari et al. Feb 2005 A1
20050047672 Ben-Ezra et al. Mar 2005 A1
20050052553 Kido et al. Mar 2005 A1
20050057687 Irani et al. Mar 2005 A1
20050068446 Steinberg et al. Mar 2005 A1
20050068452 Steinberg et al. Mar 2005 A1
20050140801 Prilutsky et al. Jun 2005 A1
20050140829 Uchida et al. Jun 2005 A1
20050195317 Myoga Sep 2005 A1
20050201637 Schuler et al. Sep 2005 A1
20050219391 Sun et al. Oct 2005 A1
20050231625 Parulski et al. Oct 2005 A1
20050248660 Stavely et al. Nov 2005 A1
20050259864 Dickinson et al. Nov 2005 A1
20050270381 Owens et al. Dec 2005 A1
20050281477 Shiraki et al. Dec 2005 A1
20060006309 Dimsdale et al. Jan 2006 A1
20060017837 Sorek et al. Jan 2006 A1
20060038891 Okutomi et al. Feb 2006 A1
20060039690 Steinberg et al. Feb 2006 A1
20060093212 Steinberg et al. May 2006 A1
20060098237 Steinberg et al. May 2006 A1
20060098890 Steinberg et al. May 2006 A1
20060098891 Steinberg et al. May 2006 A1
20060119710 Ben-Ezra et al. Jun 2006 A1
20060120599 Steinberg et al. Jun 2006 A1
20060125938 Ben-Ezra et al. Jun 2006 A1
20060133688 Kang et al. Jun 2006 A1
20060140455 Costache et al. Jun 2006 A1
20060170786 Won Aug 2006 A1
20060171464 Ha Aug 2006 A1
20060187308 Lim et al. Aug 2006 A1
20060204034 Steinberg et al. Sep 2006 A1
20060204054 Steinberg et al. Sep 2006 A1
20060204110 Steinberg et al. Sep 2006 A1
20060285754 Steinberg et al. Dec 2006 A1
20070025714 Shiraki Feb 2007 A1
20070058073 Steinberg et al. Mar 2007 A1
20070083114 Yang et al. Apr 2007 A1
20070086675 Chinen et al. Apr 2007 A1
20070097221 Stavely et al. May 2007 A1
20070110305 Corcoran et al. May 2007 A1
20070147820 Steinberg et al. Jun 2007 A1
20070189748 Drimbarean et al. Aug 2007 A1
20070201724 Steinberg et al. Aug 2007 A1
20070234779 Hsu et al. Oct 2007 A1
20070269108 Steinberg et al. Nov 2007 A1
20070296833 Corcoran et al. Dec 2007 A1
20080012969 Kasai et al. Jan 2008 A1
20080037827 Corcoran et al. Feb 2008 A1
20080037839 Corcoran et al. Feb 2008 A1
20080037840 Steinberg et al. Feb 2008 A1
20080043121 Prilutsky et al. Feb 2008 A1
20080166115 Sachs et al. Jul 2008 A1
20080175481 Petrescu et al. Jul 2008 A1
20080211943 Egawa et al. Sep 2008 A1
20080218611 Parulski et al. Sep 2008 A1
20080219581 Albu et al. Sep 2008 A1
20080219585 Kasai et al. Sep 2008 A1
20080220750 Steinberg et al. Sep 2008 A1
20080231713 Florea et al. Sep 2008 A1
20080232711 Prilutsky et al. Sep 2008 A1
20080240555 Nanu et al. Oct 2008 A1
20080240607 Sun et al. Oct 2008 A1
20080259175 Muramatsu et al. Oct 2008 A1
20080267530 Lim Oct 2008 A1
20080292193 Bigioi et al. Nov 2008 A1
20080309769 Albu et al. Dec 2008 A1
20080309770 Florea et al. Dec 2008 A1
20090003652 Steinberg et al. Jan 2009 A1
20090009612 Tico et al. Jan 2009 A1
20090080713 Bigioi et al. Mar 2009 A1
20090080796 Capata et al. Mar 2009 A1
20090080797 Nanu et al. Mar 2009 A1
20090179999 Albu et al. Jul 2009 A1
20090185041 Kang et al. Jul 2009 A1
20090185753 Albu et al. Jul 2009 A1
20090190803 Neghina et al. Jul 2009 A1
20090196466 Capata et al. Aug 2009 A1
20090284610 Fukumoto et al. Nov 2009 A1
20090303342 Corcoran et al. Dec 2009 A1
20090303343 Drimbarean et al. Dec 2009 A1
20100026823 Sawada Feb 2010 A1
20100053349 Watanabe et al. Mar 2010 A1
20100126831 Ceelen May 2010 A1
20110090352 Wang et al. Apr 2011 A1
20110102642 Wang et al. May 2011 A1
Foreign Referenced Citations (22)
Number Date Country
3729324 Mar 1989 DE
10154203 Jun 2002 DE
10107004 Sep 2002 DE
0944251 Sep 1999 EP
944251 Apr 2003 EP
1583033 Oct 2005 EP
1779322 Jan 2008 EP
1429290 Jul 2008 EP
10285542 Oct 1998 JP
11327024 Nov 1999 JP
2008-520117 Jun 2008 JP
WO9843436 Oct 1998 WO
WO0113171 Feb 2001 WO
WO0245003 Jun 2002 WO
WO2007093199 Aug 2007 WO
WO2007093199 Aug 2007 WO
WO2007142621 Dec 2007 WO
WO2007143415 Dec 2007 WO
WO2008017343 Feb 2008 WO
WO2008131438 Oct 2008 WO
WO2008151802 Dec 2008 WO
WO2009036793 Mar 2009 WO
Related Publications (1)
Number Date Country
20100238309 A1 Sep 2010 US
Continuations (1)
Number Date Country
Parent 11690836 Mar 2007 US
Child 12789300 US