Handheld article with movement discrimination

Information

  • Patent Grant
  • 7773118
  • Patent Number
    7,773,118
  • Date Filed
    Sunday, March 25, 2007
    17 years ago
  • Date Issued
    Tuesday, August 10, 2010
    13 years ago
Abstract
A digital camera has a pair of angular rate-sensing gyroscopic sensors with mutually perpendicular axes and an electronic circuit responsive to the sensor output signals to discriminate between voluntary and involuntary movements of the article as a function of the number of zero crossings per unit time of the signal and the average of the absolute amplitude of the signal.
Description
FIELD OF THE INVENTION

This invention relates to a handheld article, such as a digital camera, including an apparatus which discriminates between voluntary and involuntary movement of the article.


BACKGROUND OF THE INVENTION

Handheld articles such as digital cameras are subject to movement in use, either involuntary (hand-jitter) or voluntary (e.g. panning). It would be useful to discriminate between these two types of movement. It is desired to have an apparatus, e.g., in a handheld article such as a digital camera or other digital camera-enabled portable device, which discriminates between voluntary and involuntary movements of the article.


SUMMARY OF THE INVENTION

According to the present invention there is provided a handheld article having at least one angular rate-sensing gyroscopic sensor and an electronic circuit responsive to the sensor output signal to discriminate between voluntary and involuntary movements of the article as a function of the number of zero crossings per unit time of the signal and the average of the absolute amplitude of the signal.


The axes of the first and second angular rate-sensing gyroscopic sensors may be substantially perpendicular to one another.


The function may be proportional to the number of zero crossings of each signal and inversely proportional to the average of the absolute amplitude of each signal.


The article may include first and second angular rate-sensing gyroscopic sensors with transverse axes, and the electronic circuit may be responsive to both sensor output signals to discriminate between voluntary and involuntary movements of the article.


The electronic circuit may discriminate between voluntary and involuntary movements by determining whether the function, which may be described as:

NZC1/|W|1+NZC2/|W|2,

exceeds a predetermined threshold, where NZC1 and NZC2 are the number of zero crossings per unit time for the output signals from the first and second sensors respectively, and |W|1 and |W|2 are the averages of the absolute amplitude of the output signals from the first and second sensors respectively.


The electronic circuit may discriminate between involuntary movements and sensor noise by determining whether the function falls below a second predetermined threshold less than the first threshold.


In certain embodiments, the article includes a digital camera. The camera may be provided with image processing software responsive to the electronic circuit indicating voluntary and/or involuntary movements to control the processing of the image accordingly. The image processing software may include face detection software which is operable in the absence of the circuit indicating voluntary movement. The image processing software may include de-blur software which is operable in the absence of the circuit indicating voluntary movement.





BRIEF DESCRIPTION OF DRAWINGS

One or more embodiments will now be described, by way of example, with reference to the accompanying drawings, in which:



FIG. 1 is a block diagram of a digital camera operating in accordance with one or more embodiments.



FIGS. 2 to 4 are waveforms useful in understanding the operation of the embodiments described with reference to FIG. 1.





DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS


FIG. 1 is a block diagram of a portable digital camera 10, operating in accordance with an embodiment of the present invention. It will be appreciated that many of the processes implemented in the digital camera are implemented in or controlled by software operating on a microprocessor, central processing unit, controller, digital signal processor and/or an application specific integrated circuit, collectively depicted as processor 120. All user interface and control of peripheral components such as buttons and display is controlled by a microcontroller 122. The processor 120, in response to a user input at 122, such as half pressing a shutter button (pre-capture mode 32), initiates and controls the digital photographic process.


Ambient light exposure is determined using a light sensor 40 in order to automatically determine if a flash is to be used. The distance to the subject is determined using a focusing mechanism 50 which also focuses the image on an image capture device 60. If a flash is to be used, processor 120 causes a flash device 70 to generate a photographic flash in substantial coincidence with the recording of the image by the image capture device 60 upon full depression of the shutter button.


The image capture device 60 digitally records the image in color. The image capture device is known to those familiar with the art and may include a CCD (charge coupled device) or CMOS to facilitate digital recording. The flash may be selectively generated either in response to the light sensor 40 or a manual input 72 from the user of the camera. The high resolution image recorded by image capture device 60 is stored in an image store 80 which may comprise computer memory such a dynamic random access memory or a non-volatile memory. The camera is equipped with a display 100, such as an LCD, for preview images.


In the case of preview images which are generated in the pre-capture mode 32 with the shutter button half-pressed, the display 100 can assist the user in composing the image, as well as being used to determine focusing and exposure. Temporary storage 82 is used to store one or plurality of the preview images and can be part of the image store 80 or a separate component. The preview image is usually generated by the image capture device 60. For speed and memory efficiency reasons, preview images usually have a lower pixel resolution than the main image taken when the shutter button is fully depressed, and are generated by sub-sampling a raw captured image using software 124 which can be part of the general processor 120 or dedicated hardware or combination thereof.


Various processing functions 90 carried out on the main, high-resolution, image, and/or low resolution preview images, such as redeye detection and correction 92 and de-blurring 94, can be integral to the camera 10 or part of an external processing device 20 such as a desktop computer.


The camera 10 also includes two angular rate-sensing gyroscopic sensors 130 having vertical and horizontal axes respectively (vertical and horizontal refers to the orientation of the axes when the camera is held perfectly upright and level). In the present embodiment each sensor 130 comprises an Analog Devices ADXRS401 single chip sensor capable of measuring up to 75 degrees per second about its axis. The analog output signals from the two sensors are sampled 40,000 times per second for analog-to-digital conversion and applied to the digital processor 120. This allows the processor 120 to discriminate between voluntary and involuntary movement of the camera, as well as between camera movement and electronic jitter, in the following manner.


In a digital camera one may encounter three situations:


1. There is no movement (FIG. 2). The sensor output signal is solely electronic jitter (sensor noise).


2. There is involuntary hand jitter (FIG. 3). It adds to the sensor noise. Here, the hand jitter is dominant.


3. There is voluntary, desired hand movement (FIG. 4), for example, while panning to follow a moving vehicle. This adds to the involuntary hand jitter and sensor noise, but is dominant.


Sensor noise has a small amplitude and a large percentage of energy in the high frequency domain. Hand jitter increases the amplitude of the sensor output signals and the energy in the low frequency bands, while voluntary movement has the largest amplitude (due to a non-zero DC component) and most of the signal energy near zero frequency. The present embodiment discriminates between these three situations as a function of the number of zero-crossings per second NZC of each sensor signal, which is a measure of frequency, and the average of the absolute amplitude of each sensor signal |W| in Volts. The number of zero crossings per second (NZC) of a discrete signal w(n), where n=1, . . . N, is:







NZC
=




n
=
1


N
-
1








H


(


-

w


(
n
)



·

w


(

n
+
1

)



)




,




where







H


(
x
)


=

{




0
,

x

0







1
,

x
>
0










The value of NZC (which refers to the number of zero crossings per second) need not be established by counting zero crossings over a full second, since NZC changes relatively slowly over time and a count of the number of zero crossings taken over, for example, 1/10th of a second can be scaled up (in this example by multiplying by 10) and expressed as the number per second. The absolute amplitude of the signal is preferably averaged over the same period as the zero crossing count, but again the absolute average changes relatively slowly over time so exact correspondence is not strictly necessary.


Our tests, using two Analog Devices ADXRS401 single chip sensors mounted on a test board with mutually perpendicular axes and sensor outputs sampled at 40,000 samples per second, have shown that for the three situations above the following criteria generally apply for both horizontal and vertical components of random movement:


1. No movement: NZC ε [180; 250] and |W| ε [0.01; 0.025]


2. Hand jitter: NZC ε [50; 160] and |W| ε [0.03, 0.2]


3. Voluntary movement: NZC<10 and |W|>0.5.


These ranges can therefore be used for discriminating between the three situations. However, unless the movement is predominantly in one direction, the discrimination is likely to be less accurate if the criteria for only one direction (horizontal or vertical) are used for a decision, as one can see in the example below (FIG. 4, NZChor).


Therefore, we have found that a high degree of discrimination accuracy can be obtained by combining the criteria and computing the term:






TH
=



NZC
hor





W


_

hor


+


NZC
vert





W


_

vert







In our tests, we found empirically that if TH were greater than 1200, noise was the predominant factor; if not, hand jitter was present. We found this 1200 threshold to be highly accurate in discriminating between sensor noise and hand jitter. In our tests there were 2% false alarms (noise detected as jitter) and 1.8% misdetection. In 0.16% cases the test indicated voluntary movement instead of hand jitter.


To discriminate between hand jitter and voluntary movement we compared TH with 400, which we found empirically to be a useful threshold to differentiate between these two types of movement. A TH of less than 400 indicated voluntary movement while a TH greater than 400 indicated hand jitter. For soft movements it is hard to define the physical threshold between hand jitter and a voluntary movement. For hard movements, however, the tests proved 100% accurate.


Of course, if different sensors are used, there will be different threshold levels.



FIGS. 2 to 4 are typical waveforms of the sensor outputs for the three situations referred to above.



FIG. 2: Pure sensor noise. As one can see, both of the records have many zero-crossings, but no significant DC component. The high frequencies are much more important as compared to other cases. The energy level is low. The values computed from these records are:


NZChor=321


NZCvert=140



|W|
hor=0.025



|W|
vert=00034


TH=54016.47



FIG. 3: Hand jitter is predominant. As one can see, both of the records have insignificant DC components and a rather high number of zero-crossings. The values computed from these records are:


NZChor=78


NZCvert=119



|W|
hor=0.093



|W|
vert=0.079


TH=2345.03



FIG. 4: Voluntary movement is predominant. As one can see, both of the records have significant DC components which decreases the number of zero-crossings. The DC component shows the existence of the voluntary movement. The values computed from these records are:


NZChor=15


NZCvert=0



|W|
hor=0.182



|W|
vert=0.284


TH=82.42


The above technique is embodied in the camera 10 by the processor 120 iteratively calculating TH from the output signals from the sensors 150, comparing the calculated value with the thresholds 400 and 1200, and setting or clearing a respective flag according to whether TH is greater than 1200, less than 400, or between 400 and 1200. The state of the flags at any given moment will indicate whether the immediately preceding measurement detected sensor noise only, hand jitter or voluntary movement to a high degree of accuracy. This is done cyclically at least while the camera is in preview mode with the shutter button half-depressed, right up to the moment that the final full resolution image is captured. The final state of the flags at the moment the shutter is pressed can be stored as metadata with the image, and used in processing the image, e.g. as an input to the de-blur function 94 or alternatively, the state of the flag can be fed directly to the image processing function. During the preview phase the state of the flags can be used to determine whether to enable or disable an image stabilization function or otherwise used as input to such a function to modify its operation.


In another application, the image processing functions include a face tracking module 96 as described in co-pending application Ser. No. 11/464,083 filed Aug. 11, 2006. Such a module periodically analyses acquired images to detect faces within the images and subsequently tracks these faces from one image to another in an image stream. Face detection is relatively processor intensive and needs to be performed as judiciously as possible. Thus, using the information provided with the present embodiment, the module 96 can for example decide to switch off face detection/tracking when a camera is being voluntarily moved as it might presume that it could not track face movement during panning of the camera, whereas if hand jitter is being experienced, the module can use the degree and direction of movement to attempt to predict where a face candidate region being tracked may have moved from one image in a stream to the next. If noise rather than hand jitter is being experienced, the module 96 could decide to use an existing location for a face candidate region rather than attempting to predict its location based on camera movement.


While an exemplary drawings and specific embodiments of the present invention have been described and illustrated, it is to be understood that that the scope of the present invention is not to be limited to the particular embodiments discussed. Thus, the embodiments shall be regarded as illustrative rather than restrictive, and it should be understood that variations may be made in those embodiments by workers skilled in the arts without departing from the scope of the present invention as set forth in the claims that follow and their structural and functional equivalents.


In addition, in methods that may be performed according to the claims below and/or preferred embodiments herein, the operations have been described in selected typographical sequences. However, the sequences have been selected and so ordered for typographical convenience and are not intended to imply any particular order for performing the operations, unless a particular ordering is expressly provided or understood by those skilled in the art as being necessary.


In addition, all references cited herein as well as the background, invention summary, abstract and brief description of the drawings, as well as U.S. Pat. Nos. 6,407,777 and 6,035,072, and US published patent applications nos. 2005/0041121, 2005/0031224, 2006/0204054, 2005/0140801, 2006/0204110, 2006/0098237, 2006/0093212, 2006/0120599, and 2006/0140455, and U.S. patent applications Nos. 60/773,714, 60/804,546, 60/865,375, 60/865,622, 60/829,127, 60/829,127, 60/821,165, 60/803,980, Ser. Nos. 11/554,539, 11/464,083, 11/462,035, 11/027,001, 10/842,244, 11/024,046, 11/233,513, and 11/460,218, are all incorporated by reference into the detailed description of the preferred embodiments as disclosing alternative embodiments.

Claims
  • 1. A handheld article having at least one angular rate-sensing gyroscopic sensor and an electronic circuit responsive to a sensor output signal to discriminate between voluntary and involuntary movements of the article as a function of the number of zero crossings per unit time of the signal and the average of the absolute amplitude of the signal, and wherein the function is proportional to the number of zero crossings of the signal and inversely proportional to the average of the absolute amplitude of the signal.
  • 2. An article as claimed in claim 1, including first and second angular rate-sensing gyroscopic sensors with transverse axes, the electronic circuit being responsive to both sensor output signals to discriminate between voluntary and involuntary movements of the article.
  • 3. An article as claimed in claim 2, wherein the axes of the first and second angular rate-sensing gyroscopic sensors are substantially perpendicular to one another.
  • 4. A handheld article having first and second angular rate-sensing gyroscopic sensors with transverse axes that are substantially perpendicular to one another, and an electronic circuit responsive to sensor output signals, to discriminate between voluntary and involuntary movements of the article as a function of the number of zero crossings per unit time of the signal and the average of the absolute amplitude of the signal, wherein the electronic circuit discriminates between the voluntary and involuntary movements by determining whether the function NZC1/|W|1+NZC2/|W|2
  • 5. An article as claimed in claim 1 or claim 4, wherein the article is a digital camera.
  • 6. An article as claimed in claim 5 wherein the camera is provided with image processing software responsive to said circuit indicating either voluntary and involuntary movement to control the processing of said image accordingly.
  • 7. An article as claimed in claim 6 wherein said image processing software comprises face detection software which is operable in the absence of said circuit indicating voluntary movement.
  • 8. An article as claimed in claim 6 wherein said image processing software comprises de-blur software which is operable in the absence of said circuit indicating voluntary movement.
  • 9. An article as claimed in claim 3 or claim 4, wherein the electronic circuit further discriminates between involuntary movements and sensor noise by determining whether the said function falls below a second predetermined threshold less than the first threshold.
  • 10. An article as in claim 9, wherein the function is proportional to the number of zero crossings of the signals and inversely proportional to the average of the absolute amplitude of the signals.
US Referenced Citations (161)
Number Name Date Kind
5251019 Moorman et al. Oct 1993 A
5374956 D'Luna Dec 1994 A
5392088 Abe et al. Feb 1995 A
5428723 Ainscow et al. Jun 1995 A
5510215 Prince et al. Apr 1996 A
5599766 Boroson et al. Feb 1997 A
5686383 Long et al. Nov 1997 A
5747199 Roberts et al. May 1998 A
5751836 Wildes et al. May 1998 A
5756239 Wake May 1998 A
5756240 Roberts et al. May 1998 A
5802220 Black et al. Sep 1998 A
5889277 Hawkins et al. Mar 1999 A
5889554 Mutze Mar 1999 A
5909242 Kobayashi et al. Jun 1999 A
5981112 Roberts Nov 1999 A
6028960 Graf et al. Feb 2000 A
6035072 Read Mar 2000 A
6061462 Tostevin et al. May 2000 A
6081606 Hansen et al. Jun 2000 A
6114075 Long et al. Sep 2000 A
6124864 Madden et al. Sep 2000 A
6134339 Luo Oct 2000 A
6269175 Hanna et al. Jul 2001 B1
6297071 Wake Oct 2001 B1
6297846 Edanami Oct 2001 B1
6326108 Simons Dec 2001 B2
6330029 Hamilton et al. Dec 2001 B1
6360003 Doi et al. Mar 2002 B1
6365304 Simons Apr 2002 B2
6387577 Simons May 2002 B2
6407777 DeLuca Jun 2002 B1
6535244 Lee et al. Mar 2003 B1
6555278 Loveridge et al. Apr 2003 B1
6567536 McNitt et al. May 2003 B2
6599668 Chari et al. Jul 2003 B2
6602656 Shore et al. Aug 2003 B1
6607873 Chari et al. Aug 2003 B2
6618491 Abe Sep 2003 B1
6625396 Sato Sep 2003 B2
6643387 Sethuraman et al. Nov 2003 B1
6741960 Kim et al. May 2004 B2
6863368 Sadasivan et al. Mar 2005 B2
6892029 Tsuchida et al. May 2005 B2
6947609 Seeger et al. Sep 2005 B2
6961518 Suzuki Nov 2005 B2
7019331 Winters et al. Mar 2006 B2
7072525 Covell Jul 2006 B1
7084037 Gamo et al. Aug 2006 B2
7160573 Sadasivan et al. Jan 2007 B2
7177538 Sato et al. Feb 2007 B2
7180238 Winters Feb 2007 B2
7195848 Roberts Mar 2007 B2
7269292 Steinberg Sep 2007 B2
7292270 Higurashi et al. Nov 2007 B2
7315324 Cleveland et al. Jan 2008 B2
7315630 Steinberg et al. Jan 2008 B2
7315631 Corcoran et al. Jan 2008 B1
7316630 Tsukada et al. Jan 2008 B2
7316631 Tsunekawa Jan 2008 B2
7317815 Steinberg et al. Jan 2008 B2
7336821 Ciuc et al. Feb 2008 B2
7369712 Steinberg et al. May 2008 B2
7403643 Ianculescu et al. Jul 2008 B2
7453493 Pilu Nov 2008 B2
7453510 Kolehmainen Nov 2008 B2
7460695 Steinberg et al. Dec 2008 B2
7469071 Drimbarean et al. Dec 2008 B2
7489341 Yang et al. Feb 2009 B2
7548256 Pilu Jun 2009 B2
7551755 Steinberg et al. Jun 2009 B1
7565030 Steinberg et al. Jul 2009 B2
7593144 Dymetman Sep 2009 B2
7623153 Hatanaka Nov 2009 B2
20010036307 Hanna et al. Nov 2001 A1
20020006163 Hibi et al. Jan 2002 A1
20030052991 Stavely et al. Mar 2003 A1
20030058361 Yang Mar 2003 A1
20030091225 Chen May 2003 A1
20030103076 Neuman Jun 2003 A1
20030151674 Lin Aug 2003 A1
20030152271 Tsujino et al. Aug 2003 A1
20030169818 Obrador Sep 2003 A1
20030193699 Tay Oct 2003 A1
20030219172 Caviedes et al. Nov 2003 A1
20040066981 Li et al. Apr 2004 A1
20040076335 Kim Apr 2004 A1
20040090532 Imada May 2004 A1
20040120598 Feng Jun 2004 A1
20040120698 Hunter Jun 2004 A1
20040145659 Someya et al. Jul 2004 A1
20040169767 Norita et al. Sep 2004 A1
20040212699 Molgaard Oct 2004 A1
20040218057 Yost et al. Nov 2004 A1
20040218067 Chen et al. Nov 2004 A1
20050010108 Rahn et al. Jan 2005 A1
20050019000 Lim et al. Jan 2005 A1
20050031224 Prilutsky et al. Feb 2005 A1
20050041121 Steinberg et al. Feb 2005 A1
20050041123 Ansari et al. Feb 2005 A1
20050047672 Ben-Ezra et al. Mar 2005 A1
20050052553 Kido et al. Mar 2005 A1
20050057687 Irani et al. Mar 2005 A1
20050068446 Steinberg et al. Mar 2005 A1
20050068452 Steinberg et al. Mar 2005 A1
20050140801 Prilutsky et al. Jun 2005 A1
20050201637 Schuler et al. Sep 2005 A1
20050219391 Sun et al. Oct 2005 A1
20050231625 Parulski et al. Oct 2005 A1
20050248660 Stavely et al. Nov 2005 A1
20050259864 Dickinson et al. Nov 2005 A1
20050270381 Owens et al. Dec 2005 A1
20060006309 Dimsdale et al. Jan 2006 A1
20060017837 Sorek et al. Jan 2006 A1
20060038891 Okutomi et al. Feb 2006 A1
20060039690 Steinberg et al. Feb 2006 A1
20060093212 Steinberg et al. May 2006 A1
20060098237 Steinberg et al. May 2006 A1
20060098890 Steinberg et al. May 2006 A1
20060098891 Steinberg et al. May 2006 A1
20060120599 Steinberg et al. Jun 2006 A1
20060125938 Ben-Ezra et al. Jun 2006 A1
20060140455 Costache et al. Jun 2006 A1
20060170786 Won Aug 2006 A1
20060171464 Ha Aug 2006 A1
20060187308 Lim et al. Aug 2006 A1
20060204034 Steinberg et al. Sep 2006 A1
20060204054 Steinberg et al. Sep 2006 A1
20060204110 Steinberg et al. Sep 2006 A1
20060285754 Steinberg et al. Dec 2006 A1
20070025714 Shiraki Feb 2007 A1
20070058073 Steinberg et al. Mar 2007 A1
20070083114 Yang et al. Apr 2007 A1
20070110305 Corcoran et al. May 2007 A1
20070147820 Steinberg et al. Jun 2007 A1
20070189748 Drimbarean et al. Aug 2007 A1
20070201724 Steinberg et al. Aug 2007 A1
20070234779 Hsu et al. Oct 2007 A1
20070269108 Steinberg et al. Nov 2007 A1
20070296833 Corcoran et al. Dec 2007 A1
20080037827 Corcoran et al. Feb 2008 A1
20080037839 Corcoran et al. Feb 2008 A1
20080037840 Steinberg et al. Feb 2008 A1
20080043121 Prilutsky et al. Feb 2008 A1
20080175481 Petrescu et al. Jul 2008 A1
20080219581 Albu et al. Sep 2008 A1
20080220750 Steinberg et al. Sep 2008 A1
20080231713 Florea et al. Sep 2008 A1
20080232711 Prilutsky et al. Sep 2008 A1
20080240555 Nanu et al. Oct 2008 A1
20080292193 Bigioi et al. Nov 2008 A1
20080309769 Albu et al. Dec 2008 A1
20080309770 Florea et al. Dec 2008 A1
20090003652 Steinberg et al. Jan 2009 A1
20090080713 Bigioi et al. Mar 2009 A1
20090080796 Capata et al. Mar 2009 A1
20090080797 Nanu et al. Mar 2009 A1
20090179999 Albu et al. Jul 2009 A1
20090185753 Albu et al. Jul 2009 A1
20090190803 Neghina et al. Jul 2009 A1
20090196466 Capata et al. Aug 2009 A1
Foreign Referenced Citations (23)
Number Date Country
3729324 Mar 1989 DE
10154203 Jun 2002 DE
10107004 Sep 2002 DE
944251 Apr 2003 EP
1583033 Oct 2005 EP
1779322 Jan 2008 EP
1429290 Jul 2008 EP
10285542 Oct 1998 JP
11327024 Nov 1999 JP
2008-520117 Jun 2008 JP
WO-9843436 Oct 1998 WO
WO-0245003 Jun 2002 WO
WO-03071484 Aug 2003 WO
WO-04001667 Dec 2003 WO
WO-2004036378 Apr 2004 WO
WO-2006050782 May 2006 WO
WO-2007093199 Aug 2007 WO
WO-2007093199 Aug 2007 WO
WO-2007142621 Dec 2007 WO
WO-2007143415 Dec 2007 WO
WO-2008017343 Feb 2008 WO
WO-2008131438 Oct 2008 WO
WO-2009036793 Mar 2009 WO
Related Publications (1)
Number Date Country
20080231713 A1 Sep 2008 US