System and method for feature location and tracking in multiple dimensions including depth

Information

  • Patent Grant
  • 7050624
  • Patent Number
    7,050,624
  • Date Filed
    Tuesday, July 24, 2001
    23 years ago
  • Date Issued
    Tuesday, May 23, 2006
    18 years ago
Abstract
The present invention is directed to a method and related system for determining a feature location in multiple dimensions including depth. The method includes providing left and right camera images of the feature and locating the feature in the left camera image and in the right camera image using bunch graph matching. The feature location is determined in multiple dimensions including depth based on the feature locations in the left camera image and the right camera image.
Description
BACKGROUND OF THE INVENTION

The present invention relates to feature tracking techniques, and more particularly, to an eye tracking technique that determines the location of a person's eyes in three-dimensional space.


Virtual reality systems are able to generate three-dimensional images viewed by a person without special glasses using, for example, auto-stereoscopic imaging. Auto-stereoscopic imaging requires real time determination of a viewer's eyes in depth or in three dimensions.


Accordingly, there exists a need for a system and related tools for location of a person's features in three-dimensional space. The present invention satisfies these needs.


SUMMARY OF THE INVENTION

The present invention is directed to a method and related system for determining a feature location in multiple dimensions including depth. The method includes providing left and right camera images of the feature and locating the feature in the left camera image and in the right camera image using bunch graph matching. The feature location is determined in multiple dimensions including depth based on the feature locations in the left camera image and the right camera image.





Other features and advantages of the present invention should be apparent from the following description of the preferred embodiments taken in conjunction with the accompanying drawings, which illustrate, by way of example, the principles of the invention.


BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is an elevation view of a face for feature tracking, according to the present invention.



FIG. 2 is a plan view of the face of FIG. 1 with respect to two tracking cameras.



FIG. 3 is a schematic diagram of the geometry of the face and tracking cameras of FIG. 2.



FIG. 4A is an image from a left camera of the face of FIG. 1.



FIG. 4B is an image from a right camera of the face of FIG. 1.





DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS

The present invention is directed to a method and related system for determining a feature location in multiple dimensions including depth. The method includes providing left and right camera images of the feature and locating the feature in the left camera image and in the right camera image using bunch graph matching. The feature location is determined in multiple dimensions including depth based on the feature locations in the left camera image and the right camera image.


An embodiment of the invention is described using a simple face image 12 is shown in FIG. 1. The left eye E of the face image is marked with a small diamond to indicate the left eye as a feature for tracking. For simplicity, tracking for only one feature is described. However, several features may be tracked by performing the analysis below for each feature.


The location and tracking of the left eye may be accomplished using two cameras, a right camera CR and a left camera CL, as shown in FIG. 2. Two cameras are generally required for acquiring the location in multiple dimensions including depth using a simple geometrical model shown in FIG. 3. The coordinate system may be selected such that the cameras lie along an x-axis and the depth from the cameras is measured along a z-axis. The distance to the left eye E along the z-axis is a depth D and the distance along the x-axis is a length L (measured from the location of the right camera CR). A normal ray from each camera, NR and NL, indicates an image ray associated with an approximate center of the cameras imaging area.


The imaging areas of the left and right cameras are shown in FIGS. 4A and 4B, respectively. Each imaging area is a rectangular array of imaging or picture elements (pixels). Each vertical row of pixels in each image area corresponds to a particular slope M for an image ray originating at an origin (based on a pinhole model) of the respective camera.


The cameras may be calibrated for the image ray slope associated with each verticle row of pixels. A feature in the image frame may be located and tracked using elastic bunch graph matching. As shown in FIG. 4A, the left eye E is imaged in the left image along vertical pixel row PL and, as shown in FIG. 4B, the left eye E is imaged in the right image along vertical pixel row PR. The pixel rows PL and PR are associated with slopes ML and MR, respectively. Accordingly, the location of the left eye E is readily calculated in the x-z plane. Elastic bunch graph matching and more sophisticated geometrical models and calibration techniques are described in U.S. patent application Ser. No. 09/206,195 which is now U.S. Pat. No. 6,301,370 issued on Oct. 9, 2001.


In the elastic graph matching technique, an image is transformed into Gabor space using a wavelet transformations based on Gabor wavelets. The transformed image is represented by complex wavelet component values associated with each pixel of the original image.


Although the foregoing discloses the preferred embodiments of the present invention, it is understood that those skilled in the art may make various changes to the preferred embodiments without departing form the scope of the invention. The invention is defined only by the following claims.

Claims
  • 1. A method for determining a feature location, comprising: providing left and right camera images of the feature;locating the feature in the left camera image and in the right camera image using bunch graph matching, wherein the feature is an eye of a person's face; anddetermining the feature location in multiple dimensions including depth based on the feature locations in the left camera image and the right camera image.
  • 2. A method for determining a feature location, comprising: providing left and right camera images of the feature;locating the feature in the left camera image and in the right camera image using image analysis based on wavelet component values generated from wavelet transformations of the camera images, wherein the feature is an eye of a person's face; anddetermining the feature location in multiple dimensions including depth based on the feature locations in the left camera image and the right camera image.
  • 3. A method for determining a feature location as defined in claim 2, wherein the wavelet transformations use Gabor wavelets.
  • 4. Apparatus for determining a feature location, comprising: means for providing left and right camera images of the feature;means for locating the feature in the left camera image and in the right camera image using image analysis based on wavelet component values generated from wavelet transformations of the camera images, wherein the feature is an eye of a person's face; andmeans for determining the feature location in multiple dimensions including depth based on the feature locations in the left camera image and the right camera image.
  • 5. Apparatus for determining a feature location as defined in claim 4, wherein the wavelet transformations use Gabor wavelets.
  • 6. A method for determining a feature location, comprising: providing first and second spaced-apart camera images of the feature;locating the feature in the first camera image using image analysis based on wavelet component values generated from wavelet transformations of the first camera image, wherein the feature is an eye of a person's face; and locating the feature in the second camera image using image analysis based on wavelet component values generated from wavelet transformations of the second camera image; anddetermining the feature location in multiple dimensions including depth based on the feature location in the first camera image and the feature location in the second camera image.
  • 7. A method for determining a feature location as defined in claim 6, wherein the wavelet transformations use Gabor wavelets.
  • 8. Apparatus for determining a feature location, comprising: means for providing left and right camera images of the feature;means for locating the feature in the left camera image and in the right camera image using bunch graph matching, wherein the feature is an eye of a person's face; andmeans for determining the feature location in multiple dimensions including depth based on the feature locations in the left camera image and the right camera image.
  • 9. Apparatus for determining a feature location, comprising: means for providing first and second spaced-apart camera images of the feature;means for locating the feature in the first camera image using image analysis based on wavelet component values generated from wavelet transformations of the first camera image, and locating the feature in the second camera image using image analysis based on wavelet component values generated from wavelet transformations of the second camera image, wherein the feature is an eye of a person's face; andmeans for determining the feature location in multiple dimensions including depth based on the feature location in the first camera image and the feature location in the second camera image.
  • 10. Apparatus for determining a feature location as defined in claim 9, wherein the wavelet transformations use Gabor wavelets.
  • 11. A method for real-time determination of the location of a person's eyes in three-dimensions for auto-stereoscopic imaging, comprising: providing left and right spaced-apart camera images of a person's face, the person's face including a left eye and a right eye;locating the left eye and the right eye in the left camera image using image analysis based on wavelet component values generated from wavelet transformations of the left camera image, and locating the left eye and the right eye in the right camera image using image analysis based on wavelet component values generated from wavelet transformations of the right camera image; anddetermining the feature locations of the left eye and the right eye in three dimensions based on the left and right eye locations in the left camera image and the left and right eye locations in the right camera image.
  • 12. A method for real-time determination of the location of a person's eyes in three-dimensions for auto-stereoscopic imaging as defined in claim 11, wherein the wavelet transformations use Gabor wavelets.
  • 13. A method for real-time determination of the location of a person's eyes in three-dimensions for auto-stereoscopic imaging as defined in claim 12, wherein the image analysis comprises bunch graph matching.
  • 14. Apparatus for real-time determination of the location of a person's eyes in three-dimensions for auto-stereoscopic imaging, comprising: means for providing left and right spaced-apart camera images of a person's face, the person's face including a left eye and a right eye;means for locating the left eye and the right eye in the left camera image using image analysis based on wavelet component values generated from wavelet transformations of the left camera image, and locating the left eye and the right eye in the right camera image using image analysis based on wavelet component values generated from wavelet transformations of the right camera image; andmeans for determining the feature locations of the left eye and the right eye in three dimensions based on the left and right eye locations in the left camera image and the left and right eye locations in the right camera image.
  • 15. Apparatus for real-time determination of the location of a person's eyes in three-dimensions for auto-stereoscopic imaging as defined in claim 14, wherein the wavelet transformations use Gabor wavelets.
  • 16. Apparatus for real-time determination of the location of a person's eyes in three-dimensions for auto-stereoscopic imaging as defined in claim 15, wherein the image analysis comprises bunch graph matching.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims priority under 35 U.S.C. §119(e)(1) and 37 C.F.R. § 1.78(a)(4) to U.S. provisional application Ser. No. 60/220,309, entitled SYSTEM AND METHOD FOR FEATURE LOCATION AND TRACKING IN MULTIPLE DIMENSIONS INCLUDING DEPTH and filed Jul. 24, 2000; and claims priority under 35 U.S.C. § 120 and 37 C.F.R. § 1.78(a)(2) as a continuation-in-part to U.S. patent application Ser. No. 09/206,195 which is now U.S. Pat. No. 6,301,370 issued on Oct. 9, 2001, entitled FACE RECOGNITION FROM VIDEO IMAGES and filed Dec. 4, 1998. The entire disclosure of U.S. patent application Ser. No. 09/206,195 which is now U.S. Pat. No. 6,301,370 issued on Oct. 9, 2001 is incorporated herein by reference.

US Referenced Citations (36)
Number Name Date Kind
4725824 Yoshioka Feb 1988 A
4805224 Koezuka et al. Feb 1989 A
4827413 Baldwin et al. May 1989 A
5159647 Burt Oct 1992 A
5168529 Peregrim et al. Dec 1992 A
5187574 Kosemura et al. Feb 1993 A
5220441 Gerstenberger Jun 1993 A
5280530 Trew et al. Jan 1994 A
5333165 Sun Jul 1994 A
5383013 Cox Jan 1995 A
5430809 Tomitaka Jul 1995 A
5432712 Chan Jul 1995 A
5511153 Azarbayejani et al. Apr 1996 A
5533177 Wirtz et al. Jul 1996 A
5550928 Lu et al. Aug 1996 A
5581625 Connell Dec 1996 A
5588033 Yeung Dec 1996 A
5680487 Markandey Oct 1997 A
5699449 Javidi Dec 1997 A
5714997 Anderson Feb 1998 A
5715325 Bang et al. Feb 1998 A
5719954 Onda Feb 1998 A
5736982 Suzuki et al. Apr 1998 A
5764803 Jacquin et al. Jun 1998 A
5774591 Black et al. Jun 1998 A
5802220 Black et al. Sep 1998 A
5809171 Neff et al. Sep 1998 A
5828769 Burns Oct 1998 A
5905568 McDowell et al. May 1999 A
5917937 Szeliski et al. Jun 1999 A
5982853 Liebermann Nov 1999 A
5995119 Cosatto et al. Nov 1999 A
6011562 Gagné Jan 2000 A
6044168 Tuceryan et al. Mar 2000 A
6052123 Lection et al. Apr 2000 A
6516099 Davison et al. Feb 2003 B1
Foreign Referenced Citations (3)
Number Date Country
4406020 Jun 1995 DE
0807902 Nov 1997 EP
WO9953443 Oct 1999 WO
Related Publications (1)
Number Date Country
20020031253 A1 Mar 2002 US
Provisional Applications (1)
Number Date Country
60220309 Jul 2000 US
Continuation in Parts (1)
Number Date Country
Parent 09206195 Dec 1998 US
Child 09915204 US