Claims
- 1. A method for determining a pose of a camera comprising the steps of:
capturing a video sequence by the camera, the video sequence including a plurality of frames; extracting a plurality of features of an object in the video sequence; estimating a first pose of the camera by an external tracking system; constructing a model of the plurality of features from the estimated first pose; and estimating a second pose of the camera by tracking the model of the plurality of features, wherein after the second pose is estimated, the external tracking system is eliminated.
- 2. The method as in claim 1, wherein the extracting a plurality of features step is performed in real time.
- 3. The method as in claim 1, wherein the extracting a plurality of features step is performed on a recorded video sequence.
- 4. The method as in claim 1, wherein the constructing a model step further comprises the steps of:
tracking the plurality of features over the plurality of frames of the video sequence to construct a 2D-2D match of the plurality of features; and reconstructing 3D locations of the plurality of features by triangulating the 2D-2D match with the first pose.
- 5. The method as in claim 4, wherein the estimating the second pose step further comprises the step of matching 2D locations of the plurality of features in at least one frame of the video sequence to the 3D reconstructed locations of the plurality of features.
- 6. The method as in claim 4, further comprising the steps of:
extracting additional features from the video sequence; matching 2D locations of the additional features to the 3D reconstructed location of the at least one feature; and updating the second pose of the camera.
- 7. The method as in claim 5, wherein an initial matching is performed by object recognition.
- 8. The method as in claim 1, further comprising the step of evaluating correspondences of the plurality of features over the plurality of frames of the video sequence to determine whether the plurality of features are stable.
- 9. The method as in claim 1, further comprising the steps of:
comparing the second pose to the first pose; and wherein if the second pose is within an acceptable range of the first pose, eliminating the external tracking system.
- 10. A system for determining a pose of a camera comprising:
an external tracker for estimating a reference pose; a camera for capturing a video sequence; a feature extractor for extracting a plurality of features of an object in the video sequence; a model builder for constructing a model of the plurality of features from the estimated reference pose; and a pose estimator for estimating a pose of the camera by tracking the model of the plurality of features.
- 11. The system as in claim 10, further comprising an augmentation engine operatively coupled to a display for displaying the constructed model over the plurality of features.
- 12. The system as in claim 10, wherein the feature extractor extracts the plurality of features in real time.
- 13. The system as in claim 10, wherein the feature extractor extracts the plurality of features from a recorded video sequence.
- 14. The system as in claim 10, further comprising a processor for comparing the pose of the camera to the reference pose and, wherein if the camera pose is within an acceptable range of the reference pose, eliminating the external tracking system.
- 15. The system as in claim 10, wherein the external tracker is a marker-based tracker wherein the reference pose is estimated by tracking a plurality of markers placed in a workspace.
- 16. The system as in claim 15, further comprising a processor for comparing the pose of the camera to the reference pose and, if the camera pose is within an acceptable range of the reference pose, instructing a user to remove the markers.
- 17. A program storage device readable by machine, tangibly embodying a program of instructions executable by the machine to perform method steps for determining a pose of a camera, the method steps comprising:
capturing a video sequence by the camera, the video sequence including a plurality of frames; extracting a plurality of features of an object in the video sequence; estimating a first pose of the camera by an external tracking system; constructing a model of the plurality of features from the estimated first pose; and estimating a second pose of the camera by tracking the model of the plurality of features, wherein after the second pose is estimated, the external tracking system is eliminated.
- 18. The program storage device as in claim 17, wherein the constructing a model step further comprises the steps of:
tracking the plurality of features over the plurality of frames of the video sequence to construct a 2D-2D match of the plurality of features; and reconstructing 3D locations of the plurality of features by triangulating the 2D-2D match with the first pose.
- 19. The program storage device as in claim 18, wherein the estimating the second pose step further comprises the step of matching 2D locations of the plurality of features in at least one frame of the video sequence to the 3D reconstructed locations of the plurality of features.
- 20. An augmented reality system comprising:
an external tracker for estimating a reference pose; a camera for capturing a video sequence; a feature extractor for extracting a plurality of features of an object in the video sequence; a model builder for constructing a model of the plurality of features from the estimated reference pose; a pose estimator for estimating a pose of the camera by tracking the model of the plurality of features; an augmentation engine operatively coupled to a display for displaying the constructed model over the plurality of features; and a processor for comparing the pose of the camera to the reference pose and, wherein if the camera pose is within an acceptable range of the reference pose, eliminating the external tracking system.
Parent Case Info
[0001] This application claims priority to an application entitled “AN AUTOMATIC SYSTEM FOR TRACKING AND POSE ESTIMATION: LEARNING FROM MARKERS OR OTHER TRACKING SENSORS IN ORDER TO USE REAL FEATURES” filed in the United States Patent and Trademark Office on Jul. 10, 2001 and assigned Ser. No. 60/304,395, the contents of which are hereby incorporated by reference.
Provisional Applications (1)
|
Number |
Date |
Country |
|
60304395 |
Jul 2001 |
US |