Claims
- 1. A method comprising:
building a tree classifier including a plurality of parent nodes; and for a parent node, selecting between a monolithic classifier as a child node and a plurality of specialized classifiers as child nodes for said parent node.
- 2. The method of claim 1, wherein said selecting comprises:
determining a computational complexity of a monolithic classifier trained with a plurality of positive and negative samples; and determining a computational complexity of a plurality of specialized classifiers trained with the plurality of positive and negative samples, each of the specialized classifiers being trained with the plurality of negative samples and a different subset of the plurality of positive samples.
- 3. The method of claim 2, wherein said determining a computational complexity of the monolithic classifier comprises determining a number of features used by the monolithic classifier, and
wherein said determining a computational complexity of the plurality of specialized classifiers comprises determining a number of features used by the plurality of specialized classifiers.
- 4. The method of claim 1, further comprising:
training the monolithic classifier and the plurality of classifiers with a plurality of positive samples and a plurality of negative samples.
- 5. The method of claim 4, wherein said training comprises training with a boosting algorithm.
- 6. The method of claim 4, wherein said training comprises training the monolithic classifier and the plurality of classifiers to have a selected hit rate and a selected false alarm rate.
- 7. A method comprising:
identifying a plurality of positive samples and a plurality of negative samples in a plurality of patterns; passing the plurality of positive samples and the plurality of negative samples to a node in a tree classifier; determining a number of features used by a monolithic classifier trained with said plurality of positive samples and said plurality of negative samples; clustering the plurality of positive samples into a plurality of subsets; training each of the plurality of specialized classifiers with the plurality of negative samples and a different one of said plurality of subsets; determining a number of features used by the plurality of specialized classifiers; and selecting the plurality of specialized classifiers in response to the number of features used by the plurality of specialized classifiers being smaller than the number of features used by the monolithic classifier.
- 8. The method of claim 7, further comprising:
training each of the plurality of specialized classifiers with a boosting algorithm.
- 9. The method of claim 7, further comprising repeating the elements of the method until a desired depth is achieved.
- 10. An apparatus comprising:
a tree classifier including
a first parent node having a single child node, the child node including a monolithic classifier, and a second parent node having a plurality of child nodes, each child node including a specialized classifier.
- 11. The apparatus of claim 10, wherein each specialized classifier is trained with a different subset of positive samples.
- 12. The apparatus of claim 10, wherein the monolithic classifier and the plurality of specialized classifiers comprise boosted classifiers.
- 13. The apparatus of claim 10, wherein the monolithic classifier and the plurality of specialized classifiers have a selected hit rate and a selected false alarm rate.
- 14. The apparatus of claim 13, wherein the selected hit rate is greater than about 99%.
- 15. The apparatus of claim 13, wherein the selected false alarm rate is about 50%.
- 16. An article comprising a machine-readable medium including machine-executable instructions operative to cause the machine to:
build a tree classifier including a plurality of parent nodes; and for a parent node, select between a monolithic classifier as a child node and a plurality of specialized classifiers as child nodes for said parent node.
- 17. The article of claim 16, wherein the instructions operative to cause the machine to select comprise instructions operative to cause the machine to:
determine a computational complexity of a monolithic classifier trained with a plurality of positive and negative samples; and determine a computational complexity of a plurality of specialized classifiers trained with the plurality of positive and negative samples, each of the specialized classifiers being trained with the plurality of negative samples and a different subset of the plurality of positive samples.
- 18. An article comprising a machine-readable medium including machine-executable instructions operative to cause the machine to:
identify a plurality of positive samples and a plurality of negative samples in a plurality of patterns; pass the plurality of positive samples and the plurality of negative samples to a node in a tree classifier; determine a number of features used by a monolithic classifier trained with said plurality of positive samples and said plurality of negative samples; cluster the plurality of positive samples into a plurality of subsets; train each of the plurality of specialized classifiers with the plurality of negative samples and a different one of said plurality of subsets; determine a number of features used by the plurality of specialized classifiers; and select the plurality of specialized classifiers in response to the number of features used by the plurality of specialized classifiers being smaller than the number of features used by the monolithic classifier.
- 19. The article of claim 18, further comprising instruction operative to cause the machine to:
train each of the plurality of specialized classifiers with a boosting algorithm.
CROSS-REFERENCE TO RELATED APPLICATION
[0001] This application claims benefit of the priority of the U.S. Provisional Application filed Mar. 17, 2003 and entitled “A Detector Tree of Boosted Classifiers for Real-Time Object Detection and Tracking.”
Provisional Applications (1)
|
Number |
Date |
Country |
|
60456033 |
Mar 2003 |
US |