This US non-provisional patent application claims priority under 35 USC §119 to Korean Patent Application No. 10-2011-0134351, filed on Dec. 14, 2011, the entirety of which is hereby incorporated by reference.
The present general inventive concept relates to feature classifiers and recognition devices using the same.
Image recognition technology is used in a variety of applications such as disease diagnosis, tracking of objects, and autonomous running of robots. Generally, in the image recognition technology, search windows are set to respective regions of an input image signal. Feature vectors are extracted with respect to the set search windows, respectively. A feature vector includes features for use in image recognition. The extracted feature vector is discriminated based on previously learned information, and an image is recognized through the discrimination of the extracted feature vector.
In the image recognition technology, a feature vector extraction operation and a feature vector classification operation are performed on each search window. Accordingly, a large amount of computation is required. As a result, a large amount of hardware resources is required to process high-speed input images in real time.
Embodiments of the inventive concept provide a feature vector classifier and a recognition device using the same.
According to an aspect of the inventive concept, the feature vector classifier may include a feature vector extractor configured to generate a feature vector and a normalized value from an input image and output the feature vector and the normalized value; and a feature vector classifier configured to normalize the feature vector based on the normalized value and classify the normalized feature vector to recognize the input image.
In an exemplary embodiment, the feature vector extractor may include a feature extractor configured to extract a feature value from a search window of the input image; and a feature vector generator configured to generate a feature vector having the feature value as an element, compute the normalized value based on the feature value, and output the generated feature vector and the computed normalized value.
In an exemplary embodiment, the feature vector classifier may classify the feature vector using a linear support vector machine (LSVM) algorithm.
In an exemplary embodiment, the feature vector classifier may include a dot-product unit configured to perform dot product of the feature vector and a predetermined weighted vector; and an index classifier configured to classify an index of the feature vector based on a value of the dot product to classify the feature vector.
In an exemplary embodiment, the feature vector may perform dot product with the weighted vector by parallel computing.
In an exemplary embodiment, the dot-product unit may normalize the feature vector based on the normalized value during the dot product of the feature vector and the weighted vector.
According to another aspect of the inventive concept, the feature vector classifier may include a dot-product unit configured to receive a feature vector and a normalized value extracted from an image and normalize the feature vector based on the normalized value; and an index classifier configured to the normalized feature vector depending on an index.
The inventive concept will become more apparent in view of the attached drawings and accompanying detailed description. The embodiments depicted therein are provided by way of example, not by way of limitation, wherein like reference numerals refer to the same or similar elements. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating aspects of the inventive concept.
The inventive concept will now be described more fully hereinafter with reference to the accompanying drawings, in which preferred embodiments of the invention are shown. The inventive concept, however, may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
Reference is made to
The feature vector extractor 10 generates a feature vector from an input search window. The search window input to the feature vector extractor 10 is a partial image signal corresponding to a specific region of an image signal. A feature vector generated by the feature vector extractor 10 is a vector including features for use in image recognition as an element.
The feature vector classifier 20 classifies an index of a feature vector input from a feature vector extractor. A search window input to the recognition device 1 is classified depending on the classified index of the feature vector. There are various classification algorithms to classify an index of a feature vector. In this embodiment, a linear support vector machine (LSVM) algorithm is used which is one of the most simple classification algorithms. However, this is merely exemplary and the inventive concept is not limited thereto.
In the LSVM algorithm, a feature vector is subjected to a dot-product operation with a predetermined weight vector. An index of a feature vector is determined by a sign of the sum of a value of the dot-product operation and a predetermined offset constant.
Reference is made to
The feature extractor 11 extracts a feature value from a search window. The feature value extracted by the feature extractor 11 may be various. For example, the feature value may be a histogram of gradient (HOG) feature, a Harr-like feature, a wavelet feature or a combination thereof. However, this is merely exemplary and the inventive concept is not limited thereto.
Exemplarily, a procedure of generating a feature vector using an HOG feature as a feature value will now be described. HOG describes the orientation of brightness distribution to a local area as a histogram. In order to compute an HOG feature, a search window image is extracted as a block including a plurality of cells. Brightness, gradient orientation, and gradient magnitude of the respective cells constituting the extracted block are computed.
In this embodiment, the gradient orientation is computed through division of zero degree to 180 degrees into nine equal parts with 20 degrees. That is, a single cell has nine orientation histogram bins. Also, in this embodiment, each block includes 2×2 cells. Accordingly, in this embodiment, the number of computed feature values per block, i.e., the number of elements in a feature vector is 36. However, this is merely exemplary and the inventive concept is not limited thereto.
The feature normalizer 12 normalizes feature values extracted by the feature extractor 11. When a feature vector is classified by a feature vector classifier, the environment of a recognized feature vector and the environment of previously learned information do not match each other. For this reason, a normalization procedure is required to prevent distortion of classification result.
The normalization procedure may include various normalization methods. For example, the normalization procedure may be a mean normalization method or a mean and variance normalization (MVN) method. In some embodiments, equation (1) and equation (2) are obtained when applying the mean normalization method that is the simplest normalization method.
In the equation (1) and equation (2), xi′ represents an ith feature value prior to normalization, xi represents an ith normalized feature value, 1/N represents a normalized value, n represents the total number of elements in a feature vector, and ε represents a constant for preventing computational infinity when the denominator of the equation (1) becomes zero. That is, the normalization procedure using the mean normalization method is a procedure of multiplying a feature value by a reciprocal of the total feature value, i.e., a normalized value.
In other embodiments, the normalization procedure may be a mean square normalization method. Then, a normalized value is computed by equation (3), as follows.
Similar to the equation (2), in the equation (3), 1/N represents a normalized value, n represents the total number of elements in a feature vector, and ε represents a constant for preventing computational infinity when the denominator of the equation (1) becomes zero. The normalization procedure using the mean square method is also a procedure of multiplying a feature value by a normalized value.
In this case, multiplication to a normalized value for respective n elements constituting a feature vector should be done on the feature vector. Thus, a normalized value computation and n multiplications are required to generate a single feature vector.
The feature vector generator 13 generates a feature vector having a feature value normalized by the feature normalizer 12 as an element. The feature vector generator 13 outputs the generated feature vector.
The dot-product unit 21 performs a dot-product operation on a feature vector and a weight vector that are input from the feature extractor 10. The weight vector is a vector having a weight on a feature value as an element and has the same number of elements as the feature vector. A dot-product result value output from the dot-product unit 21 is added to an offset b to be transferred to the index classifier 22.
The index classifier 22 classifies an index of a feature vector, based on the input addition result value. For example, the index classifier 22 may classify an index of a feature vector by computing a sign of the input addition result value. In this case, the feature vector is classified into two types. However, this is merely exemplary and the inventive concept is not limited thereto. The index classifier 22 classifies a search window image input to a recognition device, based on the classification result of the feature vector.
Reference is made to
In
Accordingly, the recognition device in
Reference is made to
The feature vector extractor 110 generates a feature vector from an input search window. Unlike the feature vector extractor 10 in
The feature extractor 111 extracts a feature value from the input search window. The feature value extracted by the feature extractor 111 may be various. For example, the feature value may be a histogram of gradient (HOG) feature, a Harr-like feature, a wavelet feature or a combination thereof. However, this is merely exemplary and the inventive concept is not limited thereto.
The feature vector generator 112 generates a feature vector having a feature value extracted by the feature extractor 111 as an element. In addition, the feature vector generator 112 computes a normalized value from the feature value. The normalized value may be computed by various methods. For example, the normalized value may be computed by a mean normalization method or a mean and variance normalization (MVN) method. The feature vector generator 112 outputs the generated feature vector and the computed normalized value.
In the feature vector extractor according to this embodiment, an operation of multiplying the normalized value is not performed. Thus, a size of hardware and computation performing time are reduced.
The feature vector classifier 120 determines a feature vector, based on the feature vector and the normalized value that are input from the feature vector extractor 110. Hereinafter, the operation of a feature vector classifier will now be described below.
Reference is made to
The dot-product unit 221 receives a feature vector and a normalized value. The dot-product unit 221 does multiplication to multiply the normalized value while doing multiplication for dot product of the feature vector and a weighted vector. A result value computed by the dot-product unit 221 is transferred to the index classifier 222 after being added to an offset.
The index classifier 222 classifies an index of the feature vector, based on the input addition result value. For example, the index classifier 222 may classify an index of the feature vector by computing a sign of the input addition result value. In this case, the feature vector is classified into two types. However, this is merely exemplary and the inventive concept is not limited thereto. The index classifier 222 classifies a search window image input to a recognition device, based on the classification result of the feature vector.
Reference is made to
In
Accordingly, in the recognition devices in
Reference is made to
The dot-product unit 321 receives a feature vector and a normalized value. The dot-product unit 321 performs an addition operation for dot product of the feature vector and a weighted vector. A dot-product result value obtained by the dot-product unit 321 is transferred to the index classifier 322 after being multiplied by the normalized value and added to an offset.
The index classifier 322 classifies an index of the feature vector, based on the input addition result value. For example, the index classifier 322 may classify an index of the feature vector by computing a sign of the input addition result value. In this case, the feature vector is classified into two types. However, this is merely exemplary and the inventive concept is not limited thereto. The index classifier 322 classifies a search window image input to a recognition device, based on a classification result of the feature vector.
In
Accordingly, in the recognition devices in
In the feature vector classifier, multiplication of a normalized value is done only once. Therefore, the number of operations and the size of hardware are reduced as compared to the case where a normalization procedure is performed in the recognition device in
Reference is made to
According to a feature vector classifier and a recognition device using the same described above, during extraction and classification of a feature vector, time required for the extraction and classification and the size of hardware required are significantly reduced.
While the inventive concept has been particularly shown and described with reference to exemplary embodiments thereof, it will be apparent to those of ordinary skill in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the inventive concept as defined by the following claims.
Number | Date | Country | Kind |
---|---|---|---|
10-2011-0134351 | Dec 2011 | KR | national |