Claims
- 1. A method of incrementally forming and adaptively updating a neural net model, comprising:
(a) incrementally adding to the neural net model a function approximation node; and (b) determining function parameters for the function approximation node and updating function parameters of other nodes in the neural network model, by using the function parameters of the other nodes prior to addition of the function approximation node to the neural network model.
- 2. The method of claim 1, wherein if a model accuracy of the neural net model with the function approximation node added thereto is below a predetermined accuracy level, steps (a) and (b) are repeated.
- 3. The method of claim 1, wherein a set of sample data patterns is used to form a list of function approximation node candidates, and the function approximation node is selected from the list of function approximation node candidates.
- 4. The method of claim 3, wherein the list of function approximation node candidates is formed by
splitting the set of sample data patterns into a plurality of clusters in a first level of a cluster hierarchy, determining that a selected cluster in the first level has a population exceeding a predetermined size, and splitting the selected cluster into two or more clusters and replacing the selected cluster with the two or more clusters in a next level of the cluster hierarchy.
- 5. The method of claim 4 further comprising sorting the clusters on each level of the cluster hierarchy based on cluster size, to form a sorted list of function approximation node candidates.
- 6. The method of claim 3, wherein the neural network model is adaptively updated by incrementally adding one or more additional nodes to the neural net model, to represent new data corresponding to a data range not represented in the set of sample data patterns.
- 7. The method of claim 1 further comprising:
monitoring a model accuracy of the neural net model while the neural net model is used on-line; and adaptively updating the neural net model, if the model accuracy of the neural net model is below a predetermined threshold.
- 8. The method of claim 7, wherein the-adaptive update includes incrementally adding one or more additional nodes to the neural net model, to represent new data.
- 9. The method of claim 8, wherein the new data corresponds to a change in system dynamics.
- 10. The method of claim 7, wherein the adaptive update includes updating the function parameters of the nodes in the neural net model.
- 11. The method of claim 7, wherein if the adaptive updating reaches a limit, a full retrain of the neural net model is performed.
- 12. The method of claim 1 further comprising adaptively updating the neural net model by adding one or more additional nodes to the neural net model, based on new data patterns.
- 13. The method of claim 12, wherein the additional nodes are formed by applying a clustering methodology to the new data patterns.
- 14. The method of claim 13, wherein the clustering methodology includes
clustering the new data patterns into a number of clusters which is approximately a number of the nodes in the neural net model; determining that a selected cluster is far away from positions associated with the respective nodes in the neural net model; and adding to the neural net model an additional node associated with the selected cluster and a center of the selected cluster.
- 15. The method of claim 12, wherein
a set of initial weights is determined for the nodes in the neural net model when the neural net model is formed, and when the additional nodes are added during adaptive update, a set of new weights for the nodes in the neural net model is computed, and the initial weights are combined with the new weights for the nodes based on a forgetting factor.
- 16. The method of claim 15, wherein the forgetting factor is determined based on a cause of model degradation.
- 17. The method of claim 1 further comprising applying an orthogonal least squares methodology to determine a set of weights for the neural net model.
- 18. The method of claim 17, wherein the set of weights are adaptively updated by using new data patterns.
- 19. The method of claim 17, wherein the set of weights are updated to compensate for system drift.
- 20. The method of claim 1, wherein the function parameters for the nodes in the neural net model are determined by applying a hierarchical k-means clustering methodology to a set of sample data patterns.
- 21. The method of claim 1, wherein the function approximation node is a radial basis node, and a center and radius of the radial basis node are determined through a hierarchical k-means clustering methodology.
- 22. The method of claim 1, wherein the function approximation node is a Gaussian node.
- 23. The method of claim 1, wherein the function approximation node is a sigmoidal basis node.
- 24. The method of claim 1, wherein the function approximation node is a wavelet basis node.
- 25. The method of claim 1, wherein the function approximation node is non-linear.
- 26. A method of incrementally forming a neural net model, comprising:
applying a hierarchical clustering methodology to a set of sample data patterns to form a list of function approximation node candidates; and incrementally adding one or more function approximation nodes to the neural net model until a model with an accuracy at or above a predetermined accuracy level is formed, wherein the function approximation nodes are selected from the list of function approximation node candidates.
- 27. A computer system, comprising:
a processor; and a program storage device readable by the computer system, tangibly embodying a program of instructions executable by the processor to perform a method of incrementally forming and adaptively updating a neural net model, the method comprising: (a) incrementally adding to the neural net model a function approximation node; and (b) determining function parameters for the function approximation node and updating function parameters of other nodes in the neural network model, by using the function parameters of the other nodes prior to addition of the function approximation node to the neural network model.
- 28. A program storage device readable by a machine, tangibly embodying a program of instructions executable by the machine to perform a method of incrementally forming and adaptively updating a neural net model, the method comprising:
(a) incrementally adding to the neural net model a function approximation node; and (b) determining function parameters for the function approximation node and updating function parameters of other nodes in the neural network model, by using the function parameters of the other nodes prior to addition of the function approximation node to the neural network model.
- 29. A computer data signal embodied in a transmission medium which embodies instructions executable by a computer for incrementally forming and adaptively updating a neural net model, comprising:
a first segment including node addition code to incrementally add to the neural net model a function approximation node; and a second segment including parameter determination code to determine function parameters for the function approximation node and updating function parameters of other nodes in the neural network model, by using the function parameters of the other nodes prior to addition of the function approximation node to the neural network model.
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims the benefit of the following commonly assigned, co-pending provisional applications:
[0002] (a) Serial No. 60/374,064, filed Apr. 19, 2002 and entitled “PROCESSING MIXED NUMERIC AND/OR NON-NUMERIC DATA”;
[0003] (b) Serial No. 60/374,020, filed Apr. 19, 2002 and entitled “AUTOMATIC NEURAL-NET MODEL GENERATION AND MAINTENANCE”;
[0004] (c) Serial No. 60/374,024, filed Apr. 19, 2002 and entitled “VIEWING MULTI-DIMENSIONAL DATA THROUGH HIERARCHICAL VISUALIZATION”;
[0005] (d) Serial No. 60/374,041, filed Apr. 19, 2002 and entitled “METHOD AND APPARATUS FOR DISCOVERING EVOLUTIONARY CHANGES WITHIN A SYSTEM”;
[0006] (e) Serial No. 60/373,977, filed Apr. 19, 2002 and entitled “AUTOMATIC MODEL MAINTENANCE THROUGH LOCAL NETS”; and
[0007] (f) Serial No. 60/373,780, filed Apr. 19, 2002 and entitled “USING NEURAL NETWORKS FOR DATA MINING”.
Provisional Applications (6)
|
Number |
Date |
Country |
|
60373780 |
Apr 2002 |
US |
|
60373977 |
Apr 2002 |
US |
|
60374020 |
Apr 2002 |
US |
|
60374024 |
Apr 2002 |
US |
|
60374041 |
Apr 2002 |
US |
|
60374064 |
Apr 2002 |
US |