Mathematical Sciences: A Neural Network, Modeled by a Nonlinear Dynamical System

Information

  • NSF Award
  • 9103575
Owner
  • Award Id
    9103575
  • Award Effective Date
    9/1/1991 - 33 years ago
  • Award Expiration Date
    2/28/1994 - 30 years ago
  • Award Amount
    $ 25,500.00
  • Award Instrument
    Standard Grant

Mathematical Sciences: A Neural Network, Modeled by a Nonlinear Dynamical System

This project studies the properties of a continuous neural network that has already been shown to successfully solve certain problems in pattern classification. The mathematical model for this network was derived using only the most basic assumptions; namely, that a neural network is a collection of interacting elements, each one affected positively or negatively by some subset of the others, and such that the system as a whole is capable of adapting its behavior over time. The resulting mathematical model is an autonomous system of ordinary differential equations, and can therefore be analyzed by known analytic and numerical techniques. For appropriate values of the parameters it can be shown that an N-cell network will have N stable attractors. One of the major goals of this project is the determination of the attractive basins of these attractors and a description of their dependence on the various system parameters. This information will then be used to determine the properties of a neural network having these N-cell networks as components. Neural networks are currently being applied to certain kinds of problems that have not yielded to traditional techniques of artificial intelligence, for example speech recognition and computer vision. As yet, there is no general agreement on which neural network architectures work best. It is possible that hybrid systems, consisting of interconnected components with different architectures, may be one solution. It is in this spirit that the network proposed here is being studied. It has some useful properties not possessed by other networks; for example, it automatically identifies the component of an input vector that is most distinct from all of the others, whether it is smaller or larger. As one of the few models in which the weights vary continuously in time as a function of the activity levels, it has in these weights a built-in short-term memory capability. A careful comparison of this model with other neural networks in current use needs to be made, to determine whether it has properties that make it significantly better in particular applications.

  • Program Officer
    Michael H. Steuerwalt
  • Min Amd Letter Date
    9/24/1991 - 33 years ago
  • Max Amd Letter Date
    9/24/1991 - 33 years ago
  • ARRA Amount

Institutions

  • Name
    University of Hartford
  • City
    West Hartford
  • State
    CT
  • Country
    United States
  • Address
    200 Bloomfield Avenue
  • Postal Code
    061171545
  • Phone Number
    8607685938

Investigators

  • First Name
    Virginia
  • Last Name
    Noonburg
  • Email Address
    Noonburg@Hartford
  • Start Date
    9/1/1991 12:00:00 AM

FOA Information

  • Name
    Other Applications NEC
  • Code
    99