The present invention relates, in general, to computational devices and, more particularly, to a recurrent neural network computer based on phase modulation of Phase-Locked Loop (PLL) nonlinear oscillators.
Neural network computers, are biologically inspired, that is, they are composed of elements that perform in a manner analogous to the most elementary functions of the biological neuron. In one methodology, a neural network computer is composed of a number (n) of processing elements that may be switches or nonlinear amplifiers. These elements are then organized in a way that may be related to the anatomy of the brain. The configuration of connections, and thus communication routes, between these elements represents the manner in which the neural network computer will function, analogous to that of a program performed by digital computers. Despite this superficial resemblance, such artificial neural networks exhibit a surprising number of the brain's characteristics. For example, they learn from experience, generalize from previous examples to new ones, and abstract essential characteristics from inputs containing irrelevant data. Unlike a von Neumann computer, such a neural network computer does not execute a list of commands (a program). Rather, it performs pattern recognition and associative recall via self-organization of connections between elements.
Artificial neural networks can modify their behavior in response to their environment. Shown a set of inputs (perhaps with desired outputs), they self-adjust to produce consistent responses. A network is trained so that application of a set of inputs produces the desired (or at least consistent) set of outputs. Each such input (or output) set is referred to as a vector. Training can be accomplished by sequentially applying input vectors, while adjusting network weights according to a predetermined procedure, or by setting weights a priori. During training, the network weights gradually converge to values such that each input vector produces the desired output vector.
Because of their ability to simulate the apparently oscillatory nature of brain neurons, oscillatory neural network computers are among the more promising types of neural network computers. Simply stated, an oscillatory neural network computer includes oscillators. Oscillators are mechanical, chemical or electronic devices that are described by an oscillatory signal (periodic, quasi-periodic, almost periodic function, etc.) Usually the output is a scalar function of the form V(ωt+φ) where V is a fixed wave form (sinusoid, saw-tooth, or square wave), ω is the frequency of oscillation, and φ is the phase deviation (lag or lead).
Recurrent neural networks have feedback paths from their outputs back to their inputs. The response of such networks is dynamic in that after applying a new input, the output is calculated and fed back to modify the input. The output is then recalculated, and the process is repeated again and again. Ideally, successive iterations produce smaller and smaller output changes until eventually the outputs become steady oscillations or reach a steady state. Although these techniques have provided a means for recognizing signals, to date they have not been able to do so using associative memory.
Accordingly, a need exists for a neural network computer with fully recurrent capabilities and a method that incorporates the periodic nature of neurons in the pattern recognition process.
In accordance with the present invention, an oscillatory neural network computer is disclosed that exhibits pattern recognition using the phase relationships between a learned pattern and an incoming pattern, i.e., the pattern to be recognized.
In one aspect of the present invention, the oscillatory neural network computer comprises a weighting circuit having phase-based connection strengths. A plurality of phase-locked loop circuits are operably coupled to the weighting circuit.
In another aspect of the present invention, a method for programming an oscillatory neural network computer is provided wherein programming comprises encoding connection coefficients of the oscillatory neural network computer in accordance with phase relationships of a pattern to be learned.
In yet another aspect of the present invention, a method for recognizing an incoming pattern using an oscillatory neural network computer is provided wherein the method comprises using the phase deviation between a learned pattern and the incoming pattern to create an output signal indicative of the learned pattern.
The oscillatory neural network computer of the present invention learns or memorizes information in terms of periodic waveforms having an amplitude, a frequency, and a phase. This information is encoded as connection strengths, Sk,j, using a learning rule such as, for example, the Hebbian learning rule. The connection strengths, in combination with phase-locked loop circuitry, are used to recognize information from signals transmitted to the oscillatory neural network computer.
Copending U.S. patent application Serial No. PCT/US99/26698, entitled “OSCILLATORY NEUROCOMPUTER WITH DYNAMIC CONNECTIVITY” and filed Nov. 12, 1999 by Frank Hoppensteadt and Eugene Izhikevich is hereby incorporated herein by reference in its entirety.
Weighting circuits C1,1, C1,2, . . . , C1,N-1, C1,N are connected to the input terminals of adder circuit 311 and to the respective output terminals OUT1, OUT2, . . . , OUTN-1, OUTN. Weighting circuits C2,1, C2,2, . . . , C2,N-1, C2,N are connected to the input terminals of adder circuit 312, to the input terminals of the respective weighting elements C1,1, C1,2, . . . , C1,N-1, C1,N and to the respective output terminals OUT1, OUT2, . . . , OUTN-1, OUTN. Weighting circuits CN-1,1, CN-1,2, . . . , CN-1,N-1, CN-1,N are connected to the input terminals of adder circuit 31N-1 and to the respective output terminals OUT1, OUT2, . . . , OUTN-1, OUTN. Weighting circuits CN,1, CN,2, . . . , CN,N-1, CN,N are connected to the input terminals of adder circuit 31N and to the respective output terminals OUT1, OUT2, . . . , OUTN-1, OUTN.
Further, initialization input terminals IN1, IN2, . . . , INN-1, INN, are coupled to initialization input terminals of adder circuits 311, 312, . . . , 31N-1, 31N, respectively. The output terminals of adder circuits 311, 312, . . . , 31N-1, 31N are connected to the input terminals of bandpass filter circuits 351, 352, . . . , 35N-1, 35N, respectively.
The output terminals of bandpass filter circuits 351, 352, . . . , 35N-1, 35N are connected to the input terminals of phase-locked loop circuits 251, 252, . . . , 25N-1, 25N, respectively. The output terminals of phase-locked loop circuits 251, 252, . . . , 25N-1, 25N are connected to the respective output terminals OUT1, OUT2, . . . , OUTN-1, OUTN.
Sk,j*V(θ+ψk,j) (1)
where
Referring to
for k=1, . . . , N, where:
θk is the phase of the VCO embedded in the kth PLL circuit;
θj is the phase of the VCO embedded in the jth PLL circuit;
Ω is the natural frequency of the VCO in MegaHertz (MHz);
Sk,j are the connection strengths; and
V(θ) is a 2π periodic waveform function.
PLL neural network computer 20 has an arbitrary waveform function, V, that satisfies “odd-even” conditions and if connection strengths Sk,j are equal to connection strengths Sj,k for all k and j, then the network converges to an oscillatory phase-locked pattern, i.e., the neurons or phase-locked loop circuits oscillate with equal frequencies and constant, but not necessarily zero, phase relationships. Thus, the phase relationships between the oscillators can be used to determine the connection strengths of a neural network computer.
An example of using phase relationships to train neural network computer 20 is described with reference to
ξm=(ξm1, ξm2, . . . , ξmN), ξmk=±1, m=0, . . . , r, and k=1, . . . , N (3)
where
Still referring to
The key vectors are used in conjunction with the learning rule to determine the connection coefficients of oscillatory neural network computer 20. In the example of using the Hebbian learning rule to memorize the images, the connection coefficients, Sk,j are given by:
An advantage of using the Hebbian learning rule to determine the connection coefficients is that it produces symmetric connections Sk,j so that the network always converges to an oscillatory phase-locked pattern, i.e., the neurons oscillate with equal frequencies and constant, but not necessarily zero, phase relations. It should be understood that some information about each memorized image is included in each connection coefficient.
After the initial strengths are memorized, neural network computer 20 is ready for operation, which operation is described with reference to
Ik(t)=Ak cos(Ωt+φ0) (4)
for k=1, . . . , 60, where:
Ω is the same as the center frequency of the PLL;
φ0 is an arbitrary constant;
Ak are large numbers that are positive if the input for the kth channel is to be initialized at +1 and are negative if the kth channel is to be initialized at a −1.
After an initialization interval, the external inputs are turned off and the network proceeds to perform its recognition duties.
Another suitable method for initializing neural network computer 20 is to start the PLL circuits of PLL 25 such that they have different phases that represent pattern 44. Yet another suitable method is to start the PLL circuits of PLL 25 such that they have the same phase and then shift the phase in accordance with pattern 44. Yet another suitable method is to set the initial voltages of the loop filters associated with each PLL circuit of PLL 25. It should be understood that the method for initializing oscillatory neural network computer 20 is not a limitation of the present invention.
Still referring to
Although not shown, it should be understood that there are corresponding output signals V(θ4), . . . , V(θ60) that occur for each of the respective PLL circuits 254, . . . , 2560.
Plots 50 and 55 further illustrate pattern recognition in accordance with an embodiment of the present invention. Because the patterns of the individual pixels that have been learned are either black or white, output signals V(θ1), . . . , V(θ60) lock in phase or in anti-phase to each other depending on the pattern being recognized. For example, in the pattern for a “1” (
Briefly referring to
Further, it is expected that at times t=8.0, 8.3, 8.6, 8.9, 9.2, and 9.5, the output signals for pixels P1 and P3 would be in anti-phase and have amplitudes of substantially the same magnitude but opposite polarity. Comparing these times with the output signals shown in
The output signals of neural network computer 20 can be monitored by multiplying each output signal with a reference output signal. In the example of recognizing pattern 44 from
By now it should be appreciated that a method for recognizing patterns and an oscillatory neural network computer for implementing the method have been provided. An important aspect of this invention is the discovery by the inventors that the output signals for a PLL neural network computer oscillate with equal frequencies and constant, but not necessarily zero, phase relationships. Thus, the phase relationships of the neural network computer are used to determine the connection strengths of the neural network computer. This provides an increased immunity to noise. Another advantage of the present invention is that the type of learning rule used to train the neural network computer is not a limitation of the present invention.
Although certain preferred embodiments and methods have been disclosed herein, it will be apparent from the foregoing disclosure to those skilled in the art that variations and modifications of such embodiments and methods may be made without departing from the spirit and scope of the invention. Accordingly, it is intended that the invention shall be limited only to the extent required by the appended claims and the rules and principles of applicable law.
The present patent application claims priority rights on U.S. Provisional Patent Application Ser. No. 60/178,640 filed Jan. 28, 2000.
| Number | Name | Date | Kind |
|---|---|---|---|
| 4815475 | Burger | Mar 1989 | A |
| 5072130 | Dobson | Dec 1991 | A |
| 5263122 | Nunally | Nov 1993 | A |
| 5446828 | Woodall | Aug 1995 | A |
| 5479577 | Yang | Dec 1995 | A |
| 5705956 | Neely | Jan 1998 | A |
| 6581046 | Ahissar | Jun 2003 | B1 |
| Number | Date | Country |
|---|---|---|
| PCSUS9926698 | Nov 1999 | WO |
| Number | Date | Country | |
|---|---|---|---|
| 60178640 | Jan 2000 | US |