The present invention relates to a processing system and computer-readable medium.
An emotion generating apparatus including a neural net that receives an input of user information, equipment information and a current emotional state of a user him/herself to output a next emotional state has been known (please see Patent Document 1, for example). Also, a technique to store spatiotemporal patterns in an associative memory including a plurality of electronic neurons having a layer neural net relation having directive artificial synapse connectivity has been known (please see Patent Document 2, for example).
[Patent Document 1] Japanese Patent Application Publication No. H10-254592
[Patent Document 2] Japanese Translation of PCT International Patent Application No. 2013-535067
Various embodiments of the present invention may be described with reference to flowcharts and block diagrams whose blocks may represent (1) steps of processes in which operations are performed or (2) units of apparatuses responsible for performing operations. Certain steps and units may be implemented by dedicated circuitry, programmable circuitry supplied with computer-readable instructions stored on computer-readable media, and/or processors supplied with computer-readable instructions stored on computer-readable media. Dedicated circuitry may include digital and/or analog hardware circuits and may include integrated circuits (IC) and/or discrete circuits. Programmable circuitry may include reconfigurable hardware circuits comprising logical AND, OR, XOR, NAND, NOR, and other logical operations, flip-flops, registers, memory elements, etc., such as field-programmable gate arrays (FPGA), programmable logic arrays (PLA), etc.
Computer-readable media may include any tangible device that can store instructions for execution by a suitable device, such that the computer-readable medium having instructions stored therein comprises an article of manufacture including instructions which can be executed to create means for performing operations specified in the flowcharts or block diagrams. Examples of computer-readable media may include an electronic storage medium, a magnetic storage medium, an optical storage medium, an electromagnetic storage medium, a semiconductor storage medium, etc. More specific examples of computer-readable media may include a floppy (registered trademark) disk, a diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an electrically erasable programmable read-only memory (EEPROM), a static random access memory (SRAM), a compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a BLU-RAY(registered trademark) disc, a memory stick, an integrated circuit card, etc.
Computer-readable instructions may include assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, JAVA (registered trademark), C++, etc., and conventional procedural programming languages, such as the “C” programming language or similar programming languages.
Computer-readable instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, or to programmable circuitry, locally or via a local area network (LAN), wide area network (WAN) such as the Internet, etc., to execute the computer-readable instructions to create means for performing operations specified in the flowcharts or block diagrams. Examples of processors include computer processors, processing units, microprocessors, digital signal processors, controllers, microcontrollers, etc.
Hereinafter, (some) embodiment(s) of the present invention will be described. The embodiment(s) do(es) not limit the invention according to the claims, and all the combinations of the features described in the embodiment(s) are not necessarily essential to means provided by aspects of the invention.
Note that a user 30a is a user of the robot 40a and the user terminal 100a. A user 30b is a user of the robot 40b and the user terminal 100b. The robot 40b has approximately identical functions as those of the robot 40a. Also, the user terminal 100b has approximately identical functions as those of the user terminal 100a. Therefore, the system 20 is explained, referring to the robot 40a and the robot 40b collectively as a robot 40, and to the user terminal 100a and the user terminal 100b collectively as a user terminal 100.
The system 20 processes parameters of a neural network for determining the state of the robot 40. Parameters of a neural network include parameters of a plurality of artificial neurons and a plurality of artificial synapses constituting the neural network.
Specifically, the user terminal 100 sets initial values of parameters of a neural network based on an input from the user 30, and transmits them to the server 200. The robot 40 transmits, to the server 200, sensor information obtained through detection by a sensor provided to the robot 40. The server 200 uses the neural network based on the initial value information of the neural network and the sensor information acquired from the robot 40 to determine the state of the robot 40. For example, the server 200 uses the neural network to calculate a situation around the robot 40, an emotion of the robot 40 itself, and the state of generation of an endocrine substance of the robot 40 itself. Then, the server 200 determines action details of the robot 40 based on the situation around the robot 40, the emotion of the robot 40 itself, and the state of generation of the endocrine substance of the robot 40 itself. Note that an endocrine substance means a substance that is secreted in a body and conveys signals, such as a neurotransmitter, a hormone or the like. Also, “endocrine” means that such an endocrine substance is secreted in a body.
For example, if having judged that it is a state where an endocrine substance corresponding to sleepiness is generated, the server 200 causes the robot 40 to take action that it takes when it is sleepy. Also, if having judged that it is a state where an emotion of pleasantness occurs, the server 200 causes the robot 40 to produce a phrase representing the pleasantness.
Note that an endocrine substance of the robot 40 itself is one form of information that influences action of the robot 40, but does not mean that the robot 40 actually generates such an endocrine substance. An emotion of the robot 40 itself is likewise one form of information that influences action of the robot 40, but does not mean that the robot 40 is actually feeling such an emotion.
In the user terminal 100, the input device 106 accepts an input of an initial value of a parameter of a neural network from the user 30 and outputs it to the processing unit 102. The processing unit 102 is formed of a processor such as a CPU. The processing unit 102 causes the initial value of the parameter acquired from the input device 106 to be transmitted from the communicating unit 108 to the server 200. The communicating unit 108 receives the parameter of the neural network from the server 200. The processing unit 102 causes the parameter received by the communicating unit 108 to be displayed on the display unit 104.
In the robot 40, the sensor unit 156 includes various types of sensor such as a camera, 3D depth sensor, microphone, a touch sensor, laser range finder, or ultrasonic range finder. Sensor information obtained through detection by the sensor unit 156 is output to the processing unit 152. The processing unit 152 is formed of a processor such as a CPU. The processing unit 152 causes the sensor information acquired from the sensor unit 156 to be transmitted from the communicating unit 158 to the server 200. The communicating unit 158 receives information indicating operation details from the server 200. The processing unit 152 controls the control target 155 based on the operation details received by the communicating unit 158. The control target 155 includes a speaker, motors to drive respective units of the robot 40, display device, light-emitting device or the like. As one example, if information indicating details about a phrase to be produced is received from the server 200, the processing unit 152 causes a sound or voice to be output from the speaker according to the received details about a phrase to be produced.
At the server 200, the communicating unit 208 outputs, to the processing unit 202, the information received from the user terminal 100 or robot 40. The initial value setting unit 210 stores the initial value of the parameter received at the communicating unit 208 in the parameter initial values 286 in the storing unit 280. The external input data generating unit 230 processes the sensor information received by the communicating unit 208 to generate input information from the outside of the neural network, and outputs it to the parameter processing unit 240.
The parameter processing unit 240 performs a process on the basis of the neural network based on the parameters 288 and the definition information 284 of the neural network that are stored in the storing unit 280. The neural network is a model for artificially realizing some of brain functions of a living form by means of processes of a calculator. First, here, the technical background and problems about neural networks are explained.
A brain is considered as having two roughly classified functions. One of them is a function to perform various information processing to memorize, learn, predict, plan and so on, and the other one is an information processing regulatory function.
Information processing in a brain is considered as being realized by a vast number of neurons that are linked by synaptic connection. A human brain is considered as having more than 100 billion neurons present therein overall. On the other hand, the information processing regulatory function is considered as being realized by a relatively small number of neurons that are present at a particular region of a human brain like, for example, a wide range regulatory system of the brain. Specifically, neurons at a particular region of a brain have axons that do not have particular, well-defined destination neurons, but are branched toward a wide range of regions of the brain, and the information processing regulatory function is considered as being realized due to effects of various neurotransmitters released from the axons. The wide range regulatory system of a human is considered as having approximately several thousand neurons present therein. That is, each of a relatively small number of neurons that are present in a particular region of a brain is in contact with more than one hundred thousand other neurons, and the information processing regulatory function is considered as being realized due to neurotransmitters released by neurons of the particular region of the brain having effects not only on synapse gaps but also on numerous neurons in the brain.
Examples of information processing in a brain include a process on visual information in the visual cortex of a human. It is considered that visual information of a human is transmitted from a retina through an optic nerve to the primary visual cortex. Starting there and in the dorsal pathway, information processing about movement is performed, and information processing about information other than movement such as facial recognition is performed in the ventral pathway. On the other hand, examples of the information processing regulatory function include information processing performed when a human is feeling sleepiness. Occurrence of sleepiness is considered as being related to a wide range regulatory system that releases neurotransmitters such as acetylcholine, noradrenalin or serotonin. Thereby, a command like sleepiness can be a message to be received by a wide range of regions of a brain as in decision-making.
Here, in order to artificially realize some brain functions, it assumed that, as an example of neural networks, a network consists of a plurality of artificial neurons connected by artificial synapses. Application examples in this example of neural networks include data clustering using pattern recognition or a self-organizing map on the basis of deep learning, or the like, and it can be said that they artificially realize information processing of a brain such as image recognition or vocabulary classification.
Hebbian theory or a learning rule on the basis of spike timing-dependent plasticity (STDP) can be applied to a neural network. According to Hebbian theory, if firing of a neuron causes another neuron to fire, the connection between these two neurons is strengthened. Based on Hebbian theory, the process of strengthening connection by an artificial synapse if simultaneous firing occurs to artificial neurons prior and posterior to the artificial synapse can be incorporated into a neural network. STDP is a phenomenon in which strengthening/weakening of a synapse is dependent on the order of spike generation timing of neurons prior and posterior to the synapse. Based on STDP, a process of: strengthening connection of an artificial synapse if a prior neuron to the artificial synapse fires preceding firing of a posterior neuron to the artificial synapse; and weakening connection of the artificial synapse if the posterior artificial neuron to the artificial synapse fires preceding firing of the prior artificial neuron to the artificial synapse can be incorporated into a neural network. Also, there is a learning rule about a self-organizing map in which, in a neural network formed of a plurality of artificial neurons, a winner vector closest to an input vector is selected from weight vectors, and weighting is updated so that it becomes closer to the input vector.
Note that in an example of neural networks as in Patent Document 1 where an emotion label is output from a plurality of pieces of sensory information, even if inputs are the same, it may be possible in some cases to output different emotion labels depending on emotion labels and the inputs by feeding back emotion labels, but the neural network in Patent Document 1 is not configured to be able to incorporate such a process. Also, in the neural network in Patent Document 1, there are no relations between emotions and endocrine substances such as neurotransmitters; also, information processing is never regulated by emotions.
Apart from the information processing realized by the neural network described in Patent Document 1, or various information processing such as pattern recognition or data clustering realized by the above-mentioned example of the neural network, there are three problems that should be solved in order to realize a function of regulating information processing while properties of artificial neurons or artificial synapses dynamically change at part of a neural network due to an artificial endocrine substance such as a neurotransmitter being secreted in a wide range of regions in a brain. That is, first, in a situation where there are many hypotheses about operation principles of brain functions because most of them are not made clear, behavior of a neural network cannot be confirmed efficiently like an analog computer by connecting artificial neurons with artificial synapses through trial and error. Second, regardless of the fact that there are some equation models proposed that have different hysteresis characteristics about action potential or synaptic connection of neurons at various brain regions, equations having hysteresis or parameters of equations cannot be described efficiently for each artificial neuron or artificial synapse. Third, behavior of parameters of numerous artificial neurons or artificial synapses dynamically changing at part of a neural network due to an artificial endocrine substance being secreted in a wide range of regions in a brain cannot be simulated efficiently by large-scale calculation, and it cannot be processed efficiently even by a mutiprocess-mutilethreading process or distributed computing. In the following, operation of the system 20 is explained in more detail in relation to the above-mentioned technical background and problems about neural networks.
The artificial synapse 301 connects the artificial neuron 4 and the artificial neuron 1. The artificial synapse 301 is an artificial synapse connecting them unidirectionally. The artificial neuron 4 is an artificial neuron connected to an input of the artificial neuron 1. The artificial synapse 302 connects the artificial neuron 1 and the artificial neuron 2. The artificial synapse 302 is an artificial synapse connecting them bidirectionally. The artificial neuron 1 is an artificial neuron connected to an input of the artificial neuron 2. The artificial neuron 2 is an artificial neuron connected to an input of the artificial neuron 1.
Note that in the present embodiment, an artificial neuron is represented by N, and an artificial synapse is represented by S, in some cases. Also, each artificial neuron is discriminated by a superscript number as the discrimination character. A given artificial neuron is in some cases represented using an integer i or j as the discrimination number. For example, Ni represents a given artificial neuron.
Also, an artificial synapse is in some cases discriminated using respective discrimination numbers i and j of two artificial neurons connected to the artificial synapse. For example, S41 represents an artificial synapse connecting N1 and N4. Generally, represents an artificial synapse that inputs an output of Ni to Nj. Note that Sji represents an artificial synapse that inputs an output of Nj to Ni.
In
N1 and N3 are emotion artificial neurons for which emotions of the robot 40 are defined. N1 is an emotion artificial neuron to which an emotion “pleased” is allocated. N3 is an emotion artificial neuron to which an emotion “sad” is allocated.
N2 and N5 are endocrine artificial neurons for which endocrine states of the robot 40 are defined. N5 is an endocrine artificial neuron to which a dopamine-generated state is allocated. Dopamine is one example of endocrine substances concerning reward system. That is, N5 is one example of endocrine artificial neurons concerning reward system. N2 is an endocrine artificial neuron to which a serotonin-generated state is allocated. Serotonin is one example of endocrine substances concerning sleep system. That is, N2 is one example of endocrine artificial neurons concerning sleep system.
Information defining the state of the robot 40 like the ones mentioned above is stored in the definition information 284 in the storing unit 280, for each artificial neuron of the plurality of artificial neurons constituting the neural network. In this manner, the neural network 300 includes concept artificial neurons, emotion artificial neurons, and endocrine artificial neurons. The concept artificial neurons, emotion artificial neurons and endocrine artificial neurons are artificial neurons for which meanings such as concepts, emotions or endocrines are defined explicitly. Such artificial neurons are in some cases called explicit artificial neurons.
In contrast to this, N8 and N9 are artificial neurons for which the state of the robot 40 is not defined. Also, N8 and N9 are artificial neurons for which meanings such as concepts, emotions or endocrines are not defined explicitly. Such artificial neurons are in some cases called implicit artificial neurons.
Parameters of the neural network 300 include Iti which is an input to each Ni of the neural network, Eti which is an input from the outside of the neural network to Ni, parameters of Ni and parameters of Si.
The parameters of Ni include Sti representing the status of Ni, Vimt representing an output of the artificial neuron represented by Ni, Tit representing a threshold for firing of Ni, tf representing a last firing clock time which is a clock time when Ni fired last time, Vimtf representing an output of the artificial neuron Ni at the last firing clock time, and ati, bti and hti which are increase-decrease parameters of outputs. The increase-decrease parameters of outputs are one example of parameters specifying time evolution of outputs at the time of firing of an artificial neuron. Note that in the present embodiment, a subscript t represents that the parameter provided with the subscript is a parameter that can be updated along with the lapse of clock time.
The parameters of Sij include BStij representing a coefficient of connection of an artificial synapse of Sij, tcf representing a last simultaneous firing clock time which is a clock time when Ni and Nj connected by Sij fired simultaneously last time, BSijtcf representing a coefficient of connection at the last simultaneous firing clock time, and atij, btij and htij which are increase-decrease parameters of the coefficients of connection. The increase-decrease parameters of the coefficients of connection are one example of parameters specifying time evolution of the coefficients of connection after two artificial neurons connected by an artificial synapse fired simultaneously last time.
The parameter processing unit 240 updates the above-mentioned parameters based on an input from the external input data generating unit 230 and the neural network to determine the activation state of each artificial neuron. The operation determining unit 250 determines operation of the robot 40 based on: the activation states of at least some artificial neurons specified by values of parameters of at least some artificial neurons among a plurality of artificial neurons in the neural network; and states defined for at least some artificial neurons by the definition information 284. Note that an activation state may either be an activated state or an inactivated state. In the present embodiment, to be activated is called “to fire” and being inactivated is called “unfiring”, in some cases. Note that, as mentioned below, the “firing” state is classified into a “rising phase” and a “falling phase” depending on whether or not an output is on the rise. “Unfiring”, and a “rising phase” and a “falling phase” are represented by a status Sti.
For each Ni, the parameter edit screen 400 includes entry fields for inputting values to each of a threshold and increase-decrease parameter of Ni, and discrimination information, coefficient of connection and increase-decrease parameter of all the artificial neurons connected to Ni. Also, the parameter edit screen 400 includes a save button and reset button. The user 30 can input an initial value to each entry field using the input device 106.
If the save button is pressed, the processing unit 102 causes initial values set in the parameter edit screen 400 to be transmitted to the server 200 through the communicating unit 108. In the server 200, the initial values transmitted from the user terminal 100 are stored in the parameter initial values 286 in the storing unit 280. Also, if the reset button of the parameter edit screen 400 is pressed, the processing unit 102 sets values set in the entry fields to initial values specified in advance.
In this manner, the processing unit 102 presents, to a user and in a format in which a plurality of rows of the plurality of artificial neurons are associated with a plurality of rows of a table, the parameter values of each artificial neuron of the plurality of artificial neurons and the parameter values of one or more artificial synapses connected to inputs of each artificial neuron. Then, the processing unit 102 accepts a user input to a table for altering the presented parameter values. In this manner, the processing unit 102 can present, to the user 30, parameter values of each artificial neuron of a plurality of artificial neurons and parameter values of one or more artificial synapses connected to inputs of each artificial neuron using a data access structure accessible data unit by data unit, the data unit being collective for each artificial neuron, and can accept inputs of values from the user 30.
At S510, the parameter processing unit 240 calculates parameters corresponding to a change due to electrical influence of an artificial synapse at a temporal step tn+1. Specifically, it calculates BStij of a given Sij.
At S520, the parameter processing unit 240 calculates parameters corresponding to a change due to chemical influence caused by an endocrine substance at the temporal step tn+1 (S520). Specifically, changes in parameters of Ni and Sij that the endocrine artificial neuron has influence on are calculated. More specifically, it calculates an increase-decrease parameter or threshold of an output of the artificial neuron Ni that the endocrine artificial neuron has influence on and an increase-decrease parameter of a coefficient of connection or the coefficient of connection of Sij that the endocrine artificial neuron has influence on at the temporal step tn+1.
At S530, the parameter processing unit 240 acquires an input from the outside of the neural network. Specifically, the parameter processing unit 240 acquires an output of the external input data generating unit 230.
At S540, the parameter processing unit 240 calculates an output of Ni at the temporal step Specifically, it calculates Vimtn+1 and a status Stti. Then, at S550, it stores each parameter value at the clock time tn+1 in the parameters 288 of the storing unit 280. Also, it transmits each parameter value at the clock time tn+1 to the user terminal 100.
At S560, the parameter processing unit 240 judges whether or not to terminate the loop. For example, if the clock time represented by the temporal step has reached a predetermined clock time or if it is instructed by the user terminal 100 to stop calculation of parameter update, it is judged to terminate the loop. If the loop is not to be terminated, the process returns to S510, and calculation for a still next temporal step is performed. If the loop is to be terminated, this flow is terminated.
If both N1 and Nj at both ends of Sij are firing at a temporal step of a clock time tn, the parameter processing unit 240 calculates BStn+1ij at the clock time ttn+1ij according to BStn+1ij=BStnij+atnij×(tn+1−tn). On the other hand, if both Si and sj are not firing at the temporal step of the clock time tn, it calculates the coefficient of connection BStn+1ij at the clock time tn+1 according to BStn+1ij=BStnij+btnij×(tn+1−tn). Also, if BStn+1ij becomes a negative value, BStn+1ij is regarded as 0. Note that for Sij for which BSij is a positive value, atij is a positive value and btij is a negative value. For Sij for which BSij is a negative value, atij is a positive value and btij is a negative value.
Because as shown in
A function 700 shown in
The function 910 is a function of the coefficient of connection BStcfij and Δt at the clock time tcf. The function 910 give a value BStcfij at Δt=0. Also, the function 910 monotonically increases if Δt is in a range lower than a predetermined value, and monotonically decreases and gradually decreases toward 0 if Δt is larger than the predetermined value.
The function 920 is a function only of Δt. The function 920 gives the value 0 at Δt=0. Also, the function 920 monotonically increases if Δt is in a range lower than a predetermined value, and monotonically decreases and gradually decreases toward 0 if Δt is larger than the predetermined value. In this manner, because according to the present embodiment, htij can be defined relatively freely, a learning effect can be controlled relatively freely.
In the example of
Also, the endocrine artificial neuron N5 is an endocrine artificial neuron to which an endocrine substance of reward system is allocated. Examples of the endocrine substance of reward system may include dopamine and the like. First definition information about the endocrine artificial neuron N5 specifies: the condition “Vmtn5>Ttn5 and Vmtn4>Ttn4”; “S49 and S95” as artificial synapses that the endocrine artificial neuron N5 has influence on; and “atn+1ij=atnij×1.1” as an equation specifying influence details. Thereby, if Vmtn5 exceeds Ttn5 and additionally Vmtn4 exceeds Ttn4, the parameter processing unit 240 increases increase-decrease parameters of the artificial synapse S49 and S95 by 10% at the clock time tn+1.
Thereby, when the concept artificial neuron N4 for which a situation “a bell rang” is defined is firing if an endocrine artificial neuron of reward system fired, connection between the concept artificial neurons N4 and N5 through the implicit artificial neuron N9 can be strengthened. Thereby, it becomes easier for the endocrine artificial neuron N5 of reward system to fire if “a bell rang”.
Also, second definition information about the endocrine artificial neuron N5 specifies: the condition “Vmtn5>Ttn5”; “N1” as an artificial neuron that the endocrine artificial neuron N5 has influence on; and “Ttn+1i=Ttni×1.1” as an equation specifying influence details. Thereby, if Vmtn5 exceeds Ttn5, the parameter processing unit 240 lowers the increase-decrease parameter of the artificial neuron N1 by 10% at the clock time tn+1. Thereby, it becomes easier for an emotion “pleased” to fire if the endocrine artificial neuron N5 of reward system fired.
According to such definitions specifying influence about an endocrine artificial neuron of reward system, an implementation becomes possible in which if an act of charging the robot 40 while ringing a bell is repeated, simply ringing a bell causes the robot 40 to take action representing pleasantness.
Note that the influence definition information is not limited to the example of
The influence definition information is stored in the definition information 284 of the storing unit 280. In this manner, the storing unit 280 stores the influence definition information specifying influence of at least one of an output and firing state of an endocrine artificial neuron on a parameter of at least one of an artificial synapse and another artificial neuron not directly connected to the endocrine artificial neuron by an artificial synapse. Then, the parameter processing unit 240 updates parameters of the at least one of the artificial synapse and the other artificial neuron not directly connected to the endocrine artificial neuron by the artificial synapse based on the at least one of the output and firing state of the endocrine artificial neuron and the influence definition information. Also, parameters of the other artificial neuron that the at least one of the output and firing state of the endocrine artificial neuron has influence on can include at least one of parameters specifying a threshold, firing state and time evolution of an output at the time of firing of the other artificial neuron. Also, parameters of the artificial synapse that the at least one of the output and firing state of the endocrine artificial neuron has influence on can include at least one of parameters specifying a coefficient of connection of the artificial synapse, and time evolution of the coefficient of connection after two artificial neurons connected by the artificial synapse simultaneously fired last time. Also, the influence definition information includes information specifying influence that the firing state of an endocrine artificial neuron related with reward system has on a threshold of an emotion artificial neuron, and the parameter processing unit 240 updates the threshold of the emotion artificial neuron according to the influence definition information if the endocrine artificial neuron fired.
If indicates unfiring, the parameter processing unit 240 calculates an input Itn+1i to Ni (S1110). Specifically, if an input from the outside of the neural network is not connected to Ni, it is calculated according to Itn+1i=ΣjBStn+1ji×Vmtnj×f(Stnj). If an input from the outside of the neural network is connected to Ni, it is calculated according to Itn+1i=ΣjBStn+1ji×Vmtnj×f(Stnj)+Em+1i. Here, is an input at the clock time Etni from the outside of the neural network.
Also, f(S) gives 0 if S is a value representing unfiring, and gives 1 if S is a value indicating a rising phase or falling phase. This model corresponds to a model in which a synapse conveys action potential only if a neuron fired. Note that it may give f(S)=1. This corresponds to a model in which membrane potential is conveyed regardless of the firing state of a neuron.
At S1112, the parameter processing unit 240 judges whether or not Itn+1i exceeds Ttn+1i. If Itn+1i exceeds Ttn+1i, the parameter processing unit 240 calculates Vmtn+1i based on an increase-decrease parameter, sets Stn+1i to a value indicating a rising phase or falling phase depending on Vmtn+1i (S1114), and terminates this flow.
At S1100, if Stni is in a rising phase or falling phase, the parameter processing unit 240 calculates Vmtn+1i(S1120). Then, the parameter processing unit 240 sets Stn+1i to a value of unfiring if Vmti reached Vmin before tn+1, sets Stn+1i to a value of a rising phase or falling phase if Vmti has not reached Vmin before tn+1, and terminates this flow. Note that the parameter processing unit 240 sets a value of a falling phase to Stn+1i if Vmti reached Vmax before tn+1, and sets a value of a rising phase to Stn+1i if Vmti has not reached Vmax before tn+1.
In this manner, if Ni is firing, an output of Ni is not dependent on an input even if the output becomes equal to or lower than a threshold. Such a time period corresponds to an absolute refractory phase in a neuron of a living form.
At the temporal step of the clock time t0, Ni is unfiring. If at the clock time t1 is equal to or lower than Tt1i, the parameter processing unit 240 calculates Vt1i at the clock time t1 according to Vt1i=It1i, and calculates Vti during a time period from the clock times t0 to t1 according to Vti=It0i. Also, likewise, the parameter processing unit 240 maintains the value of Vtn calculated at the clock time step tn until a next clock time step, and changes it to Itn+1 at Vtn+1.
At the temporal step of the clock time t0, Ni is unfiring. If Ith at the clock time t1 exceeds Tt1i, the parameter processing unit 240 calculates Vt1i at the clock time t1 according to Vt1i=It1i, and calculates Vti during a time period from the clock times t0 to t1 according to Vti=It0i. Note that it is assumed here that It1i at the clock time t1 is equal to or lower than Vmax. If It1i at the clock time t1 exceeds Vmax, It1i=Vmax.
As shown in
Also, upon Vti reaching Vmax, Vti is decreased by |bti| per unit time until Vti reaches Vmin. Also, the parameter processing unit 240 determines the status of Ni in this time period as a falling phase. Then, upon Vti reaching Vmin, Vt6i at a next clock time is calculated according to Vt6i=It6i. Also, the status after Vti reached Vmin is determined as unfiring.
Note that if the status of Ni is a falling phase, Vmti is not dependent on Iti even if the calculated Vmti falls below Tti. Even if Vmti falls below Tti, the parameter processing unit 240 calculates Vmti according to an increase-decrease parameter until Vmti reaches Vmin.
A function 1400 shown in
The function 1510 is a function of the output Vmtfi and Δt at the clock time tf. The function 1510 is a function that gives the value Vmtfi at Δt=0. Also, the function 1510 is a function that monotonically increases if Δt is in a range lower than a predetermined value, and monotonically decreases if Δt is larger than the predetermined value.
The function 1520 is a function only of Δt. The function 1520 is a function that gives the value Vmin at Δt=0. Also, the function 1520 is a function that monotonically increases if Δt is in a range lower than a predetermined value, and monotonically decrease if Δt is larger than the predetermined value.
As explained above, the parameter processing unit 240 can calculate an output modelling on a change in action potential of a neuron. Therefore, rise and fall of an output can be expressed. Also, a change in an output after firing can be relatively freely expressed by an increase-decrease parameter. Thereby, the range of expression of the state can be widened.
Note that as shown in
Note that in a neural network, in some cases, a phenomenon occurs in which a firing state of an artificial neuron is promoted unidirectionally along with the lapse of time. For example, if artificial neurons linked in a loop by strongly connecting artificial synapses are present in a neural network, the artificial neurons linked in the loop fire consecutively, and this causes adjacent artificial neurons in the loop to simultaneously fire respectively and raises the coefficients of connection of the artificial synapses between the artificial neurons; thereby, firing of the artificial neurons may be kept promoted, in some cases. Also, this applies also to a case where a threshold of an artificial neuron lowers due to the influence of firing of another endocrine artificial neuron, and the influenced firing of the artificial neuron promotes firing of the endocrine artificial neuron, and other cases. Also conversely, in a case where an artificial synapse is connected by suppressed connection, in a case where a process to raise a threshold of an artificial neuron in response to firing of an endocrine artificial neuron is defined, or other cases, firing of an artificial neuron is kept suppressed unidirectionally along with the lapse of time, in some cases. In view of this, if the parameter processing unit 240 monitors temporal changes in a firing state of an artificial neuron or a coefficient of connection of an artificial synapse, or the like and detects the presence of an artificial neuron to which a firing state gives positive feedback or negative feedback, it may suppress the firing state being kept promoted unidirectionally by regulating the threshold of the artificial neuron or the coefficient of connection of an artificial synapse. For example, continuous promotion of firing may be suppressed by raising the thresholds of artificial neurons forming a positive feedback system or lowering the coefficients of connection of artificial synapses forming a positive feedback system. Also, continuous suppression of firing may be suppressed by lowering the thresholds of artificial neurons forming a negative feedback system or raising the coefficients of connection of artificial synapses forming a negative feedback system.
As shown in
at least one of parameters specifying: a coefficient of connection to a connected artificial neuron; a last simultaneous firing clock time which is a clock time when two artificial neurons that the artificial synapse connects fired simultaneously last time; a coefficient of connection at the last simultaneous firing clock time; and time evolution of a coefficient of connection after simultaneous firing occurred; and discrimination information of the artificial synapse.
On this edit screen, a user can add or delete artificial neurons, and edit parameters by mouse operation or keyboard operation, for example. Also, a user can add or delete artificial synapses, and edit parameter values by mouse operation or keyboard operation, for example.
Note that after calculation of a neural network is started, the server 200 causes the user terminal 100 to graphically display a neural network on the basis of the parameter values altered by the parameter processing unit 240. In this case, the connection relation between artificial neurons and artificial synapses of the neural network is displayed graphically in a similar manner to this edit screen. Display examples representing how it appears when parameters are altered are explained in relation to
The edit screen 1800 includes manipulation portions for altering: meanings specified for two artificial neurons connected by the selected artificial synapse; directions toward which outputs of the artificial neurons are output; the names and current values of the parameters of the artificial synapse; and the parameters. The parameters of the artificial synapse include the initial value of the coefficient of connection, and the initial value of each of increase-decrease parameters a and b. Also, the edit screen includes: a cancel button to instruct to cancel editing; an update button to instruct to update the initial value with the parameter value having been edited; and a delete button to instruct to delete the artificial synapse.
The initial values of parameters of a neural network can be edited visually. Therefore, even an unskilled user can relatively easily edit the neural network.
Also, the processing unit 202 causes the user terminal 100 to display lines representing artificial synapses while changing their widths based on the magnitude of BStij of each Sij. For example, the processing unit 202 increases the width of a line representing Sij as BStij increases. Thereby, a user can recognize at a glance the degree of connection between artificial neurons by an artificial synapse.
Note that if bidirectional artificial synapses are defined between artificial neurons, respective artificial synapses may be displayed with separate lines. Also, artificial synapses may be given marks such as arrows representing directions of an input and output of the artificial synapses so that they can be discriminated.
Here, distances represent the degrees of connection between artificial neurons. The calculated distance between artificial neurons may decrease as the coefficient of connection of an artificial synapse interposed between an artificial neuron pair increases. Also, the calculated distance between an artificial neuron pair may decrease as the number of artificial synapse interposed in series between an artificial neuron pair decreases. Also, the calculated distance between artificial neurons may decrease as the number of artificial synapses interposed in parallel between an artificial neuron pair increases. Also, if one or more artificial neurons are connected between an artificial neuron pair, assuming an average value, minimum value or the like of BStij of all the artificial synapses interposed in series between an artificial neuron pair as an effective coefficient of connection, a distance may be calculated based on the effective coefficient of connection.
For example, if an object of N2 is selected, the processing unit 202 displays, in red, a range 2310 surrounding N1 and N3 firing of which is suppressed by N2. Also, the processing unit 202 displays, in blue, a range 2320 surrounding lines of artificial synapses and an object influenced by N2 in a direction to promote firing. Thereby, a user can easily recognize which artificial neurons or artificial synapses a selected endocrine artificial neuron influences chemically.
Note that related artificial neurons may be set at initial setting based on a connection relation of artificial neurons in a neural network. For example, the parameter processing unit 240 sets, as a related artificial neuron, an endocrine artificial neuron that influences a threshold or the like of a preferential artificial neuron. Also, the parameter processing unit 240 may identify one or more artificial neurons that influence an input of a preferential artificial neuron through an artificial synapse and store it in related artificial neurons by following artificial synapses in a reverse order of the input direction of a signal from the preferential artificial neuron.
If a preferential artificial neuron is treated as a parameter update target, the parameter processing unit 240 treats a related artificial neuron corresponding to the preferential artificial neuron as a parameter update target. Here, the parameter processing unit 240 determines an upper limit value of the number of update target artificial neurons the parameters of which are to be treated as update targets, based on an available resource amount at the server 200. Then, the parameter processing unit 240 may determine update target artificial neurons by selecting preferential artificial neurons in a descending order of a preference order so that the number of artificial neurons the parameters of which are to be treated as update targets becomes equal to or smaller than the determined upper limit value.
Then, for example if BStn+1ij is calculated at S510 in
Thereby, if the amount of resource available at the server 200 becomes small, the update frequency can be maintained high for important artificial neurons. For example, if the amount of resource available at the server 200 becomes small, the function of judging presence or absence of danger can be maintained. Note that if the resource available at the server 200 is abundant, the parameter processing unit 240 may update parameters of all the artificial neurons and all the artificial synapses.
At the server 200, a plurality of update agents 2400 that are in charge of functions of the parameter processing unit 240, and input/output agents 2450a and 2450b that are in charge of data input and output to and from the user terminal 100 are implemented in the processing unit 202. The input/output agent 2450a receives an initial value of a parameter from an editor function unit implemented in the processing unit 102 of the user terminal 100 to perform a process of storing it in the data structure 2500. The input/output agent 2450a performs a process of transmitting, to the user terminal 100, a parameter updated by the parameter processing unit 240 and causing a viewer function unit implemented in the processing unit 102 to display it. The editor function unit and the viewer function unit are implemented in the processing unit 102 for example by a Web browser. Data to be exchanged between the user terminal 100 and the server 200 may be transferred according to the HTTP protocol.
The plurality of update agents 2400 each access the data structure 2500 on an artificial neuron-by-artificial neuron basis to perform calculation of updating a parameter on an artificial neuron-by-artificial neuron basis. The plurality of update agents 2400 each can access the data structure 2500 storing a parameter of a neural network. Also, the plurality of update agents 2400 each can perform calculation of updating parameters. Processes of the plurality of update agents 2400 may be executed respectively by separate processes. Also, the plurality of update agents 2400 may be executed respectively in a plurality of threads in a single process.
The data structure 2500 is generated in a format that is accessible collectively on an artificial neuron-by-artificial neuron basis, in a similar manner to information explained in relation to
At a clock time 5, upon completion of calculation of the parameters of N1, the process 1, after confirming that the parameters of N1 are uncalculated, locks the data in the row of N1 and writes in the calculation result, and unlocks the data in the row of N1. At the clock time t5, the process 1 locks the data in the row of N1, writes in the calculation result and unlocks the data in the row of N1. Likewise, upon completion of calculation about each artificial neuron, the process 2 and the process 3 also write in the calculation results in the data in the row of each artificial neuron.
Here, with reference to
In this manner, according to the data structure 2500, an implementation is possible in which, by multiprocessing, an uncalculated artificial neuron is selected for each process and calculation is started, and only a process that has completed the calculation earliest writes in its calculation result.
Note that a process similar to a process, by each of the above-mentioned processes, of separately selecting an artificial neuron and calculating a related parameter can be applied to each of S510, S520, and S540 in
Also, according to multiprocessing, the process of S510 and process of S520 in
Also, a similar process can be performed not only by multiprocessing, but also in a multithread system. In the multithread system, the similar process may be realized by replacing the process of each of the above-mentioned processes with each thread.
The neural network 2900 is formed of a sub neural network 2910, a sub neural network 2920 and a sub neural network 2930. Calculation for the sub neural network 2910, the sub neural network 2920 and the sub neural network 2930 is performed by mutually different servers.
Here, an artificial neuron 2914 of the sub neural network 2910 is an artificial neurons for which the same concept as an artificial neuron 2921 of the sub neural network 2920 and an artificial neuron 2931 of the sub neural network 2930 is defined. Also, an artificial neuron 2923 of the sub neural network 2920 is an artificial neuron for which the same concept as an artificial neuron 2934 of the sub neural network 2930 is defined. Also, an artificial neuron 2925 of the sub neural network 2910 is an artificial neuron for which the same concept as an artificial neuron 2932 of the sub neural network 2930 is defined.
The artificial neuron 2914 is connected to the artificial neuron 2931 by an artificial synapse 2940. Also, the artificial neuron 2914 is connected to the artificial neuron 2921 by an artificial synapse 2960. Also, the artificial neuron 2915 is connected to the artificial neuron 2932 by an artificial synapse 2950. Also, the artificial neuron 2923 is connected to the artificial neuron 2934 with an artificial synapse 2970. The artificial synapse 2940, the artificial synapse 2950, the artificial synapse 2960 and the artificial synapse 2970 are realized by communication through a network.
For example, if the artificial neuron 2915 is an concept artificial neuron for which a situation “there is Mr. A in sight” is defined, the artificial neuron 2932 is also a concept artificial neuron for which a situation “there is Mr. A in sight” is defined. If the artificial neuron 2915 fires, an output of the artificial neuron 2915 is transmitted from the sub neural network 2910 to the sub neural network 2930 through a network.
Note that a plurality of artificial neurons constituting a sub neural network that should be constructed by a single server preferably have shorter inter-artificial neuron distances than a distance specified in advance. Also, a neural network may be divided into sub neural networks on a function-by-function basis. For example, the sub neural network 2910 may be a neural network of a function part that is in charge of spatial recognition on the basis of a camera image.
Note that the respective sub neural networks may perform processes of a neural network asynchronously. Also, if in a first sub neural network, it is detected that the possibility that an output received from a second sub neural network is erroneous is high, a server to perform the process of the first sub neural network may inform a server to perform the process of the second sub neural network that the output is erroneous. For example, if an output indicating that “there is Mr. B in sight” is acquired suddenly after there are consecutive outputs indicting that “there is Mr. A in sight”, it may be judged that the output is erroneous.
If an error in an output is informed, in the second sub neural network, an output of a clock time when the error is informed may be calculated again, and may be output to the first sub neural network. At this time, in the second sub neural network, a calculation result that is most likely to be accurate and output earlier may be excluded, and a calculation result that is second most likely to be accurate may be output.
Note that if the neural network according to the above-mentioned embodiment is seen as an electrical circuit, operation of the neural network realized by processes of the above-mentioned server 200 or the server explained in relation to
In the embodiments explained above, a server different from the robot 40 is in charge of processes of a neural network. However, the robot 40 itself may be in charge of processes of a neural network.
Note that the robot 40 is one example of an electronic device to be a control target. The electronic device to be a control target is not limited to the robot 40. Various electronic devices can be applied as control targets.
While the embodiments of the present invention have been described, the technical scope of the invention is not limited to the above described embodiments. It is apparent to persons skilled in the art that various alterations and improvements can be added to the above-described embodiments. It is also apparent from the scope of the claims that the embodiments added with such alterations or improvements can be included in the technical scope of the invention.
The operations, procedures, steps, and stages of each process performed by an apparatus, system, program, and method shown in the claims, embodiments, or diagrams can be performed in any order as long as the order is not indicated by “prior to,” “before,” or the like and as long as the output from a previous process is not used in a later process. Even if the process flow is described using phrases such as “first” or “next” in the claims, embodiments, or diagrams, it does not necessarily mean that the process must be performed in this order.
The contents of the following patent application are incorporated herein by reference: International Patent Application PCT/JP2015/061840 filed on Apr. 17, 2015.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2015/061840 | Apr 2015 | US |
Child | 15785413 | US |