The present invention relates to a control system, system and computer-readable medium.
A terminal that studies conversations between a user and another person the user is talking to and accumulates, in a reply table, replies from the other person to questions from the user has been known (please see Patent Document 1, for example). Also, an emotion generating apparatus including a neural net that receives an input of user information, equipment information and a current emotional state of a user him/herself to output a next emotional state has been known (please see Patent Document 2, for example). Also, a technique to store spatiotemporal patterns in an associative memory including a plurality of electronic neurons having a layer neural net relation having directive artificial synapse connectivity has been known (please see Patent Document 3, for example).
[Patent Document 1] Japanese Patent Application Publication No. 2011-253389
[Patent Document 2] Japanese Patent Application Publication No. H10-254592
[Patent Document 3] Japanese Translation of PCT International Patent Application No. 2013-535067
Various embodiments of the present invention may be described with reference to flowcharts and block diagrams whose blocks may represent (1) steps of processes in which operations are performed or (2) units of apparatuses responsible for performing operations. Certain steps and units may be implemented by dedicated circuitry, programmable circuitry supplied with computer-readable instructions stored on computer-readable media, and/or processors supplied with computer-readable instructions stored on computer-readable media. Dedicated circuitry may include digital and/or analog hardware circuits and may include integrated circuits (IC) and/or discrete circuits. Programmable circuitry may include reconfigurable hardware circuits comprising logical AND, OR, XOR, NAND, NOR, and other logical operations, flip-flops, registers, memory elements, etc., such as field-programmable gate arrays (FPGA), programmable logic arrays (PLA), etc.
Computer-readable media may include any tangible device that can store instructions for execution by a suitable device, such that the computer-readable medium having instructions stored therein comprises an article of manufacture including instructions which can be executed to create means for performing operations specified in the flowcharts or block diagrams. Examples of computer-readable media may include an electronic storage medium, a magnetic storage medium, an optical storage medium, an electromagnetic storage medium, a semiconductor storage medium, etc. More specific examples of computer-readable media may include a floppy (registered trademark) disk, a diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an electrically erasable programmable read-only memory (EEPROM), a static random access memory (SRAM), a compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a BLU-RAY(registered trademark) disc, a memory stick, an integrated circuit card, etc.
Computer-readable instructions may include assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, JAVA (registered trademark), C++, etc., and conventional procedural programming languages, such as the “C” programming language or similar programming languages.
Computer-readable instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, or to programmable circuitry, locally or via a local area network (LAN), wide area network (WAN) such as the Internet, etc., to execute the computer-readable instructions to create means for performing operations specified in the flowcharts or block diagrams. Examples of processors include computer processors, processing units, microprocessors, digital signal processors, controllers, microcontrollers, etc.
Hereinafter, (some) embodiment(s) of the present invention will be described. The embodiment(s) do(es) not limit the invention according to the claims, and all the combinations of the features described in the embodiment(s) are not necessarily essential to means provided by aspects of the invention.
A user 30a is a user of the robot 40a. A user 30b is a user of the robot 40b. The robot 40b has approximately identical functions as those of the robot 40a. Therefore, the system 20 is explained, referring to the robot 40a and the robot 40b collectively as a robot 40.
The robot 40 performs various types of operation according to situations, including moving the head or limbs according to situations, having a conversation with a user 30, providing a video to a user 30, and so on. At this time, the robot 40 determines an operation in cooperation with the server 200. For example, the robot 40 transmits, to the server 200, detection information such as a facial image of a user 30 acquired by means of a camera function, or sound or voice of a user 30 acquired by means of a microphone function. The server 200 analyzes the detection information received from the robot 40, determines an operation to be performed by the robot 40, and transmits, to the robot 40, operation information representing the determined operation. The robot 40 performs the operation according to the operation information received from the server 200.
The robot 40 has emotion values representing emotions of itself. For example, the robot 40 has emotion values representing intensities of respective emotions such as “pleased”, “fun”, “sad”, “scared” or “excited”. Emotion values of the robot 40 are determined by the server 200. The server 200 causes the robot 40 to perform an operation corresponding to a determined emotion. For example, if the robot 40 has a conversation with a user 30 when an emotion value of excitation is high, the server 200 causes the robot 40 to utter at a rapid pace. In this manner, the robot 40 can express its emotion through its actions or the like.
Based on detection information received from the robot 40, the server 200 uses a neural network to update the current state of the robot 40. The state of the robot 40 includes emotions of the robot 40. Accordingly, the server 200 uses the neural network to determine the emotions of the robot 40.
Also, the robot 40 causes the server 200 to record video data of a user 30 acquired by means of a camera function, or the like. The robot 40, as necessary, acquires the video data or the like from the server 200 and provides it to a user 30. The amount of information of video data that is generated by the robot 40 and that the robot 40 causes the server 200 to record increases as the intensity of an emotion becomes higher. For example, if recording information in a high compression format such as skeletal data, the robot 40 switches to recording of information in a low compression format such as HD moving images in response to an emotion value of excitation exceeding a threshold. According to the system 20, high definition video data generated if an emotion of the robot 40 becomes intensified can be kept as a record.
In the robot 40, the sensor unit 156 has sensors such as a microphone 161, a 2D camera 163, a 3D depth sensor 162 or a distance sensor 164. The respective sensors provided to the sensor unit 156 detect information continuously. Sensor information detected by the sensor unit 156 is output to the processing unit 152. The 2D camera 163 is one example of an image sensor that captures images of objects continuously, and captures images using visible light and generates video information. The 3D depth sensor 162 emits infrared ray patterns continuously, and analyzes infrared ray patterns from infrared ray images captured by an infrared camera continuously, thereby detecting the outlines of objects. Note that other than the above-mentioned ones, the sensor unit 156 may include various sensors such as a clock, a gyro sensor, a touch sensor, a sensor for motor feedback, a sensor to detect the remaining capacity of a battery.
The processing unit 152 is formed of a processor such as a CPU. The processing unit 152 causes sensor information detected continuously by the respective sensors provided to the sensor unit 156 to be transmitted to the server 200 through the communicating unit 158. Also, the processing unit 152 processes at least part of sensor information detected continuously by the respective sensors provided to the sensor unit 156, and generates information for recording. The processing unit 152 generates first recording format information or second recording format information having an amount of information larger than that of the first recording format information. The first recording format information means, for example, information in a high compression format, and the second recording format information means, for example, information in a low compression format. For example, based on skeletal information detected continuously by the 3D depth sensor 162, the processing unit 152 generates, as first recording format information, shape data such as skeletal data of an object. Also, based on video information captured by the 2D camera 163 and audio information detected by the microphone 161, the processing unit 152 generates full HD video data and audio data. Full HD video data is one example of moving image data having more information than that of shape data of an object.
The communicating unit 158 transmits, to the server 200, first recording format information or second recording format information generated by the processing unit 152. At the server 200, the recording control unit 270 stores, in the recording data 292, the first recording format information or second recording format information received by the communicating unit 208 from the robot 40. The recording control unit 270 stores, in the recording data 292, information received from each robot 40, in association with information discriminating each of the robots 40.
Also, at the robot 40, the communicating unit 158 acquires, from the server 200, information stored in the recording data 292. The communicating unit 158 functions as a recording information receiving unit that acquires second recording format information including moving image data recorded by the recording control unit 270. Based on the moving image data included in the second recording format information received by the communicating unit 158, the processing unit 152 generates a video presented to a user 30. The processing unit 152 functions as a video generating unit that generates a video to be presented to a user 30.
Also, the communicating unit 158 receives operation information indicating an operation detail from the server 200. The processing unit 152 controls the control target 155 based on the operation detail received by the communicating unit 158. The control target 155 includes a speaker, motors that drive respective units of the robot 40 such as limbs, a light emitting device, and the like. If having received information indicating an utterance content from the server 200, the processing unit 152 causes a sound or voice to be output from a speaker according to the received utterance content. Also, the processing unit 152 can control some of actions of the robot 40 by controlling drive motors of limbs. Also, the processing unit 152 can express some of emotions of the robot 40 by controlling these motors.
At the server 200, the communicating unit 208 outputs, to the processing unit 202, information received from the robot 40. The initial value setting unit 210 stores, in the parameter initial values 286 in the storing unit 280, an initial value of a parameter indicating an initial state of the neural network received at the communicating unit 208. Note that the initial value of the parameter of the neural network may be specified in advance at the server 200 or may be able to be altered by a user 30 through the communication network 90.
The external input data generating unit 230 processes at least part of sensor information received by the communicating unit 208, generates input information from the outside of the neural network, and outputs it to the parameter processing unit 240. Based on the input information, and the current parameter 288 of the neural network and the definition information 284 stored in the storing unit 280, the parameter processing unit 240 performs calculation about the neural network.
Artificial neurons that the neural network has include: a plurality of artificial neurons for which situations of the robot 40 are defined; a plurality of emotion artificial neurons for which a plurality of emotions of the robot 40 itself are defined; and a plurality of endocrine artificial neurons for which states of generation of endocrine substances of the robot 40 itself are defined. Based on the input information generated by the external input data generating unit 230, the parameter processing unit 240 calculates a parameter representing the internal state of the plurality of artificial neurons in the neural network. For example, based on the input information generated by the external input data generating unit 230, the parameter processing unit 240 updates a parameter of the current internal state of the plurality of artificial neurons for which the situation of the robot 40 is defined or the like. Also, the parameter processing unit 240 calculates a parameter of the internal state of other artificial neurons in the neural network. Thereby, for example, a parameter of the internal state of an emotion artificial neuron for which an emotion of being “pleased” is defined is calculated. This parameter of the internal state of the emotion artificial neuron is one example of an index representing the intensity of the emotion of being “pleased”. Accordingly, based on the internal state of an emotion artificial neuron, the parameter processing unit 240 can determine the intensity of an emotion in a control system. In this manner, the parameter processing unit 240 functions as an emotion determining unit that based on at least part of information detected by sensors provided to the sensor unit 156, determines the intensity of an emotion using the neural network.
The parameter of the neural network calculated by the parameter processing unit 240 is supplied to the switching control unit 260 and the operation determining unit 250. Based on the parameter supplied from the parameter processing unit 240, the switching control unit 260 determines a recording format for information generated by the processing unit 152 of the robot 40. If it is necessary to switch the recording format for information generated by the processing unit 152, the switching control unit 260 causes an instruction to switch the recording format to be transmitted to the robot 40 through the communicating unit 208. At the robot 40, the processing unit 152 switches the recording format according to the instruction received from the server 200.
For example, if the processing unit 152 is being caused to generate first recording format information, the switching control unit 260 transmits, to the robot 40, an instruction to switch the recording format for information to be generated by the processing unit 152 from the first recording format to the second recording format, in response to increase in the intensity of an emotion determined by the parameter processing unit 240. In this manner, the switching control unit 260 switches the recording format for information to be recorded by the recording control unit 270 from the first recording format to the second recording format in response to increase in the intensity of the emotion determined by the parameter processing unit 240. Thereby, information at the time when an emotion of the robot 40 intensified can be kept as a record in detail. Also, at the robot 40, the processing unit 152 acquires moving image data in the second recording format acquired from the server 200, and generates a video to be presented to a user 30. Accordingly, the user 30 can enjoy information at the time when an emotion of the robot 40 intensified as a video.
Note that if the processing unit 152 is being caused to generate second recording format information, the switching control unit 260 transmits, to the robot 40, an instruction to switch the recording format for information to be generated by the processing unit 152 from the second recording format to the first recording format, in response to decrease in the intensity of an emotion determined by the parameter processing unit 240. In this manner, the switching control unit 260 switches the recording format for information to be recorded by the recording control unit 270 from the second recording format to the first recording format in response to decrease in the intensity of the emotion determined by the parameter processing unit 240.
The operation determination rule 282 specifies an operation to be performed by the robot 40 in association with a state of the robot 40. For example, the operation determination rule 282 specifies an operation to be performed by the robot 40 in association with an internal state of an artificial neuron of the neural network. For example, the operation determination rule 282 specifies an operation to cause the robot 40 to utter a phrase representing pleasedness in association with a condition that an emotion artificial neuron for which an emotion being “pleased” is defined is high. Also, the operation determination rule 282 specifies an operation to be performed when the robot 40 gets sleepy in association with a condition that an internal state of an endocrine artificial neuron for which an endocrine substance corresponding to sleepiness is defined is high.
Note that an endocrine substance means a substance that conveys signals secreted in the body, such as neurotransmitter or hormones. Also, being “endocrine” means that endocrine substances are secreted in the body. However, an endocrine substance of the robot 40 itself is one form of information that influences operations of the robot 40, but does not mean that the robot 40 actually generates an endocrine substance. An emotion of the robot 40 itself is likewise one form of information that influences operations of the robot 40, but does not mean that the robot 40 is actually feeling an emotion.
The operation determining unit 250 determines an operation of the robot 40 based on an operation specified in the operation determination rule 282 in association with the activation state or internal state of each artificial neuron determined by the parameter processing unit 240. Operation information indicating an operation determined by the operation determining unit 250 is transmitted from the communicating unit 208 to the robot 40. At the robot 40, by controlling the control target 155, the processing unit 152 causes the control target 155 to perform the operation indicated by the information received from the server 200. Thereby, the robot 40 can perform an appropriate operation corresponding to the current emotion of the robot 40.
The artificial synapse 301 connects the artificial neuron 4 and the artificial neuron 1. The artificial synapse 301 is an artificial synapse connecting them unidirectionally, as indicated by the arrow of the artificial synapse 301. The artificial neuron 4 is an artificial neuron connected to an input of the artificial neuron 1. The artificial synapse 302 connects the artificial neuron 1 and the artificial neuron 2. The artificial synapse 302 is an artificial synapse connecting them bidirectionally, as indicated by the double arrow of the artificial synapse 302. The artificial neuron 1 is an artificial neuron connected to an input of the artificial neuron 2. The artificial neuron 2 is an artificial neuron connected to an input of the artificial neuron 1.
Note that in the present embodiment, an artificial neuron is represented by N, and an artificial synapse is represented by S, in some cases. Also, each artificial neuron is discriminated using a superscript reference symbol as the discrimination character. Also, a given artificial neuron is in some cases represented using i or j as a discrimination character. For example, Ni represents a given artificial neuron.
Also, an artificial synapse is in some cases discriminated using respective discrimination numbers i and j of two artificial neurons connected to the artificial synapse. For example, S41 represents an artificial synapse connecting N1 and N4. Generally, Sij represents an artificial synapse that inputs an output of Ni to Nj. Note that represents an artificial synapse that inputs an output of Nj to Ni.
In
N1, N3. Nb and Nc are emotion artificial neurons for which emotions of the robot 40 are defined. N1 is an emotion artificial neuron to which an emotion “pleased” is allocated. N3 is an emotion artificial neuron to which an emotion “sad” is allocated. Nb is an emotion artificial neuron to which an emotion of being “scared” is allocated. Nc is an emotion artificial neuron to which an emotion of having “fun” is allocated.
N2, N5 and Na are endocrine artificial neurons for which endocrine states of the robot 40 are defined. N5 is an endocrine artificial neuron to which a dopamine-generated state is allocated. Dopamine is one example of endocrine substances related to the reward system. That is, N5 is one example of endocrine artificial neurons related to the reward system. N2 is an endocrine artificial neuron to which a serotonin-generated state is allocated. Serotonin is one example of endocrine substances related to the sleep system. That is, N2 is one example of endocrine artificial neurons related to the sleep system. Na is an endocrine artificial neuron to which the state of generation of noradrenaline is allocated. Noradrenaline is one example of an endocrine substance related to the sympathetic nervous system. That is, Na is an endocrine artificial neuron related to the sympathetic nervous system.
Information defining the state of the robot 40 like the ones mentioned above is stored in the definition information 284 in the storing unit 280, for each artificial neuron of the plurality of artificial neurons constituting the neural network. In this manner, the neural network 300 includes concept artificial neurons, emotion artificial neurons, and endocrine artificial neurons. The concept artificial neurons, emotion artificial neurons and endocrine artificial neurons are artificial neurons for which meanings such as concepts, emotions or endocrines are defined explicitly. In contrast to this, N8 and N9 are artificial neurons for which states of the robot 40 are not defined. Also, N8 and N9 are artificial neurons for which meanings such as concepts, emotions or endocrines are not defined explicitly.
Parameters of the neural network 300 include Iti which is an input to each Ni of the neural network, Eti which is an input from the outside of the neural network to Ni, parameters of Ni and parameters of Si.
The parameters of Ni include Sti representing the status of Ni, Vimt representing an internal state of the artificial neuron represented by Ni, Tit representing a threshold for firing of Ni, tf representing a last firing clock time which is a clock time when Ni fired last time, Vimtf representing an internal state of the artificial neuron Ni at the last firing clock time, and ati, bti and hti which are increase-decrease parameters of outputs. The increase-decrease parameters of outputs are one example of parameters specifying time evolution of outputs at the time of firing of an artificial neuron. Note that in the present embodiment, a subscript t represents that the parameter provided with the subscript is a parameter that can be updated along with the lapse of clock time. Also, Vimt is information corresponding to an membrane potential of an artificial neuron, and is one example of a parameter representing the internal state or output of the artificial neuron.
The parameters of Sij include BStij representing a coefficient of connection of an artificial synapse of Sij, tcf representing a last simultaneous firing clock time which is a clock time when Ni and Nj connected by Sij fired simultaneously last time, BSijtcf representing a coefficient of connection at the last simultaneous firing clock time, and atij, btij and htij which are increase-decrease parameters of the coefficients of connection. The increase-decrease parameters of the coefficients of connection are one example of parameters specifying time evolution of the coefficients of connection after two artificial neurons connected by an artificial synapse fired simultaneously last time.
The parameter processing unit 240 updates the above-mentioned parameters based on an input from the external input data generating unit 230 and the neural network to determine the activation state of each artificial neuron. The operation determining unit 250 determines an operation of the robot 40 based on: internal states or activation states of at least some artificial neurons among a plurality of artificial neurons in the neural network specified by values of parameters of the at least some artificial neurons; and states defined for at least some artificial neurons by the definition information 284. Note that an activation state may either be an activated state or an inactivated state. In the present embodiment, to be activated is called “to fire” and being inactivated is called “unfiring”, in some cases. Note that, as mentioned below, the “firing” state is classified into a “rising phase” and a “falling phase” depending on whether or not an internal state is on the rise. “Unfiring”, and a “rising phase” and a “falling phase” are represented by a status Sti.
At S510, the parameter processing unit 240 calculates parameters corresponding to a change due to electrical influence of an artificial synapse at a temporal step tn+1. Specifically, it calculates BStij of a given Sij.
At S520, the parameter processing unit 240 calculates parameters corresponding to a change due to chemical influence caused by an endocrine substance at the temporal step tn+1. Specifically, changes in parameters of Ni and Sij that the endocrine artificial neuron has influence on are calculated. More specifically, it calculates an increase-decrease parameter or threshold of an internal state of the artificial neuron Ni that the endocrine artificial neuron has influence on and an increase-decrease parameter of a coefficient of connection or the coefficient of connection of Sij that the endocrine artificial neuron has influence on at the temporal step tn+1.
At S530, the parameter processing unit 240 acquires an input from the outside of the neural network. Specifically, the parameter processing unit 240 acquires an output of the external input data generating unit 230.
At S540, the parameter processing unit 240 calculates an internal state of Ni at the temporal step tn+1. Specifically, it calculates Vimtn+1 and a status Stti. Then, at S550, it stores each parameter value at the clock time tn+1 in the parameters 288 of the storing unit 280. Also, it outputs the value of each parameter at the clock time tn+1 to the operation determining unit 250 and the switching control unit 260.
At S560, the switching control unit 260 judges whether or not the parameter of Ni at the temporal step tn+1 meets a condition for switching a format in which data to be stored in the recording data 292 is recorded. If the parameter of Ni at the temporal step tn+1 meets the recording format switching condition, the switching control unit 260 instructs the robot 40 to switch the recording format (S570), and the process proceeds to S506. On the other hand, if at S560, the parameter of Ni at the temporal step tn+1 does not meet the recording format switching condition, the process proceeds to S506.
At S506, the parameter processing unit 240 judges whether or not to terminate the loop. For example, if the clock time represented by temporal steps has reached a predetermined clock time or if sensor information from the robot 40 has not been received for a length of time specified in advance, it judges to terminate the loop. If the loop is not to be terminated, the process returns to S510, and calculation for a still next temporal step is performed. If the loop is to be terminated, this flow is terminated.
If both Ni and Nj at both ends of Sij are firing at a temporal step of a clock time tn, the parameter processing unit 240 calculates BStn+1ij at the clock time tn+1 according to BStn+1ij=BStnij+atnij×(tn+1−tn). On the other hand, if both Si and Sj are not firing at the temporal step of the clock time t0, it calculates the coefficient of connection BStn+1ij at the clock time tn+1 according to BStn+1ij=BStnij+btnij×(tn+1−tn). Also, if BStn+1ij becomes a negative value, BStn+1ij is regarded as 0. Note that for Sij for which BSij is a positive value, atij is a positive value and btij is a negative value. For Sij for which BSij is a negative value, atij is a positive value and btij is a negative value.
Because as shown in
A function 700 shown in
In the example of
Also, the endocrine artificial neuron N5 is an endocrine artificial neuron to which dopamine is allocated. First definition information about the endocrine artificial neuron N5 specifies: the condition “Vmtn5>Ttn5 and Vmtn4>Ttn4”; “S49 and S95” as artificial synapses that the endocrine artificial neuron N5 has influence on; and “atn+1ij=atnij×1.1” as an equation specifying influence details. Thereby, if Vmtn5 exceeds Ttn5 and additionally Vmtn4 exceeds Ttn4, the parameter processing unit 240 increases increase-decrease parameters of the artificial synapse S49 and S95 by 10% at the clock time tn+1.
Thereby, when the concept artificial neuron N4 for which a situation “a bell rang” is defined is firing if an endocrine artificial neuron of reward system fired, connection between the concept artificial neurons N4 and N5 through the implicit artificial neuron N9 can be strengthened. Thereby, it becomes easier for the endocrine artificial neuron N5 of reward system to fire if “a bell rang”.
Also, second definition information about the endocrine artificial neuron N5 specifies: the condition “Vmtn5>Ttn5”; “N1” as an artificial neuron that the endocrine artificial neuron N5 has influence on; and “Ttn+1i=Ttni×1.1” as an equation specifying influence details. Thereby, if Vmtn5 exceeds Ttn5, the parameter processing unit 240 lowers the increase-decrease parameter of the artificial neuron N1 by 10% at the clock time ttn+1. Thereby, it becomes easier for an emotion “pleased” to fire if the endocrine artificial neuron N5 of reward system fired.
According to such definitions specifying influence about an endocrine artificial neuron of reward system, an implementation becomes possible in which if an act of charging the robot 40 while ringing a bell is repeated, simply ringing a bell causes the robot 40 to take an action representing pleasedness.
Note that the influence definition information is not limited to the example of
The influence definition information is stored in the definition information 284 of the storing unit 280. In this manner, the storing unit 280 stores the influence definition information specifying influence of at least one of an internal state and firing state of an endocrine artificial neuron on a parameter of at least one of an artificial synapse and another artificial neuron not directly connected to the endocrine artificial neuron by an artificial synapse. Then, the parameter processing unit 240 updates parameters of the at least one of the artificial synapse and the other artificial neuron not directly connected to the endocrine artificial neuron by the artificial synapse based on the at least one of the internal state and firing state of the endocrine artificial neuron and the influence definition information. Also, parameters of the other artificial neuron that the at least one of the internal state and firing state of the endocrine artificial neuron has influence on can include at least one of parameters specifying a threshold, firing state and time evolution of an output at the time of firing of the other artificial neuron. Also, parameters of the artificial synapse that the at least one of the internal state and firing state of the endocrine artificial neuron has influence on can include at least one of parameters specifying a coefficient of connection of the artificial synapse, and time evolution of the coefficient of connection after two artificial neurons connected by the artificial synapse simultaneously fired last time. Also, the influence definition information includes information specifying influence that the firing state of an endocrine artificial neuron related with reward system has on a threshold of an emotion artificial neuron, and the parameter processing unit 240 updates the threshold of the emotion artificial neuron according to the influence definition information if the endocrine artificial neuron fired.
If Stni indicates unfiring, the parameter processing unit 240 calculates an input Itn+1i to Ni (S1110). Specifically, if an input from the outside of the neural network is not connected to Ni, it is calculated according to Itn+1i=ΣjBStn+1ji×Vmtnj×f(Stnj). If an input from the outside of the neural network is connected to Ni, it is calculated according to Itn+1i=ΣjBStn+1ji×Vmtnj×f(Stnj)+Etn+1i. Here, Etni is an input at the clock time tn from the outside of the neural network.
Also, f(S) gives 0 if S is a value representing unfiring, and gives 1 if S is a value indicating a rising phase or falling phase. This model corresponds to a model in which a synapse conveys action potential only if a neuron fired. Note that it may give f(S)=1. This corresponds to a model in which membrane potential is conveyed regardless of the firing state of a neuron.
At S1112, the parameter processing unit 240 judges whether or not Itn+1i exceeds Ttn+1i. If Itn+1i exceeds Ttn+1i, the parameter processing unit 240 calculates Vmtn+1i based on an increase-decrease parameter, sets Stn+1i to a value indicating a rising phase or falling phase according to Vmtn+1i (S1114), and terminates this flow.
At S1100, if Stni is in a rising phase or falling phase, the parameter processing unit 240 calculates Vmtn+1i (S1120). Then, the parameter processing unit 240 sets Stn+1i to a value of unfiring if Vmti reached Vmin before tn+1, sets Stn+1i to a value of a rising phase or falling phase if Vmti has not reached Vmin before tn+1, and terminates this flow. Note that the parameter processing unit 240 sets a value of a falling phase to Stn+1i if Vmti reached Vmax before tn+1, and sets a value of a rising phase to Sn+1i if Vmti has not reached Vmax before tn+1.
In this manner, if Nt is firing, an output of Ni is not dependent on an input even if the output becomes equal to or lower than a threshold. Such a time period corresponds to an absolute refractory phase in a neuron of a living form.
At the temporal step of the clock time t0, Ni is unfiring. If It1i at the clock time t1 is equal to or lower than Tt1i, the parameter processing unit 240 calculates Vt1i at the clock time t1 according to Vt1i=It1i, and calculates Vti during a time period from the clock times t0 to t1 according to Vti=It0i. Also, likewise, the parameter processing unit 240 maintains the value of Vtn calculated at the clock time step tn until a next clock time step, and changes it to Itn+1 at Vtn+1.
At the temporal step of the clock time t0, Ni is unfiring. If It1i at the clock time t1 exceeds Tt1i, the parameter processing unit 240 calculates Vt1i at the clock time t1 according to Vt1i=It1i, and calculates Vti during a time period from the clock times t0 to t1 according to Vti=It0i. Note that it is assumed here that It1i at the clock time t1 is equal to or lower than Vmax. If It1i at the clock time t1 exceeds Vmax, It1i=Vmax.
As shown in
Also, upon Vti reaching Vmax, Vti is decreased by |bti| per unit time until Vti reaches Vmin. Also, the parameter processing unit 240 determines the status of Ni in this time period as a falling phase. Then, upon Vti reaching Vmin, Vt6i at a next clock time is calculated according to Vt6i=It6i. Also, the status after Vti reached Vmin is determined as unfiring.
Note that if the status of Ni is a falling phase, Vmti is not dependent on Iti even if the calculated Vmti falls below Tti. Even if Vmti falls below Tti, the parameter processing unit 240 calculates Vmti according to an increase-decrease parameter until Vmti reaches Vmin.
A function 1300 shown in
Also, the rule 1400 specifies an operation to “switch” data recording format “to a low compression format” if at least a second condition that the total value of Vmti of N5 and Na exceeded a threshold is met. Thereby, when there is a transition from a state where the second condition is not met to a state where the second condition is met if information is being recorded in a high compression format, the switching control unit 260 judges to switch the information recording format to the low compression format. Note that a value obtained by multiplying the total value of Vmax of respective Nj. with a constant 0.9 is shown as an example of the threshold. The threshold may be higher than the total value of Tit of respective Nj.
N1, N3, Nb and Nc are emotion artificial neurons for which emotions of “pleased”, “sad”, “scared” and “fun” are defined, respectively. Accordingly, at the parameter processing unit 240, the intensity of an emotion is determined based on an internal state of an emotion artificial neuron, and in response to the determined intensity of the emotion exceeding a threshold specified in advance, the recording format can be switched to a low compression format.
N5 and Na are endocrine artificial neurons for which endocrine substances “dopamine” and “noradrenaline” are defined, respectively. The total value of parameters of internal states of these endocrine artificial neurons is one example of an index representing the intensity of an emotion of being “excited”. Accordingly, at the parameter processing unit 240, the intensity of an emotion is determined based on an internal state of an endocrine artificial neuron, and in response to the determined intensity of the emotion exceeding a threshold specified in advance, the recording format can be switched to a low compression format.
Also, the rule 1400 specifies an operation to “switch” data recording format “to a high compression format” if a third condition that Vmti of N1, N3, Nb and Nc are all equal to or lower than a first threshold and the total value of Vmti of N5 and Na is equal to or lower than a second threshold is met. Accordingly, when there is a transition from a state where the third condition is not met to a state where the third condition is met if information is being recorded in a low compression format, the switching control unit 260 judges to switch the information recording format to the high compression format. In this manner, in response to the intensity of an emotion becoming equal to or lower than a threshold specified in advance, the recording format can be switched to a high compression format.
Note that the first threshold of the third condition is a value obtained by multiplying Vmax of respective Nj with a constant 0.8. Also, the second threshold of the third condition is a value obtained by multiplying the total value of Vmax of respective Nj with a constant 0.8. In this manner, a case where the first threshold of the third condition is lower than the threshold of the first condition, and the second threshold of the third condition is lower than the threshold of the second condition is shown as an example. However, the first threshold may be equal to the threshold of the first condition, and the second threshold may be equal to the threshold of the second condition. Also, the first threshold of the third condition may be higher than Tit of respective Nj. Also, the second threshold of the third condition may be higher than the total value of Tit of respective Nj. Also, not being limited to these examples, various values can be applied to the thresholds of the respective conditions.
According to the system 20, the robot 40 transmits, to the server 200, continuously information in a high compression format such as skeletal data for a time period during which an emotion of the robot 40 is not significantly intense, and causes the server 200 to record the information. The consecutive information such as skeletal data recorded in the server 200 can be used when analyzing a memory of the robot 40. Then, the robot 40 starts transmission of full HD video data and audio data if an emotion of the robot 40 intensifies significantly, and cause the server 200 to record information in a low compression format including full HD video data and audio data in addition to skeletal data for a time period during which the state where the emotion remains as intense as or is more intense than a certain value continues. Then, if for example the robot 40 is requested by a user 30 to provide a video of a memory of the robot 40, the robot 40 requests the server 200 to transmit full HD video data and audio data, and provides the video data and audio data received from the server 200 to the user 30.
In this manner, according to the system 20, high image quality video data of a scene in which the robot 40 felt a strong emotion can be accumulated in the server 200. On the other hand, if the robot 40 is not feeling a strong emotion, summarized information such as skeletal data can be accumulated in the server 200. In this manner, like a human, the robot 40 can keep a summarized memory of when it is not feeling a strong emotion while keeping a memory of when it felt a strong emotion vividly.
Note that although in the present embodiment, the emotions explained are “pleased”, “sad”, “scared”, “fun” and “excited”, emotions that the system 20 handles are not limited to them. Also, although in the present embodiment, the endocrine substances explained are “dopamine”, “serotonin” and “noradrenaline”, endocrine substances that the system 20 handles are not limited to them.
Also, functions of the server 200 may be implemented by one or more computers. At least some functions of the server 200 may be implemented by a virtual machine. Also, at least some of functions of the server 200 may be implemented in a cloud. Also, among functions of the server 200, functions of components excluding the storing unit 280 can be realized by a CPU operating based on a program. For example, at least some of the processes explained as operations of the server 200 can be realized by a processor controlling each piece of hardware (for example, a hard disk, a memory and the like) provided to a computer according to a program. In this manner, at least some of processes of the server 200 can be realized by the respective pieces of hardware including a processor, a hard disk, a memory and the like and a program operating in cooperation with each other by the processor operating according to a program to control the respective pieces of hardware. That is, the program can cause a computer to function as each component of the server 200. Likewise, among components of the robot 40, functions of components excluding the control target 155 and the sensor unit 156 can be realized by a CPU operating based on a program. That is, the program can cause a computer to function as each component of the robot 40. Note that the computer may read in a program to control execution of the above-mentioned processes, operate according to the program read in, and execute the processes. The computer can read in the program from a computer-readable recording medium having stored thereon the program. Also, the program may be supplied to the computer through a communications line, and the computer may read in the program supplied through the communications line.
In the embodiments explained above, the server 200, not the robot 40, is in charge of processes of a neural network. Also, the server 200, not the robot 40, stores information such as video data. However, the robot 40 itself may be in charge of functions of the server 200, such as processes of a neural network. Also, the robot 40 itself may store information such as video data. Also, the robot 40 is one example of equipment to be a target of control by the server 200. Equipment to be a control target is not limited to the robot 40, but various types of equipment such as home appliances, vehicles or toys may apply as control targets.
While the embodiments of the present invention have been described, the technical scope of the invention is not limited to the above described embodiments. It is apparent to persons skilled in the art that various alterations and improvements can be added to the above-described embodiments. It is also apparent from the scope of the claims that the embodiments added with such alterations or improvements can be included in the technical scope of the invention.
The operations, procedures, steps, and stages of each process performed by an apparatus, system, program, and method shown in the claims, embodiments, or diagrams can be performed in any order as long as the order is not indicated by “prior to,” “before,” or the like and as long as the output from a previous process is not used in a later process. Even if the process flow is described using phrases such as “first” or “next” in the claims, embodiments, or diagrams, it does not necessarily mean that the process must be performed in this order.
Number | Date | Country | Kind |
---|---|---|---|
2015-122406 | Jun 2015 | JP | national |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2016/066311 | Jun 2016 | US |
Child | 15841172 | US |