Various embodiments described herein relate to apparatus and methods for simulating behavior and emotion state(s) for a mammal model and an artificial nervous system.
It may be desirable to simulate dynamics between behavior and emotional states for a mammal model and an artificial nervous system. Embodiments herein provide architecture, systems, and methods for same.
In an embodiment, the dynamics between emotional state(s) or behavior(s) of an artificial nervous system may be simulated. In one embodiment, the artificial nervous system is a standalone artificial nervous system that is not modeled on an organism. In another embodiment, the artificial nervous system may represent or simulate an avatar, such as a virtual organism representing any kind of animal or creature. In one embodiment, the artificial nervous system represents or simulates a mammal model.
In another embodiment, the artificial nervous system animates a physical robot. The robot may include sensors tied to the real world (such as a camera, microphone, touch sensors or any other suitable sensors). The robot may include effectors/motors/actuators such as limbs, an animatable facial structure, speakers for audible output, or any other suitable actuators/effectors.
In an embodiment, an avatar, such as a mammal model, may be presented to or perceived by a User via a user perceptible format such as image 62A on a screen 60A as shown in
In either embodiment, emotional state(s) or behavior(s) of an artificial nervous system may vary or change due to dynamics between the states and behavior. In an embodiment, the dynamics between states or behavior may be affected or influenced by a projected or perceived environment where such perceptions may be projected onto the avatar. The perceptions may include various sensory perceptions including visual, audial, olfactory (smell), taste, and tactical (touch). Perceptions from a real-world environment way be provided to the avatar through sensors such as a camera, microphone, touch sensors, or any other suitable sensors.
In an embodiment, the behavior may be any agent driven process. Behavior may include any conduct or actions including but not limited to facial expression(s) 72A, body language 74A, speech projected via speaker 66A, and so on. More generally, behavior may comprise any mathematical solver which activates different routines based on progress measures that are dynamically monitored, modulated, or controlled by an emotion system. The emotion system may be affected by density of neural firing and in one embodiment is affected by density of prediction error from a plurality of levels or differentials.
In an embodiment, an anatomical representation of the avatar 60B may include sensors or a system 50B (
In an embodiment, a digital system 50B digital input module 56 may include visual sensors to collect and provide visual sensory data (normal or broad spectrum). The input module 56 may also include one or more microphones to collect and provide audio sensory data (normal or beyond mammal normal audio range). The input module 56 may further include a device to receive air samples to be chemically tested to detect olfactory signals that may be provided in a digital representation. The input module 56 may include a device to receive physical samples to be chemically tested to detect the presence of elements that a mammal can taste and provide a digital representation such tastes including the detected sodium chloride level (tastes salty), sugar compound level (tastes sweet), acid level (tastes sour), pepper level (causes pain), and others. The input module 56 may also include a touchpad or other device that enables a User to provide an indication of level(s) of touch at various locations on the avatar 70A, 60B.
In an embodiment, the avatar's 70A, 70B current emotional state(s), combination of sensory inputs, and their intensity may be evaluated to determine or simulate dynamics between emotional state(s) or behavior(s). As noted in an embodiment, one or more sensory inputs themselves may be part of a digital reality of an avatar60A. In an embodiment, an avatar 60A, 60B may have numerous simulated emotional state(s) or behavior(s) similar to a physical mammal where the levels of each simulated emotional state or behavior is determined in part based on analysis of physical mammals. Such analysis may include the physical mammal brain functions and perceived dynamics between certain emotional states and behavior in physical mammals.
In a physical mammalian brain, various neural activation of motor behaviour circuits and visceromotor circuits driving release of neurochemicals may be generated in different brain regions in response to sensory inputs. The level of neurochemical generation may also vary as a function of the sensory input intensity. In addition, certain sensory inputs may affect different cortical and subcortical regions of the brain (conscious and sub-conscious regions) and create perceived dynamics between certain emotional states and behavior in physical mammals. For example, the amygdala and hypothalamus may be involved in the creation of several emotional states (an emotional state as discussed here is a combination of visceromotor and motor activity induced by external and internal context) or behaviors including fear responses, emotional responses, hormonal secretions, arousal, and memory.
In addition, the hippocampus, a small organ located within the brain's medial temporal lobe may form an important part of the limbic system and may regulate a physical mammal's emotions. The hippocampus may also encode emotional context from the amygdala and cortex. The hypothalamus links to brain glands such as the pituitary gland and other glands such as those controlled by brainstem nuclei (e.g. Locus coeruleus) may generate neurochemicals that may help regulate the dynamics between emotion states or behaviors. The neurochemicals released by upstream activity of the amygdala and hypothalamus and brainstem nuclei may include dopamine, serotonin, norepinephrine (NE), and oxytocin.
In an embodiment, the avatar's emotional state(s) may be simulated by creating a dynamic model between various emotional states or behavior such as shown in
As shown in
In an embodiment, the dynamic between the various ES 12A-C as shown in
A similar module may be created for any number of ES. For example,
In an embodiment, the change in level of each state 12A-H, may vary based on the attempted direction of the change (higher or lower) and the simulated neurochemical or sensory inputs, their intensity, duration, and rate of change.
As shown in
Input data or a data vector I may be provided to the first layer 12A of data processing modules (DPM) A1 to N1 where the input data vector I may be generated a multiple input data processing module network 3A, 3B. As noted, the signals generated by the modules 30A, 30B may form the input data vector I in an embodiment. In an embodiment each DPM 1A to A1 to N1, A2 to N2, and A3 to N3 of a layer 42A, 42B, 42C may be fully connected to adjacent layer(s) 42A, 42B, 42N DPM A1 to N1, A2 to N2, and A3 to N3. For example DPM A1 of layer 42A may be connected to each DPM A2 to N2 of layer 42B.
In an embodiment the network 40 may represent a neural network and each DPM A1 to N1, A2 to N2, and A3 to N3 may represent a neuron. Further, each DPM A1 to N1, A2 to N2, and A3 to N3 may receive multiple data elements in a vector and combine same using a weighting algorithm to generate a single datum. The single datum may then be constrained or squashed with a constraint of 1.0 (or squashed to a maximum magnitude of 1.0) in an embodiment. The network may receive one or more data vectors that represent a collection of features where the features may represent an instant in time.
In an embodiment the network 40 may receive input training vectors with a label or expected result or prediction such as staying the current emotional state 12A-12G and sending control to another emotional state 12A-12G. In another embodiment, the network 40 may receive input training vectors with a label or expected result or prediction of the desired control signals for states 12A-12G based on the signals generated by the modules 30A, 30B. The network 40 may employ or modulate weighting matrixes to reduce a difference between the expected result or label and a result or label predicted by the network, instance, or model 40. An error or distance E may be determined by a user defined distance function in an embodiment. The network or model 40 may further include functions that constrain each layer's DPM A1 to N1, A2 to N2, and A3 to N3 magnitude to attempt to train the model or network 40 to correctly predict a result when corresponding input vectors are presented to the network or model 40 as input(s) I. In the network 10A each DPM 3A to 3N of the final layer 12N may provide an output data, predicted result, or data vector O1 to ON. In an embodiment, the data vector may determine which state should have control.
Triggering areas 43A-D may comprise innate triggers, hardwired triggers, and learned triggers. Innate triggers may include triggers based on neural firing, such as modality-independent activity patterns 43A. The triggers may be based on the connections between neurons and the activity on those connections, as modulated via attention, neurochemicals, and so on. Innate triggers may also be hardwired, which go directly to a behavioral response circuit without being modeled with neural firing. In one embodiment, interoceptive patterns 43B and exteroceptive patterns 43C are hardwired. One example of hardwiring may be pain, which may be hardwired to a behavioral response. However, in other embodiments, interoceptive patterns 43B and exteroceptive patterns 43C are also based on neural firing through neural networks. For example, a pain or reward stimulus may induce a burst of neural firing that causes a behavior. Thus, innate triggers may be either hardwired or based on neural firing, and a combination of both approaches may be used.
Learned triggers are based on a mapping between stimulus and an emotion. In this manner, an arbitrary pattern 43D may be connected to an emotion. For example, a bell may be associated with a negative emotion if the bell is presented just before a negative stimulus is presented. A neural network, associative map, or other model may develop an association between the otherwise arbitrary stimulus, the bell, and an emotion that has been perceived in the presence of that stimulus.
Triggers may be transmitted to the mapping circuit 44A, which for simplification is shown here to collectively model the hypothalamus and/or other subcortical nuclei causing visceromotor and motor activity. The mapping circuit may comprise a plurality of neurons, such as a neural network, and model a hypothalamus/subcortical nuclei response for a single emotion. The mapping circuit 44A may trigger one or more of a plurality of (motor) behavioral responses 46A. More complex embodiments may involve further mapping in the cortical regions potentially involved in emotional processing and behavioural response such as the Anterior Cingulate.
After processing the signal, the mapping circuit 44A transmits a signal output to a modulatory (visceromotor) neurochemical response model 45A. The neurochemical response model 45A models the release of neurochemicals in a body or artificial nervous system. Neurochemical response model 45A may comprise a plurality of neurons, such as a neural network. The mapping circuit 44A also transmits a signal output to behavioral response 46A. Behavioral response 46A may comprise one or more neurons, such as a neural network. The behavioral response 46A may comprise any of the emotional states 12A-12H. The behavioral response may trigger motor execution by the body or artificial nervous system such as facial expression(s) 72A, body language 74A, or speech.
The behavioral response models 46 may be connected to each other in multiple ways. First, a behavioral response model may inhibit another behavioral response model, such as interest inhibiting fear. Moreover, in some embodiments, mutual inhibition may be modeled with the behavioral response models both inhibiting the other. In some embodiments, the behavioral response model for each emotion inhibits the behavioral response model for all of the other emotions, as the behaviors each compete for attention and try to displace each other. Second, a behavioral response model may interact with other behavioral response models through a recurrent dynamical circuit. A recurrent dynamical circuit may model a catastrophe network, where catastrophe refers to an abrupt switch from one state to another (which may be positive or negative). The recurrent dynamical circuit may model complex behavior such as illustrated in dynamics or pathology 10, where behavior depends not just on the trigger but on the most recent current state.
Behavioral response models 46 may also be modulated by the neurochemical state 45. Neurochemical state may process the input from mapping circuits 44 and transmit an output signal the behavioral response models 46. The output signal may modulate the response of the behavioral response models 46. The modulation may model modulation occurring in a mammal due to release of certain neurochemicals.
The connections between behavioral response models 46 may be modeled for example by modules 20A, 20B, or 20C, where each behavioral response model for a single emotion corresponds to one of states 12A-12H. The connections between nodes of modules 20A, 20B, and 20C may model the inhibition and recurrent relationships between the behavioral response models 46. Each connection may model the potential transmission of a signal from one behavioral response model to another. Moreover, modulation by neurochemical state 45 may also be modeled by neural network inputs to the behavioral response models 46.
The rate of change of the stimulus affects the emotional state that is induced. When the density of neural firing increases quickly, the artificial nervous system is induced into a startled emotion 90A. If the rate of increase is somewhat slower, then the emotion induced is fear 90B. Finally, if the rate of increase is slow enough to be manageable for the artificial nervous system then the emotion induced is interest 90C.
As an example, a very sudden increase in density of neural firing may cause the experience of being startled 90A. When the density of neural firing is increasing over time at a slower rate an emotional state of fear 90B may be induced because the artificial nervous system is unable to cope with the increasing stimuli. Meanwhile, if the density of neural firing increases over time but at a rate that is manageable, then an emotional state of interest 90C is induced.
Sustained stimuli also affect emotional state. A sustained stimulus input at a high level may induce the emotional state of anger 90D. A sustained stimulus input at a somewhat lower level may induce the emotional state of 90E. In some embodiments, sustained stimulus input of any kind leads to negative emotional states, regardless of the character of the stimulus input. For example, even a pleasant melody may induce anger or frustration if played continuously over a long period of time. Likewise, receiving the same compliment over and over again may also lead the artificial nervous system to experience feelings of anger and frustration.
In an embodiment, a decrease in density of neural firing is associated with positive emotions. For example, an emotional state of joy 90F may be induced. The artificial nervous system may experience joy at the decreasing neural firing that causes it to feel less overwhelmed with stimuli.
In an embodiment, the artificial nervous system receives, at a first time, a first observation 80A that is an input. The input may be of any modality such as tactile, visual, aural, olfactory, or gustatory. The input may be of a positive, negative, or neutral valence, and examples include a tap, wave, noise, speaking, shouting, physical approaching object or body part, taste, smell, sound, music or melody, and so on. The input may include observations about the environment. The input 80A may be input to predictor 82, which may be machine learning model that makes a prediction 84 based on the input 80A. In some embodiments, the predictor 82 comprises one or more neurons, such as a neural network. In some embodiments, the prediction 84 comprises a prediction of what will happen based on the input 80A. The predictor 82 may optionally take into consideration the behavior of the artificial nervous system in generating prediction 84.
At a second time, the artificial nervous system may receive a second observation 80B. The second observation 80B may comprise a ground truth observation about the state of the world. The artificial nervous system may perform an error calculation 86 to compute the error between the prediction 84 of what the ground truth would be and the second observation 80B. For example, the error calculation 86 may be a subtraction operation of the second observation 80B from the prediction 84 or other error calculations such as least squares. This computes a prediction error 88, which is a value measuring the error in the prediction 84.
The prediction error 88 comprises a stimulus input for the modality-independent activity patterns 43A triggering area.
As a stimulus input, prediction error 88 may affect density of neural firing and thereby emotion in an artificial nervous system according to modality-independent activity patterns. With respect to graph 90, in the context of prediction error 88, the more the prediction 84 differs from the ground truth observation 80B the denser the activity of neural firing.
The rate of change of the prediction error 88 over time may affect the emotional state that is induced. When the prediction error 88 increases quickly, the artificial nervous system is induced into a startled emotion 90A. If the rate of increase is somewhat slower, then the emotion induced is fear 90B. Finally, if the rate of increase is slow enough to be manageable for the artificial nervous system then the emotion induced is interest 90C.
As an example, a very sudden difference between prediction and observed ground truth may cause the experience of being startled 90A. When the difference between prediction and observed reality is increasing over time at a slower rate an emotional state of fear 90B may be induced because the artificial nervous system observes rapidly increasing differences between expectation and reality and has an inability to understand or control observed reality. Meanwhile, if the difference between prediction and observed ground truth increases over time but at a rate that is manageable, then an emotional state of interest 90C is induced. The artificial nervous system is interested and curious in the differences experienced between the predicted and real outcomes.
Sustained prediction error 88 also affect emotional state. A sustained prediction error 88 at a high level may induce the emotional state of anger 90D. A sustained prediction error 88 at a somewhat lower level may induce the emotional state of 90E. Consistently making incorrect predictions about the world may lead to anger or frustration.
In an embodiment, a decrease in prediction error 88 is associated with positive emotions. A decrease in prediction error 88 over time may induce joy 90F where the artificial nervous system feels that the environment has become more predictable or that it has a better understanding and ability to predict outcomes.
The invention/s disclosed herein may be used within the context of a neurobehavioral modelling framework to create and animate an embodied agent or avatar is disclosed in U.S. Ser. No. 10/181,213B2, also assigned to the assignee of the present invention, and is incorporated by reference herein.
It should be understood that while an emotion system and behavior have been described in the context of a mammal model, the emotion system and behavior may be abstracted and used in models of other organisms or avatars or separately from an organism model. That is, they may be used in abstracted neural systems, such as a completely artificial nervous system that is not connected to an avatar.
The modules may include hardware circuitry, single or multi-processor circuits, memory circuits, software program modules and objects, firmware, and combinations thereof, as desired by the architect of the architecture 10 and as appropriate for particular implementations of various embodiments. The apparatus and systems of various embodiments may be useful in applications other than a sales architecture configuration. They are not intended to serve as a complete description of all the elements and features of apparatus and systems that might make use of the structures described herein.
Applications that may include the novel apparatus and systems of various embodiments include electronic circuitry used in high-speed computers, communication and signal processing circuitry, modems, single or multi-processor modules, single or multiple embedded processors, data switches, and application-specific modules, including multilayer, multi-chip modules. Such apparatus and systems may further be included as sub-components within and couplable to a variety of electronic systems, such as televisions, cellular telephones, personal computers (e.g., laptop computers, desktop computers, handheld computers, tablet computers, etc.), workstations, radios, video players, audio players (e.g., mp3 players), vehicles, medical devices (e.g., heart monitor, blood pressure monitor, etc.) and others. Some embodiments may include a number of methods.
It may be possible to execute the activities described herein in an order other than the order described. Various activities described with respect to the methods identified herein can be executed in repetitive, serial, or parallel fashion. A software program may be launched from a computer-readable medium in a computer-based system to execute functions defined in the software program. Various programming languages may be employed to create software programs designed to implement and perform the methods disclosed herein. The programs may be structured in an object-orientated format using an object-oriented language such as Java or C++. Alternatively, the programs may be structured in a procedure-orientated format using a procedural language, such as assembly or C. The software components may communicate using a number of mechanisms well known to those skilled in the art, such as application program interfaces or inter-process communication techniques, including remote procedure calls. The teachings of various embodiments are not limited to any particular programming language or environment.
The accompanying drawings that form a part hereof show, by way of illustration and not of limitation, specific embodiments in which the subject matter may be practiced. The embodiments illustrated are described in sufficient detail to enable those skilled in the art to practice the teachings disclosed herein. Other embodiments may be utilized and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. This Detailed Description, therefore, is not to be taken in a limiting sense, and the scope of various embodiments is defined only by the appended claims, along with the full range of equivalents to which such claims are entitled.
Such embodiments of the inventive subject matter may be referred to herein individually or collectively by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept, if more than one is in fact disclosed. Thus, although specific embodiments have been illustrated and described herein, any arrangement calculated to achieve the same purpose may be substituted for the specific embodiments shown. This disclosure is intended to cover any and all adaptations or variations of various embodiments. Combinations of the above embodiments, and other embodiments not specifically described herein, will be apparent to those of skill in the art upon reviewing the above description.
The Abstract of the Disclosure is provided to comply with 37 C.F.R. § 1.72(b), requiring an abstract that will allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In the foregoing Detailed Description, various features are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted to require more features than are expressly recited in each claim. Rather, inventive subject matter may be found in less than all features of a single disclosed embodiment. Thus, the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separate embodiment.
Number | Date | Country | Kind |
---|---|---|---|
755124 | Jul 2019 | NZ | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/IB2020/056280 | 7/3/2020 | WO |