1. Field of the Invention
The subject invention relates to computer programs generating virtual environments having agents acting therein.
2. Description of Related Art
Over the past several years, computers have increasingly promoted collaborative activities between groups of users. The collaboration between users can be a simple as an instant messaging discussion group, or can be a complex engineering design being developed by a group of engineers dispersed at locations around the world. Computer interfaces have matured as well, changing from the primitive text-based user interfaces of the early days of computers to multimedia-rich browser environments, as well as complex virtual environments. For example, virtual environments are used today to provide realistic scenarios to train personnel involved in occupations requiring quick decision-making, such as police work and aircraft and ship piloting. Coupled with the maturation of the user interface has been the trend towards sophisticated multi-user systems that support collaboration amongst a large group of users.
Concurrent with the rise of the Internet, software agents have become a necessary tool to manage the volume and flow of information available to an Internet user. Software agents execute various tasks as required by a particular user, and are guided by their particular programming. For example, a software agent can operate autonomously and, within that autonomous operation, react to certain events, capture and filter information and communicate the filtered information back to a user. A software agent can be designed to control their own activities, and one of skill can easily design software agents that communicate and interact with other software agents.
The types of software agents are only limited by the imagination of a software designer. A software agent can be designed to be a pedagogical agent that has speech capability (Lester, et al 1997) and can adapt their behavior to their environment (Johnson, 1998). A well-designed software agent can respond with cognitive responses, as well as affect, and their outward behavior is adapted to their particular role (Lester and Stone, 1997), (Andre et al, 1998).
An avatar is defined as the “the representation of a user's identity within a multi-user computer environment; a proxy for the purposes of simplifying and facilitating the process of inter-human communication in a virtual world.” (Gerhard and Moore 1998). Within a virtual environment, avatars have a plurality of attractive traits, such as identity, presence and social interaction. Within a virtual world, an avatar is used to establish a user's presence, and they may take on an assumed persona of the user. For example, in a gaming virtual world, a mild mannered accountant may use an avatar with a persona of a mercenary soldier. It is well known that avatars can be aware of each other within a given virtual world. Moreover, an avatar can be under the direct control of its underlying user, or may have a great deal of freedom with respect to its internal state and actions. A group of avatars can initiate and continue social and business encounters in a virtual world and foster the impression that they are acting as virtual agents and have authority derived from the underlying user.
The invention has been made in view of the above circumstances and prior art.
Various aspects and advantages of the invention will be set forth in part in the description that follows and in part will be obvious from the description, or may be learned by practice of the invention. The aspects and advantages of the invention may be realized and attained by means of the instrumentalities and combinations particularly pointed out in the appended claims.
In one embodiment of the present invention, a socially intelligent agent (SIA) platform enables interactions with various different applications, thereby enabling easier programming of various applications and injecting socially intelligent agents thereto. Specifically, an application adapter is provided to enable interaction between any application and the SIA platform. A plurality of adapters can be provided to enable interactions with various applications. In operation, the user provides input via the user interface and the input is applied to the application via the application interface. The application processes the input and provides a social event indication to the SIA platform, via the application adapter. The SIA platform then process the social event and output an emotional response. The emotional response is sent to the application via the application adapter. The application processes the emotional response and, when proper, output appropriate response to a user interface, such as a display (image output), game pad (vibration output), etc.
According to another embodiment of the present invention, a virtual environment is provided for one or more socially intelligent agents (SIA's). The environment comprises a scenario environment that receives user inputs and outputs stimulus messages. The stimulus messages are received by a stimulus interpreter, which interprets the stimulus messages and outputs social facts as input events to the socially intelligent agents. The socially intelligent agents, in turn, process the social facts and output one or more emotion response messages as emotion state/desire messages. An emotion manifester receives the emotion state/desire messages output from the socially intelligent agents and converts the emotion state/desire messages into action messages. The scenario environment receives the action messages and converts them into graphical representations of the socially intelligent agents' emotional responses so that the users are able to visually interpret the actions/responses of the socially intelligent agents.
The virtual environment may further comprise a scenario database, coupled to the scenario environment, for providing a cyberspace context that allows the socially intelligent agents to interact with each other. The cyberspace contexts can be quite varied and are limited only by the imagination of the software programmers creating the virtual environment of the application that is coupled to the SIA platform.
An SIA comprises a social response generator coupled to an emotion generator. The social response generator receives and processes an input event. The input event is processed according to a plurality of predefined personality trait indices stored in a personality trait register and a plurality of emotional state indices stored in an emotional state register. Each of the registers is associated with an agent. Subsequent to processing the input event, the social response generator outputs at least one social response message based on the predefined personality trait index that is output from the predefined personality trait register and the emotional state index output from the emotional state register. The social response message is output to an event buffer. The emotion generator captures the social response message from the event buffer, and outputs an emotion response message. The emotion generator creates the emotion response message based on at least one of a personality trait index that is output from the predefined personality trait register, the emotional state index output from the emotional state register and/or the plurality of emotional state values. This embodiment of the present invention may, for example, be realized in computer firmware or electronic circuitry, or some combination of both hardware and firmware.
An agent uses current emotional state index that has predefined thresholds that are indicative of the emotions of neutrality, happiness, sadness and anger. The agent also uses a current confidence index, wherein the confidence level of the agent is represented as a numerical value. In addition, in a virtual environment, agents have to be aware of each other and be able to react to each other as dictated by the emotional states and personality traits. Therefore, the emotional state register of an agent can further comprise at least one or more of an agent interrelationship index, which is used for indicative of the relationship between agents. The agent interrelationship index is used with another agent that receives an input event, with another agent that outputs an emotion response message or with an agent observes an input event or an emotion response message.
An additional refinement of the agents of the present invention is that the social response generator modifies the current state of the emotional state register based on the input event or the output from the predefined personality state register. For example, if an agent is in a virtual environment and the agent responds incorrectly to a particular input event (e.g., a school environment where a student agent gives an incorrect answer in response to a question from a professor agent), the emotional state register may be updated based on an output from the agent personality trait register, as well as the current emotion index in the emotional state register.
With respect to predefined personality trait register, the personality of an agent comprises at least one of an intelligence index, a conscientiousness index, an extraversion index, an agreeableness index and an emotional stability index. Since a human being's personality traits are fairly stable and do not generally change, the personality traits of an agent according to the present invention are predefined in a particular agent's programming and are not affected by the outputs from the social response generator or the emotion generator. The predefined personality trait register may also comprise an agent social status index. The agent social status index is used to define relationships between agents that receive an input event, agents that output an emotion response message and/or agents that observe an input event or an emotion response message. These indices are useful in establishing a social hierarchy between individual agents and/or groups of agents.
As indicated above, an agent comprises an event buffer for storing social response messages. In an embodiment of the present invention, the event buffer comprises a first buffer and a second buffer. The social response messages are sorted into the first and second buffers dependent upon the type of social response message that is output by the social response generator. For example, the social response generator generates an unexpected response flag, which are stored in the first buffer. The social response generator also generates a danger response flag that is stored in the second buffer. In addition, the social response generator generates a sensory input flag, which is stored in the second buffer. In human beings, different responses to external events are active for differing lengths of time. When a person is surprised, that response only lasts a short time. When a person senses danger, however, that response/awareness will likely last until the person no longer perceives a dangerous situation. In the present invention, the differing time lengths for these types of responses are implements with event buffers having different validity lengths. Specifically, a social response message that is stored in the first buffer is maintained for a predetermined first period of time, and a social response message that is stored in the second buffer is maintained for a predetermined second period of time. In the present invention, a social response message that is stored in the first buffer is maintained for a shorter period of time that a social response message stored in the second buffer.
After the social response generator has processed the input event and output a social response message (if dictated by the agent programming) and updated the emotional state register and/or the current emotion index (if dictated by the agent programming), the emotion generator creates and outputs an emotion response message. There are different emotion response messages, and the emotion response messages are output into emotion categories. As with the event buffer, the emotion categories have differing validity lengths. The generated emotion response messages are based on at least one or more outputs from the predefined personality trait register, one or more outputs from the emotional state register and/or the social response message stored in the event buffer.
The emotion categories comprise at least a lasting emotion category, a short-lived emotion category and a momentary emotion category. For the momentary emotion category, its validity length is determined by an unexpected response flag generated by the social response generator. Accordingly, the emotion generator generates an emotion response message for the momentary emotion category that comprises at least a surprise indicator. For the short-lived emotion category, its validity length is determined by a danger response flag or a sensory input response flag generated by the social response generator. The emotion generator generates an emotion response message for the short-lived emotion category that comprises at least indices indicative of disgust or fear. Finally, the emotion generator generates an emotion response message for the lasting emotion category that comprises indices indicative of neutrality, happiness, sadness or anger.
In an alternative embodiment of the present invention, an agent comprises a social response state machine coupled to an emotion state machine. The social response state machine receives and processes an input event. The input event is processed according to a plurality of predefined personality trait indices stored in a personality trait register and a plurality of emotional state indices stored in an emotional state register. Each of the registers is associated with an agent. Subsequent to processing the input event, the social response state machine outputs at least one social response message based on the predefined personality trait index that is output from the predefined personality trait register and the emotional state index output from the emotional state register. The social response message is output to an event buffer. The emotion state machine captures the social response message from the event buffer, and outputs an emotion response message. The emotion state machine creates the emotion response message based on at least one of a personality trait index that is output from the predefined personality trait register, the emotional state index output from the emotional state register and/or the plurality of emotional state indices. This embodiment of the present invention includes all the features of the first embodiment described above with respect to the characteristics and operation of the social response state machine, the emotion state machine, the emotional state register, the personal trait register, the event buffer, the emotion categories, etc.
In another alternative embodiment of the present invention, an article of manufacture, comprising a computer readable medium having stored therein a computer program for a software agent, and further comprising a first and second code portions that are executed on a computer and/or computer system. The first code portion receives and processes an input event. The input event is processed according to a plurality of predefined personality trait indices stored in a personality trait register and a plurality of emotional state indices stored in an emotional state register. Each of the registers is associated with an agent. Subsequent to processing the input event, the first code portion outputs at least one social response message based on the predefined personality trait index that is output from the predefined personality trait register and the emotional state index output from the emotional state register. The social response message is output to an event buffer. The second code portion captures the social response message from the event buffer, and outputs an emotion response message. The second code portion creates the emotion response message based on at least one of a personality trait index that is output from the predefined personality trait register, the emotional state index output from the emotional state register and/or the plurality of emotional state indices. This embodiment of the present invention includes all the features of the first embodiment described above with respect to the characteristics and operation of the first code portion, the second code portion, the emotional state register, the personal trait register, the event buffer, the emotion categories, etc.
As discussed in the background section, agents need a virtual environment for operation. According to another embodiment of the present invention, such a virtual environment would be suitable for a plurality of agents as described above. The environment may comprise a scenario environment that receives user inputs and outputs stimulus messages. The stimulus messages are received by a stimulus interpreter, which interprets the stimulus messages and outputs social facts as input events to the plurality of software agents. The software agents, in turn, process the social facts as discussed above and output one or more emotion response messages. An emotion manifester receives the emotion response messages output from the plurality of agents and converts the emotion response messages into action messages. Finally, the scenario environment receives the action messages and converts them into graphical representations of the software agents' emotional responses so that the users are able to visually interpret the actions/responses of the software agents.
The virtual environment may further comprise a scenario database, coupled to the scenario environment, for providing a cyberspace context that allows the plurality of agents to interact with each other. The cyberspace contexts can be quite varied and are limited only by the imagination of the software programmers creating the virtual environment.
The virtual environment can further comprise a first role database coupled to the stimulus interpreter. The first role database comprises social characteristics used by the stimulus interpreter to create input events that are sent the plurality of software agents. The virtual environment can further comprise a second role database coupled to the emotion manifester. The second role database comprises information used to convert the emotion response messages received from the plurality of software agents into action messages.
In an alternative embodiment, the virtual environment a scenario database, coupled to the scenario environment, for providing a cyberspace context that graphically depicts the interaction between the plurality of software agents based on the action messages received from the emotion manifester. The scenario environment sends a first type of command to the stimulus interpreter, which outputs a stimulus message to the plurality of software agents and forwards the command to the emotion manifester. The scenario environment can also sends a second type of command to the stimulus interpreter, which outputs a stimulus message only to the plurality of software agents.
In another alternative embodiment, the present invention provides an article of manufacture that comprises a computer readable medium having stored therein a computer program. The computer program comprises a first code portion which, when executed on a computer, provides a plurality of software agents. The computer program further comprises a second code portion which, when executed on a computer, provides a scenario environment that receives user inputs and outputs stimulus messages. The computer program further comprises a third code portion which, when executed on a computer, provides a stimulus interpreter that interprets the stimulus messages and outputs social facts as input events to the plurality of software agents. The computer program further comprises a fourth code portion which, when executed on a computer, provides an emotion manifester that receives the emotion response messages output from the plurality of agents and converts the emotion response messages into action messages. The scenario environment receives the action messages and converts them into graphical representations of the software agents' emotional responses.
The above and other aspects and advantages of the invention will become apparent from the following detailed description and with reference to the accompanying drawing figures.
The accompanying drawings, which are incorporated in and constitute a part of this specification illustrate embodiments of the invention and, together with the description, serve to explain the aspects, advantages and principles of the invention. In the drawings,
Hereinafter, an illustrative, non-limiting embodiment of the present invention will be described in detail with reference to the accompanying drawings.
Referring to
In the context of the socially intelligent agent 10, states are variable information that each agent has at initialization. For example, states include the emotions of a socially intelligent agent. An emotional state is given to a socially intelligent agent at initialization, and updated according to social rules that are used for the generation of behavior of the socially intelligent agent 10.
Referring to
The ranges illustrated in TABLE 1 are exemplary in natures and can be adapted/changed to suit a particular type of socially intelligent agent. For the negative emotions, TABLE 1 illustrates that a particular range of emotional state index 21 is interpreted for both sadness and anger. For example, if the emotional state index 21 contains a value in the range of −0.25 to −0.11, the emotional state of the socially intelligent agent can be interpreted as slightly sad or slightly angry. The dual interpretations are based on the indices of the predefined personality trait register 11 (See
A. Bandura (See Self-efficacy (1994) (V. S. Ramachaudran (Ed.), Encyclopedia of human behavior (Vol. 4, pp. 71-81), New York: Academic Press (Reprinted in H. Friedman [Ed.], Encyclopedia of mental health, San Diego: Academic Press, 1998))) describes self-efficacy (i.e., confidence) as a fundamental psychological construct, defining it as “people's beliefs about their capabilities to produce designated levels of performance that exercise influence over events that affect their lives.” Bandura further explains that perceived self-efficacy exerts influence over the character of emotion experienced when encountering a task. For example, if a person has a low confidence level when encountering a task, that person may become nervous or afraid. This concept fits well within the framework of appraisal theories of emotion discussed earlier. Bandura further discloses that a person boosts their self-confidence by successfully accomplishing tasks. Conversely, failing to accomplish tasks lowers a person's confidence level.
In an embodiment of the present invention, the socially intelligent agent 10 uses a current confidence index 22, wherein the confidence level of the agent is represented as an integer value. For a socially intelligent agent 10 having a neutral confidence state, the current confidence index 22 will be approximately 0.0. If the current confidence index 22 exceeds a predefined threshold (e.g., 0.45), the socially intelligent agent 10 will exhibit confident behavior. For example, a socially intelligent agent 10 having a high positive value in its current confidence index 22 may comment on the current relationship between other agents, or will comment on the behavior of another socially intelligent agent or will act in an assured manner with socially intelligent agents in the context of a virtual environment. Conversely, a socially intelligent agent 10 having a high negative value in its current confidence index 22 (i.e., −0.63) that exceeds a predefined threshold will manifest unconfident behavior. As with the emotional state index, the thresholds of the current confidence index 22 can be manipulated based on the type of the socially intelligent agent that is desired.
In addition, in a virtual environment, socially intelligent agents have to be aware of each other and be able to react to each other as dictated by the emotional states and personality traits. Therefore, the emotional state register 12 of a socially intelligent agent 10 can further comprise at least one or more of an agent interrelationship index 23m, 23n, 23x for another agent that receives an input event 16 and/or that outputs an emotion response message 17. Depending upon the complexity of the social context within a given virtual environment, a socially intelligent agent 10 may comprise one or more agent interrelationship indices in various combinations. For example, if a particular virtual environment is supporting four socially intelligent agents (AGENT1, AGENT2, AGENT3 and AGENT4), each of the agents will have multiple agent interrelationship indices. For example, AGENT1 will have an agent interrelationship index(AGENT2) directed towards AGENT2, an agent interrelationship index(AGENT3) directed towards AGENT3 and an agent interrelationship index(AGENT4) directed towards AGENT4. Of course, the agent interrelationship index 23 can be implemented as individual registers or memory locations, or as an array with an identifier of a particular socially intelligent agent acting as the index into the array. The social response generator 13 can call on these various indices as its programming warrants.
A discussion of the values used in the various interrelationship indices and how they indicate relationships between socially intelligent agents follows. AGENT1 has a neutral relationship with AGENT2, i.e., the agent interrelationship index of AGENT1 for AGENT2 is approximately 0.0. This value is the default value for the agent interrelationship index 23 between two socially intelligent agents. If the AGENT1/AGENT2 interrelationship index becomes more positive, it means that AGENT1 likes AGENT2 and the absolute value of this value represent how much AGENT1 likes AGENT2. In addition, when AGENT1 has an unpleasant relationship with AGENT3, i.e., the AGENT1/AGENT3 interrelationship index is less than 0.0. If the AGENT1/AGENT3 interrelationship index becomes more negative, it means AGENT1 dislikes AGENT3, and the absolute value of the AGENT1/AGENT3 interrelationship index represents how AGENT1 dislikes AGENT3. Of course, AGENT2, AGENT3 and AGENT4 each have their own interrelationship indices with AGENT1, as well as with each other.
Human beings naturally recognize and respond to the personalities of other individuals. Since an individual's personality is one of their more consistent mental aspects, it is typically used as a predictive indicator of the emotional state and possible behaviors of an individual. Although an individual's personality does not radically transform within a short time intervals, prolonged exposure to a particular environment can induce some change. In the realm of psychology, five indicia are known to characterize some of the major attributes of personality. See D. Moffat, Personality Parameters and Programs, Creating Personalities for Synthetic Actors, Springer (1997). The indicia are openness/intellect, conscientiousness, extraversion, agreeableness and emotional stability. Reeves and Nass claim that friendliness and dominance are two major attributes of personality, especially that of mediated agents. See B. Reeves, and C. Nass, The Media Equation: How People Treat Computers, Television, and New Media Like Real People and Places, CSLI publications and Cambridge University Press, New York, 1996.
In the context of the present invention, traits are static information that does not change during a session in a virtual environment. For a socially intelligent agent 10, traits include personality and social status. Referring to
The predefined personality trait register 11 will now be described in greater detail. The intelligence index 31 represents the degree of openness to experience and/or intellect of the socially intelligent agent 10. In the exemplary embodiment, the intelligence index 31 ranges from 0.0 to 1.0. An average socially intelligent agent 10 will have an intelligence index 31 of approximately 0.5. If the intelligence index 31 is greater than 0.5, the socially intelligent agent 10 will be imaginative, curious, creative, adventurous, original, artistic, etc. Conversely, if the intelligence index 31 is less than 0.5, the socially intelligent agent 10 will act in a conventional manner, will avoid the unfamiliar, will be inartistic, will lack imagination, etc.
The conscientiousness index 32 represents the degree of conscientiousness of a socially intelligent agent. In the exemplary embodiment, the conscientiousness index 32 ranges from 0.0 to 1.0. An average socially intelligent agent 10 will have a conscientiousness index 32 of approximately 0.5. If the conscientiousness index 32 is greater than 0.5, a socially intelligent agent will be cautious, disciplined, organized, neat, ambitious, goal-oriented, etc. On the other hand, if the conscientiousness index 32 is less than 0.5, a socially intelligent agent will be unreliable, lazy, careless, negligent, low on need for achievement, etc.
The extraversion index 33 represents the degree of extraversion of a socially intelligent agent. In the exemplary embodiment, the extraversion index 33 ranges from 0.0 to 1.0. An average socially intelligent agent 10 will have an extraversion index 33 of approximately 0.5. If the extraversion index 33 is greater than 0.5, a socially intelligent agent will be talkative, optimistic, sociable, friendly, high in need for stimulation, etc. Conversely, if the extraversion index 33 is less than 0.5, a socially intelligent agent will be quiet, conventional, less assertive, aloof, etc.
The agreeableness index 34 represents the degree of agreeableness of a socially intelligent agent. In the exemplary embodiment, the agreeableness index 34 ranges from 0.0 to 1.0. An average socially intelligent agent 10 will have an agreeableness index 34 of approximately 0.5. If the agreeableness index 34 is greater than 0.5, a socially intelligent agent will be compassionate, caring, good-natured, trusting, cooperative, helpful, etc. On the other hand, if the agreeableness index 34 is less than 0.5, a socially intelligent agent will be irritable, rude, competitive, unsympathetic, self-centered, etc.
The emotional stability index 35 represents the degree of neuroticism and/or emotional stability of a socially intelligent agent. In the exemplary embodiment, the agreeableness index 34 ranges from 0.0 to 1.0. An average socially intelligent agent 10 will have an emotional stability index 35 of approximately 0.5. If the emotional stability index 35 is greater than 0.5, a socially intelligent agent will be relaxed, calm, secure, unemotional, even-tempered, etc. Conversely, if the emotional stability index 35 is less than 0.5, a socially intelligent agent will be anxious, nervous, worrying, insecure, emotional, etc.
The predefined personality trait register 11 may also comprise an agent social status index 36m, 36n, 36x. These indices are useful in establishing a social hierarchy between individual agents and/or groups of agents. In the exemplary embodiment, the agent social status index 36m, 36n, 36x is an integer value that is greater than zero. Each socially intelligent agent will have its own social status index, and each socially intelligent agent can refer to the social status index of other socially intelligent agents. For example, if a socially intelligent AGENT1 wants to refer the social status of socially intelligent AGENT2, AGENT1 can refer to the social status index of AGENT2, i.e., agent social status index(AGENT2). Depending upon the complexity of the social context within a given virtual environment, a socially intelligent agent 10 may comprise one or more agent social status indices in various combinations. Of course, the agent social status index 36 can be implemented as individual registers or memory locations, or as an array with an identifier of a particular socially intelligent agent acting as the index into the array. The social response generator 13 can call on these various indices as its programming warrants.
Referring to
An additional refinement of the agents of the present invention is that the social response generator 13 modifies the current state of the emotional state register 12 based on the input event 16 or the output from the predefined personality trait register 11. For example, if a socially intelligent agent 10 is in a virtual environment and the agent responds incorrectly to a particular input event 16 (e.g., a school environment where the agent gives an incorrect answer in response to a question from a professor agent), the emotional state register 12 may be updated based on an output from the agent personality trait register 11, as well as the emotional state index 21 in the emotional state register 12.
Referring to
In the exemplary embodiment, the TASK_FEEDBACK function is divided into two sub-functions that are executed based on how the input event 16 is directed to the socially intelligent agent 10. Specifically, if the socially intelligent agent 10 is the receiver of the input event 16, then the first of the two sub-functions is executed. If the socially intelligent agent 10 is the sender of the input event 16 or is observing the input event 16 (observing in this context means that one socially intelligent agent is aware that another socially intelligent agent is sending/receiving the input event 16, but the observing socially intelligent agent neither receives or sends the input event), then the second of the two sub-functions is executed.
If a socially intelligent agent is receiving an input event 16 that requires feedback, the first sub-function of the TASK_FEEDBACK function is executed. The TASK_FEEDBACK function includes three input parameters: the identifier of the agent that sent the task (sender), the identifier of the agent that is the receives the task, and the degree of the task. The degree parameter is an indicia of the strength of the behavior. The sub-function of the TASK_FEEDBACK function first executes the processes for the event buffer 15 for storing social response messages 18. In the exemplary embodiment, if the agent's current confidence index 22 is above a predefined threshold (i.e., Threshold-1) and the degree of behavior is greater than zero, the function UNEXPECTED_EVENT is called to store an unexpected event indication in the event buffer 15. If the agent's current confidence index 22 is below a predefined threshold (i.e., Threshold-2) and the degree of behavior is less than zero, the function UNEXPECTED_EVENT is called to store an unexpected event indication in the event buffer 15. The Threshold-1 and Threshold-2 factors can be manipulated to fine tune the social responses of the socially intelligent agent. After determining whether to set an unexpected event indication, the first sub-function then updates the emotional state index 21, the current confidence index 22 and the agent interrelationship index 23. First, the sub-function calculates an emotion delta (i.e., delta-emotion in
The TASK_FEEDBACK function uses several sub-functions to accomplish its desired results. In the exemplary embodiment, since several of the personality trait indices and the emotional indices are restricted to values in the range of −1.0 to 1.0, the capAt1 function range limits the calculations performed by the TASK_FEEDBACK function. For example, capAt1(0.3) would returns 0.3, capAt1(1.3) would return 1.0 and capAt1(−1.6) would return −1.0. The sub-function setDesiredBehavior(x, y, z) sets an index for doing behavior identified by parameter x towards another socially intelligent agent identified by y with degree of behavior z. For example, the function call “setDesiredBehavior(SOCIAL_FEEDBACK, AGENT3, 0.5)” means gives social feedback towards an agent with the identifier AGENT3 with degree of behavior equal to 0.5.
The second sub-function of the TASK_FEEDBACK function is called if the socially intelligent agent 10 has sent the input event 16 or is observing the input event 16. First of all, the emotional state index 21 and the current confidence index of the agent are updated in a similar manner as discussed above with respect to an agent that receives an input event 16, although the formulas used are different. Next, the emotional state index 21, the various social status indices and the interrelationship index for the agent receiving the input event 16 are examined. In the exemplary embodiment, different combinations of the emotional state index 21 and the agent social status index 36 of the sending/observing agent, and the interrelationship index 23 and agent social status index 36 of the receiving agent are examined, and based on their results, the sub-function setDesiredBehavior(x, y, z) is called to set an index for a behavior directed to a particular agent. Depending upon the values of the various indices, the giving of social feedback in response to an input event 16 may or may not occur. The invocation of setDesiredBehavior(null, null, null)” clears the buffer for storing the index related to behavior.
Referring to
Referring to
As indicated above, a socially intelligent agent 10 comprises an event buffer 15 for storing social response messages 18. In an embodiment of the present invention, the event buffer 15 comprises a first buffer 15A and a second buffer 15B. The social response messages 18 are sorted into the first and second buffer dependent upon the type of social response message 18 that is output by the social response generator 13. For example, the social response generator 13 generates an unexpected response flag, which is stored in the first buffer 15A. The social response generator 13 also generates a danger response flag that is stored in the second buffer 15B. In addition, the social response generator 13 generates a sensory input flag, which is stored in the second buffer 15B. In human beings, different responses to external events are active for differing lengths of time. When a person is surprised, that response only lasts a short time. When a person senses danger, however, that response/awareness will likely last until the person no longer feels a sense of danger. In the present invention, the differing time lengths for these types of responses are implements with event buffers 15A, 15B having different validity lengths. Specifically, a social response message 18 that is stored in the first buffer 15A is maintained for a predetermined first period of time, and a social response message 18 that is stored in the second buffer 15B is maintained for a predetermined second period of time. In the present invention, a social response message 18 that is stored in the first buffer 15A is maintained for a shorter period of time that a social response message 18 stored in the second buffer 15B.
Referring to
Referring to
Referring to
Referring to
The emotion categories comprise at least a lasting emotion category, a short-lived emotion category and a momentary emotion category. For the momentary emotion category, its validity length is determined by an unexpected response flag generated by the social response generator 13. Accordingly, the emotions generator 14 generates an emotion response message 17 for the momentary emotion category that comprises at least a surprise value. For the short-lived emotion category, its validity length is determined by a danger response flag or a sensory input response flag generated by the social response generator 13. The emotions generator 14 generates an emotion response message 17 for the short-lived emotion category that comprises at least a disgust value or a fear value. Finally, the emotions generator 14 generates an emotion response message 17 for the lasting emotion category that comprises at least a neutrality value, a happiness value, a sadness value or an anger value.
More specifically, in
With respect to the category of short-lived emotions, in the exemplary embodiment of the present invention, the emotional responses of fear and disgust are defined. As shown in
For the lasting emotion category, the emotion generator 14 examines the emotional state index 41 and the agreeableness index 64 of the socially intelligent agent to output the emotions of neutrality, sadness, happiness or anger. The emotions of sadness, happiness and anger are further shaded with the modifiers of slightly and extremely. In the exemplary embodiment of the emotion generator 14, the various constants are defined as follows in Table 3:
Referring to
The socially intelligent agent 150 further comprises an interpreter 156, which receives an Input_Event message. The cyberspace context that the socially intelligent agent 150 is operating within generates an application dependent task event according to cyberspace context and sends the application dependent task event to all the socially intelligent agents attached to the context. For example, in a cyberspace context involving a plurality of socially intelligent agents as students and an additional socially intelligent agent acting as a professor, a typical application dependent task event might be PROFESSOR_CALL_STUDENT, which is sent to all the socially intelligent agents. When the interpreter 156 receives an application dependent task event as an Input_Event message, the application dependent task event is processed based on information from the emotional state register 152, the personality trait register 151 and a role database 158, which contains social characteristic information. Alternatively, the interpreter 156 forwards the Input_Event message to the social response generator 153 as a Social_Event message without any further processing using information from the emotional state register 152, the personality trait register 151 and a role database 158. After the social response generator 153 has processed the Social_Event message received from the interpreter 156, the social response generator 153 sends an Event_Processed flag to the interpreter 156.
The socially intelligent agent 150 further comprises a manifester 157 that coordinates the manifestation of the socially intelligent agent's emotional response to the Social_Event message. After receiving the Event_Processed flag from the social response generator 153, the interpreter 156 outputs a Manifest_Emotion flag to the manifester 157 to begin the process of manifesting the socially intelligent agent's emotional response. Based on the social characteristics in the role database 158, the manifester 157 sends a Generate_Emotion flag to the emotion generator 154. The emotion generator 154 uses information from the emotional state register 152, the personality register 151 and the event buffers 155A, 155B to generate the socially intelligent agent's emotional response (if one is required) to the Social_Event message. It might be, based on the social characteristics in the role database 158, that no emotional response is required and the manifester 157 does not issue a Generate_Emotion flag to the emotion generator 154. If the manifester 157 issues a Generate_Emotion flag, the emotion generator 154 outputs an Emotion_Response message to the manifester 157 based on information from the emotional state register 152, the personality register 151 and the event buffers 155A, 155B. The manifester 157 uses the Emotion_Response message, plus information from the emotional state register 152, the personality register 151 and the role database 158 to formulate an Agent_Behavior message that is indicative of the socially intelligent agent's response to a Social_Event message.
Referring to
Referring to
In the context of the socially intelligent agent 150, states are variable information that each agent has at initialization. For example, states include the emotions of a socially intelligent agent. An emotional state is given to a socially intelligent agent at initialization, and updated according to social rules that are used for the generation of behavior of the socially intelligent agent 150.
As discussed in the background section, agents need a virtual environment for operation. According to another embodiment of the present invention, such a virtual environment would be suitable for one or more socially intelligent agents as described above. Referring to
The virtual environment may further comprise a scenario database 185, coupled to the scenario environment 181, for providing a cyberspace context that allows the socially intelligent agents 183 to interact with each other. The cyberspace contexts can be quite varied and are limited only by the imagination of the software programmers creating the virtual environment.
The virtual environment can further comprise a first role database 186 coupled to the stimulus interpreter 182. The first role database 186 comprises social characteristics used by the stimulus interpreter 182 to create input events that are sent to the socially intelligent agents 183. The virtual environment can further comprise a second role database 187 coupled to the emotion manifester 184. The second role database 187 comprises information used to convert the emotion states/desires received from the socially intelligent agents 183 into action messages.
Referring to FIGS. 18 in further detail, the virtual environment comprises a scenario database 185, coupled to the scenario environment 181, for providing a cyberspace context that graphically depicts the interaction between the socially intelligent agents 183 based on the action messages received from the emotion manifester 184. The scenario environment sends a first type of command to the stimulus interpreter 182, which outputs a stimulus message to the socially intelligent agents 183 and forwards the command to the emotion manifester 184. This first type of command is based upon a user input that is inputted into the scenario environment 181. This command also incorporates a response mechanism, whereby the socially intelligent agents 183 respond back to the emotion manifester 184, which outputs the result of the command back to the scenario environment 181 as action messages. The scenario environment 181 converts the action messages into graphical representations of the socially intelligent agents' emotional responses so that the users are able to visually interpret the actions/responses of the socially intelligent agents 183. The scenario environment 181 can also send a second type of command to the stimulus interpreter 182, which outputs a stimulus message only to the socially intelligent agents 183. There is no response from the socially intelligent agents 183 in response to this type of command.
In another alternative embodiment, the present invention provides an article of manufacture that comprises a computer readable medium having stored therein a computer program. The computer program comprises a first code portion which, when executed on a computer, provides a socially intelligent agents 183. The computer program further comprises a second code portion which, when executed on a computer, provides a scenario environment 181 that receives user inputs and outputs stimulus messages. The computer program further comprises a third code portion which, when executed on a computer, provides a stimulus interpreter 182 that interprets the stimulus messages and outputs social facts as input events to the socially intelligent agents 183. The computer program further comprises a fourth code portion which, when executed on a computer, provides an emotion manifester 184 that receives the emotion response messages output from the socially intelligent agents 183 as emotion state/desire messages and converts the emotion state/desire messages into action messages. The scenario environment 181 receives the action messages and converts them into graphical representations of the socially intelligent agents' emotional responses so that the users are able to visually interpret the actions/responses of the socially intelligent agents 183.
As can be understood, according to the embodiments described herein, including the embodiment depicted in
In a similar manner, other intelligent agents participating in the scenario environment can also participate according to their personality traits and emotional state even if no user provides input to these agents. For example, in the classroom environment exemplified above, other agent may be present, some of which may be avatars of other users, while others just present without being an avatar of a user. Assuming that the avatar that answered the question provided a wrong answer. An intelligent agent present in the environment and having a high extraversion index value, high confidence index value, and negative agent interrelationship to the answering avatar may, without any input from a user, turn to the avatar and exclaim: “this is wrong! You have no idea what you are talking about!” On the other hand, another agent having a high agreeableness index and positive agent interrelationship to the answering avatar, may try to comfort the answering avatar without any input from a user by stating: “it's ok, you'll probably get the next one right.” Here again, it can be seen that the various embodiments of the invention provides emotional responses which correlates to the character and emotional state of the agent, even when no input is provided by users.
The advantages of the embodiment of
As can be seen from the above description, the SIA platform can be used by many different applications. Additionally, designers of applications need not “re-design” or implement a social response engine for their particular application. Rather, designers of the application may focus on the particular graphics and scenarios of the application, and simply couple to the SIA platform via an adapter to implement behavioral responses of actors within the application.
When the application 204 generates an event, the event is sent to all of the agents. The structure of each agent is similar to that shown for agent 200. Each agent has application specific rules stored in the event interpreter 210. The application specific rules are written to be specific to each application 204. Each agent also has common rules listed in the output generator 212, which are used for each event from each application and are common to all applications. That is, the common rules are written by the entity generating the SIA platform and are static for all applications, while the application specific rules may be written by the entity programming the application and change from application to application. One function of the event interpreter 210 is to receive an application event, and determine which generic event is most appropriate to issue in view of the received application event. That is, each agent has a list of generic events. The list of generic events is normally static and does not change between applications. The event interpreter 210 uses the application event to determine which generic event to issue. When the output generator 212 receives a generic event, it looks up the status of the dynamic register 216, the static register 214 and uses the common rules to provide an update to the dynamic register 216. The event interpreter 210 then looks up the entries in the updated dynamic register 216 and the entries in the static register 214 and issues a response based on these entries. It should be appreciated that for a beneficial operation of the embodiment of
Another optional feature shown in
The social response model 216B has a set of rules for handling each detected event 212B. An example of implementation of the social response model is depicted in
Notably, in this example each rule in the social response model 216B is created specifically for the particular agent to determine how to update values in the states, and how to put/remove desires and emotional events to/from buffers in states. On the other hand, according to another embodiment, each rule in the social response model 216B is created to be common for all of the agents. Of course, when a common rule is invoked for any particular agent, it can have different values of traits and/or states for the specific agent. Regardless of which implementation is chosen, each value in the states 211B and traits 213B are used by application developers to specify the behaviors of the agents. Each desire stored in the desired buffer 217B is used for triggering specific behavior of the agents corresponding to specified conditions that must be met. In this example, desires are employed for providing social feedback, i.e., emotion 215B. Because the response model 205B maintains and updates the values of the dynamic and static buffers, 211B and 213B, and the desire buffer 217B, it is not necessary for developers of applications to know how these values and buffer should be maintained and updated. Therefore, it makes it easier for developers of applications to add emotional behavior to agents in the application.
In the example of
The categorical emotion model 218B is a mechanism for generating categorical emotions of agents referencing their states, traits, and event buffers. An example of the categorical emotion model is depicted in
A general example of a computer (not shown) that can be used in accordance with the described embodiment will be described below.
The computer comprises one or more processors or processing units, a system memory and a bus that couples various system components comprising the system memory to processors. The bus can be one or more of any of several types of bus structures, comprising a memory bus or memory controller, a peripheral bus, an accelerated graphics port and a processor or local bus using any of a variety of bus architectures. The system memory comprises read only memory (ROM) and random access memory 140. A basic input/output system (BIOS) containing the routines that help to transfer information between elements within the computer, such as during boot up, is stored in the ROM or in a separate memory.
The computer further comprises a hard drive for reading from and writing to one or more hard disks (not shown). Some computers can comprise a magnetic disk drive for reading from and writing to a removable magnetic disk and an optical disk drive for reading from or writing to a removable optical disk, such as a CD ROM or other optical media. The hard drive, the magnetic disk drive and the optical disk drive are connected to the bus by an appropriate interface. The drives and their associated computer-readable media provide nonvolatile storage of computer-readable instructions, data structures, program modules and other data for the computer. Although the exemplary environment described herein employs a hard disk, a removable magnetic disk and a removable optical disk, it should be appreciated by those skilled in the art that other types of computer-readable media which can store data that is accessible by a computer, such as magnetic cassettes, flash memory cards, digital video disks, random access memories (RAMs), read only memories (ROMs), etc. may also be used in the exemplary operating environment.
A number of program modules may be stored on the hard disk, magnetic disk, optical disk, ROM or RAM, comprising an operating system, at least one or more application programs, other program modules and program data. In some computers, a user might enter commands and information into the computer through input devices such as a keyboard and a pointing device. Other input devices (not shown) may comprise a microphone, a joystick, a game pad, a satellite dish and/or a scanner. In some instances, however, a computer might not have these types of input devices. These and other input devices are connected to the processing unit through an interface coupled to the bus. In some computers, a monitor or other type of display device might also connected to the bus via an interface, such as a video adapter. Some computers, however, do not have these types of display devices. In addition to the monitor, the computers might comprise other peripheral output devices (not shown) such as speakers and printers.
The computer can, but need not, operate in a networked environment using logical connections to one or more remote computers, such as a remote computer. The remote computer may be another personal computer, a server, a router, a network PC, a peer device or other common network node, and typically comprises many or all of the elements described above relative to the computer. The logical connections to the computer may comprise a local area network (LAN) and a wide area network (WAN). Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets, and the Internet.
When used in a LAN networking environment, the computer is connected to the local network through a network interface or adapter. When used in a WAN networking environment, the computer typically comprises a modem or other means for establishing communications over the wide area network, such as the Internet. The modem, which may be internal or external, is connected to the bus via a serial port interface. In a networked environment, program modules depicted relative to the computer, or portions thereof, may be stored in the remote memory storage device. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers may be used.
Generally, the data processors of the computer are programmed by means of instructions stored at different times in the various computer-readable storage media of the computer. Programs and operating systems are typically distributed, for example, on floppy disks or CD-ROMs. From there, they are installed or loaded into the secondary memory of the computer. At execution, they are loaded at least partially into the computer's primary electronic memory. The invention described herein comprises these and other various types of computer-readable storage media when such media contain instructions or programs for implementing the steps described below in conjunction with a microprocessor or other data processor. The invention also comprises the computer itself when programmed according to the methods and techniques described below.
The foregoing description of the preferred and other embodiments of the invention has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed, and modifications and variations are possible in light of the above teachings or may be acquired from practice of the invention. The embodiments were chosen and described in order to explain the principles of the invention and its practical application to enable one skilled in the art to utilize the invention in various embodiments and with various modifications as are suited to the particular use contemplated.
Thus, while only certain embodiments of the invention have been specifically described herein, it will be apparent that numerous modifications may be made thereto without departing from the spirit and scope of the invention. Further, acronyms are used merely to enhance the readability of the specification and claims. It should be noted that these acronyms are not intended to lessen the generality of the terms used and they should not be construed to restrict the scope of the claims to the embodiments described therein.
This application claims priority from U.S. Provisional Patent Application Ser. No. 60/676,016 filed Apr. 29, 2005, the disclosure of which is incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
60676016 | Apr 2005 | US |