The present invention relates to virtual worlds, such as simulations of the real-world or real-life, and the like, and more particularly to managing and presenting avatar mood effects in a virtual world.
Computer based simulations are becoming more ubiquitous. Simulations may be used for training purposes, for entertainment, for commerce or for other purposes. Computer simulations such as Second Life or similar simulations present a virtual world which allows users or players to be represented by characters known as avatars. Second Life is an Internet-based virtual world launched in 2003 by Linden Research, Inc. A downloadable client program called the Second Life Viewer enables users, called “Residents”, to interact with others in the virtual world through motional avatars. The virtual world basically simulates the real world or environment. The users or residents via their avatar can explore the virtual world, meet other users or residents, socialize, participate in individual and group activities, create and trade items (virtual property) and services from one another. The challenge with respect to such simulations or virtual worlds is to make them as realistic or as much like the real-world or real-life as possible. This increases the utility of such simulations as a training tool or enjoyment of the participants or users as an entertainment medium. Current virtual worlds enable only certain limited capabilities for simulating real-world interactions such as personalization of avatars based on clothing, facial features and physique. More engaging experiences, such as moods or emotions are typically not taken into account. For example, how moods are defined and affect personal features, such as dress, facial expressions or other features, and personal interactions is lacking. Second Life is a trademark of Linden Research, Inc. in the United States, other countries or both.
In accordance with an embodiment of the present invention, a method for managing and presenting avatar moods or mood effects in a virtual world may include allowing a mood or mood effect to be associated with a user's avatar in the virtual world from a plurality of predefined moods or mood effects. The method may also include presenting the associated mood or mood effect to other users of the virtual world.
In accordance with another embodiment of the present invention, a method for managing and presenting avatar moods or mood effects in a virtual world may include profiling a set of mood or mood effect changes to portray different real-world emotions and moods by a user's avatar to other users in the virtual world. The method may also include defining a predetermined action based on each mood or mood effect change, the predetermined action to be performed by the user's avatar in the virtual world in response to the user's avatar being associated with a mood corresponding to the mood effect change. The method may further include triggering the change in the mood or mood effect in response to occurrence of a predetermined event.
In accordance with another embodiment of the present invention, a method for managing and presenting a mood of a user's avatar in a virtual world may include presenting a mood or mood effect of the user's avatar to other users in the virtual world. Presenting the mood or mood effect may include presenting a predefined script in association with the user's avatar to other users in the virtual world to indicate the mood of the user's avatar. Presenting the mood or mood effect may also include performing a predefined action by the user's avatar to indicate the mood of the user's avatar. Presenting the mood or mood effect may additionally include presenting the user's avatar with a predetermined appearance to indicate the mood of the user's avatar.
Other aspects and features of the present invention, as defined solely by the claims, will become apparent to those ordinarily skilled in the art upon review of the following non-limited detailed description of the invention in conjunction with the accompanying figures.
The following detailed description of embodiments refers to the accompanying drawings, which illustrate specific embodiments of the invention. Other embodiments having different structures and operations do not depart from the scope of the present invention.
As will be appreciated by one of skill in the art, the present invention may be embodied as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module” or “system.” Furthermore, the present invention may take the form of a computer program product on a computer-usable storage medium having computer-usable program code embodied in the medium.
Any suitable computer usable or computer readable medium may be utilized. The computer-usable or computer-readable medium may be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection having one or more wires, a tangible medium such as a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), or other tangible optical or magnetic storage device; or transmission media such as those supporting the Internet or an intranet. Note that the computer-usable or computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted, or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory. In the context of this document, a computer-usable or computer-readable medium may be any medium that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The computer-usable medium may include a propagated data signal with the computer-usable program code embodied therewith, either in baseband or as part of a carrier wave. The computer usable program code may be transmitted using any appropriate medium, including but not limited to the Internet, wireline, optical fiber cable, radio frequency (RF) or other means.
Computer program code for carrying out operations of the present invention may be written in an object oriented programming language such as Java, Smalltalk, C++ or the like. However, the computer program code for carrying out operations of the present invention may also be written in conventional procedural programming languages, such as the “C” programming language or similar programming languages, or in functional programming languages, such as Haskell, Standard Meta Language (SML) or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
The present invention is described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
Referring also to
As illustrated in block 102, mood effects may be attributes standardized by a virtual world owner or operator. The standardized mood effect attributes may be shared by other avatars in the virtual world. As described herein, the standardized moods or mood effect attributes may be stored on a system or server to permit the mood effect attributes to be shared or associated with other avatars. In accordance with an embodiment of the present invention, users or participants in the virtual system may query or request the mood effect attributes of another avatar. Responses to such queries or requests may assist in automated responses and provide a better understanding of the person or avatar with which a user's avatar may be engaged in the virtual world and to facilitate interactions between users and users' avatars in the virtual world.
In block 104, a profile defining each mood or mood effect may be received by a virtual world system or stored on a user's computer system for managing and presenting mood effects. An identification or characterization of the mood effect may be associated with each profile, such as happy, sad, angry or similar moods or emotions. As previously discussed the profiles can be relatively complex and may include different human moods and emotions and may include different levels for each mood or emotion.
In block 106, any scripts, gestures, actions, or appearances of an avatar, or other attributes may be received and stored by a system to be associated with each mood effect as part of the mood effect profile to define the mood effect. The GUI presented to define each mood effect or mood effect profile in block 102 may permit a script to be entered and stored in association with each mood effect. The script may be a visual or verbal utterance or other form of communications. The GUI may also permit an action, such as a gesture or other action to be entered and stored in association with each mood effect. A particular action may be performed by a user's avatar while the mood affect corresponding with the particular action is associated with the user's avatar. The GUI may also permit an appearance of a user's avatar to be associated with each mood effect. For example, clothing worn by the user's avatar may be different depending upon a mood of the user's avatar.
In block 108, any action or actions received and stored by the virtual world system or user's computer system in association with each mood or mood effect may be configured to be triggered by occurrence of a predetermined event, such as a virtual world event or other occurrence. For example an action may be performed by a user's avatar in response to entering an event or location, leaving an event or location, entering a mood, or leaving a mood. A particular action may also be performed by the user's avatar in response to the user's avatar coming into contact with another avatar with predetermined matched or correlated rules. For example, each of the avatars may be in a particular mood, have a particular company or organization affiliation in the virtual world, same or similar virtual world experiences or other characteristics relative to one another that prompt a predefined action. The action may be triggered in response to the avatars coming within a predetermined proximity range of one another in the virtual world.
In block 110, a user may be allowed to select a mood effect from a plurality of predefined mood effects and associated with the user's avatar in the virtual world. Each of the plurality of mood effects may be defined as previously discussed. The mood effects may be defined by the user or the mood effects may be standardized by the virtual world system operator or owner. The mood effect may be selected from a menu, dropdown list using a computer pointing device or may be selected by some other mechanism known in relation to virtual world systems, simulations or other computer applications.
In accordance with another embodiment of the present invention, the mood effect may be manually set based on inputs from the user or selection of criteria by the user. The mood effect may be tagged to the avatar by a tagging mechanism or visual identifier to present the mood or mood effect of the user's avatar to users of other avatars in the virtual world. Referring also to
Referring back to
In block 114, a mood effect associated with the user's avatar may be changed in response to an input of some criteria corresponding the mood effect or selection of a different mood effect similar to that previously described. A change in the mood effect may also be triggered by an occurrence, an event, an entry into a location in the virtual world by the user's avatar or other stimuli similar to that previously discussed. An example of a method and system for automated avatar moods in a virtual world is described in U.S. patent application Ser. No. ______ (IBM Docket No. RSW920070211US1), entitled “Automated Avatar Moods in a Virtual World”, by Steven K. Speicher et al., which is assigned to the same assignee as the present invention and is incorporated herein by reference.
In block 116, changes in mood effects may be mined or recorded and tracked along with any actions taken by avatars in response to a mood effect or change in mood effect. This data may be analyzed by the virtual world owner or operator or other entities providing services in the virtual world for market data intelligence or for other purposes.
The moods or mood effects can be defined as simple profiles, such as a “Happy Avatar” 408 and a “Sad Avatar” 410, either of which may be associated with, or tagged to the user's avatar 404 to identify the avatar's mood, visually, audibly or both, to other users in the virtual world 406. The avatar mood or mood effect may be tagged to the avatar 404 using a tagging mechanism, visual identifier or other mechanism. The tagging mechanism or visual identifier may be any such mechanism used in virtual worlds or simulations to associate information or attributes to an avatar and to present such information to other users or participants in the virtual world.
The system 400 may also include a change mood component 412 or module to cause changes in moods or mood effects associated with or tagged to an avatar. As previously discussed, the change in moods may be in response to some event or condition or actions by the avatar's user, such as selecting a different mood or mood effect from a plurality of moods.
The system 400 may also include subsystems for defining the mood and changes in the mood or mood effects. For example, the system 400 may include a scripts subsystem 414, a gesture or action subsystem 416, and an appearance subsystem 418. The scripts subsystem 414 may permit a script to be entered and associated with a mood or mood effect in defining the mood or mood effect. The scripts subsystem 414 may also control presentation of the script in association with an avatar tagged with the mood corresponding to the script. The script may be words, sounds, such as grunts, groans, laugh or other utterances which may be spoken or expressed by an avatar that has been tagged with the particular mood or mood effect. The script may be presented in visual or audible form. For example, the script may be presented in a balloon, similar to balloons 316 and 318 illustrated in
The gestures or actions subsystem 414 may permit a specific gesture or action to be entered and associated with a mood or mood effect in defining the mood or mood effect. The gestures subsystem 416 may also control performance of the gesture or action by an avatar tagged with the mood corresponding to the gesture or action. Examples of avatar actions or gestures that may be associated with a mood or mood effect may include the avatar's head being down and/or shoulders slumped forward to indicate a sad mood, clenched fists to indicate an angry mood, arms held overhead to indicate a happy mood, the avatar jumping up and down to express a happy mood, other movements of the arms, legs or other body parts or body language that may connote a particular mood or emotion or any other actions or gestures that may express a particular type of mood or emotion.
The appearance subsystem 418 may permit a specific appearance of an avatar to be entered and associated with a mood or mood effect in defining the mood or mood effect. The appearance subsystem 418 may also control the appearance of an avatar tagged with the particular mood corresponding to the appearance. Examples of avatar appearances that may express a mood or emotion may include avatar facial expressions, bright colored clothing to express a happy mood, dark, black or gray colored clothing in association with a sad mood or any other visual effects associated with appearance of an avatar in the virtual world that may suggest a mood of the avatar or user associated with the avatar.
The module 502 of managing and presenting avatar mood effects in the virtual world may include a define mood sub-module or component 510. The define mood component 510 may be similar to the define mood component 402 described with reference to
In accordance with another embodiment of the present invention, as previously described, moods or mood effects may be standardized by the virtual world owner and/or operator. Accordingly, the virtual world owner/operator may define the different mood or mood effects. The different predefined mood or mood effects may then be shared by all of the users of the virtual world and may be associated with or tagged to respective users' avatars to express or present the avatar's mood to other users in the virtual world.
The module for managing and presenting avatar mood effects in a virtual world may also include a mood inventory 512. The mood inventory 512 may store all of the moods and associated mood effects or attributes, such as scripts, gestures or actions, appearances or other attributes associated with each mood for application to the user's avatar. As previously discussed, the moods and associated mood effects or attributes may be defined by the user or the virtual world owner and/or operator may defined standardized moods and associated mood effects or attributes which may be shared among all avatars in the virtual world.
The module for managing and presenting avatar mood effects in a virtual world may also include a component 514 to control mood changes in response to predetermined stimuli. For example, mood changes may occur in response to selection of another mood by the user of the avatar, certain events in the virtual world or real world, entering or leaving a location in the virtual world, interaction with another avatar or any other stimuli that may elicit a mood change. Predetermined rules may be created or defined to evoke specific mood changes. As previously discussed, an example of a system and method for automated changes in avatar moods in a virtual world is described in U.S. patent application Ser. No. ______ (IBM Docket No. RSW920070211US1).
The computer system 506 may also include a processor 516. The module for managing and presenting avatar mood effects 502 may be accessed from the file system 504 and run on the processor 516.
The computer system may also include a display 518, a speaker system 520, and one or more input devices, output devices or combination input/output devices, collectively I/O devices 522. The I/O devices 522 may include a keyboard, pointing device, such as a mouse, disk drives and any other devices to permit a user to interface with and control operation of the computer system and to access and participate in the virtual world through the user's avatar. The display 518 may present the user's avatar and other users' avatars in the virtual world and present the features described herein for defining and managing moods or mood effects. The speaker 520 may present any sounds associated with the virtual world, such as audible mood effects or other sounds.
The system 500 may also include a server 524. A virtual world system 526 may reside and operate on the server 524. Users 508 via browsers (not shown in
Other embodiments of the present invention are not limited to only a server and the system and features described herein may be in one of many forms. Examples may include may include a client, configurations that support peer-to-peer communications, a wireless solution or other arrangements.
In accordance with an embodiment of the present invention, a repository or inventory 528 of standardized moods, mood effects and any associated attributes may be associated with the server 524. Standardized mood inventory 528 may be contained on the server 524 or may be a separate component from the server 528. A mood or visual identification tagging mechanism 530 may also be operable on the server 524 to tag mood effects to respective avatar and to control and maintain any mood changes. Control, tracking and recording of moods and mood changes may be coordinated between the mood change component 514 that may be operable on the computer system 506 of each user 508 and the mood or visual identification tagging mechanism 530. In accordance with an embodiment of the present invention, the module 502 for managing and presenting avatar mood effects may be operable on the server 524 or some of the features or operations described with respect to module 502 may be performed in the computer system 506 and others on the server 524.
The flowcharts and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
Although specific embodiments have been illustrated and described herein, those of ordinary skill in the art appreciate that any arrangement which is calculated to achieve the same purpose may be substituted for the specific embodiments shown and that the invention has other applications in other environments. This application is intended to cover any adaptations or variations of the present invention. The following claims are in no way intended to limit the scope of the invention to the specific embodiments described herein.