The present application claims priority to U.S. Patent Application Ser. No. 61/080,626, filed on Jul. 14, 2008.
The present application may be related in subject matter to the following applications: U.S. patent application Ser. No. 12/178,535, U.S. patent application Ser. No. 12/188,953.
A portion of the disclosure of this document may contain material that is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent files or records, but otherwise reserves all copyright rights whatsoever. The following notice shall apply to this document: Copyright © 2008 Microsoft Corp.
The presently disclosed subject matter relates to the field of computing, and more particularly, to fields such as gaming, although this is merely an exemplary and non-limiting field.
Avatars can be graphical images that represent real persons in virtual or game space. They can be embodiments or personifications of the persons themselves, or of principles, attitudes, or views of life held by such persons. Hitherto, avatars have been static, or at most, they have had limited animation capabilities. However, what is needed is systems, methods, and/or computer readable media that allow for creative animation based on input from users and/or games.
Various mechanisms are provided herein for accessorizing avatars with animated items. Avatars may be accessorized by items that have animations that are item specific, or they may be accessorized by items that apply to the entire body of the avatars. In addition, such item accessories may apply to avatars across different game titles and they may be added during the execution of a game, whether using a user interface or the game itself (e.g. receiving accessories for achieving certain milestones in a game). Such accessories may also be obtained from remote sources as packages, and then applied locally to avatars.
It should be noted that this Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
The foregoing Summary, as well as the following Detailed Description, is better understood when read in conjunction with the appended drawings. In order to illustrate the present disclosure, various aspects of the disclosure are shown. However, the disclosure is not limited to the specific aspects shown. The following figures are included:
In the case of online game playing, other users, such as user B 162 and user C 164 could interact with the first user 160 if such online game playing was mediated by a central server. In the prior art shown in
In stark contrast to
In any event,
For example, if a user wins a hat animation, such a hat animation may be applied to the avatar across several games, whether racing games, strategic games, fantasy games, and so on. The animation may be the same for an avatar across several games, or it may change depending on the context of the game. Such details are implementation specific, and all aspects of such animation items are contemplated herein.
In another aspect of the present disclosure,
There may be various triggering mechanisms for applying such item content to avatars. For example, one such mechanism can include a user interface 306 that allows users to select which items to apply to which avatars. Alternatively (or in addition to), game titles 304 themselves can be a trigger for applying item content to avatars, depending on the implementation and user interaction with a gaming console hosting such an avatar engine 302. Yet, in other aspects of the present disclosure, the avatar engine 302 may reside on hosting server and any processing may be performed upstream from users.
In another aspect of the presently disclosed subject matter,
Thus, a system may be used for animating gaming console avatars in a plurality of different ways, where the system may be loaded in local computer memory and where it may be executed by a physical processor. In such a system, a processing module 402 may be configured to process an accessory package 310 containing a plurality of animation items. Another module, the identification module 404, may be configured to identify avatars and animation items from the plurality of animation items. And still another module, the application module 406, may be configured to apply the animation items to the avatars.
This system may accommodate various other aspects, such as having animation items applied to the avatars 410 at least in a first game title 420 and a second game title 422. Moreover, the animation items may be configured to be applied to the avatars 410 while the first game title 420 is playing 407. In other aspects, animation items may be further configured to be applied to the avatars 410 while the second game 422 title is playing 407. The first game title 420 and the second game title 422 may be distinct from each other in that they may be loaded separately, at time t1424 and at time t2426.
It will be readily appreciated that other instructions may be used in conjunction with the aspects discussed above, such as: instructions configured to carry at least one item between or across different game titles 825; instructions configured to present to a user the option of accessorizing any avatars during the execution of games 830; instructions configured to trigger and manage animation in different ways 870, such continuous 850, event 855, user 860, and idle 865 animation; instructions configured to apply animation in specific ways 810, such as item-based 804 and avatar-based 802, or using different mechanisms, such as user interface 806 and game titles 808. It should be noted that other instructions 835 that embody the various aspect discussed above may be used.
The above discussed computing devices and accessories can be embodied as gaming consoles, music players, personal computers, controllers, remote control devices and other such devices having different, similar, or the same platforms. Referring to
This console, which includes a game oriented console or a PC, can comprise, for example, digital audio processing functionality. Specifically, in
A graphics processing unit (GPU) 108 and a video encoder/video codec (coder/decoder) 114 can form a video processing pipeline for high speed and high resolution graphics processing. Data can be carried from the graphics processing unit 108 to the video encoder/video codec 114 via a bus. The video processing pipeline can output data to an A/V (audio/video) port 140 for transmission to a television or other display. A memory controller 110 can be connected to the GPU 108 and CPU 101 to facilitate processor access to various types of memory 112, such as, but not limited to, a RAM (Random Access Memory). Thus, various types of information, whether sensitive or not, or even parts of various types of information, can be stored in the various types of memories discussed above, depending on the need.
The multimedia console 100 can include an I/O controller 120, a system management controller 122, an audio processing unit 123, a network interface controller 124, a first USB host controller 126, a second USB controller 128 and a front panel I/O subassembly 130 that can be preferably implemented on a module 118. The USB controllers 126 and 128 can serve as hosts for peripheral controllers 142(1)-142(2), a wireless adapter 148, and an external memory unit 146 (e.g., flash memory, external CD/DVD ROM drive, removable media, etc.). Such peripheral controllers 142(1)-142(2) can have various types of lighting displays that is triggered by proximity and motion. Moreover, the network interface 124 and/or wireless adapter 148 can provide access to a network (e.g., the Internet, home network, etc.) and can be any of a wide variety of various wired or wireless interface components including an Ethernet card, a modem, a Bluetooth module, a cable modem, and the like.
System memory 143 can be provided to store application data that is loaded during the boot process. A media drive 144 can be provided and can comprise a DVD/CD drive, hard drive, or other removable media drive, etc. The media drive 144 can be internal or external to the multimedia console 100. Application data can be accessed via the media drive 144 for execution, playback, etc. by the multimedia console 100. The media drive 144 can be connected to the I/O controller 120 via a bus, such as a Serial ATA bus or other high speed connection (e.g., IEEE 1394). Additional to such application data, other information can be stored on the console 100 that will aid in the communication between peripheral/accessory device controllers and the console 100 itself.
The system management controller 122 can provide a variety of service functions to assure the availability of the multimedia console 100. The audio processing unit 123 and an audio codec 132 can form a corresponding audio processing pipeline with high fidelity, 3D, surround, and stereo audio processing according to aspects of the presently disclosed subject matter above. Audio data can be carried between the audio processing unit 123 and the audio codec 126 via a communication link. The audio processing pipeline can output data to the A/V port 140 for reproduction by an external audio player or device having audio capabilities.
The front panel I/O subassembly 130 can support the functionality of the power button 150 and the eject button 152, as well as any LEDs (light emitting diodes) or other indicators exposed on the outer surface of the multimedia console 100. A system power supply module 136 can provide power to the components of the multimedia console 100. A fan 138 can cool the circuitry within the multimedia console 100.
The CPU 101, GPU 108, memory controller 110, and various other components within the multimedia console 100 can be interconnected via one or more buses, including serial and parallel buses, a memory bus, a peripheral bus, and a processor or local bus using any of a variety of bus architectures.
When the multimedia console 100 is powered on or rebooted, application data can be loaded from the system memory 143 into memory 112 and/or caches 102, 104 and executed on the CPU 101. Such application data can include some of the online derived data, including the avatar packages discussed above. The application can also present a graphical user interface that provides a consistent user experience when navigating to different media types available on the multimedia console 100. Users can accessorize avatars using such a user interface. In operation, applications and/or other media contained within the media drive 144 can be launched or played from the media drive 144 to provide additional functionalities to the multimedia console 100. And, such media, including game titles can be the basis for accessorizing avatars.
The multimedia console 100 can be operated as a standalone system by simply connecting the system to a television or other display. In this standalone mode, the multimedia console 100 can allow one or more users to interact with the system, watch movies, listen to music, and the like. However, with the integration of broadband connectivity made available through the network interface 124 or the wireless adapter 148, the multimedia console 100 can further be operated as a participant in a larger network community of computing devices. As such a participant, it can interact with computing devices, whether PCs or servers, and receive information that can be eventually stored.
Next,
Finally, it should also be noted that the various techniques described herein can be implemented in connection with hardware or software or, where appropriate, with a combination of both. Thus, the methods and apparatus of the presently disclosed subject matter, or certain aspects or portions thereof, can take the form of program code (i.e., instructions) embodied in tangible storage media, such as floppy diskettes, CD-ROMs, hard drives, or any other machine-readable storage medium, where, when the program code is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing the subject matter.
In the case of program code execution on programmable computers, the computing device can generally include a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device. One or more programs that can utilize the creation and/or implementation of domain-specific programming models aspects of the present invention, e.g., through the use of a data processing application programming interface (API) or the like, are preferably implemented in a high level procedural or object oriented programming language to communicate with a computer system. However, the program(s) can be implemented in assembly or machine language, if desired. In any case, the language can be a compiled or interpreted language, and combined.
Finally, while the present disclosure has been described in connection with a plurality of exemplary aspects, as illustrated in the various figures and discussed above, it is understood that other similar aspects can be used or modifications and additions can be made to the described aspects for performing the same function of the present disclosure without deviating therefrom. For example, in various aspects of the disclosure, methods, systems, and computer readable media were described configured for providing animation accessorization of avatars. However, other equivalent mechanisms to these described aspects are also contemplated by the teachings herein. Therefore, the present disclosure should not be limited to any single aspect, but rather construed in breadth and scope in accordance with the appended claims.
Number | Name | Date | Kind |
---|---|---|---|
5880731 | Liles | Mar 1999 | A |
6227974 | Eilat | May 2001 | B1 |
6229533 | Farmer | May 2001 | B1 |
6268872 | Matsuda et al. | Jul 2001 | B1 |
6385642 | Chlan et al. | May 2002 | B1 |
6545682 | Ventrella et al. | Apr 2003 | B1 |
6692359 | Williams et al. | Feb 2004 | B1 |
6697072 | Russell et al. | Feb 2004 | B2 |
6910186 | Kim | Jun 2005 | B2 |
7006098 | Bickmore et al. | Feb 2006 | B2 |
7275987 | Shimakawa et al. | Oct 2007 | B2 |
7342587 | Danzig et al. | Mar 2008 | B2 |
7425169 | Ganz | Sep 2008 | B2 |
7568004 | Gottfried | Jul 2009 | B2 |
7636755 | Blattner et al. | Dec 2009 | B2 |
7690997 | Van Luchene et al. | Apr 2010 | B2 |
7824253 | Thompson et al. | Nov 2010 | B2 |
7840903 | Amidon et al. | Nov 2010 | B1 |
7849043 | Woolf et al. | Dec 2010 | B2 |
7913176 | Blattner et al. | Mar 2011 | B1 |
8047915 | Lyle et al. | Nov 2011 | B2 |
8099338 | Betzler et al. | Jan 2012 | B2 |
8187067 | Hamilton et al. | May 2012 | B2 |
20010019337 | Kim | Sep 2001 | A1 |
20020067362 | Nocera et al. | Jun 2002 | A1 |
20020068626 | Takeda et al. | Jun 2002 | A1 |
20020151364 | Suchocki | Oct 2002 | A1 |
20030228908 | Caiafa et al. | Dec 2003 | A1 |
20040152512 | Collodi et al. | Aug 2004 | A1 |
20040221224 | Blattner | Nov 2004 | A1 |
20040250210 | Huang et al. | Dec 2004 | A1 |
20050137015 | Rogers et al. | Jun 2005 | A1 |
20050143174 | Goldman et al. | Jun 2005 | A1 |
20050216558 | Flesch et al. | Sep 2005 | A1 |
20050248574 | Ashtekar et al. | Nov 2005 | A1 |
20060046820 | Inamura | Mar 2006 | A1 |
20060121991 | Borinik et al. | Jun 2006 | A1 |
20060143569 | Kinsella et al. | Jun 2006 | A1 |
20060184355 | Ballin et al. | Aug 2006 | A1 |
20060188144 | Sasaki et al. | Aug 2006 | A1 |
20060293103 | Mendelsohn | Dec 2006 | A1 |
20060294465 | Ronen et al. | Dec 2006 | A1 |
20070002057 | Danzig et al. | Jan 2007 | A1 |
20070074114 | Adjali et al. | Mar 2007 | A1 |
20070110298 | Graepel et al. | May 2007 | A1 |
20070111789 | Van Deursen et al. | May 2007 | A1 |
20070113181 | Blattner et al. | May 2007 | A1 |
20070167204 | Lyle et al. | Jul 2007 | A1 |
20070168863 | Blattner et al. | Jul 2007 | A1 |
20070178966 | Pohlman et al. | Aug 2007 | A1 |
20070197296 | Lee | Aug 2007 | A1 |
20070259713 | Fiden et al. | Nov 2007 | A1 |
20070260984 | Marks et al. | Nov 2007 | A1 |
20070273711 | Maffei | Nov 2007 | A1 |
20070293319 | Stamper et al. | Dec 2007 | A1 |
20070298866 | Gaudiano et al. | Dec 2007 | A1 |
20080001951 | Marks et al. | Jan 2008 | A1 |
20080045283 | Stamper et al. | Feb 2008 | A1 |
20080059570 | Bill | Mar 2008 | A1 |
20080076519 | Chim | Mar 2008 | A1 |
20080081701 | Shuster | Apr 2008 | A1 |
20080091692 | Keith et al. | Apr 2008 | A1 |
20080120558 | Nathan et al. | May 2008 | A1 |
20080158232 | Shuster | Jul 2008 | A1 |
20080215974 | Harrison et al. | Sep 2008 | A1 |
20080215975 | Harrison et al. | Sep 2008 | A1 |
20080220876 | Mehta et al. | Sep 2008 | A1 |
20080250315 | Eronen et al. | Oct 2008 | A1 |
20080301556 | Williams et al. | Dec 2008 | A1 |
20080303830 | Fleury et al. | Dec 2008 | A1 |
20080309675 | Fleury et al. | Dec 2008 | A1 |
20080309677 | Fleury et al. | Dec 2008 | A1 |
20090063983 | Amidon et al. | Mar 2009 | A1 |
20090069084 | Reece et al. | Mar 2009 | A1 |
20090106671 | Olson et al. | Apr 2009 | A1 |
20090198741 | Cooper | Aug 2009 | A1 |
20090267960 | Finn et al. | Oct 2009 | A1 |
20090312080 | Hamilton et al. | Dec 2009 | A1 |
20100009747 | Reville et al. | Jan 2010 | A1 |
20100023885 | Reville et al. | Jan 2010 | A1 |
20100035692 | Reville et al. | Feb 2010 | A1 |
20100203968 | Gill et al. | Aug 2010 | A1 |
20100233667 | Wilson et al. | Sep 2010 | A1 |
Number | Date | Country |
---|---|---|
10-2008-0033781 | Apr 2008 | KR |
WO 01059709 | Aug 2001 | WO |
WO 2004053799 | Jun 2004 | WO |
WO 2006107182 | Oct 2006 | WO |
Entry |
---|
Andre, E. et al., “Exploiting Models of Personality and Emotions to Control the Behavior of Animated Interactive Agents”, Downloaded from Internet Dec. 19, 2008, http://www.dfki.de, 5 pages. |
Lisetti, C.L. et al., “MAUI: A Multimodal Affective User Interface”, Multimedia, 2002, http://delivery.acm.org, 161-170. |
Hayes-Roth, B. et al., “Improvisational Puppets, Actors, and Avatars”, Computer Science Department, 2000, 10 pages. |
Vilhjalmsson, H, H., “Autonomous Communicative Behaviors in Avatars”, Submitted to the Program in Media Arts and Sciences, School of Architecture and Planning, MIT, Jun. 1997, 50 pages. |
PCT Application No. PCT/US2009/050606: International Search Report and Written Opinion of the International Searching Authority, Feb. 25, 2010, 11 pages. |
Kafai et al., “Your Second Selves: Avatar Designs and Identity Play in a Teen Virtual World,” http://www.gseis.ucla.edu/faculty/kafai/paper/whyville—pdfs/DIGRA07—avatar.pdf, downloaded 2008, 1-9. |
Vasalou et al., “Constructing My Online Self: Avatars that Increase Self-focused Attention,” http://www.luminainteractive.com/research/downloads/note137-vasalou.pdf, downloaded 2008, 1-4. |
Folea et al., “MPEG-4 SDK: From Specifications to Real Applications,” http://www-artemis.int-evry.fr/Publications/library/Folea-WSEAS-ICC2005.pdf, downloaded 2008, 1-6. |
Bacon, S., “Avatars,” http://www.ymessengerblog.com/blog/category/avatars/feed, downloaded 2008, 1-7. |
Fischer “Skript: An extendable language flexible avatar control framework for VR applications”, Thesis 2005. |
U.S. Appl. No. 12/189,067, Final Office Action dated Jul. 6, 2012, 12 pages. |
U.S. Appl. No. 12/271,690, Final Office Action dated Jul. 16, 2012, 12 pages. |
Lee et al., “Interactive Control of Avatars Animated with Human Motion Data,” http://graphics.cs.cmu.edu/projects/Avatar/avatar.pdf, downloaded 2008, 10 pages. |
Roeder, L., “Create a Personalized Yahoo Avatar,” Personal Web About.com, http://personalweb.about.com/od/hacksforprofiles/ss/yahooavatar—5.htm, Jun. 6, 2008, 2 pages. |
Alexa et al., “An Animation System for User Interface Agents,” http://wscg.zcu.cz/wscg2001/Papers—2001/R342.pdf, downloaded 2008, 7 pages. |
U.S. Appl. No. 12/271,690, Non-Final Office Action dated Feb. 16, 2012, 12 pages. |
U.S. Appl. No. 12/189,067, Non-Final Office Action dated Aug. 19, 2011, 8 pages. |
U.S. Appl. No. 12/178,535, Non-Final Office Action dated Aug. 31, 2011, 17 pages. |
U.S. Appl. No. 12/178,535, Final Office Action dated Dec. 29, 2011, 14 pages. |
Wikipedia Web printout: “ Need for Speed: Underground”, printed on Aug. 25, 2011. |
“Changing Avatar Appearance with Material Scripting,” http://update.multiverse.net/wiki/index.php/Changing—Avatar—Appearance—With—Material—Scripting, downloaded 2008, 1-10. |
“How to Change Your Avatar's Appearance in Second Life,” http://www.ehow.com/how—2036822—change-avatars-appearance.html, downloaded 2008, 1-2. |
“The Adventure,” http://got.communicatenewmedia.com/track2.php?t=features, downloaded 2008, 1-2. |
Number | Date | Country | |
---|---|---|---|
20100026698 A1 | Feb 2010 | US |