Live toy system

Abstract
The live toy system as described affords a user with an experience of interacting with a physical toy as if it were a live toy by communicating and interacting with the toy and its game version (avatar) via a game application. Unlike other products, the system provides a near identical or virtual representation of the physical toy in the online application. The online application provides a virtual world and/or game environment in which the game avatar exists and participates in part by input received from the user. Some of this input is provided to the physical toy from the user who can affect the appearance or behavior of the game avatar. The user can play and interact with both their physical toy in the real world and its game avatar in an online world—which provides an experience of a single live toy for the user.
Description
BACKGROUND

Virtual world websites have been a popular gaming and activity outlet for all ages. With a simple user account, users can enter imaginary worlds online and interact with the site content as well as other users through their online characters.


Ganz, the assignee of the present application, has numerous patents that took the online virtual world a step further, when it first introduced a plush toy having a unique code and a related virtual online world in which the toy's code is entered in order to obtain a virtual version of the toy in the virtual world. Once the plush toy was registered online in this manner, the user was able to play with their virtual toy in the virtual world. Ganz's patents describe how the user could decorate a home for the toy, play games and earn items such as food, clothes and home furnishings for the toy.


SUMMARY

Described herein is a live toy system and method that affords a user with an experience of interacting with a physical toy as if it were a live toy by communicating and interacting with the physical toy and its game avatar via a game application. The system provides a near identical representation of the physical toy in the game application. The game application provides a virtual world in a game-based environment in which the game avatar exists and participates in game activities in part by input received from the user. Some of this input comes from the user via the physical toy. The physical toy can receive input from the user and then transmit that input data or some responsive data to the game avatar in the game application using a Bluetooth® or similar connection between the physical toy and a computing device running the game application. Therefore, the user can affect the appearance or behavior of the game avatar by way of the physical toy. The user can play and interact with both their physical toy in the real world and its game avatar in an online world—which provides an experience of a single live toy for the user.


The live toy system can include a physical toy comprising a master control unit, an input component for receiving incoming data, a communications control component for managing, handling and transmitting data, and an output component comprising a speaker, wherein the master control unit controls the input component, the communications control component and the output component by processing data associated with each component, managing requests and processing and tracking queued requests; a server computer that is connected to at least one computing device and that is programmed to create signals which are communicated to cause the display to be generated on the at least one computing device as a game application, wherein the display shows a virtual world in which a replica image of the physical toy exists in the virtual world as a game avatar; wherein the physical toy is registered to a user account via a unique code in the game application on the at least one computing device, the physical toy and the game avatar are connected to each other via the computing device and the game application, and the physical toy and the game avatar each create requests and communicate the requests to each other, where upon the physical toy's completion of a request sent by the game avatar, the game avatar receives data corresponding to the completion from the physical toy and outputs a reaction on the display of the game application, where upon the game avatar's completion of a request sent by the physical toy, the physical toy receives data from the computing device corresponding to the completion which triggers an output event from the output component, wherein the physical toy and the game avatar collectively represent one toy that exists in the virtual world and in a real world at the same time and where physical interaction with the physical toy directly affects the game avatar in the virtual world and playing with the game avatar in the game application results in responsive reactions in the physical toy.


Also, a method is described that includes providing a physical toy having an input component for receiving incoming data and a communications control component for managing, handling data and transmitting data, and an output component comprising a speaker; providing at least one computing device that is wirelessly connected to the physical toy and comprises a display; and using a server computer that is connected to at least one computing device and that is programmed to create signals which are communicated to cause a display to be generated on the computing device as a game application, wherein the display shows a virtual world in which a replica image of the physical toy exists in the virtual world as a game avatar; wherein the physical toy is registered to a user account via a unique code in the game application on the at least one computing device, the physical toy and the game avatar are connected to each other via the computing device and the game application, and the physical toy and the game avatar each create requests and communicate the requests to each other for completion, where upon the physical toy's completion of a request sent by the game avatar, the game avatar receives data corresponding to the completion from the physical toy and outputs a reaction on the display of the game application, where upon the game avatar's completion of a request sent by the physical toy, the physical toy receives data from the computing device corresponding to the completion which triggers an output event from the output component, wherein the physical toy and the game avatar represent a character that exists in the virtual world and in a real world at the same time and where physical interaction with the physical toy affects the game avatar in the virtual world and playing with the game avatar generates responses in the physical toy.


Another method is described that includes using a server computer that is communicating with multiple computing devices and that is programmed to create signals which are communicated to cause a display to be generated on a computing device as a game application, wherein the display creates a virtual world in which a replica image of a physical toy that exists in a real world also exists in the virtual world as a game avatar; using the server computer to assign the physical toy to an account in the game application via a unique validation code for the physical toy, which causes the game avatar to be displayed in the game application; wherein the server computer sends data to and receives data from the physical toy via a wireless communication between the physical toy and the computing device and causes the game application to be updated according to the data received from the physical toy; and wherein the server computer causes the game avatar to be responsive to the physical toy in the virtual world by exchanging data communications with the physical toy in the real world and in real time when the computing device is paired with the physical toy and the game application is open, and further whereby the game avatar creates requests and communicates them to the physical toy, and the server computer receives and processes data corresponding to sensor input from the physical toy and the game application is updated as a result of the sensor input from the physical toy.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a block diagram of a live toy system as described herein that facilitates a user's interaction with a physical or real toy and an online application to effectuate a live-acting physical toy.



FIG. 2 is a block diagram of various features built into a physical toy which can be used in the live toy system as discussed in FIG. 1.



FIG. 3 is an exemplary schematic diagram of the live toy system of FIG. 1 as it would operate for a player/user.



FIG. 4 is an exemplary schematic diagram of the live toy system of FIG. 1 as it would operate for a player/user.



FIG. 5 is an exemplary schematic flow diagram that represents an operation of the physical toy in the live toy system of FIG. 1.





DETAILED DESCRIPTION

The live toy system as described herein affords a user with an experience of interacting with a physical toy as if it were a live toy by communicating with the toy via an online software or game application. Unlike other products, the system provides a near identical or virtual representation of the physical toy in the online application. The online application provides a virtual world and/or game environment in which the virtual toy exists as a game avatar and participates in part by input received from the user. Some of this input is provided to the physical toy from the user who can affect the appearance or behavior of the game avatar in the online application. The physical toy and game avatar come together as one to form a single character. As will be described in more detail below, the user is able to play and interact with both their physical toy and an identical looking electronic version of their physical toy in an online world—the game avatar, where the game avatar and the physical toy together create an experience of a single character or live toy for the user.


According to FIG. 1, a block diagram of a live toy system 100 is shown which facilitates two-way communication as part of a game world. The game world via an App or application interacts with a physical toy, and the physical toy can interact with the App as well, and in particular with a game version of itself.


The system 100 comprises a physical toy or physical toy object 120 and a game application (App) 130 that is located on a separate or remote computing device 140. Examples of the physical toy or physical toy object 120 include plush or plastic toy objects. Examples of a mobile or remote device 140 include a smart phone or tablet or any other type of computing device such as a laptop or computer.


The physical toy 120 exists in the game application as a game avatar having an appearance that is based on and recognizable as the physical toy; however, the game avatar can have more clothes and accessories to wear, can be fed, can perform activities in the virtual game world. The App 130 operates on a computing device which is paired with the toy. When paired, the physical toy and the game avatar communicate and interact with each other via the App 130. This creates a more fulfilling play experience because the player/user can interact with the physical toy through the game avatar and by doing so, the physical toy and the game avatar come together resulting in a live toy.


The physical toy 120 comprises a master control unit 125, an input component 150, a communications control component 160 and an output component 170. The master control unit 125 controls the input component, the communications control component and the output component. In one embodiment, the input component 150 can include a microphone to pick up sounds such as the user's voice. It can also include one or more sensors that are programmed to detect touch, sound, or external temperature such as room temperature. The output component can include a speaker and/or a display screen.


The communications control component 160 in the physical toy 120 manages and processes incoming and outgoing data from the physical toy. This includes incoming and outgoing data to and from a player/user and to and from the paired application 130. The data passed between the physical toy 120 and the paired application 130 may be characterized as requests. A request sent from one side may warrant some kind of task to be completed by the other side—for example, if the request was sent from the game avatar in the paired application to the physical toy, then the physical toy may need to complete the task in order to yield or elicit some other reaction or result in the application 130. The opposite would apply as well. If the request is sent from the physical toy to the game avatar in the paired application, then the game avatar in the paired application may need to complete a task (via the user playing the game).


The communications control component 160 includes a processor in order to analyze data and handle it properly. As alluded to above, data can be transmitted via wireless communication protocols such as Bluetooth®. The physical toy 120 and the game application 130 on the computing device 140 can be connected via Bluetooth® so that user input from the physical toy 120 can be readily transmitted in a proper format and received by the computing device 140 via a receiver 145. For example, a user may touch the physical toy 120 and the touch is detected by at least one sensor embedded in the physical toy 120. The touch data is processed and transmitted to the user's account in the paired application 130. The paired application 130 processes the data via a processor 190 and the result can be displayed on the computing device screen (display 180). The result can also be communicated back to the physical toy 120 and then provided as output from the physical toy such as in the form of sounds or words through a speaker (output component 170) built into the physical toy 120.


Referring now to FIG. 2, there is an exemplary image of a physical toy 200 that can be used in the live toy system 100 as discussed in FIG. 1. The physical toy 200 resembles a plush toy or stuffed animal; however, it should be appreciated that any type of toy including figurines or other objects can be used in the live toy system 100 operate as described herein. The physical toy 200 can be made of any material such as textiles, ceramic, silicon and/or plastic or any combination thereof.


Also in FIG. 2 is a master control unit 210 that comprises one or more processors or microprocessors which manage or regulate at least one touch sensor 220 and optional LED light(s) 230, Bluetooth® LE 240, a sound chip or microprocessor 250, and a battery 260. The master control unit manages requests made bi-directionally between the physical toy and the App 130. The sound chip 250 is connected to a speaker 270 with optional volume control built into the toy 200, and the battery 260 is connected to a power switch 280 and a USB charging port 290. In addition, the one or more touch sensors 220 can be positioned anywhere on the toy such as the head, back and/or paws to allow direct communication between the physical toy and the App/game avatar.


The master control unit 210 also includes Flash RAM for storing physical toy data, a request queue and a unique identifier so that the App 130 syncs with the selected physical toy. The sound chip 250 includes sound RAM/ROM for storing multiple sampled sound effects that can be triggered by the physical toy 120 or by the App 130.


The Bluetooth® communication allows for bi-directional communication between the physical toy 120 and the App 130 running on the computing device 140. The physical toy also has a button to pair with Bluetooth® on the device 140. The pairing button can be the same or a different button as the power on/off button. Alternatively, the pairing can be carried out in software without a button.


The physical toy 120 can be paired with any Bluetooth-enabled device. Once paired, the toy 120 can automatically connect to the device whenever it is in range. The physical toy 120 has three primary modes: unpaired, paired but App closed, and paired with App open.


When the App 130 is open and the physical toy 120 is paired to the device, the App 130 will recognize that the physical toy 120 is paired. In one embodiment, if the App 130 is open and the physical toy is paired to the device, but the player has not activated the physical toy—that is, no data is sent from the toy 120 to the App 130, then the App 130 can ask if the player wants to play with the physical toy 120.


If the player has multiple physical toys assigned to the user account, then the App 130 can ask if the player would like to switch to a different physical toy 120. If the App 130 is closed and then later opened, the App 130 can be programmed to open with the most recently played physical toy as the active toy in the App 130.


While the App 130 is open and the physical toy is paired, the system will send commands to the active physical toy to trigger an output event such as sounds in the physical toy—for example, when any of the following events occur:

    • A toy request is initiated
    • A toy request is completed
    • A game is won
    • A game is lost
    • The game avatar is fed
    • The game avatar's outfit is changed
    • The game avatar is put to bed (optional)


Sounds on the physical toy include:

    • Awake—a neutral sound that can used when the pet becomes active and/or for generic requests
    • Happy—3-5 variations
    • Sad—2-3 variations
    • Eating—2-3 variations
    • Hungry—used for feeding requests
    • Playful—used for game requests


Other events may also occur to trigger an output event in the physical toy in the form of a sound representing various emotions/reactions. Other sounds aside from those listed are contemplated as well. Additionally, while paired and the App 130 is open, more specific sounds, such as sleeping, can be streamed directly from the App 130 to the physical toy's speaker. Optionally, if the active physical toy has sensors, the App 130 will recognize when the sensors are being touched/used or otherwise activated. The App 130 and the game character can be responsive to the data received from the sensors.


The App 130 can also use sensors in the physical toy as inputs to a game. In one embodiment, the game avatar in the App 130 may want some attention by saying “I'm feeling down. Can I have a hug?”. The App 130 transmits this as a request to the physical toy 120 and the sensors embedded in the physical toy 120 detect touch data corresponding to a hug. That touch data is communicated back to the App 130, and the game avatar may respond such as by giving a reward, points and/or providing some other type of response (e.g., a visual or audible response). In another embodiment, the physical toy can say or express “I'm getting hungry. Can I have a snack?”. This request is communicated to the App 130, where the player can feed the game avatar. Once fed, the App 130 communicates that the request is completed to the physical toy and the physical toy may respond with an appropriate sound.


The App 130 can send the game avatar's current Happiness/Hunger/Energy stats to the physical toy on the one minute “heartbeat” to be stored in the toy's Flash RAM.


When playing with the paired physical toy with either the App closed or otherwise not communicating, the master control unit 210 determines when it will make a request to the App. The request is selected from the options available in the Flash RAM's Request Queue. The physical toy 200 can play a request sound based on the type of request selected. The master control unit 210 stores the request in request queue until the App is opened or begins to communicate with the physical toy 200.


As stated earlier, if the active (paired) physical toy is not the active game avatar in the App when the App is opened, the App will automatically switch to the correct game avatar in the App. The unique identifier in the active physical toy is communicated to the App and the corresponding game avatar in the App account is put into play. The corresponding game avatar in the App is a nearly, if not completely, identical image of the physical toy to create the effect of the physical toy 200 having an in-game or in-App existence. The physical toy can tell the App which request from the queue it wants to trigger. If the request is ignored, after a brief amount of time such as 3 minutes (the length of a request), that request will be removed from the request queue in the physical toy. If there are no requests available in the queue, no request will be made by the toy.


The App 130 can also send one or more game avatar requests to the physical toy's Flash RAM to be used as a “queue” of requests that can be accessed while the App is not open. If the physical toy is not paired with the device, then interactive play between the physical toy and the game avatar in the App is not available. However, the physical toy can be played with and appropriate input data collected by its sensors can be processed and stored and then later transmitted to the App when the physical toy is paired with the App. In addition, or in the alternative, if the physical toy has sensors, the sensors can directly trigger sounds in the toy when petted, hugged, etc. The master control unit can also recognize that the physical toy is not being used and occasionally make either sad sounds or other alerting sounds that simply encourage kids to hug/pet the toy. Optionally, the same functionality for paired physical toy with App closed can be available while the physical toy is not paired. The physical toy will simply wait for both the Bluetooth connection to be made AND the App to be opened.


Regarding power use, to avoid battery drainage, the physical toy has multiple power saving features. For example, the physical toy can be manually shut off such as by holding the pair/power button down for 5 seconds. There is also a method for automatically shutting down the physical toy or can go into “sleep” mode when the toy is not actively played with, and/or is not paired with the App when the App is open and available for pairing, for a designated amount of time—such as 5 minutes, for instance.


Turning now to FIG. 3, there is a schematic flow diagram 300 illustrating a live toy system in operation such as the system 100 discussed in FIG. 1. According to the diagram 300, a physical toy 310 is connected via Bluetooth® technology to an App located on a mobile computing device. The App is a game where a player has set up an account and has registered a code associated with the physical toy 310 in the account. In the game App, the registered physical toy appears virtually identical in the game—referred to as the game avatar herein. The game avatar is a character in the game App that the player interacts with, in part, via playing the game or playing activities in the game and in part, via the physical toy. In the game App, the player can perform various actions such as, but not limited to, playing the game or various activities in the game, feeding the game avatar, putting the game avatar to bed or to sleep, winning prizes for the game avatar, completing quests or challenges to benefit the game avatar or overall position in the game, buying gifts or other items available for purchase in the game App and the like. Information or data concerning any one of these events can be communicated to the paired physical toy. The master control unit processes the data and chooses a sound effect that is pre-determined as an appropriate response to such data and then the sound chip plays a sound which can be heard by the player/user via the physical toy's built-in speaker. For example, there could be sound effects assigned as part of a table for various reactions or feelings providing separate sound effects for different emotions or states such as excited, hungry, tired, sad, needy, awake, or confused. The physical toy expresses or acts out the feeling or reaction that the game avatar elicits, bringing it to life for the player/user.


In FIG. 4, there is shown an exemplary schematic 400 of the live toy system 100 in operation. A physical toy 410 connects to an open App via Bluetooth® and sends a request to the App. A server supporting the App processes the request to determine the specific details of the request and displays the request on-screen of the App—such as “Karl, can you dress me in a new shirt?” This request is from the physical toy to the App and, specifically to the game avatar in the App. The App displays two options for the player to choose from—Ok or Not now. The player/user completes the request by choosing “Ok” and the screen then displays “Do you like it? Is it me?” and an in-game reward is added to the account for completing the request. In response to the on-screen questions, the physical toy outputs a happy or excited sound from its speaker. Multiple sounds may be triggered to exhibit “happy” or “excited” or any other feeling/mood.


The App also updates the game avatar account to reflect the reward and a mood indicator as well as any other game avatar stats there may be—such as hunger status, bored status, energy level, or social level (whether the game avatar wants to engage in social play with other active players or with other physical toys that may be paired to the same account in the App). In addition, the request queue is also updated to show that a request was completed and/or any content associated with it. In practice, for instance, an account in the App may include multiple game avatars, with each of the game avatars corresponding to a different physical toy. A dashboard view, for instance, can show the current status of each game avatar registered to the user's account.


If a user has multiple physical toys and corresponding game avatars registered in the App, it is also possible for those multiple game avatars (registered to the same user account) to interact with each other. The physical toys would be triggered from the respective game avatars as described above in the figures.


The App may also support interaction or game play between more than one user. In one embodiment, multiple game avatars registered to different user accounts can participate in activities or play games in the App. Different users and their respective accounts can be linked or otherwise connected via the App and their respective game avatars can interact with each other in the App. The corresponding physical toys would be triggered by their respective game avatars from the App as described in the figures. In another embodiment, a first user can use his/her game avatar to send a request to a second user's game avatar in the App. The request is then communicated from the second user's game avatar to the second user's physical toy. The physical toy completes the request via the second user; and that completed request (or response data) is communicated back to the second user's game avatar. The response data is then communicated to the first user's game avatar and then to the first user's physical toy.


As a further example, there could be requirements for the different game avatars to satisfy before they can play with each other. For instance, if at least two physical toys are paired to the App and both corresponding game avatars have high enough social levels or other status indicators that indicate they want to play or be played with, at least one of them may express that the two game avatars want to interact with each other. This can be expressed on screen in the App and then also communicated into one or more requests made from the App to the physical toy and the physical toy may play some sounds in order to grab the attention of the player/user and obtain a response from the player/user.


Now referring to FIG. 5, another exemplary schematic flow diagram is shown to represent an operation of the physical toy 510. A user can play with the physical toy 510 via the App and the physical toy 510 can also include an internal timer. The internal timer can track if a response is received in response to a request made in the App. The request may be one from various categories that would be satisfied in the App such as play game (in App), feed me, need sleep, love me, dress me, bathe me, shop for me, and the like. With each request or each category of request, the physical toy can output one or more sound effects that are appropriate for the type of request made. The sound effects may be repeated such as every 30 seconds until the request is acknowledged or satisfied or until some other amount of time threshold is satisfied—such as when 3 minutes has elapsed since the request was made (logged) without a response to the request. A threshold time can be programmed to determine an “ignored” request. If the response is deemed to be ignored, then an ignore counter can increase by one and the internal timer can be reset. When the ignore counter reaches 5 ignores or some other determined number of ignores, the physical toy may be silenced or put to sleep until the physical toy is paired with the App again. By silencing the physical toy or putting the physical toy to sleep, battery power is conserved.

Claims
  • 1. A live toy system comprising: a physical toy comprising a master control unit, an input component for receiving incoming data, a communications control component for managing, handling and transmitting data, and an output component comprising a speaker, wherein the master control unit controls the input component, the communications control component and the output component by processing data associated with each component, managing requests and processing and tracking queued requests;a server computer that is connected to at least one computing device and that is programmed to create signals which are communicated to cause the display to be generated on the at least one computing device as a game application, wherein the display shows a virtual world in which a replica image of the physical toy exists in the virtual world as a game avatar;wherein the physical toy is registered to a user account via a unique code in the game application on the at least one computing device, the physical toy and the game avatar are connected to each other via the computing device and the game application, and the physical toy and the game avatar each create requests and communicate the requests to each other,where upon the physical toy's completion of a request sent by the game avatar, the game avatar receives data corresponding to the completion from the physical toy and outputs a reaction on the display of the game application,where upon the game avatar's completion of a request sent by the physical toy, the physical toy receives data from the computing device corresponding to the completion which triggers an output event from the output component,wherein the physical toy and the game avatar collectively represent one toy that exists in the virtual world and in a real world at the same time and where physical interaction with the physical toy directly affects the game avatar in the virtual world and playing with the game avatar in the game application results in responsive reactions in the physical toy.
  • 2. The system of claim 1, wherein the physical toy is a made of a plush or plastic material.
  • 3. The system of claim 1, wherein the master control unit manages battery power and usage of the physical toy.
  • 4. The system of claim 1, wherein the physical toy further comprises a wireless device, which transmits data between the physical toy and the game avatar when paired with the computing device and when the game application is open.
  • 5. The system of claim 4, wherein the physical toy automatically pairs with the computing device when the game application is open, and a wireless connection is enabled on both the physical toy and the computing device.
  • 6. The system of claim 1, wherein the physical toy further comprises at least one of a touch sensor, a microphone, a sound sensor, storage memory LED lights, a sound chip, a speaker, a battery, charging port and a power switch.
  • 7. The system of claim 1, wherein the game application is associated with a user account and the physical toy is associated with the user account and the physical toy is specified in the user account as the game avatar.
  • 8. The system of claim 1, wherein the physical toy further comprises an internal timer that counts a number of requests sent to a physical toy from the game avatar and determines how many requests within a prescribed amount of time have been ignored and if a threshold number is satisfied, then the physical toy is signaled to switch to a reduced power mode.
  • 9. The system of claim 1, wherein multiple physical toys along with their corresponding game avatars, which are associated with different user accounts, are connected to the game application and are permitted to interact with each other by way of the game application.
  • 10. The system of claim 1, wherein there is a first user associated with the user account, and a second user is associated with a second user account, and wherein the server computer controls the first user using their game avatar to send a first request to the second user's game avatar, and the first request is communicated from the second user's game avatar to the second user's physical toy; and responsive to the second user's physical toy completing the first request, information indicative of the first request having been finished is communicated from the second user's game avatar, to the first user's game avatar and then is transmitted to the first user's physical toy.
  • 11. A method comprising: providing a physical toy having an input component for receiving incoming data and a communications control component for managing, handling data and transmitting data, and an output component comprising a speaker;providing at least one computing device that is wirelessly connected to the physical toy and comprises a display; andusing a server computer that is connected to at least one computing device and that is programmed to create signals which are communicated to cause a display to be generated on the computing device as a game application, wherein the display shows a virtual world in which a replica image of the physical toy exists in the virtual world as a game avatar;wherein the physical toy is registered to a user account via a unique code in the game application on the at least one computing device, the physical toy and the game avatar are connected to each other via the computing device and the game application, and the physical toy and the game avatar each create requests and communicate the requests to each other for completion,where upon the physical toy's completion of a request sent by the game avatar, the game avatar receives data corresponding to the completion from the physical toy and outputs a reaction on the display of the game application,where upon the game avatar's completion of a request sent by the physical toy, the physical toy receives data from the computing device corresponding to the completion which triggers an output event from the output component,wherein the physical toy and the game avatar represent a character that exists in the virtual world and in a real world at the same time and where physical interaction with the physical toy affects the game avatar in the virtual world and playing with the game avatar generates responses in the physical toy.
  • 12. The method of claim 11 further comprising pairing the physical toy and the computing device via a wireless connection, and when paired and the game application is open, the game avatar corresponding to a paired physical toy is chosen for play in the game application.
  • 13. The method of claim 12 further comprising sending a request from the game application to the physical toy when the game application has not detected activity or other data received from the physical toy within a prescribed amount of time.
  • 14. The method of claim 13, wherein the request comprises a task to be completed and when completed, data is communicated to a source of the request to generate at least one of a visual or audible response by the source of the request following completion of the task.
  • 15. The method of claim 11 further comprising detecting touch input from one or more touch sensors in the physical toy and transmitting data associated with the input to the game application via the computing device.
  • 16. The method of claim 11 further comprises storing requests made by the game application to the physical toy in a request queue when the physical toy is not paired with the computing device and processing the requests in order when the physical toy is paired and the game application is open.
  • 17. The method of claim 11, wherein there is a first user associated with the user account, and there is a second user associated with a second user account, and where the method further comprises: using the game avatar of the first user, the first user sends a first request to the second user's game avatar;communicating the first request from the second user's game avatar to the second user's physical toy; andresponsive to the second user's physical toy completing the request, information indicative of the second request having been finished is communicated from the second user's game avatar, to the first user's game avatar and then to the first user's physical toy.
  • 18. The method of claim 11 further comprises linking the physical toy to the game application via a unique code assigned to the physical toy, where upon a successful link, the game avatar image which substantially replicates the physical toy is displayed in the game application.
  • 19. The method of claim 11, wherein multiple game avatars and corresponding physical toys associated with different user accounts are connected to the game application and are permitted to interact with each other by way of the game application.
  • 20. A method comprising: using a server computer that is communicating with multiple computing devices and that is programmed to create signals which are communicated to cause a display to be generated on a computing device as a game application, wherein the display creates a virtual world in which a replica image of a physical toy that exists in a real world also exists in the virtual world as a game avatar;using the server computer to assign the physical toy to an account in the game application via a unique validation code for the physical toy, which causes the game avatar to be displayed in the game application;wherein the server computer sends data to and receives data from the physical toy via a wireless communication between the physical toy and the computing device and causes the game application to be updated according to the data received from the physical toy; andwherein the server computer causes the game avatar to be responsive to the physical toy in the virtual world by exchanging data communications with the physical toy in the real world and in real time when the computing device is paired with the physical toy and the game application is open, and further whereby the game avatar creates requests and communicates them to the physical toy, and the server computer receives and processes data corresponding to sensor input from the physical toy and the game application is updated as a result of the sensor input from the physical toy; andwherein the server computer causes request reminders to be displayed in the game application and the server computer also pings the physical toy with request reminders when the physical toy is paired with the computing device and the game application is open.
  • 21. The method of claim 20, wherein the request comprises a task to be completed and when completed, data is communicated to a source of the request to generate a visual or audible response from the source of the request following completion of the task.
  • 22. The method of claim 20, wherein the server computer tracks requests sent to the physical toy in a queue in the game application until data is received from the physical toy in response to each request.
US Referenced Citations (317)
Number Name Date Kind
4738451 Logg Apr 1988 A
5255834 Bendersky Oct 1993 A
5375195 Johnston Dec 1994 A
5411259 Pearson May 1995 A
5544320 Konrad Aug 1996 A
5572646 Kawai et al. Nov 1996 A
5606652 Silverbrook Feb 1997 A
5659692 Poggio et al. Aug 1997 A
5682469 Linnett et al. Oct 1997 A
5684943 Abraham Nov 1997 A
5717869 Moran et al. Feb 1998 A
5736982 Suzuki et al. Apr 1998 A
5802296 Morse et al. Sep 1998 A
5822428 Gardner Oct 1998 A
5853327 Gilboa Dec 1998 A
5880731 Liles et al. Mar 1999 A
5886697 Naughton et al. Mar 1999 A
5890963 Yen Apr 1999 A
5923330 Tarlton Jul 1999 A
5926179 Matsuda et al. Jul 1999 A
5956038 Rekimoto Sep 1999 A
5959281 Domiteaux Sep 1999 A
5964660 James et al. Oct 1999 A
5966526 Yokoi Oct 1999 A
6009458 Hawkins et al. Dec 1999 A
6012961 Sharpe, III et al. Jan 2000 A
6031549 Hayes-Roth Feb 2000 A
6049778 Walker et al. Apr 2000 A
6057856 Miyashita et al. May 2000 A
6072466 Shah et al. Jun 2000 A
6081830 Schindler Jun 2000 A
6159101 Simpson Dec 2000 A
6173267 Cairns Jan 2001 B1
6175857 Hachiya et al. Jan 2001 B1
6200216 Peppel Mar 2001 B1
6210272 Brown Apr 2001 B1
6213871 Yokoi Apr 2001 B1
6219045 Leahy et al. Apr 2001 B1
6227931 Shackelford May 2001 B1
6227966 Yokoi May 2001 B1
6229904 Huang et al. May 2001 B1
6251010 Tajiri et al. Jun 2001 B1
6251012 Horigami et al. Jun 2001 B1
6251017 Leason et al. Jun 2001 B1
6253167 Matsuda et al. Jun 2001 B1
6254477 Sasaki et al. Jul 2001 B1
6256043 Aho et al. Jul 2001 B1
6267672 Vance Jul 2001 B1
6268872 Matsuda et al. Jul 2001 B1
6273815 Stuckman Aug 2001 B1
6290565 Galyean, III et al. Sep 2001 B1
6290566 Gabai et al. Sep 2001 B1
6311195 Hachiya et al. Oct 2001 B1
6349301 Mitchell et al. Feb 2002 B1
6352478 Gabai et al. Mar 2002 B1
6356867 Gabai et al. Mar 2002 B1
6368177 Gabai et al. Apr 2002 B1
6388665 Linnett et al. May 2002 B1
6394872 Watanabe May 2002 B1
6404438 Hatlelid et al. Jun 2002 B1
6406370 Kumagai Jun 2002 B1
6449518 Yokoo et al. Sep 2002 B1
6468155 Zucker et al. Oct 2002 B1
6476830 Farmer et al. Nov 2002 B1
6482067 Pickens Nov 2002 B1
6493001 Takagi et al. Dec 2002 B1
6494762 Bushmitch et al. Dec 2002 B1
6519771 Zenith Feb 2003 B1
6522333 Hatlelid et al. Feb 2003 B1
6539400 Bloomfield et al. Mar 2003 B1
6554679 Shackelford et al. Apr 2003 B1
6559863 Megiddo May 2003 B1
6560511 Yokoo et al. May 2003 B1
6572431 Maa Jun 2003 B1
6574606 Bell et al. Jun 2003 B1
6587834 Dixon, III Jul 2003 B1
6595858 Tajiri Jul 2003 B1
6609968 Okada et al. Aug 2003 B1
6612501 Woll et al. Sep 2003 B1
6616532 Albrecht Sep 2003 B2
6650761 Rodriguez et al. Nov 2003 B1
6663105 Sullivan et al. Dec 2003 B1
6685565 Tanibuchi et al. Feb 2004 B2
6692360 Kusuda et al. Feb 2004 B2
6704784 Matsuda et al. Mar 2004 B2
6719604 Chan Apr 2004 B2
6720949 Pryor et al. Apr 2004 B1
6722973 Akaishi Apr 2004 B2
6727925 Bourdelais Apr 2004 B1
6729884 Kelton et al. May 2004 B1
6734884 Berry May 2004 B1
6734885 Matsuda May 2004 B1
6735324 McKinley et al. May 2004 B1
6739941 Brownsberger May 2004 B1
6758678 Van Gilder Jul 2004 B2
6772195 Hatlelid et al. Aug 2004 B1
6773325 Mawle et al. Aug 2004 B1
6773344 Gabai et al. Aug 2004 B1
6800013 Liu Oct 2004 B2
6813605 Nakamura Nov 2004 B2
6814662 Sasaki et al. Nov 2004 B2
6845486 Yamada et al. Jan 2005 B2
6890179 Rogan et al. May 2005 B2
6899333 Weisman May 2005 B2
6910186 Kim Jun 2005 B2
6918833 Emmerson Jul 2005 B2
6944421 Axelrod Sep 2005 B2
6951516 Eguchi et al. Oct 2005 B1
6954728 Kusumoto et al. Oct 2005 B1
6959166 Gabai et al. Oct 2005 B1
7012602 Watson et al. Mar 2006 B2
7037166 Shrock et al. May 2006 B2
7039940 Weatherford May 2006 B2
7042440 Pryor et al. May 2006 B2
7046242 Kitsutaka May 2006 B2
7054831 Koenig May 2006 B2
7058897 Matsuda Jun 2006 B2
7061493 Cook et al. Jun 2006 B1
7062722 Carlin et al. Jun 2006 B1
7066781 Weston Jun 2006 B2
7076445 Cartwright Jul 2006 B1
7081033 Mawle et al. Jul 2006 B1
7086005 Matsuda Aug 2006 B1
7089083 Yokoo et al. Aug 2006 B2
7104884 Yokoi Sep 2006 B2
7117190 Sabe et al. Oct 2006 B2
7143358 Yuen Nov 2006 B1
7155680 Akazawa et al. Dec 2006 B2
7168051 Robinson et al. Jan 2007 B2
7171154 Fujisawa Jan 2007 B2
7179171 Forlines et al. Feb 2007 B2
7181690 Leahy et al. Feb 2007 B1
7191220 Ohwa Mar 2007 B2
7208669 Wells et al. Apr 2007 B2
7229288 Stuart et al. Jun 2007 B2
7249139 Chuah et al. Jul 2007 B2
7266522 Dutta et al. Sep 2007 B2
7288028 Rodriquez et al. Oct 2007 B2
7314407 Pearson Jan 2008 B1
7425169 Ganz Sep 2008 B2
7442108 Ganz Oct 2008 B2
7448231 Jeanvoine et al. Nov 2008 B2
7465212 Ganz Dec 2008 B2
7478047 Loyall et al. Jan 2009 B2
7488231 Weston Feb 2009 B2
7534157 Ganz May 2009 B2
7548242 Hughes et al. Jun 2009 B1
7568964 Ganz Aug 2009 B2
7599802 Harwood et al. Oct 2009 B2
7604525 Ganz Oct 2009 B2
7618303 Ganz Nov 2009 B2
7677948 Ganz Mar 2010 B2
7677974 Van Luchene Mar 2010 B2
7690997 Van Luchene et al. Apr 2010 B2
7789726 Ganz Sep 2010 B2
7803046 Scott et al. Sep 2010 B2
7789758 Van Luchene Oct 2010 B2
7806758 Van Luchene Oct 2010 B2
7819731 Suzuki Oct 2010 B2
7850527 Barney et al. Dec 2010 B2
7867093 Wright Jan 2011 B2
7908324 Shochet et al. Mar 2011 B2
7967657 Ganz Jun 2011 B2
7970663 Ganz Jun 2011 B2
7974901 Van Luchene Jul 2011 B2
7974902 Van Luchene Jul 2011 B2
8002605 Ganz Aug 2011 B2
8241099 Blair et al. Aug 2012 B2
8251810 Van Luchene Aug 2012 B2
8262471 Van Luchene Sep 2012 B2
8272956 Kelly et al. Sep 2012 B2
8313364 Reynolds et al. Nov 2012 B2
8328611 Sano et al. Dec 2012 B2
8328643 Osvald et al. Dec 2012 B1
8348758 Cram Jan 2013 B2
8388450 McGuirk et al. Mar 2013 B1
8460052 Ganz Jun 2013 B2
8540570 Janis et al. Sep 2013 B2
8636558 Eyzaguirre et al. Jan 2014 B2
8641471 Ganz Feb 2014 B2
8808053 Ganz Aug 2014 B2
8900030 Ganz Dec 2014 B2
8926395 Zheng Jan 2015 B2
9132344 Borge Sep 2015 B2
9199175 Zheng Dec 2015 B2
9238171 Ganz Jan 2016 B2
9610513 Ganz Apr 2017 B2
9675895 Judkins et al. Jun 2017 B2
9833725 Watry Dec 2017 B2
9947023 Ganz Apr 2018 B2
20010020955 Nakagawa Sep 2001 A1
20010031603 Gabai et al. Oct 2001 A1
20010036851 Saski et al. Nov 2001 A1
20010037304 Paiz Nov 2001 A1
20010039206 Peppel Nov 2001 A1
20020002514 Kamachi et al. Jan 2002 A1
20020022523 Dan et al. Feb 2002 A1
20020022992 Miller et al. Feb 2002 A1
20020022993 Miller et al. Feb 2002 A1
20020022994 Miller et al. Feb 2002 A1
20020026357 Miller et al. Feb 2002 A1
20020026358 Miller et al. Feb 2002 A1
20020036617 Pryor Mar 2002 A1
20020040327 Owa Apr 2002 A1
20020054094 Matsuda May 2002 A1
20020065746 Lewis May 2002 A1
20020065890 Barron May 2002 A1
20020068500 Gabai et al. Jun 2002 A1
20020082065 Fogel et al. Jun 2002 A1
20020082077 Johnson et al. Jun 2002 A1
20020090985 Tochner et al. Jul 2002 A1
20020094851 Rheey Jul 2002 A1
20020111808 Feinberg Aug 2002 A1
20020113809 Akazawa et al. Aug 2002 A1
20020119810 Takatsuka et al. Aug 2002 A1
20020130894 Young Sep 2002 A1
20020147640 Daniele Oct 2002 A1
20020160835 Fujioka et al. Oct 2002 A1
20020161666 Fraki et al. Oct 2002 A1
20020168919 Perkins Nov 2002 A1
20020169668 Bank et al. Nov 2002 A1
20020169672 Barnhart Nov 2002 A1
20020183119 Fessler Dec 2002 A1
20020198781 Cobley Dec 2002 A1
20020198940 Bower et al. Dec 2002 A1
20030004889 Fiala Jan 2003 A1
20030018523 Rappaport Jan 2003 A1
20030034955 Gilder Feb 2003 A1
20030045203 Sabe et al. Mar 2003 A1
20030055984 Shimakawa et al. Mar 2003 A1
20030061161 Black Mar 2003 A1
20030088467 Culver May 2003 A1
20030093182 Yokoyama May 2003 A1
20030119570 Maroun Jun 2003 A1
20030126031 Asami Jul 2003 A1
20030166414 Sako et al. Sep 2003 A1
20030220885 Lucarelli Nov 2003 A1
20030222902 Chupin et al. Dec 2003 A1
20030232649 Gizis Dec 2003 A1
20030236119 Forlines et al. Dec 2003 A1
20040009812 Scott et al. Jan 2004 A1
20040030595 Park et al. Feb 2004 A1
20040043806 Kirby Mar 2004 A1
20040046736 Pryor et al. Mar 2004 A1
20040053690 Fogel et al. Mar 2004 A1
20040075677 Loyall et al. Apr 2004 A1
20040092311 Weston May 2004 A1
20040093266 Dohring May 2004 A1
20040153557 Shochet et al. Aug 2004 A1
20040189702 Hlavac et al. Sep 2004 A1
20040193489 Boyd et al. Sep 2004 A1
20040204127 Forlines et al. Oct 2004 A1
20040219961 Ellenby et al. Nov 2004 A1
20040229696 Beck Nov 2004 A1
20040242326 Fujisawa Dec 2004 A1
20040259465 Wright et al. Dec 2004 A1
20050043076 Lin Feb 2005 A1
20050049725 Huang Mar 2005 A1
20050059483 Borge Mar 2005 A1
20050071225 Bortolin Mar 2005 A1
20050114272 Herrmann et al. May 2005 A1
20050137015 Rogers et al. Jun 2005 A1
20050192864 Ganz Sep 2005 A1
20050243091 Hong Nov 2005 A1
20050250415 Barthold Nov 2005 A1
20050250416 Barthold Nov 2005 A1
20050272504 Eguchi et al. Dec 2005 A1
20050287925 Proch et al. Dec 2005 A1
20060035692 Kirby Feb 2006 A1
20060079150 Filoseta et al. Apr 2006 A1
20060080539 Asami et al. Apr 2006 A1
20060093142 Schneier et al. May 2006 A1
20060166593 Shrock et al. Jul 2006 A1
20060285441 Walker et al. Dec 2006 A1
20070050716 Leahy et al. Mar 2007 A1
20070111795 Choi et al. May 2007 A1
20070143679 Resner Jun 2007 A1
20070176363 Bielman Aug 2007 A1
20080009350 Ganz Jan 2008 A1
20080009351 Ganz Jan 2008 A1
20080045285 Fujito Feb 2008 A1
20080081694 Hong Apr 2008 A1
20080109313 Ganz May 2008 A1
20080109392 Nandy May 2008 A1
20080134099 Ganz Jun 2008 A1
20080155019 Wallace Jun 2008 A1
20080163055 Ganz Jul 2008 A1
20080274806 Ganz et al. Nov 2008 A1
20080274811 Ganz et al. Nov 2008 A1
20090029772 Ganz Jan 2009 A1
20090053970 Borge Feb 2009 A1
20090054155 Borge Feb 2009 A1
20090063282 Ganz Mar 2009 A1
20090131164 Ganz May 2009 A1
20090149233 Strause et al. Jun 2009 A1
20090204420 Ganz Aug 2009 A1
20100151940 Borge Jun 2010 A1
20110009190 Scott et al. Jan 2011 A1
20110039622 Levenson Feb 2011 A1
20110039623 Levenson Feb 2011 A1
20110256937 Van Luchene Oct 2011 A1
20110263322 Van Luchene Oct 2011 A1
20120238361 Janis et al. Sep 2012 A1
20120238362 Janis et al. Sep 2012 A1
20120264520 Marsland et al. Oct 2012 A1
20130079143 McGuirk et al. Mar 2013 A1
20130079145 Lam et al. Mar 2013 A1
20130088491 Hobbs et al. Apr 2013 A1
20130102379 Sargent et al. Apr 2013 A1
20130109474 Login et al. May 2013 A1
20130109479 Ganz May 2013 A1
20140273717 Judkins Sep 2014 A1
20150065258 Meade Mar 2015 A1
20160236085 Yano Aug 2016 A1
20160325180 Nelson Nov 2016 A1
20170221305 Peterson Aug 2017 A1
20210125212 Ganz et al. Apr 2021 A1
Foreign Referenced Citations (38)
Number Date Country
2 475 463 Aug 2003 CA
103 04 779 Sep 2003 DE
0 971 302 Jan 2000 EP
1 228 792 Aug 2002 EP
2365364 Feb 2002 GB
2000-57373 Feb 2000 JP
2001-222585 Aug 2001 JP
2001222585 Aug 2001 JP
2001-283024 Oct 2001 JP
2001-321571 Nov 2001 JP
2002-016171 Jan 2002 JP
2002063092 Feb 2002 JP
2002-297498 Oct 2002 JP
2003-016035 Jan 2003 JP
2002-134481 Jul 2003 JP
2003-205178 Jul 2003 JP
2003-210843 Jul 2003 JP
2003-242058 Aug 2003 JP
2003-248650 Sep 2003 JP
20010073524 Aug 2001 KR
9942917 Aug 1999 WO
9950733 Oct 1999 WO
0033533 Jun 2000 WO
2001004852 Jan 2001 WO
0169572 Sep 2001 WO
0169829 Sep 2001 WO
0169830 Sep 2001 WO
0190841 Nov 2001 WO
0222224 Mar 2002 WO
2002021407 Mar 2002 WO
0227591 Apr 2002 WO
02054327 Jul 2002 WO
2002079925 Oct 2002 WO
02099581 Dec 2002 WO
03026764 Apr 2003 WO
03034303 Apr 2003 WO
2003071389 Aug 2003 WO
2005064502 Jul 2005 WO
Non-Patent Literature Citations (137)
Entry
CNET News, “Who Let the Neopets out?,” dated Feb. 26, 2002.
Amendment and Response to Office action for Inter Partes Reexamination of U.S. Pat. No. 7,604,525, dated Oct. 19, 2010.
Action Closing Prosecution of Inter Partes Reexamination of U.S. Pat. No. 7,677,948—dated Sep. 14, 2010.
Action Closing Prosecution of Inter Partes Reexamination of U.S. Pat. No. 7,568,964—dated Sep. 8, 2010.
Action Closing Prosecution for Inter Partes Reexamination of U.S. Pat. No. 7,604,525, dated Feb. 25, 2011.
3rd Party Comments re: Response to Office action for Inter Partes Reexamination of U.S. Pat. No. 7,604,525, dated Nov. 2, 2010.
“The Sims,” http://en.wikipedia.org/wiki/, retrieved Feb. 6, 2010.
“The Sims Booklet,” dated 2000.
“The Sims 10th Anniversary,” http://thesims2.ea.com/, dated Feb. 26, 2010.
“Look Out Pokemon,” The Ottawa Citizen, Dated Feb. 7, 2000.
Search Report—CA2696620—dated Mar. 1, 2011.
Reexamination Request No. 95/001,343—NTC of Intent to Issue Reexam Certificate dated Feb. 27, 2013.
Reexamination Request No. 95/001,422—Board Decision Feb. 13, 2013.
Reexamination Request No. 95/001,422—Patent Owner's Response After Closing Prosecution. Cert of Service dated Mar. 13, 2013.
Reexamination Request No. 95/001,345—Order Remanding Inter Partes Reexamination to the Examiner Mar. 19, 2013.
Right of Appeal notice for Inter Partes Reexamination of U.S. Pat. No. 7,604,525 dated Nov. 14, 2011.
Examiners Answer for Inter Partes Reexamination of U.S. Pat. No. 7,568,964 dated Nov. 15, 2011.
Action Closing Prosecution for Inter Partes Reexamination of U.S. Pat. No. 7,618,303 dated Nov. 14, 2011.
Decision on Appeal of Reexamination Request No. 95/001,341, dated Oct. 30, 2012.
Request to Reopen Prosecution of Reexamination Request No. 95/001,341, dated Nov. 30, 2012.
Decision on Appeal of Reexamination Request No. 95/001,343, dated Oct. 30, 2012.
Order Denying Request Reopen of Reexamination Request No. 95/001,345, dated Nov. 7, 2012.
Renewed Request Reopen of Reexamination Request No. 95/001,345, dated Nov. 21, 2012.
Rebuttal Brief of 3rd Party of Reexamination Request No. 95/001,422, dated Sep. 13, 2012.
Rebuttal Brief Entered of Reexamination Request No. 95/001,422, dated Nov. 15, 2012.
Appeal Docketing Notice of Reexamination Request No. 95/001,422, dated Nov. 26, 2012.
Inter Parties Reexamination Certificate for 95/001,343 issued Apr. 24, 2013.
Examiner's Determination for 95/001,345 dated May 1, 2013.
Reexamination Request No. 95/001,422—Final Board Decision issued Jun. 4, 2013.
Petition for the Director to Review the Denial of Reexamination Request No. 90/011,310, dated Jan. 6, 2011.
Status Inquiry on Petition for the Director to Review the Denial of Reexamination Request No. 90/011,310, dated Jan. 13, 2012.
Denial of Petition for the Director to Review the Denial of Reexamination Request No. 90/011,310, dated Jan. 26, 2012.
Examiner's Answer of Reexamination Request No. 95/001,341, dated Apr. 12, 2012.
Rebuttal Brief of 3rd Party Requestor of Reexamination Request No. 95/001,341, dated May 10, 2012.
Appellant's Brief of Reexamination Request No. 95/001,343, dated Feb. 14, 2012.
Respondent's Brief of Reexamination Request No. 95/001,343, dated Mar. 8, 2012.
Examiner's Answer of Reexamination Request No. 95/001,343, dated Apr. 25, 2012.
Rebuttal Brief of 3rd Party Requestor of Reexamination Request No. 95/001,345, dated Dec. 14, 2011.
BPAI Docketing Notice of Reexamination Request No. 95/001,345, dated Apr. 9, 2012.
Rebuttal Brief Entry of Reexamination Request No. 95/001,345, dated Mar. 27, 2012.
Right of Appeal Notice of Reexamination Request No. 95/001,422, dated Apr. 2, 2012.
Notice of Appeal by 3rd Party Requestor of Reexamination Request No. 95/001,422, dated Apr. 17, 2012.
Rebuttal Brief of Patent Owner of Reexamination Request No. 95/001,343, dated May 25, 2012.
Decision on Appeal of Reexamination Request No. 95/001,345, dated Jun. 12, 2012.
Extension of Time Petition of Reexamination Request No. 95/001,345, dated Jul. 2, 2012.
Decision on Petition of Reexamination Request No. 95/001,345, dated Jul. 12, 2012.
Request to Reopen Prosecution of Reexamination Request No. 95/001,345, dated Jul. 12, 2012.
Third Party Requestor's Reply to Request of Reexamination Request No. 95/001,345, dated Aug. 9, 2012.
Examiner's Answer of Reexamination Request No. 95/001,422, dated Aug. 15, 2012.
U.S. Appl. No. 16/871,304 to Ganz, filed May 11, 2020.
Reexamination 95/011,341—Decision on Reconsideration—Denied; Dated Apr. 21, 2015.
Reexamination 95/011,341—Notice of Appeal to Federal Circuit; Dated Jun. 22, 2015.
Reexamination 95/011,341—Examiner's Determination, dated Oct. 18, 2013.
Reexamination 95/011,341—Response After Decision, dated Nov. 18, 2013.
Reexamination 95/011,341—Declaration of Karl Borst, Nov. 18, 2013.
Reexamination 95/001,345—PTAB Docketing Notice, Dec. 23, 2013.
Reexamination 95/001,345—Notice of Concurrent Proceedings, Dec. 30, 2013.
Reexam Certificate Issued for 95/001,422, dated Oct. 30, 2013.
BPAI Decision on Appeal 11840939, dated Jan. 24, 2014.
BPAI Decision on Appeal 11840940, dated Jan. 24, 2014.
BPAI Decision on Appeal 11859491, dated Jan. 24, 2014.
BPAI Decision on Appeal 11840941, dated Jan. 29, 2014.
Reexamination 95/011,341—PTAB Docketing Notice, Jan. 31, 2014.
Reexamination 95/011,341—PTAB Decision, Apr. 1, 2014.
Reexamination 95/001,345—PTAB Decision, Apr. 1, 2014.
Request for Rehearing 11840939, dated Mar. 24, 2014.
Request for Rehearing 11840940, dated Mar. 24, 2014.
Request for Rehearing 11859491, dated Mar. 24, 2014.
Request for Rehearing 11840941, dated Mar. 29, 2014.
Reexamination 95/011,341—Request for Rehearing, May 1, 2014.
Request for Rehearing—Denied 11840939, dated May 9, 2014.
Request for Rehearing—Denied 11840940, dated May 9, 2014.
Request for Rehearing—Denied 11859491, dated May 9, 2014.
Request for Rehearing—Denied 11840941, dated May 9, 2014.
Notice of Appeal 11840939, dated May 7, 2012.
Notice of Appeal 11840940, dated Apr. 30, 2012.
Notice of Appeal 11840941, dated Apr. 30, 2012.
Notice of Appeal 11859491, dated Apr. 30, 2012.
Reexamination 95/011,345—Reexam Certificate Issued; Dated Sep. 16, 2014.
Reexamination 95/001,422—NTC of Intent to Issue a Reexam Certificate, dated Sep. 30, 2013.
Reexamination 95/001,341—Order Remanding Reexam to Examiner for Consideration, dated Jul. 31, 2013.
Reexamination 95/011,341—Appellant's Motion to Dismiss Appeal; Dated Aug. 18, 2015.
Reexamination 95/011,341—Order Granting Motion to Dismiss; Dated Sep. 3, 2015.
https://bratzboulevard.wordpress.com/tag/app/ accessed on Sep. 19, 2019.
https://furby.hasbro.com/en-us/apps accessed on Sep. 16, 2019.
https://apps.apple.com/us/app/action-heroez/id680915120 accessed on Sep. 19, 2019.
Search Report—AU-2009202831, dated Jan. 12, 2011.
Search Report—AU-2009202829, dated Jan. 4, 2011.
Search Report—AU-2009202828, dated Jan. 13, 2011.
USPTO Communication Form for Ex Partes Reexamination of U.S. Pat. No. 7,568,964 dated Nov. 12, 2010.
U.S. Copyright Registrations for The Sims expansion packs, Dec. 31, 2000.
The Sims: News, “Details about Makin Magic,” Jul. 11, 2003.
The Neopian Times, Week 42 retrieved Mar. 25, 2010.
The Neopian Times, Week 32 retrieved Mar. 25, 2010.
The Helpful Neopian retrieved Mar. 25, 2010.
Telecomworldwire, “Product Sidewire,” Oct. 1995.
Second Request for Reexamination of U.S. Pat. No. 7,618,303—with Exhibits B, C, D, E, F, L, M dated Aug. 24, 2010.
Search Report—CA2665737, dated Oct. 26, 2010.
Search Report for PCT/CA2004/002206 dated May 2, 2005.
Right of Appeal notice for Inter Partes Reexamination of U.S. Pat. No. 7,677,948 dated Jan. 18, 2011.
Right of Appeal notice for Inter Partes Reexamination of U.S. Pat. No. 7,568,964 dated Jan. 15, 2011.
Request for Reexamination of U.S. Pat. No. 7,677,948—with Exhibits B, C, D, E, O, P, Q, R, S, and T dated Apr. 20, 2010.
Request for Reexamination of U.S. Pat. No. 7,618,303—with Exhibits B, C, D, E, M, N, and O dated Apr. 20, 2010.
Request for Reexamination of U.S. Pat. No. 7,604,525—with Exhibits H, I, J, K, L, X, and Y dated Apr. 20, 2010.
Request for Reexamination of U.S. Pat. No. 7,568,964—with Exhibits B, C, N, O, R, S dated Apr. 20, 2010.
Request for ExParte Reexamination of U.S. Pat. No. 7,568,964—no exhibits dated Nov. 1, 2010.
Prima's Official Strategy Guide—The Sims, dated 2000.
Pojo's Unofficial Guide to Neopets, 2003.
Petition to Review Denial for Ex Partes Reexamination of U.S. Pat. No. 7,568,964 dated Jan. 6, 2011.
PC Magazine, “The Sims Online Arrives,” dated Dec. 18, 2002.
Order Granting Request for Inter Partes Reexamination of U.S. Pat. No. 7,677,948 dated Jul. 9, 2010.
Order Granting Request for Inter Partes Reexamination of U.S. Pat. No. 7,618,303 Nov. 8, 2010.
Order Granting Request for Inter Partes Reexamination of U.S. Pat. No. 7,604,525 dated Jul. 9, 2010.
Order Granting Request for Inter Partes Reexamination of U.S. Pat. No. 7,568,964 dated Jul. 10, 2010.
Order Denying Ex Partes Reexamination of U.S. Pat. No. 7,568,964 dated Dec. 10, 2010.
Office action for Inter Partes Reexamination of U.S. Pat. No. 7,618,303—dated Dec. 17, 2010.
Office action for Inter Partes Reexamination of U.S. Pat. No. 7,604,525—dated Aug. 19, 2010.
Notice of Termination of Inter Partes Reexamination of U.S. Pat. No. 7,618,303, dated Sep. 30, 2010.
Notice of Appeal in Inter Partes Reexamination of of U.S. Pat. No. 7,677,948 dated Feb. 17, 2011.
Notice of Appeal in Inter Partes Reexamination of of U.S. Pat. No. 7,568,964 dated Feb. 10, 2011.
Nothing But Neopets, “Neopian History.”, retrieved Mar. 24, 2010.
Nothing But Neopets, “Dec. 2002.”
Neopian Hospital, retrieved Mar. 24, 2010.
Neopets-Archeology, retrieved Mar. 25, 2010.
Neopets The Official Magazine, dated Nov. 18, 2003.
Neopet Nation, Mar. 13, 2003.
monopets.com, registration, Dec. 12, 2002.
Monopets, “The Gurgle,” plush toy, undated, (photographs taken Aug. 2007).
M2 Presswire, “UltraCorps—second premium title for the Internet Gaming Zone Oblivion,” May 1998.
Johnson, “Sympathetic Interfaces,” 1999.
Japanese Patent Office, Decision of Refusal, JP App No. 2006-545875, dated Feb. 18, 2009.
Intl Search Report—PCT-CA2009-000271 dated Sep. 7, 2010.
http://www.lego.com, Feb. 6, 2003, www.archive.org.
http://web.archive.org.web.20031202190119/www.monopets.com/, Dec. 2, 2003.
Grace, “Web site tycoon's next goal: sixth grade”, dated Jun. 19, 2002.
Decision Vacating Inter Partes Reexamination of U.S. Pat. No. 7,618,303, dated Jun. 30, 2010.
Office action CA-3096193, dated Oct. 22, 2021.
Related Publications (1)
Number Date Country
20210370182 A1 Dec 2021 US