Metronome for competitive gaming headset

Information

  • Patent Grant
  • 10561942
  • Patent Number
    10,561,942
  • Date Filed
    Monday, May 15, 2017
    6 years ago
  • Date Issued
    Tuesday, February 18, 2020
    4 years ago
Abstract
A system, method, and processor-readable storage medium for providing metronome signals are disclosed. An example method includes receiving at least one metronome timing profile, obtaining an audio stream of a computer game, receiving a player instruction or software instruction, and generating, based on the player or software instruction, metronome signals according to the at least one metronome timing profile. The metronome signals and the audio stream can be combined and output simultaneously. The players can also provide settings for the metronome signals such as a selection of a sound or a frequency. The players can also create and edit metronome timing profiles. The metronome timing profiles can be also shared among players.
Description
BACKGROUND
Technical Field

This disclosure generally relates to computer games, video games, and gaming equipment for playing computer or video games. More particularly, this disclosure relates to systems and methods for providing metronome signals to players based on predetermined settings.


Background

Computer and video games include a wide range of different categories or genres of games. Some examples include first person shooters, sports games, action games, puzzles, real-time strategies, simulations, role-playing games, educational games, virtual reality games, and so forth. Many games can be played by more than one player.


Player game skills can be improved with proper training and practice. For example, a player can engage in a training to improve his skills of operating a trackball, keyboard, joystick, or game console to be more effective in performing certain game actions, operating virtual equipment (e.g., virtual guns), and completing game tasks. Moreover, training and practice can improve player abilities to better navigate and orient within a virtual game environment.


SUMMARY

This section is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description section. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.


In one example embodiment of this disclosure, there is provided a method for generating metronome signals. An example method can include receiving or maintaining, by a computing device, at least one metronome timing profile; obtaining, by the computing device, an audio stream of a computer game; generating, by the computing device or headphones, the audio stream of the computer game; receiving, by the computing device, a player instruction or a software instruction; and, based on the player instruction or the software instruction, generating, by the computing device or the headphones, metronome signals according to the at least one metronome timing profile, wherein the metronome signals and the audio stream are combined and output simultaneously.


In certain embodiments of the disclosure, the method may further include obtaining, by the computing device, one or more settings for the metronome signals, wherein the metronome signals are output through the headphones according to the at least one metronome timing profile and settings. The settings can include a sound selection of the metronome signals, a frequency of generating the metronome signals, and/or an instructions concerning recurrence of metronome signals.


In some embodiments, the metronome timing profile can be configured to cause generating the metronome signals repeatedly with a predetermined frequency. The predetermined frequency can be within a range from about 1 second to about 3,600 seconds. The predetermined frequency can also be automatically selected based on a game level or a player skill level (i.e., based on player skills). In some implementations, the predetermined frequency can include at least a first duty cycle and a second duty cycle, where a frequency of the first duty cycle differs from a frequency of the second duty cycle.


In certain embodiments of the disclosure, the method may further comprise sharing, by the computing device, the at least one metronome timing profile with another computing device based on a player input by the computing device. In yet further embodiments of the disclosure, the method may include providing, by the computing device, a graphical user interface (GUI) to enable a player to create and update the at least one metronome timing profile, wherein the GUI is further configured to allow the player to activate and deactivate generating of the metronome signals.


The computing device can be a personal computer, mobile device, or gaming console. Thus, the computing device can be communicatively coupled to the headphones, either by a wired or wireless connection. In other implementations, the computing device can be integrated into the headphones. In some embodiments, the metronome signals can include audio signals only; however, in other embodiments, the metronome signals can include both audio signals and displayable indicators.


In another embodiment of this disclosure, there is provided a system for generating metronome signals. The system can be implemented as part of a gaming device, a game console, a mobile device, a smart phone, a headset, headphones, and the like. The system can include at least one processor and a memory storing processor-executable codes. The processor can be configured to implement the following operations upon executing the processor-executable codes: receiving or maintaining at least one metronome timing profile, obtaining an audio stream of a computer game, generating the audio stream of the computer game, receiving a player instruction or a software instruction, and, based on the player instruction or the software instruction, generating metronome signals according to the at least one metronome timing profile, wherein the metronome signals and the audio stream are combined and output simultaneously.


In yet further embodiments of the disclosure, there is provided a non-transitory processor-readable medium having instructions stored thereon, which when executed by one or more processors, cause the one or more processors to implement the above-outlined method for providing metronome signals.


Additional novel features of the example embodiments can be set forth in the detailed description, which follows, and can be apparent to those skilled in the art upon examination of the following description and the accompanying drawings or may be learned by production or operation of the examples. The objects and advantages of the concepts may be realized and attained by means of the methodologies, instrumentalities, and combinations particularly pointed out in the appended claims.





BRIEF DESCRIPTION OF THE DRAWINGS

Embodiments are illustrated by way of example and not limitation in the figures of the accompanying drawings, in which like references indicate similar elements.



FIG. 1 shows an example system architecture for providing metronome signals, according to one embodiment.



FIG. 2 shows a process flow diagram illustrating a method for providing metronome signals, according to an example embodiment.



FIG. 3 shows an example computer system that can be used to implement at least some operations of method for providing metronome signals, according to an example embodiment.





Like reference characters indicate similar components throughout the several views of the drawings. Skilled artisans will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated relative to other elements to help to improve understanding of various embodiments of the present disclosure. In addition, common but well-understood elements that are useful or common in a commercially feasible embodiment are often not depicted in order to facilitate a less obstructed view of these various embodiments of the present disclosure.


DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS

The present disclosure is generally directed to a virtual metronome for computer and video gaming environments. More specifically, the embodiments of this disclosure include methods and systems for providing metronome signals to assist players in training and practicing their gaming skills. These methods and systems can also be used to enhance performance at game tournaments (not limited to practice or training). The methods can be implemented by computing devices such as a mobile device, a smart phone, a personal computer, a gaming console, and the like. The metronome signals can be generated based on predetermined player settings and output via headphones, a headset, speakers, or any other output device. In providing the settings for metronome signals, players can select designated sounds, frequencies for the metronome signals, duty cycles, and also provide instructions as to whether the metronome signals are one-time signals (i.e., reminders, alarms) or recurring metronome signals that are repeatedly generated with a predetermined frequency.


The players can also create one or more metronome timing profiles. Each metronome timing profile can include instructions for the computing device as to when and how the metronome signals are to be generated based on the player settings. Thus, each metronome timing profile can include policies for one or more virtual metronomes. For example, a player can create a first virtual metronome which, when activated, instructs the computing device to periodically generate first metronome signals with a first frequency. The player can also create a second virtual metronome which, when activated, instructs the computing device to generate a second, non-recurring metronome signal at predetermined times. The player can also create a third virtual metronome which, when activated, instructs the computing device to periodically generate third metronome signals with a frequency and sound different from the first metronome signals.


In some implementations, a player can share the metronome timing profiles with other players, or use (download) metronome timing profiles of other players. Moreover, in addition to the audio metronome signals, there can be provided audible metronome signals and displayable metronome signals. The audible metronome signals can be mixed or simultaneously output with an audio stream generated by a game.


Thus, competitive e-sport players can train, practice, and compete with specific timings in mind for their computer games. There is no need for the players to use cell phone alarms, “egg” timers, or other separate devices to practice game skills. Moreover, the metronome signals described in this disclosure can be generated such that they are integrated with the audio stream of computer games, and there is no difficulty in hearing the metronome signals by the players. Training and competing with the metronome signals can help improve player gameplay by ingraining certain habits and keeping the player mindful of critical timing windows to give the player a competitive edge.


Furthermore, embodiments of this disclosure can enable players to set and configure metronome signals through a GUI, which allows the players to create one or more metronome timing profiles or update settings of metronome signals. For example, players can set up a recurring 15-second chime to remind them to check a virtual map, a one-time chime after 2 minutes of the game start to check for a sniper rifle spawn, and a recurring 43-second chime to remind the player to complete a map recon loop. All three timers can be saved as a single timing profile, which the player can name as he pleases. The GUI can also be configured to enable the user to start (activate), stop (deactivate), or pause one or more virtual metronomes. In other words, the player can cause starting (activating), stopping (deactivating), or pausing generation of one or more metronome signals. In other embodiments, a computer game or another software can cause one or more virtual metronomes to start (activate), stop (deactivate), or pause.


The metronome timing profiles can be transferable to other players. Accordingly, other players can download metronome timing profiles and take advantage of predetermined timers. This can help other players train and compete more efficiently by leveraging the training profiles of better players (e.g., professional players). Moreover, metronome timing profiles can be linked to certain computer games and available for download so that the players can experience the regimen that professional players use to perfect their gameplay. In addition, some metronome timing profiles can be sold or purchased by players. In yet additional embodiments, the metronome timing profiles or settings of certain metronome signals can depend on or automatically adjusted based on the current complexity level of a game or a player skill level achieved in the game.


The following detailed description of embodiments includes references to the accompanying drawings, which form a part of the detailed description. Approaches described in this section are not prior art to the claims and are not admitted to be prior art by inclusion in this section. Reference throughout this specification to “one embodiment,” “an embodiment,” “some embodiments,” “some implementations” or similar language means that a particular feature, structure, or characteristic described in connection with an example implementation is included in at least one embodiment of the present disclosure. Thus, appearances of the phrases “in one embodiment,” “in an embodiment,” “in some embodiments,” and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment.


Furthermore, the described features, structures, or characteristics of embodiments may be combined in any suitable manner in one or more implementations. In the following description, numerous specific details are provided, such as examples of programming, software modules, user selections, network transactions, hardware modules, hardware circuits, hardware chips, etc., to provide a thorough understanding of embodiments. One skilled in the relevant art will recognize, however, that the embodiments can be practiced without one or more of the specific details, or with other methods, components, materials, and so forth. In other instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring aspects of the disclosure.


Embodiments of this disclosure will now be presented with reference to accompanying drawings which show blocks, components, circuits, steps, operations, processes, algorithms, and the like, collectively referred to as “elements” for simplicity. These elements may be implemented using electronic hardware, computer software, or any combination thereof. Whether such elements are implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. By way of example, an element, or any portion of an element, or any combination of elements may be implemented with a “processing system” that includes one or more processors. Examples of processors include microprocessors, microcontrollers, Central Processing Units (CPUs), digital signal processors (DSPs), field programmable gate arrays (FPGAs), programmable logic devices (PLDs), state machines, gated logic, discrete hardware circuits, and other suitable hardware configured to perform various functions described throughout this disclosure. One or more processors in the processing system may execute software, firmware, or middleware (collectively referred to as “software”). The term “software” shall be construed broadly to mean processor-executable instructions, instruction sets, code segments, program code, programs, subprograms, software components, applications, software applications, software packages, routines, subroutines, objects, executables, threads of execution, procedures, functions, and the like, whether referred to as software, firmware, middleware, microcode, hardware description language, or otherwise.


Accordingly, in one or more embodiments, the functions described herein may be implemented in hardware, software, or any combination thereof. If implemented in software, the functions may be stored on or encoded as one or more instructions or code on a non-transitory computer-readable medium. Computer-readable media includes computer storage media. Storage media may be any available media that can be accessed by a computer. By way of example, and not limitation, such computer-readable media can include a random-access memory (RAM), a read-only memory (ROM), an electrically erasable programmable ROM (EEPROM), compact disk ROM (CD-ROM) or other optical disk storage, magnetic disk storage, solid state memory, or any other data storage devices, combinations of the aforementioned types of computer-readable media, or any other medium that can be used to store computer executable code in the form of instructions or data structures that can be accessed by a computer.


For purposes of this patent document, the terms “or” and “and” shall mean “and/or” unless stated otherwise or clearly intended otherwise by the context of their use. The term “a” shall mean “one or more” unless stated otherwise or where the use of “one or more” is clearly inappropriate. The terms “comprise,” “comprising,” “include,” and “including” are interchangeable and not intended to be limiting. For example, the term “including” shall be interpreted to mean “including, but not limited to.”


The term “computing device” shall be construed to mean any electronic device configured to process digital data and cause headphones or speakers to generate metronome signals. By way of example, and not limitation, some examples of computing devices include a cellular phone, smart phone, user equipment, terminal, mobile phone, Internet phone, tablet computer, laptop computer, personal computer, desktop computer, game computer, game console, virtual reality headset, and so forth. The term “player” shall be construed to mean a user of the computing device.


The term “metronome” shall be construed to mean a device, virtual feature or computer program that generates repeated or non-repeated audible and/or visual signals according to a predetermined timing, schedule, or timing profile. The audible signals can be configurable in a tone, duration, pitch, and so forth. The term “metronome timing schedule” shall be construed to mean an instruction for a computing device instructing the computing device to generate metronome signals based on a predetermined time schedule, player settings, or configurations. The metronome timing schedule is saved in a digital form such as a file.


The terms “computer game” and “video game” can be used interchangeably and shall be construed to mean any user-interactive computer- or microprocessor-controlled game. The term “headphones” shall be construed to mean one or more speakers or loudspeakers to be maintained close to a user's ears. The terms “headphones,” “headset,” “earbuds,” “speakers,” and “loudspeakers” shall mean the same and can be used interchangeably.


Referring now to the drawings, example embodiments are described. The drawings are schematic illustrations of idealized example embodiments. Thus, the example embodiments discussed herein should not be construed as limited to the particular illustrations presented herein, rather these example embodiments can include deviations and differ from the illustrations presented herein.



FIG. 1 shows an example system architecture 100 for providing metronome signals, according to one example embodiment. System architecture 100 includes a computing device 105 such as mobile device, game console, personal computer, and the like. Computing device 105 can be connected to headphones 110 to output an audio stream of multimedia content such as a computer game. Headphones 110 are also used to output metronome signals in an audible form. In some embodiments, computing device 105 can be integrated into headphones 110. In yet additional embodiments, computing device 105 can be coupled to a game console or another device providing a computer game to a player.


Computing device 105 includes a processor 115 and a memory 120 for storing processor-executable instructions, metronome signals, metronome timing profiles, and settings associated with metronome timing profiles or metronome signals. The processor-executable instructions cause processor 115 to implement at least some operations of the methods for providing metronome signals as disclosed herein. Computing device 105 further includes a GUI 125 enabling the player to activate or deactivate metronome signals, adjust settings associated with metronome timing profiles and metronome signals, and create, adjust, delete, upload, or download metronome timing profiles. Computing device 105 further includes a timer 130 configured to count time or provide clock signals. Timer 130 can generate or cause generation of metronome signals.


In certain embodiments, the player can operate GUI 125 to create or update one or more metronome timing profiles. In additional embodiments, the player can also operate GUI 125 to send or upload one or more metronome timing profiles to server 135 via a communications network 140. In yet additional embodiments, the player can also operate GUI 125 to receive or download one or more metronome timing profiles from server 135 via communications network 140. Thus, server 135 can store and manage certain metronome timing profiles, which can belong to a plurality of users, including professional players. Communications network 140 can refer to any wired, wireless, or optical networks including, for example, the Internet, intranet, local area network (LAN), Personal Area Network (PAN), Wide Area Network (WAN), Virtual Private Network (VPN), cellular phone networks (e.g., packet switching communications network, circuit switching communications network), Bluetooth radio, Ethernet network, an IEEE 802.11-based radio frequency network, IP communications network, or any other data communication network utilizing physical layers, link layer capability, or network layer to carry data packets, or any combinations of the above-listed data networks.


In yet more implementations, the player can operate GUI 125 to share one or more player metronome timing profiles with other players using the same or similar computing device 105. For example, the player metronome timing profiles can be sent from one computing device 105 to another computing device 105 via communications network 140. In addition, the player can operate GUI 125 to receive one or more metronome timing profiles of other players that have been shared with the player.



FIG. 2 is a process flow diagram showing a method 200 for providing metronome signals according to an example embodiment. Method 200 may be performed by processing logic that may include hardware (e.g., decision-making logic, dedicated logic, programmable logic, application-specific integrated circuit (ASIC)), software (such as software run on a general-purpose computer system or a dedicated machine), or a combination of both. In one example embodiment, the processing logic refers to one or more elements of computing device 105 of FIG. 1. Below recited operations of method 200 may be implemented in an order different than described and shown in the figure. Moreover, method 200 may have additional operations not shown herein, but which can be evident for those skilled in the art from the present disclosure. Method 200 may also have fewer operations than outlined below and shown in FIG. 2.


At operation 205, computing device 105 receives or maintains at least one metronome timing profile. In addition, computing device 105 receives or maintains one or more settings of metronome signals associated with the metronome timing profile. In one example implementation, a metronome timing profile can include instructions on how, what, and when metronome signals shall be generated. In other words, the metronome timing profile includes instructions on how one or more virtual metronomes are to be operated. The metronome timing profile can be created, updated, and uploaded by the player to server 135, downloaded from server 135, and shared with other players. For all of these ends, the player can use GUI 125.


For example, the player can set up a first virtual metronome which, when activated, generates recurring audible metronome signals of a first sound with a first frequency (e.g., every 15 seconds). Further, the player can set up a second virtual metronome which, when activated, generates recurring audible metronome signals of a second sound with a second frequency (e.g., every 23 seconds). In addition, the player can set up a third virtual metronome which, when activated, generates a non-recurring audible metronome signal of third sound after a predetermined period (e.g., two minutes) after a computer game is started. In other words, the third virtual metronome acts as a reminder, alert, stop-watch, or count-down. In additional embodiments, the metronome signals can include displayable indicators, messages, or any other visible information that can be displayed on a screen of computing device 105. For example, the visible information can include a clock or a text message reminding the player to perform certain actions (e.g., “check map” to reminder the player to review a virtual map of a computer game).


As discussed above, the player can set, adjust, or otherwise change any setting of any virtual metronome. For example, the player can select a sound, melody, or any other audible signal or message to be played back by the virtual metronome as metronome signal. The player can also select a frequency, period, or repetition of metronome signals. For example, the player can set a predetermined frequency of metronome signals in a range from about 1 second to about 3,600 seconds. The player can also name metronome signals or virtual metronome. The player can also associate the metronome signals or virtual metronome with a certain computer game such that the metronome signals are automatically generated when the computer game is activated. The player can also associate the metronome signals, virtual metronome, or certain settings of the virtual metronome with a computer game level or player level. For example, a frequency of metronome signals can decrease with the increase of a difficulty level of computer game. Similarly, the frequency of metronome signals can increase with the increase of a player skill level in a particular computer game. Thus, the predetermined frequency of metronome signals can be automatically selected based on the game level or player skill level.


In yet additional embodiments, the metronome signals can be set and generated in two or more duty cycles. Each duty cycle can differ from one another. For example, a first duty cycle can cause generation of metronome signals with first settings (e.g., first sound or first frequency), while a second duty cycle can cause generation of metronome signals with second settings (e.g., second sound or second frequency). The first duty cycle and the second duty cycle are alternating. In other embodiments, there can be more than two duty cycles. Parameters or settings of duty cycles can be set, updated, downloaded, or uploaded by the player via GUI 125.


Still referring to FIG. 2, at operation 210, computing device 105 obtains an audio stream of a computer game and outputs the audio stream via headphones 110 or other speakers. At optional operation 215, computing device 105 receives a player instruction or a software instruction (e.g., a computer game instruction) indicating that certain metronome signals are to be generated based on a selected metronome timing profile. The player instruction or the software instruction can be generated by the player through GUI 125. In additional embodiments, the player instruction or the software instruction can be automatically generated by computing device 105 upon identifying or determining a qualifying event such as a start of the computer game.


At operation 220, computing device 105 generates and outputs the metronome signals via headphones 110. The metronome signals are generated according to the at least one metronome timing profile, the settings, and the player instruction or the software instruction. Notably, computing device 105 combines the metronome signals and the audio stream such that they output simultaneously to the player. Thus, the player can conveniently listen to the audio stream and the metronome signals using the same headphones.



FIG. 3 is a high-level block diagram illustrating a computing device 300 suitable for implementing the methods described herein. In particular, computing device 300 may be used for implementing the methods for providing metronome signals as described above. Computing device 300 may include, be, or be an integral part of one or more of a variety of types of devices, such as a mobile device, among others. In some embodiments, computing device 300 can be regarded as an instance of computing device 105.


As shown in FIG. 3, computing device 300 includes one or more processors 310, memory 320, one or more mass storage devices 330, one or more output devices 350, one or more input devices 360, one or more network interfaces 370, one or more optional peripheral devices 380, and a communication bus 390 for operatively interconnecting the above-listed elements. Processors 310 can be configured to implement functionality and/or process instructions for execution within computing device 300. For example, processors 310 may process instructions stored in memory 320 or instructions stored on storage devices 330. Such instructions may include components of an operating system or software applications.


Memory 320, according to one example, is configured to store information within computing device 300 during operation. For example, memory 320 can store settings of metronome signals or metronome timing profiles. Memory 320, in some example embodiments, may refer to a non-transitory computer-readable storage medium or a computer-readable storage device. In some examples, memory 320 is a temporary memory, meaning that a primary purpose of memory 320 may not be long-term storage. Memory 320 may also refer to a volatile memory, meaning that memory 320 does not maintain stored contents when memory 320 is not receiving power. Examples of volatile memories include RAM, dynamic random access memories (DRAM), static random access memories (SRAM), and other forms of volatile memories known in the art. In some examples, memory 320 is used to store program instructions for execution by processors 310. Memory 320, in one example, is used by software applications or mobile applications. Generally, software or mobile applications refer to software applications suitable for implementing at least some operations of the methods as described herein.


Mass storage devices 330 can also include one or more transitory or non-transitory computer-readable storage media or computer-readable storage devices. For example, memory 320 can store instructions for processor 310, metronome signals, settings of metronome signals, and metronome timing profiles. In some embodiments, mass storage devices 330 may be configured to store greater amounts of information than memory 320. Mass storage devices 330 may also be configured for long-term storage of information. In some examples, mass storage devices 330 include non-volatile storage elements. Examples of such non-volatile storage elements include magnetic hard discs, optical discs, solid-state discs, flash memories, forms of electrically programmable memories (EPROM) or electrically erasable and programmable memories, and other forms of non-volatile memories known in the art.


Computing device 300 may also include one or more optional input devices 360. Input devices 360 may be configured to receive input from a player through tactile, audio, video, or biometric channels. Examples of input devices 360 may include a keyboard, keypad, mouse, trackball, touchscreen, touchpad, microphone, video camera, image sensor, fingerprint sensor, or any other device capable of detecting an input from the player or other source, and relaying the input to computing device 300 or components thereof.


Optional output devices 350 may be configured to provide output to the player through visual or auditory channels. Output devices 350 may include a video graphics adapter card, display, such as liquid crystal display (LCD) monitor, light emitting diode (LED) monitor, or organic LED monitor, sound card, speaker, headphones, headset, virtual reality headset, projector, or any other device capable of generating output that may be intelligible to a player. Output devices 350 may also include a touchscreen, presence-sensitive display, or other input/output capable displays known in the art.


Computing device 300 can also include network interface 370. Network interface 370 can be utilized to communicate with external devices via one or more communications networks such as communications network 140 or any other wired, wireless, or optical networks. Network interface 370 may be a network interface card, such as an Ethernet card, an optical transceiver, a radio frequency transceiver, or any other type of device that can send and receive information.


An operating system of computing device 300 may control one or more functionalities of computing device 300 or components thereof. For example, the operating system may interact with the software applications or mobile applications and may facilitate one or more interactions between the software/mobile applications and processors 310, memory 320, storage devices 330, input devices 360, output devices 350, and network interface 370. The operating system may interact with or be otherwise coupled to software applications or components thereof. In some embodiments, software or mobile applications may be included in the operating system.


Thus, methods and systems for providing metronome signals have been described. Although embodiments have been described with reference to specific example embodiments, it will be evident that various modifications and changes can be made to these example embodiments without departing from the broader spirit and scope of the present application. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.

Claims
  • 1. A method for providing metronome signals during game play of a computer game, the method comprising: receiving or maintaining, by a computing device, at least one metronome timing profile;obtaining, by the computing device, an audio stream of a computer game;causing, by the computing device, output of the audio stream of the computer game to at least one loudspeaker of the computing device or headphones;receiving, by the computing device, a player instruction or a software instruction; andbased on the player instruction or the software instruction, generating, by the computing device, metronome signals according to the at least one metronome timing profile, wherein the metronome signals are audio signals integrated with the audio stream and caused to be output simultaneously to a player such that the player is aided in hearing both the metronome signals and the audio stream using the at least one loudspeaker;wherein at least two of the metronome signals are repeated while the player continues to play the computer game.
  • 2. The method of claim 1, further comprising: obtaining, by the computing device, one or more settings for the metronome signals, wherein the metronome signals are caused, by the computing device, to be output through the headphones according to the at least one metronome timing profile and the one or more settings.
  • 3. The method of claim 2, wherein the one or more settings include a sound selection of the metronome signals.
  • 4. The method of claim 2, wherein the one or more settings include a frequency of generating the metronome signals.
  • 5. The method of claim 2, wherein the one or more settings include an instruction concerning recurrence of metronome signals.
  • 6. The method of claim 1, wherein the at least one metronome timing profile is configured to cause generating the metronome signals repeatedly with a predetermined frequency.
  • 7. The method of claim 6, wherein the predetermined frequency is within a range of about 1 second to about 3,600 seconds.
  • 8. The method of claim 6, wherein the predetermined frequency is automatically selected based on a game level or a player skill level.
  • 9. The method of claim 6, wherein the predetermined frequency includes at least a first duty cycle and a second duty cycle, wherein a frequency of the first duty cycle differs from a frequency of the second duty cycle.
  • 10. The method of claim 1, further comprising: sharing, by the computing device, the at least one metronome timing profile with another computing device based on a player input by the computing device.
  • 11. The method of claim 1, further comprising: providing, by the computing device, a graphical user interface to enable the player to create and update the at least one metronome timing profile, wherein the graphical user interface is further configured to enable the player to activate and deactivate the generating of the metronome signals.
  • 12. The method of claim 1, wherein the computing device is integrated into the headphones.
  • 13. The method of claim 1, wherein the computing device includes a game console or mobile device.
  • 14. A system for providing metronome signals during game play of a computer game, the system comprising: at least one processor and a memory storing processor-executable codes, wherein the at least one processor is configured to implement the following operations upon executing the processor-executable codes: receiving or maintaining at least one metronome timing profile;obtaining an audio stream of a computer game;causing output of the audio stream of the computer game to at least one loudspeaker of a computing device or headphones;receiving a player instruction or a software instruction; andbased on the player instruction or the software instruction, generating metronome signals according to the at least one metronome timing profile, wherein the metronome signals are audio signals integrated with the audio stream and caused to be output simultaneously to a player such that the player is aided in hearing both the metronome signals and the audio stream using the at least one loudspeaker;wherein at least two of the metronome signals are repeated while the player continues to play the computer game.
  • 15. The system of claim 14, wherein the at least one processor is further configured to implement the following operation upon executing the processor-executable codes: obtaining one or more settings for the metronome signals, wherein the metronome signals are caused, by the computing device, to be output through headphones according to the at least one metronome timing profile and the one or more settings, and wherein the one or more settings include a sound selection of the metronome signals and a frequency of generating the metronome signals.
  • 16. The system of claim 14, wherein the at least one processor is further configured to implement the following operation upon executing the processor-executable codes: sharing the at least one metronome timing profile with another computing device based on a player input by the computing device.
  • 17. The system of claim 14, wherein the at least one processor is further configured to implement the following operation upon executing the processor-executable codes: providing a graphical user interface to enable the player to create and update the at least one metronome timing profile, wherein the graphical user interface is further configured to enable the player to activate and deactivate the generating of the metronome signals.
  • 18. A non-transitory processor-readable medium having instructions stored thereon, which when executed by one or more processors, cause the one or more processors to implement a method for providing metronome signals during game play of a computer game, the method comprising: receiving or maintaining, by a computing device, at least one metronome timing profile;obtaining, by the computing device, an audio stream of a computer game;causing, by the computing device, output of the audio stream of the computer game to at least one loudspeaker of the computing device or headphones;receiving, by the computing device, a player instruction or a software instruction; andbased on the player instruction or the software instruction, generating, by the computing device, metronome signals according to the at least one metronome timing profile, wherein the metronome signals are audio signals integrated with the audio stream and caused to be output simultaneously to a player such that the player is aided in hearing both the metronome signals and the audio stream using the at least one loudspeaker;wherein at least two of the metronome signals are repeated while the player continues to play the computer game.
US Referenced Citations (194)
Number Name Date Kind
3147341 Gibson, Jr. Sep 1964 A
3200193 Biggs et al. Aug 1965 A
4016540 Hyatt Apr 1977 A
4090216 Constable May 1978 A
4104625 Bristow et al. Aug 1978 A
4355334 Fitzgibbon et al. Oct 1982 A
4445187 Best Apr 1984 A
4475132 Rodesch Oct 1984 A
4514727 Van Antwerp Apr 1985 A
4569026 Best Feb 1986 A
4677569 Nakano et al. Jun 1987 A
4704696 Reimer et al. Nov 1987 A
4752069 Okada Jun 1988 A
4757525 Matthews et al. Jul 1988 A
4952917 Yabuuchi Aug 1990 A
5057744 Barbier et al. Oct 1991 A
5167010 Elm et al. Nov 1992 A
5241671 Reed et al. Aug 1993 A
5274560 LaRue Dec 1993 A
5321833 Chang et al. Jun 1994 A
5358259 Best Oct 1994 A
5377997 Wilden et al. Jan 1995 A
5405152 Katanics et al. Apr 1995 A
5446714 Yoshio et al. Aug 1995 A
5498002 Gechter Mar 1996 A
RE35314 Logg Aug 1996 E
5598297 Yamanaka et al. Jan 1997 A
5617407 Bareis Apr 1997 A
5649861 Okano et al. Jul 1997 A
5659732 Kirsch Aug 1997 A
5704837 Iwasaki et al. Jan 1998 A
5724567 Rose et al. Mar 1998 A
5732232 Brush, II et al. Mar 1998 A
5751825 Myers May 1998 A
5765150 Burrows Jun 1998 A
5786801 Ichise Jul 1998 A
5802361 Wang et al. Sep 1998 A
5818553 Koenck et al. Oct 1998 A
5823879 Goldberg et al. Oct 1998 A
5870740 Rose et al. Feb 1999 A
5890122 Van Kleeck et al. Mar 1999 A
5947823 Nimura Sep 1999 A
5948040 DeLorme et al. Sep 1999 A
5974412 Hazlehurst et al. Oct 1999 A
5977968 Le Blanc Nov 1999 A
6001013 Ota Dec 1999 A
6012053 Pant et al. Jan 2000 A
6017272 Rieder Jan 2000 A
6064978 Gardner et al. May 2000 A
6067539 Cohen May 2000 A
6098061 Gotoh et al. Aug 2000 A
6155924 Nakagawa et al. Dec 2000 A
6168524 Aoki et al. Jan 2001 B1
6183366 Goldberg et al. Feb 2001 B1
6202058 Rose et al. Mar 2001 B1
6210273 Matsuno Apr 2001 B1
6241524 Aoshima et al. Jun 2001 B1
6264560 Goldberg et al. Jul 2001 B1
6273818 Komoto Aug 2001 B1
6283861 Kawai et al. Sep 2001 B1
6296570 Miyamoto et al. Oct 2001 B1
6319121 Yamada et al. Nov 2001 B1
6327590 Chidlovskii et al. Dec 2001 B1
6363378 Conklin et al. Mar 2002 B1
6366272 Rosenberg et al. Apr 2002 B1
6375571 Ohnuma et al. Apr 2002 B1
6409604 Matsuno Jun 2002 B1
6413163 Yamauchi et al. Jul 2002 B1
6419580 Ito Jul 2002 B1
6428411 Togami Aug 2002 B1
6434556 Levin et al. Aug 2002 B1
6456977 Wang Sep 2002 B1
6508706 Sitrick et al. Jan 2003 B2
6529875 Nakajima et al. Mar 2003 B1
6533663 Iwao et al. Mar 2003 B1
6538666 Ozawa et al. Mar 2003 B1
6554707 Sinclair et al. Apr 2003 B1
6556983 Altschuler et al. Apr 2003 B1
6571208 Kuhn et al. May 2003 B1
6572478 Miyamoto et al. Jun 2003 B2
6582230 Aoshima et al. Jun 2003 B1
6582309 Higurashi et al. Jun 2003 B2
6585599 Horigami et al. Jul 2003 B1
6652384 Kondo et al. Nov 2003 B2
6684127 Fujita et al. Jan 2004 B2
6705945 Gavin et al. Mar 2004 B2
6729954 Atsumi et al. May 2004 B2
6826552 Grosser et al. Nov 2004 B1
6899628 Leen et al. May 2005 B2
6920426 Takechi Jul 2005 B2
6928433 Goodman et al. Aug 2005 B2
6935954 Sterchi et al. Aug 2005 B2
6966832 Leen et al. Nov 2005 B2
6979267 Leen et al. Dec 2005 B2
7029394 Leen et al. Apr 2006 B2
7062561 Reisman Jun 2006 B1
7085722 Luisi Aug 2006 B2
7137891 Neveu et al. Nov 2006 B2
7155157 Kaplan Dec 2006 B2
7172118 Urken Feb 2007 B2
7180529 Covannon et al. Feb 2007 B2
7202613 Morgan et al. Apr 2007 B2
7233904 Luisi Jun 2007 B2
7438642 Walker et al. Oct 2008 B2
7452273 Amaitis et al. Nov 2008 B2
7455589 Neveu et al. Nov 2008 B2
7572187 Van Luchene Aug 2009 B2
7613616 Luisi Nov 2009 B2
7717782 Van Luchene May 2010 B2
7731589 Kataoka et al. Jun 2010 B2
7764026 Dowling et al. Jul 2010 B2
7880746 Marks et al. Feb 2011 B2
7946909 Neveu et al. May 2011 B2
7965859 Marks Jun 2011 B2
8295549 Marks et al. Oct 2012 B2
8442403 Weaver May 2013 B2
8714983 Kil May 2014 B2
8799250 Smith et al. Aug 2014 B1
8964298 Haddick et al. Feb 2015 B2
9108108 Zalewski et al. Aug 2015 B2
9126116 Turner et al. Sep 2015 B2
9155960 Argiro Oct 2015 B2
9626689 Bethke Apr 2017 B1
9833707 Watson Dec 2017 B2
9950259 Watson Apr 2018 B2
10128914 Calabrese Nov 2018 B1
20010009867 Sakaguchi et al. Jul 2001 A1
20020068626 Takeda et al. Jun 2002 A1
20020082065 Fogel et al. Jun 2002 A1
20020103031 Neveu et al. Aug 2002 A1
20020169617 Luisi Nov 2002 A1
20030065636 Peyrelevade Apr 2003 A1
20030109305 Gavin et al. Jun 2003 A1
20030177347 Schneier et al. Sep 2003 A1
20040029625 Annunziata Feb 2004 A1
20040166935 Gavin et al. Aug 2004 A1
20050054290 Logan et al. Mar 2005 A1
20050170828 Nakamura et al. Aug 2005 A1
20050174889 Marcantonio Aug 2005 A1
20050191969 Mousseau Sep 2005 A1
20050275508 Orr et al. Dec 2005 A1
20060039017 Park et al. Feb 2006 A1
20060178179 Neveu et al. Aug 2006 A1
20060190270 Luisi Aug 2006 A1
20070037605 Logan Feb 2007 A1
20070060231 Neveu et al. Mar 2007 A1
20070087797 Van Luchene Apr 2007 A1
20070099709 Okada May 2007 A1
20070244704 Luisi Oct 2007 A1
20070257928 Marks et al. Nov 2007 A1
20070273848 Fan Nov 2007 A1
20070279427 Marks Dec 2007 A1
20080064019 Kaufman et al. Mar 2008 A1
20080109491 Gupta May 2008 A1
20080167106 Lutnick et al. Jul 2008 A1
20080220869 Midgley et al. Sep 2008 A1
20080294782 Patterson Nov 2008 A1
20090054814 Schnapp et al. Feb 2009 A1
20090063463 Turner et al. Mar 2009 A1
20090119234 Pinckney et al. May 2009 A1
20100041475 Zalewski et al. Feb 2010 A1
20100111374 Stoica May 2010 A1
20100138764 Hatambeiki et al. Jun 2010 A1
20100171430 Seydoux Jul 2010 A1
20100174593 Cao Jul 2010 A1
20100194578 Zhang Aug 2010 A1
20100213873 Picard et al. Aug 2010 A1
20100241496 Gupta et al. Sep 2010 A1
20100302033 Devenyi Dec 2010 A1
20100312366 Madonna et al. Dec 2010 A1
20110033830 Cherian Feb 2011 A1
20120021388 Arbuckle Jan 2012 A1
20120088213 Soltanoff Apr 2012 A1
20130344960 Perry Dec 2013 A1
20140068755 King Mar 2014 A1
20140121009 Watson May 2014 A1
20140132628 Hoff May 2014 A1
20140135631 Brumback et al. May 2014 A1
20140142403 Brumback et al. May 2014 A1
20140143424 Rostaing May 2014 A1
20140191848 Imes et al. Jul 2014 A1
20140329613 Savarese et al. Nov 2014 A1
20140361872 Garcia et al. Dec 2014 A1
20150005911 Lake, II Jan 2015 A1
20150087369 McIntyre Mar 2015 A1
20150141005 Suryavanshi et al. May 2015 A1
20150304804 Lotito Oct 2015 A1
20150347738 Ulrich et al. Dec 2015 A1
20160018934 Turner et al. Jan 2016 A1
20160057565 Gold Feb 2016 A1
20160282899 Inagaki et al. Sep 2016 A1
20170368459 Watson Dec 2017 A1
20180091193 Hagedorn Mar 2018 A1
20190074868 Calabrese Mar 2019 A1
Foreign Referenced Citations (87)
Number Date Country
1201180 Dec 1998 CN
1385783 Dec 2002 CN
1848742 Oct 2006 CN
101836362 Sep 2010 CN
101849436 Sep 2010 CN
101968827 Feb 2011 CN
101968827 May 2014 CN
104797311 Jul 2015 CN
104797311 Sep 2018 CN
109107149 Jan 2019 CN
19905076 Jun 2002 DE
0789296 Aug 1997 EP
0850673 Jul 1998 EP
0898237 Feb 1999 EP
0901803 Mar 1999 EP
0913175 May 1999 EP
1029569 Aug 2000 EP
1078661 Feb 2001 EP
1262955 Dec 2002 EP
1355707 Oct 2003 EP
1388357 Feb 2004 EP
1434627 Jul 2004 EP
1630754 Mar 2006 EP
1650706 Apr 2006 EP
1793588 Jun 2007 EP
1262955 Mar 2010 EP
2322257 May 2011 EP
2322257 Apr 2018 EP
2355627 Sep 1998 GB
2351637 Jan 2001 GB
2356785 May 2001 GB
2411065 Aug 2005 GB
S59202779 Nov 1984 JP
H07178246 Jul 1995 JP
H08155140 Jun 1996 JP
H09265379 Oct 1997 JP
H10272258 Oct 1998 JP
H10295935 Nov 1998 JP
H11000467 Jan 1999 JP
H11070273 Mar 1999 JP
H11119791 Apr 1999 JP
H11197359 Jul 1999 JP
2000024322 Jan 2000 JP
2000116946 Apr 2000 JP
2000176154 Jun 2000 JP
2000334168 Dec 2000 JP
2001009156 Jan 2001 JP
2001029649 Feb 2001 JP
2001079265 Mar 2001 JP
2001157779 Jun 2001 JP
2001198350 Jul 2001 JP
2002052256 Feb 2002 JP
2002085835 Mar 2002 JP
2002092474 Mar 2002 JP
2002159740 Jun 2002 JP
2002166048 Jun 2002 JP
2002191868 Jul 2002 JP
2003047768 Feb 2003 JP
2003228585 Aug 2003 JP
2004529678 Sep 2004 JP
2005505357 Feb 2005 JP
3741687 Feb 2006 JP
2006031670 Feb 2006 JP
2006087459 Apr 2006 JP
2006099125 Apr 2006 JP
3865721 Jan 2007 JP
2007249899 Sep 2007 JP
2011025044 Feb 2011 JP
5580131 Aug 2014 JP
1020000072753 Dec 2000 KR
100464877 Dec 2004 KR
100469822 Jan 2005 KR
1020020044919 Jun 2005 KR
1020070052493 Sep 2008 KR
101226305 Jan 2013 KR
WO1994018790 Aug 1994 WO
WO9714102 Apr 1997 WO
WO2001082626 Nov 2001 WO
WO2002060548 Aug 2002 WO
WO2003031003 Apr 2003 WO
WO2005040900 May 2005 WO
WO2006033360 Mar 2006 WO
WO2007130641 Nov 2007 WO
WO2009052659 Apr 2009 WO
WO2009060376 May 2009 WO
WO2014070677 May 2014 WO
WO2019050692 Mar 2019 WO
Non-Patent Literature Citations (135)
Entry
“Notice of Allowance”, European Patent Application No. 10007803.9, dated Oct. 23, 2017, 7 pages.
“Office Action”, Chinese Patent Application No. 201380056819.3, dated Dec. 18, 2017, 3 pages [7 pages including translation].
Non-Final Office Action, dated Dec. 17, 2003, U.S. Appl. No. 09/859,034, filed May 14, 2001.
Final Office Action, dated Jun. 4, 2004, U.S. Appl. No. 09/859,034, filed May 14, 2001.
Advisory Action, dated Aug. 25, 2004, U.S. Appl. No. 09/859,034, filed May 14, 2001.
Non-Final Office Action, dated Jan. 19, 2005, U.S. Appl. No. 09/859,034, filed May 14, 2001.
Final Office Action, dated Jun. 24, 2005, U.S. Appl. No. 09/859,034, filed May 14, 2001.
Advisory Action, dated Sep. 2, 2005, U.S. Appl. No. 09/859,034, filed May 14, 2001.
Notice of Allowance, dated Jan. 13, 2006, U.S. Appl. No. 09/859,034, filed May 14, 2001.
Non-Final Office Action, dated Nov. 17, 2005, U.S. Appl. No. 10/364,951, filed Feb. 11, 2003.
Final Office Action, dated May 16, 2006, U.S. Appl. No. 10/364,951, filed Feb. 11, 2003.
Advisory Action, dated Aug. 18, 2006, U.S. Appl. No. 10/364,951, filed Feb. 11, 2003.
Non-Final Office Action, dated Feb. 5, 2007, U.S. Appl. No. 10/364,951, filed Feb. 11, 2003.
Non-Final Office Action, dated Jul. 9, 2003, U.S. Appl. No. 10/268,278, filed Oct. 9, 2002.
Notice of Allowance, dated Dec. 2, 2003, U.S. Appl. No. 10/268,278, filed Oct. 9, 2002.
Non-Final Office Action, dated Dec. 30, 2004, U.S. Appl. No. 10/791,476, filed Mar. 1, 2004.
Non-Final Office Action, dated Apr. 5, 2006, U.S. Appl. No. 10/791,476, filed Mar. 1, 2004.
Final Office Action, dated Oct. 24, 2006, U.S. Appl. No. 10/791,476, filed Mar. 1, 2004.
Notice of Allowance, dated Feb. 16, 2007, U.S. Appl. No. 11/403,716, filed Apr. 13, 2006.
Non-Final Office Action, dated Feb. 25, 2003, U.S. Appl. No. 09/773,452, filed Jan. 31, 2001.
Non-Final Office Action, dated Jun. 5, 2003, U.S. Appl. No. 09/773,452, filed Jan. 31, 2001.
Final Office Action, dated Jun. 1, 2004, U.S. Appl. No. 09/773,452, filed Jan. 31, 2001.
Final Office Action, dated Sep. 24, 2004, U.S. Appl. No. 09/773,452, filed Jan. 31, 2001.
Advisory Action, dated May 4, 2005, U.S. Appl. No. 09/773,452, filed Jan. 31, 2001.
Non-Final Office Action, dated Sep. 13, 2005, U.S. Appl. No. 09/773,452, filed Jan. 31, 2001.
Final Office Action, dated Mar. 16, 2006, U.S. Appl. No. 09/773,452, filed Jan. 31, 2001.
Notice of Allowance, dated Jul. 11, 2006, U.S. Appl. No. 09/773,452, filed Jan. 31, 2001.
Non-Final Office Action, dated Jun. 2, 2008, U.S. Appl. No. 11/375,296, filed Mar. 13, 2006.
Notice of Allowance, dated Sep. 25, 2008, U.S. Appl. No. 11/375,296, filed Mar. 13, 2006.
Non-Final Office Action, dated Mar. 25, 2010, U.S. Appl. No. 11/624,886, filed Jan. 19, 2007.
Final Office Action, dated Aug. 24, 2010, U.S. Appl. No. 11/624,886, filed Jan. 19, 2007.
Notice of Allowance, dated Feb. 18, 2011, U.S. Appl. No. 11/624,886, filed Jan. 19, 2007.
Non-Final Office Action, dated May 2, 2008, U.S. Appl. No. 11/591,314, filed Oct. 31, 2006.
Non-Final Office Action, dated Aug. 2, 2010, U.S. Appl. No. 11/591,314, filed Oct. 31, 2006.
Notice of Allowance, dated Jan. 13, 2011, U.S. Appl. No. 11/591,314, filed Oct. 31, 2006.
Notice of Allowance, dated Sep. 18, 2009, U.S. Appl. No. 11/764,795, filed Jun. 18, 2007.
Non-Final Office Action, dated Apr. 1, 2011, U.S. Appl. No. 11/850,516, filed Sep. 5, 2007.
Final Office Action, dated Sep. 15, 2011, U.S. Appl. No. 11/850,516, filed Sep. 5, 2007.
Non-Final Office Action, dated Dec. 3, 2013, U.S. Appl. No. 11/850,516, filed Sep. 5, 2007.
Notice of Allowance, dated Jun. 19, 2014, U.S. Appl. No. 11/850,516, filed Sep. 5, 2007.
Non-Final Office Action, dated Aug. 11, 2014, U.S. Appl. No. 11/850,516, filed Sep. 5, 2007.
Notice of Allowance, dated Jan. 16, 2015, U.S. Appl. No. 11/850,516, filed Sep. 5, 2007.
Non-Final Office Action, dated Dec. 3, 2013, U.S. Appl. No. 12/509,848, filed Jul. 27, 2009.
Non-Final Office Action, dated May 29, 2014, U.S. Appl. No. 12/509,848, filed Jul. 27, 2009.
Non-Final Office Action, dated Sep. 19, 2014, U.S. Appl. No. 12/509,848, filed Jul. 27, 2009.
Final Office Action, dated Feb. 25, 2015, U.S. Appl. No. 12/509,848, filed Jul. 27, 2009.
Notice of Allowance, dated Jun. 19, 2015, U.S. Appl. No. 12/509,848, filed Jul. 27, 2009.
Non-Final Office Action, dated Sep. 12, 2016, U.S. Appl. No. 13/663,262, filed Oct. 29, 2012.
Non-Final Office Action, dated May 9, 2017, U.S. Appl. No. 13/663,262, filed Oct. 29, 2012.
Notice of Allowance, dated Sep. 19, 2017, U.S. Appl. No. 13/663,262, filed Oct. 29, 2012.
“International Search Report” Patent Cooperation Treaty Application No. PCT/US02/02710, dated Sep. 12, 2002, 3 pages.
“Office Action”, European Patent Application No. 02704295.1, dated Apr. 23, 2004, 3 pages.
“Office Action”, European Patent Application No. 02704295.1, dated Dec. 15, 2004, 4 pages.
“Office Action”, European Patent Application No. 02704295.1, dated Apr. 12, 2006, 10 pages.
“Office Action”, China Patent Application No. 2010102454118, dated Sep. 7, 2012, 3 pages [11 pages with translation].
“Office Action”, European Patent Application No. 10007803.9, dated Aug. 8, 2013, 6 pages.
“Office Action”, Japan Patent Application No. 2010-167803, dated Mar. 26, 2013, 3 pages [6 pages with translation].
Rejection dated Mar. 16, 2012 in KR Application No. 10-2010-0072613.
“International Search Report & Written Opinion”, Patent Cooperation Treaty Application No. PCT/US2013/067135, dated May 1, 2014, 18 pages.
Rejection dated Mar. 2, 2004 in KR Application No. 10-2002-00265621.
Decision to Grant dated Oct. 5, 2005 in JP Application 2002-5607373.
Rejection dated Nov. 16, 2003 in JP Application 2002-5607373.
“Office Action”, China Patent Application No. 201010245413.8, dated Nov. 5, 2013, 4 pages [12 pages with translation].
“European Search Report”, European Patent Application No. 03254168.2, dated Apr. 23, 2004, 3 pages.
“Office Action”, European Patent Application No. 03254168.2, dated Sep. 29, 2006, 4 pages.
Stern, Andrew. Virtual Babyz: Believeable agents with Narrative Intelligence, Narrative Intelligence AAAI Symposium, Nov. 1999. Online. Viewed Apr. 28, 2006. http://www.cs.cmu.edu/afs/cs/user/michaelm/www/nidocs/Stern.html, 7 pages.
“Babyz Features Page.” Online. Viewed May 3, 2006. www.babyz.net/features.html, 1 page.
“Babyz”. Wikipedia online reference. Viewed May 1, 2006. http://en.wikipedia.or.q/wiki/babyz, 2 pages.
Northwestem University CS395, Game Design Course “Simulation and Modeling: Under the Hood of the Sims”, Spring 2002. http://www.cs.northwestern.edu/%7Eforbus/c95-gd/lectures/The_Sims_Under_the_Hood_files/frame.htm, 32 pages.
Simpson, Dan. “The Complete Sims Guide” Feb. 6, 2005, pertinent sections printed from the Internet, may also be found in its entirety at: http://www.neoseeker.com/resourcelink.html?rlid=16238&rid=15516, 18 pages.
“Sequence Paradium 2—Laughter in the Dark—Tactical Guidebook”, First Edition, Keibunsha Inc., Feb. 10, 2005, pp. 5-32.
Sprigg, Sean M., Patent Examiner, Examiner's Affidavit, Nov. 9, 2005, 5 pages.
Stern, Andrew. “Andrew Stern”. Online. Viewed Apr. 28, 2006. http://quvu.net/andrew/resume.html, 6 pages.
Stewart, Nick. “The Adrenaline Vault Review of the Sims”, Mar. 9, 2000. Printed from the Internet, 5 pages.
Decision to Grant / Notice of Allowance dated Jun. 3, 2014 in JP 2010167803 filed Jul. 27, 2010.
“Office Action”, Japan Patent Application No. 2003-288128, dated Mar. 15, 2005.
“Office Action”, Japan Patent Application No. 2003-288128, dated Dec. 13, 2005.
Notice of Allowance dated Oct. 31, 2012 in KR 10-2010-0072613.
“Office Action”, European Patent Application No. 10007803.9, dated Sep. 29, 2014, 4 pages.
“Office Action”, China Patent Application No. 201380056819.3, dated Nov. 15, 2016, 6 pages [16 pages including translation].
“Office Action,” Chinese Patent Application No. 201380056819.3, dated Jun. 23, 2017, 3 pages [7 pages including translation].
Arcadia, vol. 2, No. 12, Enterbrain, Inc., Dec. 1, 2001, pp. 56-63.
Konami Corporation, Konami Official Guide Perfect Series, Tokimeki Memorial—Forever with You: Official Guide, First Edition, Jun. 29, 1997, 19 pages [37 pages with translation].
Login, vol. 21, No. 4, Enterbrain, Inc. Apr. 1, 2002, pp. 70-77.
Reynolds, Craig, “Flocks, Herds, and Schools: A Distributed Behavioral Model,” Proceedings of SIGGRAPH '87, Computer Graphics 21(4), Jul. 1987, 13 pages.
Reynolds, Craig, “Interaction with Groups of Autonomous Characters,” Proceedings of Game Developer Conference 2000, San Francisco, CA 2000, 12 pages.
Reynolds, Craig, “Steering Behaviors for Autonomous Characters,” Proceedings of Game Developers Conference 1999, 21 pages.
Super Mario Brothers: Complete Cheat Book, Tokuma Publishing Co., Ltd., Nov. 20, 1985, p. 9.
Yu, Bin et al., “A Social Mechanism of Reputation Management in Electronic Communities,” Proceedings of 4th International Workshop on Cooperative Information Agents, 2000, 12 pages.
Aguilera, S. et al., “Impaired Persons Facilities Based on a Multi-Modality Speech Processing System,” Proc. on Speech & Language Tech., 1993, 4 pages.
Arons, B., “Authoring and Transcription Tools for Speech-Based Hypermedia,” Proc. of American Voice I/O Society, 1991, 6 pages.
Arons, B., “Hyperspeech: Navigating in Speech-Only Hypermedia,” Proc. of Hypertext, Dec. 1991, pp. 133-146.
Bennacef, S.K., A Spoken Language System for Information Retrieval Proc. of ICSLP, Sep. 1994, 4 pages.
Gauvain, J.L. et al., “Speech Recognition for an Information Kiosk,” Proc. of ICSLP, 1996, 4 pages.
Gauvain, J.L. et al., “Spoken Language Component of the MASK Kiosk,” Human Comfort and Security fo Information Systems, Oct. 26, 1995, 11 pages.
Gauvain, J.L. et al., “The LIMSI Continuous Speech Dictation System,” Proc. ARPA Human Lang. & Technology, Apr. 1994, 6 pages.
Gauvain, J.L. et al., “The LIMSI Continuous Speech Dictation System: Evaluation on the ARPA Wall Street Journal Task,” Proc. of the IEEE-ICASSP, 1994, 4 pages.
Goddeau, D. et al., “Galaxy: A Human-Language Interface to On-Line Travel Information,” Proc. of ICSLP, 1994, 4 pages.
House, D., “Spoken-Language Access to Multimedia (SLAM): Masters Thesis,” Oregon Graduate Inst., Dept. of CS and Eng., 1995, 59 pages.
Mostow, Jack et al., “Towards a Reading coach That Listens: Automated Detection of Oral Reading Errors”, Proc. of the 11th Ntl. Conf. on A.I., 1993, 6 pages.
Russell, M. et al., “Applications of Automatic Speech Recognition to Speech and Language Development in Young Children,” Proc. of ICSLP, 1996, 4 pages.
Lamel, L.F. et al., “Recent Developments in Spoken Language Systems for Information Retrieval,” ESCA ETRW Spoken Dialog Systems, 1995, 4 pages.
Language Industry Monitor, “Janet Baker's Optimism,” 1992, 2 pages.
Dorsey et al., Design and Simulation of Opera Lighting and Projection Effects, Program of Computer Graphics, Computer Graphics, Jul. 1991, vol. 25, No. 4, New York, pp. 41-50.
Calvert, Justin, SCEE's latest plans for its Eye Toy peripheral will effectively turn the PlayStation 2 into a videophone. First screens inside, SCEE announces Eye Toy; Chat, Game spot, http://www.gamespot.com/news/6095429.html, May 5, 2004, 1 page.
Nayer et al., Lighting Sensitivity Display, ACM Transactions on Graphics, Oct. 2004, vol. 23, No. 4, pp. 963-979, New York, pp. 963-979.
Spagnoletti, Philips Ambilight TV, Home Entertainment, engadget, Jul. 8, 2004, 1 page.
Wikipedia Article on Diablo II, http://en.wikipedia.org/wiki/Diablo_II, 2010, 8 pages.
Diablo II Frost nova Description, http://diablo2.diablowiki.net/Frost_Nova, Oct. 30, 2009, 5 pages.
Diefendorff, “Sony's Emotionally Charged Chip”, Microprocessor Report, vol. 13, No. 5, Apr. 19, 1999, 8 pages.
Sony Computer Entertainment, Inc., “Fantavision Game Manual”, 2000, 18 pages.
Wikipedia, “Aimbot”, http://en.wikipedia.org/wiki/Aimbot (last updated Jun. 3, 2005; last accessed Jul. 5, 2005), 1 page.
Agarwal et al., “Ranking database Queries Using User Feedback: A Neural network Approach”, CS511 Project, Advanced Database Management Systems, Fall 2006, 9 pages.
Agichtein et al., “Improving Web Search Ranking by Incorporating User Behavior Information”, SIGIR 2006, Aug. 6-11, ACM, 8 pages.
Bhattacharjee et al., “Incentive Based ranking Mechanisms”, Position Paper, Department of Computer Science, Stanford University, 2006, 7 pages.
Chaudhuri et al., “Probabilistic Information Retrieval Approach for Ranking of Databased Query Results,” 2006, 43 pages.
Chidlovskii et al., “Collaborative Re-Ranking of Search Results”, Xerox Research Centre Europe, AAAI-2000, Workshop on AI for Web Search, 2001, 5 pages.
Kang et al., Establishing Value Mappings Using Statistical Models and User Feedback, CIKM '05, Oct. 31-Nov. 5, 2005, ACM, 8 pages.
W3C Working Draft Jun. 18, 2007, The XMLHttpRequest Object, W3C, http://www.w3.org/TR/2007/WD-XMLHttpRequest-20070618/, 12 pages.
European Search Report, dated Jan. 19, 2004, European Patent Application No. 02009339.9, 2 pages.
“Office Action”, European Patent Application No. 02009339.9, dated Jan. 19, 2006, 5 pages.
“Office Action”, European Patent Application No. 02009339.9, dated Dec. 11, 2006, 3 pages.
“Office Action”, European Patent Application No. 02009339.9, dated Jul. 4, 2007, 4 pages.
“Office Action”, European Patent Application No. 02009339.9, dated Sep. 17, 2008, 4 pages.
“Notice of Allowance”, European Patent Application No. 02009339.9, dated Nov. 16, 2009, 33 pages.
“International Search Report”, Patent Cooperation Treaty Application No. PCT/US02/32438, dated Feb. 4, 2013, 1 page.
“International Search Report” Patent Cooperation Treaty Application No. PCT/US2007/010944, dated Feb. 18, 2008, 5 pages.
“Search Report”, European Application No. 02769043.7, dated Dec. 21, 2004, 4 pages.
“Office Action”, European Patent Application No. 02769043.7, dated Apr. 28, 2005, 6 pages.
“Office Action”, European Patent Application No. 02769043.7, dated Oct. 24, 2006, 5 pages.
“Office Action”, European Patent Application No. 02769043.7, dated Jan. 31, 2007, 3 pages.
“Notice of Allowance”, Chinese Patent Application No. 201380056819.3, dated Jun. 8, 2018, 2 pages [4 pages including translation].
“International Search Report” and “Written Opinion of the International Searching Authority,” Patent Cooperation Treaty Application No. PCT/US2018/047694, dated Sep. 21, 2018, 12 pages.
Ohashi et al. “A Gesture Recognition Method for a Stick Input System .” Transactions of the Information Processing Society of Japan 40, No. 2 (1999) [retrieved on Mar. 19, 2014]. Retrieved from the Internet: <URL: http://ci.nii.ac.jp/haid/110002764810>. 12 pages.
Ricadela, A., “Texts Share Tips for Enhancing Play in Popular PC and Console Titles—Books Present Natural Add-On Sales for Games,” Computer Retail Week, 8(22), 30 [online], 1998 [retrieved on Jul. 19, 2019], Retrieved from the Internet: <URL:https://dialog.proquest.com/professional/docview/667110627?accountid=142257>, 2 pages.
Related Publications (1)
Number Date Country
20180326304 A1 Nov 2018 US