Video game technologies have greatly advanced since the early games of the 1980's. Faster processors (CPUs), 3D rendering technologies, shaders, and high powered video cards not available two decades ago have all advanced gaming. The concept of “sprites” was introduced to allow low-powered early model personal computers and arcade games to deliver fast-paced interactive gaming. A sprite is a small graphic image that can be moved quickly around a display screen with very little hardware processing. The game “Frogger”, for example, is a classic interactive video game that utilizes sprites to animate the frogs, cars, and the other moving items that are the animated graphic images of the game.
Sprite animation was implemented in hardware chips. A Sprite image had a fixed size and could be moved around the display screen quickly by simply changing the hardware register defining a sprite's (x,y) position on the screen. Sprite-enabled hardware chips also supported auto-collision detection of sprite images, such as detecting a rocket sprite image “hitting” or intersecting a ship sprite image in a game, for example, by reading a collision mask register instead of having to perform complex intersection tests with a low-powered processors. By the 1990s, sprites were rapidly disappearing as faster processors were developed for advanced video game modeling and rendering.
This summary is provided to introduce simplified concepts of Sprite interface and code-based functions which is further described below in the Detailed Description. This summary is not intended to identify essential features of the claimed subject matter, nor is it intended for use in determining the scope of the claimed subject matter.
In an embodiment of Sprite interface and code-based functions, a sprite interface is implemented in managed code to provide an interface to sprite animation functions and tiling functions, such as for a gaming application. A sprite application and/or a tiler application are implemented in native code to provide the sprite animation and tiling functions via the sprite interface when initiated by the gaming application. The gaming application, sprite interface, sprite application, and the tiler application can be implemented in a low-end computing-based device, such as a television-based client device.
The same numbers are used throughout the drawings to reference like features and components.
Sprite interface and code-based functions provides a sprite interface via which a gaming application can access the sprite animation functions of a sprite application and/or a tiler application in a low-end computing device, such as a television-based client device. A low-end television-based client device is also commonly referred to as a “thin client” due to the limited processing and graphics capabilities of the device. The variety of video games available for use on a thin client has typically been limited due to the limited processing and graphics constraints. Sprite interface and code-based functions provides that even the constrained platforms of a thin client can provide action-orientated interactive games written in managed code which, previously, could not be implemented in a thin client with limited processing power.
The sprite interface is implemented in managed code and abstracts the more processor intensive aspects of sprite-based animation which is implemented in native code to provide rendering and collision detection which can be driven from simple C# (C-Sharp) game code. By implementing a sprite application (also referred to as a sprite “engine”) in native code and exposing it to the C# managed layer in a low-end computing device runtime, older classic arcade games can be quickly ported and adapted to a television-based environment. Sprite interface and code-based functions also provides for tile-based rendering of video game background sprite images which allows for fast scrolling as a video character moves through the video game. A tiler application (also referred to as a tiler “engine”) can also be implemented in native code and can be interfaced independently of the sprite animation functions.
While aspects of the described systems and methods for Sprite interface and code-based functions can be implemented in any number of different computing systems, gaming systems, environments, and/or configurations, embodiments of Sprite interface and code-based functions are described in the context of the following exemplary system architectures.
Client device 102 can be implemented in any number of embodiments, such as a set-top box, a digital video recorder (DVR) and playback system, an appliance device, a gaming device, and as any other type of client device or low-end client device that may be implemented in an entertainment and/or information system. Alternatively, embodiments of Sprite interface and code-based functions may be implemented in other low-end computing-based devices such as a cellular phone, PDA (personal digital assistant), portable gaming device, and the like.
In this example, client device 102 includes one or more processor(s) 110 as well as a gaming application 112, managed code 114, and native code 116, all of which can be implemented as computer executable instructions and executed by the processor(s) 110. Additionally, client device 102 may be implemented with any number and combination of differing components as further described below with reference to the exemplary client device 500 shown in
The managed code 114 is an example of code that is managed by the NET Framework Common Language Runtime (CLR) to interact between natively executing code (e.g., native code 116) and the runtime on device 102. The native code 116 is an example of computer executable instructions that are written directly in a low level language and compiled to execute on the specific processor(s) 110.
The gaming application 112 can be any type of user-interactive and/or video-based game that provides an interactive display 118 on the display device 104. A user can initiate the game for entertainment and interact with the game according to the interactive display 118 with the remote control device 106 and/or the gaming controller 108 via wired or wireless inputs 120. The remote control device 106 and/or the gaming controller 108 can include various configuration and television-specific input keys, an input keypad, and/or various user-selectable input controls to interact with the gaming application 112.
In the exemplary gaming system 100, a managed sprite interface 122 is implemented in managed code 114, and a sprite application 124 and/or tiler application 126 are implemented in the native code 116. The sprite application 124 may also be referred to as a sprite “engine” to implement sprite animation functionality for the gaming application 112. Similarly, the tiler application 126 may also be referred to as a tiler “engine” to implement tiling functionality, such as background sprite images for the interactive display 118 on display device 104. Sprite interface and code-based functions abstracts the conventional hardware graphics chip previously used for sprite animation into the more efficient native code of the underlying processor(s) 110 and provides the managed sprite interface 122 for function calls to the native code.
Although the sprite interface 122, sprite application 124, and tiler application 126 are each illustrated and described as single application programs, each of the sprite interface 122, sprite application 124, and tiler application 126 can be implemented as several component applications distributed to each perform one or more functions in television-based client device 102. Further, although the sprite application 124 and the tiler application 126 are illustrated and described as separate application programs, the sprite application 124 and the tiler application 126 can be implemented together as a single application program in the native code 116.
The sprite interface 122 can be implemented with application program interface(s) (APIs) 128 via which the gaming application 112 can request or initiate sprite animation functions from the sprite application 124. The sprite interface 122 (via the APIs 128) provides an interface to the sprite animation functions available via the sprite application 124. The sprite application 124 can receive a request for a sprite animation function and provide the sprite animation function to the gaming application 112 via the sprite interface 122. A developer of the gaming application 112 can include function calls to the sprite interface APIs 128 to incorporate the sprite animation functions of the sprite application 124.
The sprite application 124 can be implemented to provide collision detection of sprite images, animation of a sprite image, a tiled sprite image to include in a background image, and/or any other sprite animation functions to the gaming application 112 via the sprite interface 122. Alternatively, or in addition, the tiler application 126 is implemented to provide tiling functions for background images of the gaming application 112 via the sprite interface 122 when initiated or requested by the gaming application 112.
The sprite interface 122, sprite application 124, and the tiler application 126 can each be implemented as a class which is a reference type that encapsulates data (constants and fields) and behavior (methods, properties, indexers, events, operators, instance constructors, static constructors, and destructors), and can contain nested types. An example of each is included below.
The Sprite Engine Class (e.g., the sprite application 124) provides solid flicker-free sprite animation in a graphics mode, such as a three-hundred fifty two by two-hundred forty (352×240) graphics mode. This provides for a custom background layer and a list of at least thirty-two (32) sprites with (x,y) motion vectors for each. The Sprite Engine breaks the memory allocated for a main seven-hundred and four by four-hundred eighty (704×480) display screen into four sub-screens, three of which are used. Two of the sub-screens are used ping-pong style where one is onscreen while the other is being updated.
As each frame of an animation is displayed, the background is blitted to the current off-screen buffer and the sprites are composited on top of the background. Then sprite (x,y) positions are updated by their motion vectors and rectangular collision detection is applied to sprites which are flagged as being collidable. The distinction between managed and native functionality is simple, yet flexible so that the managed layer can individually guide sprites on each frame while the fast rendering and processing work of updating positions is done in the native layer.
The Sprite Engine Class includes:
The Tiler Class (e.g., the tiler application 126) can be implemented independent of the sprite application (Sprite Engine). The Tiler Class provides maintaining a single short array described as the “tile” to be displayed at each location on the screen. The value at each location in the array is used to index into an array of images. The Tiler can then quickly render an entire screen of tiles and perform functions such as side and vertical scrolling of the entire screen in an efficient manner. The Tiler Class can also be implemented to generate other types of backgrounds out of spliced images and generate complex backgrounds for dialog boxes and non-gaming user interfaces.
The Tiler Class includes:
The Tiler can also be “attached” to an optional SpriteEngine via the public variable SpriteEngine. Then when the Tiler scrolls in a given direction it can quickly update the coordinates of the sprites. This saves the managed layer considerable processing work on each scroll.
The Sprite Class (e.g., the managed sprite interface 122). A sprite can be represented by an array of images that are cycled through to create an animation effect, a size, an (x,y) location, and a motion vector that includes (dx,dy) and a step count. An attribute mask can be utilized to describe a sprite as a “hero” (item to test for collisions against) or “collidable”. This moves a portion of the sprite maintenance to the native layer without overly complicating the native drawing engine.
The Sprite Class includes:
[ ]array of the highest zero ordered sprite involved in the collision.
The gaming application 112 incorporates the animated sprite character 206 and the various tiled sprite images via the managed sprite interface 122 which interfaces to the native code 116 where the sprite application 124 and/or the tiler application 126 provides the sprite engine functions of tiling, sprite animation, and collision detection. In the example 200, the animated sprite character moves from one location to the next over the blocks (e.g., block 216) and to different block levels while superimposed over the ladders (e.g., ladder 214) to obtain the game pieces (e.g., game pieces 210 and 212) which is determined as a “collision” between sprite images.
Methods for Sprite interface and code-based functions, such as exemplary method 400 described with reference to
At block 402, a gaming application is executed in a computing-based device, such as a low-end television-based client device. For example, the processor(s) 110 in client device 102 execute and process gaming application 112 from which an interactive gaming display 118 is displayed on display device 104 for user interaction via the remote control device 106 or game controller 108.
At block 404, a sprite interface is executed in managed code. At block 406, a sprite application is executed in native code, and at block 408, a tiler application is executed in the native code. For example, the processor(s) 110 in client device 102 execute the managed code 114 which includes the sprite interface 122, and execute the native code 116 which includes the sprite application 124 and the tiler application 126.
At block 410, a request is received for a sprite animation function from the gaming application. At block 412, the request for the sprite animation function is initiated via the sprite interface. For example, the gaming application 112 can initiate or request a sprite animation function via the APIs 128 of the managed sprite interface 122.
At block 414, the sprite animation function is provided from the sprite application, where the sprite animation function is provided to the gaming application via the sprite interface. For example, the sprite application 124 in native code 116 in client device 102 provides any one or more of collision detection between sprite images, animation of a sprite image, and various tiled sprite images for a background image of the gaming application 112 via the Sprite interface 122.
At block 416, a tiling function is provided with the tiler application. For example, and as an alternative to the sprite application 124, the tiler application 126 in native code 116 in client device 102 provides tiling functions for the gaming application 112 via the sprite interface 122.
Client device 500 includes one or more media content inputs 502 which may include Internet Protocol (IP) inputs over which streams of media content are received via an IP-based network. Device 500 further includes communication interface(s) 504 which can be implemented as any one or more of a serial and/or parallel interface, a wireless interface, any type of network interface, a modem, and as any other type of communication interface. A wireless interface enables client device 500 to receive control input commands 506 and other information from an input device, such as from remote control device 508, PDA (personal digital assistant) 510, cellular phone 512, or from another infrared (IR), 802.11, Bluetooth, or similar RF input device.
A network interface provides a connection between the client device 500 and a communication network by which other electronic and computing devices can communicate data with device 500. Similarly, a serial and/or parallel interface provides for data communication directly between client device 500 and the other electronic or computing devices. A modem facilitates client device 500 communication with other electronic and computing devices via a conventional telephone line, a DSL connection, cable, and/or other type of connection.
Client device 500 also includes one or more processors 514 (e.g., any of microprocessors, controllers, and the like) which process various computer executable instructions to control the operation of device 500, to communicate with other electronic and computing devices, and to implement embodiments of Sprite interface and code-based functions. Client device 500 can be implemented with computer readable media 516, such as one or more memory components, examples of which include random access memory (RAM), non-volatile memory (e.g., any one or more of a read-only memory (ROM), flash memory, EPROM, EEPROM, etc.), and a disk storage device. A disk storage device can include any type of magnetic or optical storage device, such as a hard disk drive, a recordable and/or rewriteable compact disc (CD), a DVD, a DVD+RW, and the like.
Computer readable media 516 provides data storage mechanisms to store various information and/or data such as software applications and any other types of information and data related to operational aspects of client device 500. For example, an operating system 518 and/or other application programs 520 can be maintained as software applications with the computer readable media 516 and executed on processor(s) 514 to implement embodiments of Sprite interface and code-based functions.
For example, client device 500 can be implemented to include a program guide application 522 that is implemented to process program guide data 524 and generate program guides for display which enable a viewer to navigate through an onscreen display and locate broadcast programs, recorded programs, video on-demand programs and movies, interactive game selections, network-based applications, and other media access information or content of interest to the viewer. The application programs 520 can include programmed application(s) to implement features and embodiments of Sprite interface and code-based functions as described herein, such as any one or more of the gaming application 112, sprite interface 122, sprite application 124, and tiler application 126 shown in
The client device 500 also includes an audio and/or video output 534 that provides audio and video to an audio rendering and/or display system 536, or to other devices that process, display, and/or otherwise render audio, video, and image data. Video signals and audio signals can be communicated from device 500 to a television 538 (or to other types of display devices) via an RF (radio frequency) link, S-video link, composite video link, component video link, analog audio connection, or other similar communication link.
Although embodiments of Sprite interface and code-based functions have been described in language specific to structural features and/or methods, it is to be understood that the subject of the appended claims is not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as exemplary implementations of Sprite interface and code-based functions.