A portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent disclosure, as it appears in the Patent and Trademark Office patent files or records, but otherwise reserves all copyright rights whatsoever. Copyright 2013, WMS Gaming, Inc.
Embodiments of the inventive subject matter relate generally to wagering game systems, and more particularly to wagering game systems including multiplayer games that utilize head tracking technologies.
Wagering game machines, such as slot machines, video poker machines and the like, have been a cornerstone of the gaming industry for several years. Generally, the popularity of such machines depends on the likelihood (or perceived likelihood) of winning money at the machine and the intrinsic entertainment value of the machine relative to other available gaming options. Where the available gaming options include a number of competing wagering game machines and the expectation of winning at each machine is roughly the same (or believed to be the same), players are likely to be attracted to the most entertaining and exciting machines. Shrewd operators consequently strive to employ the most entertaining and exciting machines, features, and enhancements available because such machines attract frequent play and hence increase profitability to the operator. Therefore, there is a continuing need for wagering game machine manufacturers to continuously develop new games and gaming enhancements that will attract frequent play.
Embodiments of the invention are illustrated in the Figures of the accompanying drawings in which:
This section provides an introduction to some embodiments of the invention.
Some embodiments of the inventive subject matter conduct multiplayer games, where the multiplayer games are presented on multiple wagering game machines. In such multiplayer games, a virtual object (e.g., a globe, space ship, etc.) is presented to each player. Each player may have a different view of the virtual object, depending on factors such as where the player's wagering game machine resides on a casino floor. For example, the virtual object may be a globe (i.e., a spherical rendition of Earth). On a wagering game machine on the casino's south side, a player may see a view of the globe that includes Australia. On a machine on the casino's north side, a player may see North America on the globe.
Some embodiments use head tracking technology to change a player's view of the virtual object based on the player's viewing perspective. This effect simulates looking through a window, where head movements reveal different fields of view. For example, if a player changes viewing perspectives (e.g., leans leftward and peers rightward at a display device), the player may see a portion of the globe that was not visible from the player's original viewing perspective.
In some embodiments, one or more virtual objects are game elements in the multiplayer game. For example, the virtual objects can be playing cards in a video Texas Hold 'Em poker game. Alternatively, the virtual objects can be shared picking elements, shared slots reel symbols, or any other suitable game elements. The players can interact with the virtual objects to affect game results.
As shown, player-1 can see the virtual object 110 (i.e. globe 110) though a viewing field 108. Player-2 can see the globe 110 through a viewing field 112. Because the viewing fields are different for each player, each player sees a different part of the virtual object 110.
Referring back to
This section describes an example operating environment and presents structural aspects of some embodiments. This section includes discussion about wagering game machine architectures and wagering game networks.
Each casino 212 includes a local area network 216, which includes an access point 204, a wagering game server 206, and wagering game machines 202. The access point 204 provides wireless communication links 210 and wired communication links 208. The wired and wireless communication links can employ any suitable connection technology, such as Bluetooth, 802.11, Ethernet, public switched telephone networks, SONET, etc. In some embodiments, the wagering game server 206 can serve wagering games and distribute content to devices located in other casinos 212 or at other locations on the communications network 214.
The wagering game machines 202 described herein can take any suitable form, such as floor standing models, handheld mobile units, bartop models, workstation-type console models, etc. Further, the wagering game machines 202 can be primarily dedicated for use in conducting wagering games, or can include non-dedicated devices, such as mobile phones, personal digital assistants, personal computers, etc.
The wagering game machines 202 can include head tracking cameras (not show in
As noted above, the wagering game machines 202 can present wagering games that include virtual objects viewable by players at a plurality of the machines 202. The discussion of
The wagering game server 206 includes a wagering game engine 205 and a virtual objects engine 203. The wagering game engine 205 can determine results for wagering games presented on the machines 202. In some instances, the wagering game engine 205 also determines and streams content (e.g., graphics, audio, etc.) for wagering games presented on the machines 202. The virtual objects engine 203 can create and process data representing virtual objects. In some instances, the virtual objects engine 203 creates graphical content for transmission to the machines 202. In other instances, the virtual objects engine 203 transmits, to the machines 202, data representing virtual objects. In turn, the machines 202 can process the data to graphically render virtual objects.
In some embodiments, wagering game machines 202 and wagering game servers 206 work together such that a wagering game machine 202 can be operated as a thin, thick, or intermediate client. For example, one or more elements of game play may be controlled by the wagering game machine 202 (client) or the wagering game server 206 (server). Game play elements can include executable game code, lookup tables, configuration files, game outcome, audio or visual representations of the game, game assets or the like. In a thin-client example, the wagering game server 206 can perform functions such as determining game outcome or managing assets, while the wagering game machine 202 can present a graphical representation of such outcome or asset modification to the user (e.g., player). In a thick-client example, the wagering game machines 202 can determine game outcomes and communicate the outcomes to the wagering game server 206 for recording or managing a player's account.
In some embodiments, either the wagering game machines 202 (client) or the wagering game server 206 can provide functionality that is not directly related to game play. For example, account transactions and account rules may be managed centrally (e.g., by the wagering game server 206) or locally (e.g., by the wagering game machine 202). Other functionality not directly related to game play may include power management, presentation of advertising, software or firmware updates, system quality or security checks, etc.
In some embodiments, the wagering game network 200 can include other network devices, such as accounting servers, wide area progressive servers, player tracking servers, and/or other devices suitable for use in connection with embodiments of the invention.
Any of the wagering game network components (e.g., the wagering game machines 202) can include hardware and computer readable media including instructions for performing the operations described herein. Any combination of one or more computer readable medium(s) may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
The main memory 328 includes a wagering game unit 332, graphics engine 336, and head tracking unit 338. In one embodiment, the wagering game unit 332 can present wagering games, such as video poker, video black jack, video slots, video lottery, etc., in whole or part. The graphics engine 336 can process data representing virtual objects, and present the virtual objects on a primary display 310. The head tracking engine 338 can operate in concert with the head tracking camera 340 to track player head movements. In some embodiments, the head tracking engine 338 transmits head tracking data to a remote wagering game server. In other embodiments, the head tracking engine 338 processes head tracking data to determine whether to modify a player's viewing field with respect to a virtual object. Operations and functionality of the graphics engine 336 and head tracking unit 338 are described in greater detail in the following sections.
The CPU 326 is also connected to an input/output (I/O) bus 322, which can include any suitable bus technologies, such as an AGTL+ frontside bus and a PCI backside bus. The I/O bus 322 is connected to a payout mechanism 308, primary display 310, secondary display 312, value input device 314, player input device 316, information reader 318, and storage unit 330. The player input device 316 can include the value input device 314 to the extent the player input device 316 is used to place wagers. The I/O bus 322 is also connected to an external system interface 324, which is connected to external systems 304 (e.g., wagering game networks).
In one embodiment, the wagering game machine 306 can include additional peripheral devices and/or more than one of each component shown in
Any component of the architecture 300 can include one or more of hardware, firmware, and machine-readable storage media including instructions for performing the operations described herein.
This discussion continues with move details about head tracking technologies employed by some embodiments of the inventive subject matter.
In some implementations, the video capture device 425 can capture video of the player's head movements, facial gestures, and facial features. The wagering game machine 460 can then generate player input data (e.g., a plurality of variables) that represents the x, y, and z coordinates of the player's head at various instances in time. This data can be used to determine the player's head movements. The wagering game machine 460 can also generate player input data that represents the x, y, and z coordinates of various data points of the player's facial features. This data can be used to determine the player's facial movements (e.g., where the player 400 is looking, the player's facial gestures, etc.). In one example, the wagering game machine 460 may generate data representing x, y, and z coordinates for multiple data points in a player's eyes, nose, mouth, forehead, chin, etc. The wagering game machine 460 can process data associated with the head movements and facial gestures to determine the player's viewing perspective. In some embodiments, the wagering game machine 460 can compare the x, y, and z coordinates of the player's head at various instances in time to one or more reference points (e.g., (x,y,z)=(0,0,0)) to quantify the player's head movement with respect to the reference points. In turn, the wagering game machine 460 can modify display content (e.g., a view of a virtual object) to be consistent with the player's viewing perfective.
This discussion will continue with an explanation of how some embodiments process virtual objects.
After creating one or more virtual objects, the virtual objects engine 203 (see
The virtual objects engine 203 can determine initial positions and settings for the virtual camera (i.e., initial viewing fields) based on information including gaming machine position in a casino, gaming machine position in a bank, number of players in community game, player status (e.g., high roller, etc.), and any other suitable information. After initially positioning and configuring each virtual camera, the engine 203 enables each player to see virtual images captured by their virtual camera (i.e., images captured in the viewing fields 508 and 510).
Some embodiments of the inventive subject matter work with three-dimensional (3D) autostereoscopic display devices.
This section describes operations associated with some embodiments of the invention. In the discussion below, the flow diagrams will be described with reference to the block diagrams presented above. However, in some embodiments, the operations can be performed by logic not described in the block diagrams.
In certain embodiments, the operations can be performed by executing instructions residing on machine-readable storage media, while in other embodiments, the operations can be performed by hardware and/or other components (e.g., firmware). In some embodiments, the operations can be performed in series, while in other embodiments, one or more of the operations can be performed in parallel. Moreover, some embodiments can perform less than all the operations shown in any flow diagram.
At block 704, the virtual object engine determines a head position for each of the plurality of players. In some embodiments, the virtual object engine receives head tracking information from wagering game machines equipped with head tracking equipment. The flow continues at block 706.
At block 706, the virtual objects engine determines a viewing field of the virtual object for each of the players. Each viewing field may be based on a player's head position, and other factors (e.g., wagering game machine position in a casino, players role in a game, number of players, etc.). In some embodiments, the virtual objects engine employs virtual cameras to capture images (e.g., portions of the virtual object) in the viewing fields. For embodiments using autostereoscopic 3D displays, the virtual objects engine can employ two virtual cameras for each player, as described above. The flow continues at block 708.
At block 708, the virtual objects engine presents to each player a view of the virtual object, where the view corresponds to each player's viewing field (e.g., see
At block 710, the wagering game server's wagering game engine receives player inputs associated with the virtual object. For example, the virtual object may be associated with game elements, such as items to be selected in a community picking bonus game. Each player may see and select different items (see discussion of
At block 712, the wagering game engine determines and presents game results based on the player inputs. In some embodiments, the wagering game engine transmits game results to wagering game machines, which in turn present the results using locally stored content. Alternatively, wagering game engine can transmit game results and content for presentation on the wagering game machines.
This section provides examples showing how some embodiments utilize virtual objects and head tracking in community wagering games.
Because embodiments are equipped with head tracking technologies, the player's 810 and 814 can modify their viewing fields with head movements. For example, in
Embodiments of the inventive subject matter are not limited to the gaming concepts described above. The following is a non-exhaustive list of how various embodiments can utilize head-tracking capabilities and virtual objects to enhance community games.
Referring to
The wagering game machine 1110 illustrated in
Input devices, such as the touch screen 1118, buttons 1120, a mouse, a joystick, a gesture-sensing device, a voice-recognition device, and a virtual input device, accept player input(s) and transform the player input(s) to electronic data signals indicative of the player input(s), which correspond to an enabled feature for such input(s) at a time of activation (e.g., pressing a “Max Bet” button or soft key to indicate a player's desire to place a maximum wager to play the wagering game). The input(s), once transformed into electronic data signals, are output to a CPU for processing. The electronic data signals are selected from a group consisting essentially of an electrical current, an electrical voltage, an electrical charge, an optical signal, an optical element, a magnetic signal, and a magnetic element.
For purposes of the present detailed description, the singular includes the plural and vice versa (unless specifically disclaimed); the words “and” and “or” shall be both conjunctive and disjunctive; the word “all” means “any and all”; the word “any” means “any and all”; and the word “including” means “including without limitation.”
This detailed description refers to specific examples in the drawings and illustrations. These examples are described in sufficient detail to enable those skilled in the art to practice the inventive subject matter. These examples also serve to illustrate how the inventive subject matter can be applied to various purposes or embodiments. Other embodiments are included within the inventive subject matter, as logical, mechanical, electrical, and other changes can be made to the example embodiments described herein. Features of various embodiments described herein, however essential to the example embodiments in which they are incorporated, do not limit the inventive subject matter as a whole, and any reference to the invention, its elements, operation, and application are not limiting as a whole, but serve only to define these example embodiments. This detailed description does not, therefore, limit embodiments of the invention, which are defined only by the appended claims. Each of the embodiments described herein are contemplated as falling within the inventive subject matter, which is set forth in the following claims.
Number | Name | Date | Kind |
---|---|---|---|
6822643 | Matsui et al. | Nov 2004 | B2 |
7654899 | Durham et al. | Feb 2010 | B2 |
7841944 | Wells | Nov 2010 | B2 |
7883415 | Larsen et al. | Feb 2011 | B2 |
8165347 | Heinzmann et al. | Apr 2012 | B2 |
8540569 | Orlinsky et al. | Sep 2013 | B2 |
8696458 | Foxlin | Apr 2014 | B2 |
20010040572 | Bradski et al. | Nov 2001 | A1 |
20020037770 | Paul et al. | Mar 2002 | A1 |
20050040601 | Yoseloff et al. | Feb 2005 | A1 |
20050282605 | Englman et al. | Dec 2005 | A1 |
20060009283 | Englman et al. | Jan 2006 | A1 |
20060258425 | Edidin et al. | Nov 2006 | A1 |
20070060390 | Wells | Mar 2007 | A1 |
20090233769 | Pryor | Sep 2009 | A1 |
20100041464 | Arezina et al. | Feb 2010 | A1 |
Number | Date | Country |
---|---|---|
WO 2011063197 | May 2011 | WO |
Number | Date | Country | |
---|---|---|---|
20140073386 A1 | Mar 2014 | US |
Number | Date | Country | |
---|---|---|---|
61699977 | Sep 2012 | US |