A portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent disclosure, as it appears in the Patent and Trademark Office patent files or records, but otherwise reserves all copyright rights whatsoever. Copyright 2022 LNW Gaming, Inc.
The present invention relates generally to gaming systems, apparatus, and methods and, more particularly, to image analysis and tracking of physical objects in a gaming environment.
Casino gaming environments are dynamic environments in which people, such as players, casino patrons, casino staff, etc., take actions that affect the state of the gaming environment, the state of players, etc. For example, a player may use one or more physical tokens to place wagers on the wagering game. A player may perform hand gestures to perform gaming actions and/or to communicate instructions during a game, such as making gestures to hit, stand, fold, etc. Further, a player may move physical cards, dice, gaming props, etc. A multitude of other actions and events may occur at any given time. To effectively manage such a dynamic environment, the casino operators may employ one or more tracking systems or techniques to monitor aspects of the casino gaming environment, such as credit balance, player account information, player movements, game play events, and the like.
Some gaming systems can perform object tracking in a gaming environment. For example, a gaming system with a camera at a gaming table can capture an image feed of a gaming area to identify certain physical objects or to detect certain activities such as betting actions, payouts, player actions, etc. However, one challenge to such a gaming system is tracking the complexity of the system elements, particularly regarding the tracking of money. For example, a camera near a gaming table may take pictures of casino tokens (e.g., gaming chips) at the gaming table. However, gaming chips are relatively thin (e.g., ˜3 mm thick) and can be stacked close together on a chip stack within a chip tray. Because of the relatively small size of chips, the detail in the images (taken from the camera near the gaming table) may be insufficient for an automated-analysis model (e.g. machine learning model) to distinguish chip-edge features, or other finely detailed chip characteristics, that indicate chip values. As a result, contemporary computer vision systems fail to identify some chips, resulting in lost revenue for a casino when a dealer misplaces a high-value chip into a wrong stack of chips in a chip tray and accidentally delivers the misplaced, high-value chip to a player when a lower-value chip was intended.
Accordingly, a new tracking system that is adaptable to the challenges of dynamic casino gaming environments is desired.
According to one aspect of the present disclosure, a gaming system is provided for tracking chips in a gaming environment. In some embodiments, an apparatus comprises a chip tray having one or more image sensors. The apparatus further comprises a tracking controller. The chip tray includes columns. The image sensor(s) are positioned between a least two of the columns and aligned substantially parallel to a vertical height of the columns. The tracking controller is configured, in some embodiments, to perform operations that cause the apparatus to capture, via the one or more image sensors, image data of a chip stack in at least one of the two of the columns. The tracking controller is further configured to segment the chip stack from the image data, and detect, via a machine learning model, a value of each chip in the chip stack in response analysis of chip-edge features of each gaming chip in the chip stack. The tracking controller is further configured to electronically present information associated with detection of the value of each gaming chip in the chip stack.
Additional aspects of the invention will be apparent to those of ordinary skill in the art in view of the detailed description of various embodiments, which is made with reference to the drawings, a brief description of which is provided below.
While the invention is susceptible to various modifications and alternative forms, specific embodiments have been shown by way of example in the drawings and will be described in detail herein. It should be understood, however, that the invention is not intended to be limited to the particular forms disclosed. Rather, the invention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the invention as defined by the appended claims.
While this invention is susceptible of embodiment in many different forms, there is shown in the drawings, and will herein be described in detail, preferred embodiments of the invention with the understanding that the present disclosure is to be considered as an exemplification of the principles of the invention and is not intended to limit the broad aspect of the invention to the embodiments illustrated. For purposes of the present detailed description, the singular includes the plural and vice versa (unless specifically disclaimed); the words “and” and “or” shall be both conjunctive and disjunctive; the word “all” means “any and all”; the word “any” means “any and all”; and the word “including” means “including without limitation.”
For purposes of the present detailed description, the terms “wagering game,” “casino wagering game,” “gambling,” “slot game,” “casino game,” and the like include games in which a player places at risk a sum of money or other representation of value, whether or not redeemable for cash, on an event with an uncertain outcome, including without limitation those having some element of skill. In some embodiments, the wagering game involves wagers of real money, as found with typical land-based or online casino games. In other embodiments, the wagering game additionally, or alternatively, involves wagers of non-cash values, such as virtual currency, and therefore may be considered a social or casual game, such as would be typically available on a social networking web site, other web sites, across computer networks, or applications on mobile devices (e.g., phones, tablets, etc.). When provided in a social or casual game format, the wagering game may closely resemble a traditional casino game, or it may take another form that more closely resembles other types of social/casual games.
In some embodiments, the chip tray 130 includes images sensors (e.g., image sensors 157, image sensors 159, image sensors 144, etc.). The image sensors are positioned between chip-tray columns (columns 132) and are aligned substantially parallel to a vertical height of the columns 132. In some instances the image sensors are embedded into the structure (e.g., the material) of the chip tray 130 to ensure they remain set to a viewing perspective of at least the portion of the edges of the chips 131 in chip stacks on the chip tray 130. For example, image sensors 157 are positioned in an array at the bottom of the chip tray 130 between specific ones of the columns 132 (e.g., each one of the image sensors 157 are positioned between specific pairs of columns). The image sensors 157 face upward from a bottom part of the chip tray 130. For example, the chip tray 130 is sloped such that when a stack of the chips 131 rests within any one of the given columns 132, gravity pulls the chip stack downward to rest upon a bottom resting plate 136 near a bottom edge 137 of the chip tray 130. The image sensors 157 capture images of the chips 131 in any two columns 132 that are adjacent to an individual one of the image sensors 157. In some embodiments, image sensors 159 are positioned near a top edge 139 of the chip tray 130.
The image sensors 157 and image sensors 159 have opposite or opposing viewing perspectives to each other. In some embodiments, image sensors 144 are positioned at left and right sides (i.e., side 133 and side 135) of the chip tray 130. In some embodiments, the sensors 144 have opposite or opposing viewing perspectives to each other. The term “viewing perspective” is referred to more succinctly herein as a “viewpoint.” Thus opposite or opposing viewing perspectives are referred to herein as “opposing viewpoints.” Image sensors with opposing viewpoints are referred to herein as “opposing-viewpoint” image sensors. Data captured or collected by image sensors that have opposing viewpoints is referred to as “opposing-viewpoint data”. The opposing viewpoint data (taken from opposing-viewpoint image sensors) has an inverse symmetrical relationship regarding where each pixel in the respective images relates to a given chip position in the chip tray 130. Further, in some embodiments the image sensors 157, image sensors 159 and/or image sensors 144 are fixed, or can be automatically locked into known fixed positions with known viewing perspectives relative to each other. Thus, in some embodiments, a processor of the gaming system 100 (e.g., tracking controller 204 shown in
In some embodiments, the image sensors 157 are spaced between every other pair of columns 132. For example, as shown in
In some embodiments, as mentioned, the tracking controller 204 utilizes opposing-viewpoint image sensors (e.g., image sensors 157 and image sensors 159 or image sensors 144). For instance, the opposing-viewpoint image sensors 157 and image sensors 159 are respectively positioned on opposite ends of any one of the columns 132.
In some embodiments, the bottom image sensor (i.e., image sensor 357) can capture image data for the bottom halves of adjacent chip stacks while a top image sensor (e.g., image sensor 359) can capture image data for the top halves of adjacent chip stacks. The system 100 (e.g., tracking controller 204) can further segment the chip stacks into two different images representing the top halves and lower halves of the chip stacks, where each segmented portion has a greater resolution of the chips within the respective image half than if only one image had been taken for the chip stacks.
A controller (e.g., tracking controller 204 in
In some embodiments, the tracking controller 204 is also configured to automatically detect physical objects in a gaming environment as points of interest based on electronic analysis of an image performed by one or more additional neural network models. In some embodiments, the tracking controller 204 can access, control, or otherwise use one or more environmental image sensors (e.g., camera(s) 102) and a projector 103. In some embodiments, the camera(s) 102 are separate from the chip tray 130. The camera(s) 102 can capture one or more streams of images of a gaming area. For example, first image sensor(s) 142 capture first streams of images of first portions of the gaming area, such as portions of the top surface 104 relevant to game play. The first image sensor(s) 142 also capture image data related to game participants (e.g., players, back-betting patrons, dealer, etc.) positioned around the gaming table 101 (at the different player areas 105, 106, 107, 108, 109, and 110 and/or at the dealer area 111). Second image sensor(s) 146 capture image data related to dealer area 111 and/or the chip tray 130. The camera(s) 102 are positioned above the surface 104 and to the left and right sides of the dealer area 111. In some embodiments, the viewing perspectives of the image sensor(s) 144 are substantially orthogonal to the vertical height of the columns 132. Furthermore, in some embodiments, the image sensor(s) 146 have opposing viewpoints from each other. Thus, each can capture opposing-viewpoint image data from either side of the chip tray 130. The opposing-viewpoint image data can be used to more accurately detect the identity of the chips 131. For example, one of the image sensor(s) 146 captures image data of one of the columns 132 from a left side while the other of the image sensor(s) 146 captures image data of the same one of the columns 132 from a right side. The tracking controller 204 can determine, from these opposing viewpoints, the left and right sides of any chip stacks. Thus a first instance of a machine learning model can (using the stream of images from the first one of the image sensor(s) 146) detect a chip edge pattern of one or more chips from one side of the chip stack. Simultaneously, a second instance of the machine learning model can (using the stream of images from the second one of the image sensor(s) 146) detect a chip edge pattern of the same one or more chips from an opposite side of the chip stack. The tracking controller 204 can further compare and/or combine the left and right side images to independently verify chip values. Images from fixed, opposing viewpoints are symmetrical inverses of each other, thus the relative geometric positions of each chip (from each opposing viewpoint) can be aligned to the other by an inverse transformation function. The coordinates of each chip (from the opposing viewpoints), thus can be pinpointed by the two instances of the machine learning models using the symmetrical inverse relationship. The independent instances of the machine learning models thus obtain independent verification (via independent analysis of the opposing-viewpoint image data) of any given chip value by the pinpointed locations. In some embodiments, the image sensor(s) 146 are used in place of, or in addition to, the image sensors 144.
In some embodiments, the tracking controller 204 activates and deactivates the image sensor(s) 146, image sensors 157, image sensors 159, and/or image sensors 144 automatically based on a position of participants, such as the dealer. For example, the tracking controller 204 detects when a dealer (or any portion of the dealer, such as a hand or arm) is blocking a view of the chips 131. The tracking controller 204 deactivates the blocked image sensor until the dealer no longer blocks the view. Then the tracking controler 204 reactivates the image sensor to continue capturing an image stream of the chip tray 130.
In some embodiments, the chips 131 are further marked with a hidden or invisible marking that the image sensors 142, image sensors 146, image sensors 157, image sensors 159, and/or image sensors 144 are configured to detect using light and/or electromagnetic frequencies outside the average human visible range. For example, in some embodiments the image sensors described herein are equipped to detect infrared signals and/or ultraviolet light reflected off of the chips 131. Thus, the tracking controller 204 (and/or any machine learning models it uses) can more accurately detect an identity of a chip and/or determine a chip value.
The projector 103 is also positioned above the gaming table 101, and also to the left of the first player area 105. The projector 103 can project images of gaming content toward the surface 104 relative to objects in the gaming area. In some instances, the projector 103 is configured to project images of gaming content relevant to some elements of a wagering game that are common, or related, to any or all participants (e.g., the projector 103 projects gaming content at a communal presentation area 114).
In some embodiments, the gaming system 100 can detect one or more points of interest by detecting, via a specific type of computer visioning model (e.g., a machine learning model or neural network model), physical features of the image that appear at the surface 104. For example, the tracking controller 204 is configured to monitor the gaming area (e.g., physical objects within the gaming area), and to determine a relationship between one or more of the objects. The tracking controller 204 can further receive and analyze collected sensor data to detect and monitor physical objects (e.g., the tracking controller 204 receives and analyzes the captured image data from image sensors 157, image sensors 159, and or from one or more image sensors either on or external to the chip tray 130). The tracking controller 204 can establish data structures relating to various physical objects detected in the image data. For example, the tracking controller 204 can apply one or more image neural network models during image analysis that are trained to detect aspects of physical objects. In at least some embodiments, each model applied by the tracking controller 204 may be configured to identify a particular aspect of the image data and provide different outputs for any physical objected identified such that the tracking controller 204 may aggregate the outputs of the neural network models together to identify physical objects as described herein. The tracking controller 204 may generate data objects for each physical object identified within the captured image data. The data objects may include identifiers that uniquely identify the physical objects such that the data stored within the data objects is tied to the physical objects. The tracking controller 204 can further store data in a database, such as database system 208 in
In some embodiments, the tracking controller 204 is configured to detect bank-change events, or in other words, events that occur in the gaming environment that would affect a change to the overall value of the bank of chips 131 within the chip tray 130, such as buy-ins, won bets, and pay-outs. For example, the tracking controller 204 can identify betting circles (e.g., main betting circles 105A, 106A, 107A, 108A, 109A, and 110A (“105A-110A”) and secondary betting circles 105B, 106B, 107B, 108B, 109B, and 110B (“105B-110B”)). The betting circles are positioned relative to player stations 105, 106, 107, 108, 109, and 110. The tracking controller 204 can also detect placement of gaming chips (e.g., as stacks) within the betting circles during betting on a wagering game conducted at the gaming table 101. The tracking controller 204 can further determine the values of chip stacks within the betting circles. The tracking controller 204 determines, based on the values of the chip stacks, amounts by which the bank is expected to change based on collection of losing bets and/or payouts required for winning bets. The tracking controller 204 can compare the expected amounts to actual changes to the chips 131 in the chip tray 130. Based on the comparison, the tracking controller 204, for instance, determines whether there are any errors in placement of chips of one denomination value into one of the columns 132 for a different denomination value. The tracking controller 204 can further generate warnings (e.g. of the errors of placement of chips in the wrong column) and/or generate reports that tracks the accuracy of a dealer's handling of the chips into and out of the bank.
Some objects may be included at the gaming table 101, such as gaming tokens, cards, a card shoe, dice, etc. but are not shown in
The gaming area 201 is an environment in which one or more casino wagering games are provided. In the example embodiment, the gaming area 201 is a casino gaming table and the area surrounding the table 101 (e.g., as in
The game controller 202 is configured to facilitate, monitor, manage, and/or control gameplay of the one or more games at the gaming area 201. More specifically, the game controller 202 is communicatively coupled to at least one or more of the tracking controller 204, the imaging system 206, the tracking database system 208, a gaming device 210, an external interface 212, and/or a server system 214 to receive, generate, and transmit data relating to the games, the players, and/or the gaming area 201. The game controller 202 may include one or more processors, memory devices, and communication devices to perform the functionality described herein. More specifically, the memory devices store computer-readable instructions that, when executed by the processors, cause the game controller 202 to function as described herein, including communicating with the devices of the gaming system 200 via the communication device(s).
The game controller 202 may be physically located at the gaming area 201 as shown in
The gaming device 210 is configured to facilitate one or more aspects of a game. For example, for card-based games, the gaming device 210 may be a card shuffler, shoe, or other card-handling device. The external interface 212 is a device that presents information to a player, dealer, or other user and may accept user input to be provided to the game controller 202. In some embodiments, the external interface 212 may be a remote computing device in communication with the game controller 202, such as a player's mobile device. In other examples, the gaming device 210 and/or external interface 212 includes one or more projectors. The server system 214 is configured to provide one or more backend services and/or gameplay services to the game controller 202. For example, the server system 214 may include accounting services to monitor wagers, payouts, and jackpots for the gaming area 201. In another example, the server system 214 is configured to control gameplay by sending gameplay instructions or outcomes to the game controller 202. It is to be understood that the devices described above in communication with the game controller 202 are for exemplary purposes only, and that additional, fewer, or alternative devices may communicate with the game controller 202, including those described elsewhere herein.
In the example embodiment, the tracking controller 204 is in communication with the game controller 202. In other embodiments, the tracking controller 204 is integrated with the game controller 202 such that the game controller 202 provides the functionality of the tracking controller 204 as described herein. Like the game controller 202, the tracking controller 204 may be a single device or a distributed computing system. In one example, the tracking controller 204 may be at least partially located remotely from the gaming area 201. That is, the tracking controller 204 may receive data from one or more devices located at the gaming area 201 (e.g., the game controller 202 and/or the imaging system 206), analyze the received data, and/or transmit data back based on the analysis.
In the example embodiment, the tracking controller 204, similar to the example game controller 202, includes one or more processors, a memory device, and at least one communication device. The memory device is configured to store computer-executable instructions that, when executed by the processor(s), cause the tracking controller 204 to perform the functionality of the tracking controller 204 described herein. The communication device is configured to communicate with external devices and systems using any suitable communication protocols to enable the tracking controller 204 to interact with the external devices and integrates the functionality of the tracking controller 204 with the functionality of the external devices. The tracking controller 204 may include several communication devices to facilitate communication with a variety of external devices using different communication protocols.
The tracking controller 204 is configured to monitor at least one or more aspects of the gaming area 201. In the example embodiment, the tracking controller 204 is configured to monitor physical objects within the area 201, and determine a relationship between one or more of the objects. Some objects may include gaming tokens. The tokens may be any physical object (or set of physical objects) used to place wagers. As used herein, the term “stack” refers to one or more gaming tokens physically grouped together. For circular tokens typically found in casino gaming environments (e.g., gaming chips 131), these may be grouped together into a vertical stack. In another example in which the tokens are monetary bills and coins, a group of bills and coins may be considered a “stack” based on the physical contact of the group with each other and other factors as described herein. Thus, while examples herein illustrate circular tokens, other embodiments can utilize different shapes, such as rectangular shaped columns of a chip tray in which rectangular-shaped tokens can be placed.
In the example embodiment, the tracking controller 204 is communicatively coupled to the imaging system 206 to monitor the gaming area 201. More specifically, the imaging system 206 includes one or more sensors configured to collect sensor data associated with the gaming area 201, and the tracking controller 204 receives and analyzes the collected sensor data to detect and monitor physical objects. The imaging system 206 may include any suitable number, type, and/or configuration of sensors to provide sensor data to the game controller 202, the tracking controller 204, and/or another device that may benefit from the sensor data.
In the example embodiment, the imaging system 206 includes at least one image sensor that is oriented to capture image data of physical objects in the gaming area 201. In one example, the imaging system 206 may include a single image sensor that monitors tokens in the chip tray 130 and/or additional objects in the gaming area 201. In another example, the imaging system 206 includes a plurality of image sensors that monitor subdivisions of the chip tray 130 and/or the gaming area 201. The image sensor may be part of a camera unit of the imaging system 206 or a three-dimensional (3D) camera unit in which the image sensor, in combination with other image sensors and/or other types of sensors, may collect depth data related to the image data, which may be used to distinguish between objects within the image data. The image data is transmitted to the tracking controller 204 for analysis as described herein. In some embodiments, the image sensor is configured to transmit the image data with limited image processing or analysis such that the tracking controller 204 and/or another device receiving the image data performs the image processing and analysis. In other embodiments, the image sensor may perform at least some preliminary image processing and/or analysis prior to transmitting the image data. In such embodiments, the image sensor may be considered an extension of the tracking controller 204, and as such, functionality described herein related to image processing and analysis that is performed by the tracking controller 204 may be performed by the image sensor (or a dedicated computing device of the image sensor). In certain embodiments, the imaging system 206 may include, in addition to or instead of the image sensor, one or more sensors configured to detect objects, such as time-of-flight sensors, radar sensors (e.g., LIDAR), thermographic sensors, and the like.
The tracking controller 204 is configured to establish data structures relating to various physical objects detected in the image data from the image sensor. For example, the tracking controller 204 applies one or more image neural network models during image analysis that are trained to detect aspects of physical objects. Neural network models are analysis tools that classify “raw” or unclassified input data without requiring user input. That is, in the case of the raw image data captured by the image sensor, the neural network models may be used to translate patterns within the image data to data object representations of, for example, tokens, token edges, colors, patterns, faces, hands, etc., thereby facilitating data storage and analysis of objects detected in the image data as described herein.
At a simplified level, neural network models are a set of node functions that have a respective weight applied to each function. The node functions and the respective weights are configured to receive some form of raw input data (e.g., image data), establish patterns within the raw input data, and generate outputs based on the established patterns. The weights are applied to the node functions to facilitate refinement of the model to recognize certain patterns (i.e., increased weight is given to node functions resulting in correct outputs), and/or to adapt to new patterns. For example, a neural network model may be configured to receive input data, detect patterns in the image data representing gaming chip parts, human body parts, etc., perform image segmentation, and generate an output that classifies one or more portions of the image data as representative of segments of a chip, a player's body parts (e.g., a box having coordinates relative to the image data that encapsulates a face, an arm, a hand, etc. and classifies the encapsulated area as a “human,” “face,” “arm,” “hand,” etc.), and so forth.
For instance, to train a neural network to identify the most relevant predictions for identifying a chip part or a human body part, for example, a predetermined dataset of raw image data including image data of chips and/or human body parts, and with known outputs, is provided to the neural network. As each node function is applied to the raw input of a known output, an error correction analysis is performed such that node functions that result in outputs near or matching the known output may be given an increased weight while node functions having a significant error may be given a decreased weight. In the example of identifying a gaming chip, node functions that consistently recognize image patterns of chip edge features (e.g., colors, shapes, patterns, etc.) may be given additional weight. In the example of identifying a human face, node functions that consistently recognize image patterns of facial features (e.g., nose, eyes, mouth, etc.) may be given additional weight. Similarly, in the example of identifying a human hand, node functions that consistently recognize image patterns of hand features (e.g., wrist, fingers, palm, etc.) may be given additional weight. The outputs of the node functions (including the respective weights) are then evaluated in combination to provide an output such as a data structure representing a human face. Training may be repeated to further refine the pattern-recognition of the model, and the model may still be refined during deployment (i.e., raw input without a known data output).
At least some of the neural network models applied by the tracking controller 204 may be deep neural network (DNN) models. DNN models include at least three layers of node functions linked together to break the complexity of image analysis into a series of steps of increasing abstraction from the original image data. For example, for a DNN model trained to detect human faces from an image, a first layer may be trained to identify groups of pixels that represent the boundary of facial features, a second layer may be trained to identify the facial features as a whole based on the identified boundaries, and a third layer may be trained to determine whether or not the identified facial features form a face and distinguish the face from other faces. The multi-layered nature of the DNN models may facilitate more targeted weights, a reduced number of node functions, and/or pipeline processing of the image data (e.g., for a three-layered DNN model, each stage of the model may process three frames of image data in parallel).
In at least some embodiments, each model applied by the tracking controller 204 may be configured to identify a particular aspect of the image data and provide different outputs such that the tracking controller 204 may aggregate the outputs of the neural network models together to identify physical objects as described herein. For example, one model may be trained to identify chips edge features. For example, different models may be trained to identify chips from different viewing perspectives (e.g., from top or bottom viewing perspectives). Another model may be trained to identify writing on chips (e.g., written values that indicate a chip denomination). Furthermore, in some embodiments, the different models may be trained to detect human faces, while another model may be trained to identify the bodies of players. In such an example, the tracking controller 204 may link together the different outcomes of models (e.g., the tracking controller 204 links together the detection of a face of a player to a detection of a body of the player by analyzing the outputs off the two models). In other embodiments, a single DNN model may be applied to perform the functionality of several models.
As described in further detail below, the tracking controller 204 may generate data objects for each physical object identified within the captured image data by the DNN models. The data objects are data structures that are generated to link together data associated with corresponding physical objects. For example, the outputs of several DNN models associated with a player may be linked together as part of a player data object.
It is to be understood that the underlying data storage of the data objects may vary in accordance with the computing environment of the memory device or devices that store the data object. That is, factors such as programming language and file system may vary the where and/or how the data object is stored (e.g., via a single block allocation of data storage, via distributed storage with pointers linking the data together, etc.). In addition, some data objects may be stored across several different memory devices or databases.
In some embodiments, the player data objects include a player identifier, and data objects of other physical objects include other identifiers. The identifiers uniquely identify the physical objects such that the data stored within the data objects is tied to the physical objects. In some embodiments, the identifiers may be incorporated into other systems or subsystems. For example, a player account system may store player identifiers as part of player accounts, which may be used to provide benefits, rewards, and the like to players. In certain embodiments, the identifiers may be provided to the tracking controller 204 by other systems that may have already generated the identifiers.
In at least some embodiments, the data objects and identifiers may be stored by the tracking database system 208. The tracking database system 208 includes one or more data storage devices (e.g., one or more databases) that store data from at least the tracking controller 204 in a structured, addressable manner. That is, the tracking database system 208 stores data according to one or more linked metadata fields that identify the type of data stored and can be used to group stored data together across several metadata fields. The stored data is addressable such that stored data within the tracking database system 208 may be tracked after initial storage for retrieval, deletion, and/or subsequent data manipulation (e.g., editing or moving the data). The tracking database system 208 may be formatted according to one or more suitable file system structures (e.g., FAT, exFAT, ext4, NTFS, etc.).
The tracking database system 208 may be a distributed system (i.e., the data storage devices are distributed to a plurality of computing devices) or a single device system. In certain embodiments, the tracking database system 208 may be integrated with one or more computing devices configured to provide other functionality to the gaming system 200 and/or other gaming systems. For example, the tracking database system 208 may be integrated with the tracking controller 204 or the server system 214.
In the example embodiment, the tracking database system 208 is configured to facilitate a lookup function on the stored data for the tracking controller 204. The lookup function compares input data provided by the tracking controller 204 to the data stored within the tracking database system 208 to identify any “matching” data. It is to be understood that “matching” within the context of the lookup function may refer to the input data being the same, substantially similar, or linked to stored data in the tracking database system 208. For example, if the input data is an image of a player's face, the lookup function may be performed to compare the input data to a set of stored images of historical players to determine whether or not the player captured in the input data is a returning player. In this example, one or more image comparison techniques may be used to identify any “matching” image stored by the tracking database system 208. For example, key visual markers for distinguishing the player may be extracted from the input data and compared to similar key visual markers of the stored data. If the same or substantially similar visual markers are found within the tracking database system 208, the matching stored image may be retrieved. In addition to or instead of the matching image, other data linked to the matching stored image may be retrieved during the lookup function, such as a player account number, the player's name, etc. In at least some embodiments, the tracking database system 208 includes at least one computing device that is configured to perform the lookup function. In other embodiments, the lookup function is performed by a device in communication with the tracking database system 208 (e.g., the tracking controller 204) or a device in which the tracking database system 208 is integrated within.
In
In some embodiments, an image sensor is affixed by a material of the chip tray (e.g., at least a portion of the image sensor is embedded into the structure of the chip tray), yet still retains some degree of movement. For example, in some embodiments, an image sensor may have a range of automated movement, such as to shift laterally, rotate, etc. between the positions shown in
Referring back to
Referring back to
Referring back to
Referring back to
Referring back to
Furthermore, in some embodiments, the tracking controller 204 can transmit the information to other devices connected to a network (e.g., gaming table network, gaming machine network, computer network, communication network, etc.) for presentation via other devices, display, etc. In some instances, the tracking controller 204 can communicate the image data, as well as the determined information, wirelessly to a mobile device (e.g., to present via a headset, a mobile device, a table computer, etc. that is worn or carried by a dealer, a pit boss, etc.).
In some embodiments, the gaming table 1200 may include a display 1210 separate from the gaming surface 1202. The display 1210 may be configured to face players, prospective players, and spectators and may display, for example, information randomly selected by a shuffler device and also displayed on a display of the shuffler device; rules; pay tables; real-time game status, such as wagers accepted and cards dealt; historical game information, such as amounts won, amounts wagered, percentage of hands won, and notable hands achieved; the commercial game name, the casino name, advertising and other instructions and information related to the wagering game. The display 1210 may be a physically fixed display, such as an edge lit sign, in some embodiments. In other embodiments, the display 1210 may change automatically in response to a stimulus (e.g., may be an electronic video monitor).
The gaming table 1200 may include particular machines and apparatuses configured to facilitate the administration of the wagering game. For example, the gaming table 1200 may include one or more card-handling devices 1204A, 1204B. The card-handling device 1204A may be, for example, a shoe from which physical cards 1206 from one or more decks of intermixed playing cards may be withdrawn, one at a time. Such a card-handling device 1204A may include, for example, a housing in which cards 1206 are located, an opening from which cards 1206 are removed, and a card-presenting mechanism (e.g., a moving weight on a ramp configured to push a stack of cards down the ramp) configured to continually present new cards 1206 for withdrawal from the shoe.
In some embodiments in which the card-handling device 1204A is used, the card-handling device 1204A may include a random number generator and a display, in addition to or rather than such features being included in a shuffler device. In addition to the card-handling device 1204A, the card-handling device 1204B may be included. The card-handling device 1204B may be, for example, a shuffler configured to select information (using a random number generator), to display the selected information on a display of the shuffler, to reorder (either randomly or pseudo-randomly) physical playing cards 1206 from one or more decks of playing cards, and to present randomized cards 1206 for use in the wagering game. Such a card-handling device 1204B may include, for example, a housing, a shuffling mechanism configured to shuffle cards, and card inputs and outputs (e.g., trays). Shufflers may include card recognition capability that can form a randomly ordered set of cards within the shuffler. The card-handling device 1204 may also be, for example, a combination shuffler and shoe in which the output for the shuffler is a shoe.
In some embodiments, the card-handling device 1204 may be configured and programmed to administer at least a portion of a wagering game being played utilizing the card-handling device 1204. For example, the card-handling device 1204 may be programmed and configured to randomize a set of cards and deliver cards individually for use according to game rules and player and or dealer game play elections. More specifically, the card-handling device 1204 may be programmed and configured to, for example, randomize a set of six complete decks of cards including one or more standard 52-card decks of playing cards and, optionally, any specialty cards (e.g., a cut card, bonus cards, wild cards, or other specialty cards). In some embodiments, the card-handling device 1204 may present individual cards, one at a time, for withdrawal from the card-handling device 1204. In other embodiments, the card-handling device 1204 may present an entire shuffled block of cards that are transferred manually or automatically into a card dispensing shoe 1204. In some such embodiments, the card-handling device 1204 may accept dealer input, such as, for example, a number of replacement cards for discarded cards, a number of hit cards to add, or a number of partial hands to be completed. In other embodiments, the device may accept a dealer input from a menu of game options indicating a game selection, which will select programming to cause the card-handling device 1204 to deliver the requisite number of cards to the game according to game rules, player decisions and dealer decisions. In still other embodiments, the card-handling device 1204 may present the complete set of randomized cards for manual or automatic withdrawal from a shuffler and then insertion into a shoe. As specific, nonlimiting examples, the card-handling device 1204 may present a complete set of cards to be manually or automatically transferred into a card dispensing shoe, or may provide a continuous supply of individual cards.
In another embodiment, the card handling device may be a batch shuffler, such as by randomizing a set of cards using a gripping, lifting, and insertion sequence.
In some embodiments, the card-handling device 1204 may employ a random number generator device to determine card order, such as, for example, a final card order or an order of insertion of cards into a compartment configured to form a packet of cards. The compartments may be sequentially numbered, and a random number assigned to each compartment number prior to delivery of the first card. In other embodiments, the random number generator may select a location in the stack of cards to separate the stack into two sub-stacks, creating an insertion point within the stack at a random location. The next card may be inserted into the insertion point. In yet other embodiments, the random number generator may randomly select a location in a stack to randomly remove cards by activating an ejector.
Regardless of whether the random number generator (or generators) is hardware or software, it may be used to implement specific game administrations methods of the present disclosure.
The card-handling device 1204 may simply be supported on the gaming surface 1202 in some embodiments. In other embodiments, the card-handling device 1204 may be mounted into the gaming table 1202 such that the card-handling device 1204 is not manually removable from the gaming table 1202 without the use of tools. In some embodiments, the deck or decks of playing cards used may be standard, 52-card decks. In other embodiments, the deck or decks used may include cards, such as, for example, jokers, wild cards, bonus cards, etc. The shuffler may also be configured to handle and dispense security cards, such as cut cards.
In some embodiments, the card-handling device 1204 may include an electronic display 1207 for displaying information related to the wagering game being administered. The electronic display 1207 may display a menu of game options, the name of the game selected, the number of cards per hand to be dispensed, acceptable amounts for other wagers (e.g., maximums and minimums), numbers of cards to be dealt to recipients, locations of particular recipients for particular cards, winning and losing wagers, pay tables, winning hands, losing hands, and payout amounts. In other embodiments, information related to the wagering game may be displayed on another electronic display, such as, for example, the display 1210 described previously.
The type of card-handling device 1204 employed to administer embodiments of the disclosed wagering game, as well as the type of card deck employed and the number of decks, may be specific to the game to be implemented. Cards used in games of this disclosure may be, for example, standard playing cards from one or more decks, each deck having cards of four suits (clubs, hearts, diamonds, and spades) and of rankings ace, king, queen, jack, and ten through two in descending order. As a more specific example, six, seven, or eight standard decks of such cards may be intermixed. Typically, six or eight decks of 52 standard playing cards each may be intermixed and formed into a set to administer a blackjack or blackjack variant game. After shuffling, the randomized set may be transferred into another portion of the card-handling device 1204B or another card-handling device 1204A altogether, such as a mechanized shoe capable of reading card rank and suit.
The gaming table 1200 may include one or more chip racks 1208 configured to facilitate accepting wagers, transferring lost wagers to the house, and exchanging monetary value for wagering elements 1212 (e.g., chips). For example, the chip rack 1208 (also referred to as a chip tray herein, e.g., chip tray 130) may include a series of token support columns, each of which may support tokens of a different type (e.g., color and denomination). In some embodiments, the chip rack 1208 may be configured to automatically present a selected number of chips using a chip-cutting-and-delivery mechanism. In the example shown in
When administering a wagering game in accordance with embodiments of this disclosure, a dealer 1216 may receive money (e.g., cash) from a player in exchange for wagering elements 1212. The dealer 1216 may deposit the money in the drop box 1214 and transfer physical wagering elements 1212 to the player. As part of the method of administering the game, the dealer 1216 may accept one or more initial wagers from the player, which may be reflected by the dealer 1216 permitting the player to place one or more wagering elements 1212 or other wagering tokens (e.g., cash) within designated areas on the gaming surface 1202 associated with the various wagers of the wagering game. Once initial wagers have been accepted, the dealer 1216 may remove physical cards 1206 from the card-handling device 1204 (e.g., individual cards, packets of cards, or the complete set of cards) in some embodiments. In other embodiments, the physical cards 1206 may be hand-pitched (i.e., the dealer 1216 may optionally shuffle the cards 1206 to randomize the set and may hand-deal cards 1206 from the randomized set of cards). The dealer 1216 may position cards 1206 within designated areas on the gaming surface 1202, which may designate the cards 1206 for use as individual player cards, community cards, or dealer cards in accordance with game rules. House rules may require the dealer to accept both main and secondary wagers before card distribution. House rules may alternatively allow the player to place only one wager (i.e., the second wager) during card distribution and after the initial wagers have been placed, or after card distribution but before all cards available for play are revealed.
In some embodiments, after dealing the cards 1206, and during play, according to the game rules, any additional wagers (e.g., the play wager) may be accepted, which may be reflected by the dealer 1216 permitting the player to place one or more wagering elements 1212 within the designated area (i.e., area 124) on the gaming surface 1202 associated with the play wager of the wagering game. The dealer 1216 may perform any additional card dealing according to the game rules. Finally, the dealer 1216 may resolve the wagers, award winning wagers to the players, which may be accomplished by giving wagering elements 1212 from the chip rack 1208 to the players, and transferring losing wagers to the house, which may be accomplished by moving wagering elements 1212 from the player designated wagering areas to the chip rack 1208.
The processors 1642 may be configured to execute a wide variety of operating systems and applications including the computing instructions for administering wagering games of the present disclosure.
The processors 1642 may be configured as a general-purpose processor such as a microprocessor, but in the alternative, the general-purpose processor may be any processor, controller, microcontroller, or state machine suitable for carrying out processes of the present disclosure. The processor 1642 may also be implemented as a combination of computing devices, such as a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
A general-purpose processor may be part of a general-purpose computer. However, when configured to execute instructions (e.g., software code) for carrying out embodiments of the present disclosure the general-purpose computer should be considered a special-purpose computer. Moreover, when configured according to embodiments of the present disclosure, such a special-purpose computer improves the function of a general-purpose computer because, absent the present disclosure, the general-purpose computer would not be able to carry out the processes of the present disclosure. The processes of the present disclosure, when carried out by the special-purpose computer, are processes that a human would not be able to perform in a reasonable amount of time due to the complexities of the data processing, decision making, communication, interactive nature, or combinations thereof for the present disclosure. The present disclosure also provides meaningful limitations in one or more particular technical environments that go beyond an abstract idea. For example, embodiments of the present disclosure provide improvements in the technical field related to the present disclosure.
The memory 1646 may be used to hold computing instructions, data, and other information for performing a wide variety of tasks including administering wagering games of the present disclosure. By way of example, and not limitation, the memory 1646 may include Synchronous Random Access Memory (SRAM), Dynamic RAM (DRAM), Read-Only Memory (ROM), Flash memory, and the like.
The display 1658 may be a wide variety of displays such as, for example, light-emitting diode displays, liquid crystal displays, cathode ray tubes, and the like. In addition, the display 1658 may be configured with a touch-screen feature for accepting user input as a user interface element 1644.
As nonlimiting examples, the user interface elements 1644 may include elements such as displays, keyboards, push-buttons, mice, joysticks, haptic devices, microphones, speakers, cameras, and touchscreens.
As nonlimiting examples, the communication elements 1656 may be configured for communicating with other devices or communication networks. As nonlimiting examples, the communication elements 1656 may include elements for communicating on wired and wireless communication media, such as for example, serial ports, parallel ports, Ethernet connections, universal serial bus (USB) connections, IEEE 1394 (“firewire”) connections, THUNDERBOLT™ connections, BLUETOOTH® wireless networks, ZigBee wireless networks, 802.11 type wireless networks, cellular telephone/data networks, fiber optic networks and other suitable communication interfaces and protocols.
The storage 1648 may be used for storing relatively large amounts of nonvolatile information for use in the computing system 1640 and may be configured as one or more storage devices. By way of example and not limitation, these storage devices may include computer-readable media (CRM). This CRM may include, but is not limited to, magnetic and optical storage devices such as disk drives, magnetic tape, CDs (compact discs), DVDs (digital versatile discs or digital video discs), and semiconductor devices such as RAM, DRAM, ROM, EPROM, Flash memory, and other equivalent storage devices.
A person of ordinary skill in the art will recognize that the computing system 1640 may be configured in many different ways with different types of interconnecting buses between the various elements. Moreover, the various elements may be subdivided physically, functionally, or a combination thereof. As one nonlimiting example, the memory 1646 may be divided into cache memory, graphics memory, and main memory. Each of these memories may communicate directly or indirectly with the one or more processors 1642 on separate buses, partially combined buses, or a common bus.
As a specific, nonlimiting example, various methods and features of the present disclosure may be implemented in a mobile, remote, or mobile and remote environment over one or more of Internet, cellular communication (e.g., Broadband), near field communication networks and other communication networks referred to collectively herein as an iGaming environment. The iGaming environment may be accessed through social media environments such as FACEBOOK® and the like. DragonPlay Ltd, acquired by Bally Technologies Inc., provides an example of a platform to provide games to user devices, such as cellular telephones and other devices utilizing ANDROID®, iPHONE® and FACEBOOK® platforms. Where permitted by jurisdiction, the iGaming environment can include pay-to-play (P2P) gaming where a player, from their device, can make value based wagers and receive value based awards. Where P2P is not permitted the features can be expressed as entertainment only gaming where players wager virtual credits having no value or risk no wager whatsoever such as playing a promotion game or feature.
In some embodiments, wagering games may be administered in an at least partially player-pooled format, with payouts on pooled wagers being paid from a pot to players and losses on wagers being collected into the pot and eventually distributed to one or more players. Such player-pooled embodiments may include a player-pooled progressive embodiment, in which a pot is eventually distributed when a predetermined progressive-winning hand combination or composition is dealt. Player-pooled embodiments may also include a dividend refund embodiment, in which at least a portion of the pot is eventually distributed in the form of a refund distributed, e.g., pro-rata, to the players who contributed to the pot.
In some player-pooled embodiments, the game administrator may not obtain profits from chance-based events occurring in the wagering games that result in lost wagers. Instead, lost wagers may be redistributed back to the players. To profit from the wagering game, the game administrator may retain a commission, such as, for example, a player entrance fee or a rake taken on wagers, such that the amount obtained by the game administrator in exchange for hosting the wagering game is limited to the commission and is not based on the chance events occurring in the wagering game itself. The game administrator may also charge a rent of flat fee to participate.
It is noted that the methods described herein can be played with any number of standard decks of 52 cards (e.g., 1 deck to 10 decks). A standard deck is a collection of cards comprising an Ace, two, three, four, five, six, seven, eight, nine, ten, jack, queen, king, for each of four suits (comprising spades, diamonds, clubs, hearts) totaling 52 cards. Cards can be shuffled or a continuous shuffling machine (CSM) can be used. A standard deck of 52 cards can be used, as well as other kinds of decks, such as Spanish decks, decks with wild cards, etc. The operations described herein can be performed in any sensible order. Furthermore, numerous different variants of house rules can be applied.
Note that in the embodiments played using computers (a processor/processing unit), “virtual deck(s)” of cards are used instead of physical decks. A virtual deck is an electronic data structure used to represent a physical deck of cards which uses electronic representations for each respective card in the deck. In some embodiments, a virtual card is presented (e.g., displayed on an electronic output device using computer graphics, projected onto a surface of a physical table using a video projector, etc.) and is presented to mimic a real life image of that card.
Methods described herein can also be played on a physical table using physical cards and physical chips used to place wagers. Such physical chips can be directly redeemable for cash. When a player wins (dealer loses) the player's wager, the dealer will pay that player a respective payout amount. When a player loses (dealer wins) the player's wager, the dealer will take (collect) that wager from the player and typically place those chips in the dealer's chip rack. All rules, embodiments, features, etc. of a game being played can be communicated to the player (e.g., verbally or on a written rule card) before the game begins.
Initial cash deposits can be made into the electronic gaming machine which converts cash into electronic credits. Wagers can be placed in the form of electronic credits, which can be cashed out for real coins or a ticket (e.g., ticket-in-ticket-out) which can be redeemed at a casino cashier or kiosk for real cash and/or coins.
Any component of any embodiment described herein may include hardware, software, or any combination thereof.
Further, the operations described herein can be performed in any sensible order. Any operations not required for proper operation can be optional. Further, all methods described herein can also be stored as instructions on a computer readable storage medium, which instructions are operable by a computer processor. All variations and features described herein can be combined with any other features described herein without limitation. All features in all documents incorporated by reference herein can be combined with any feature(s) described herein, and also with all other features in all other documents incorporated by reference, without limitation.
Features of various embodiments of the inventive subject matter described herein, however essential to the example embodiments in which they are incorporated, do not limit the inventive subject matter as a whole, and any reference to the invention, its elements, operation, and application are not limiting as a whole, but serve only to define these example embodiments. This detailed description does not, therefore, limit embodiments which are defined only by the appended claims. Further, since numerous modifications and changes may readily occur to those skilled in the art, it is not desired to limit the inventive subject matter to the exact construction and operation illustrated and described, and accordingly all suitable modifications and equivalents may be resorted to, falling within the scope of the inventive subject matter.
This application claims the priority benefit of U.S. Provisional Patent Application No. 63/240,171 filed Sep. 2, 2021, which 63/240,171 application is incorporated by reference herein in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5103081 | Fisher et al. | Apr 1992 | A |
5451054 | Orenstein | Sep 1995 | A |
5757876 | Dam et al. | May 1998 | A |
6460848 | Soltys et al. | Oct 2002 | B1 |
6514140 | Storch | Feb 2003 | B1 |
6517435 | Soltys et al. | Feb 2003 | B2 |
6517436 | Soltys et al. | Feb 2003 | B2 |
6520857 | Soltys et al. | Feb 2003 | B2 |
6527271 | Soltys et al. | Mar 2003 | B2 |
6530836 | Soltys et al. | Mar 2003 | B2 |
6530837 | Soltys et al. | Mar 2003 | B2 |
6533662 | Soltys et al. | Mar 2003 | B2 |
6579180 | Soltys et al. | Jun 2003 | B2 |
6579181 | Soltys et al. | Jun 2003 | B2 |
6663490 | Soltys et al. | Dec 2003 | B2 |
6688979 | Soltys et al. | Feb 2004 | B2 |
6712696 | Soltys et al. | Mar 2004 | B2 |
6758751 | Soltys et al. | Jul 2004 | B2 |
7124947 | Storch | Oct 2006 | B2 |
7316615 | Soltys et al. | Jan 2008 | B2 |
7753781 | Storch | Jul 2010 | B2 |
7771272 | Soltys et al. | Aug 2010 | B2 |
8130097 | Knust et al. | Mar 2012 | B2 |
8285034 | Rajaraman et al. | Oct 2012 | B2 |
8606002 | Rajaraman et al. | Dec 2013 | B2 |
9378605 | Koyama | Jun 2016 | B2 |
9795870 | Ratliff | Oct 2017 | B2 |
10032335 | Shigeta | Jul 2018 | B2 |
10096206 | Bulzacki et al. | Oct 2018 | B2 |
10192085 | Shigeta | Jan 2019 | B2 |
10398202 | Shigeta | Sep 2019 | B2 |
10403090 | Shigeta | Sep 2019 | B2 |
10529183 | Shigeta | Jan 2020 | B2 |
10540846 | Shigeta | Jan 2020 | B2 |
10580254 | Shigeta | Mar 2020 | B2 |
10593154 | Shigeta | Mar 2020 | B2 |
10600279 | Shigeta | Mar 2020 | B2 |
10600282 | Shigeta | Mar 2020 | B2 |
10665054 | Shigeta | May 2020 | B2 |
20020042298 | Soltys et al. | Apr 2002 | A1 |
20050059479 | Soltys et al. | Mar 2005 | A1 |
20060019739 | Soltys et al. | Jan 2006 | A1 |
20140357361 | Rajaraman | Dec 2014 | A1 |
20180061178 | Shigeta | Mar 2018 | A1 |
20180068525 | Shigeta | Mar 2018 | A1 |
20180075698 | Shigeta | Mar 2018 | A1 |
20180114406 | Shigeta | Apr 2018 | A1 |
20180211110 | Shigeta | Jul 2018 | A1 |
20180239984 | Shigeta | Aug 2018 | A1 |
20180247134 | Bulzacki et al. | Aug 2018 | A1 |
20180336757 | Shigeta | Nov 2018 | A1 |
20180350191 | Shigeta | Dec 2018 | A1 |
20190043309 | Shigeta | Feb 2019 | A1 |
20190088082 | Shigeta | Mar 2019 | A1 |
20190102987 | Shigeta | Apr 2019 | A1 |
20190108710 | French et al. | Apr 2019 | A1 |
20190147689 | Shigeta | May 2019 | A1 |
20190172312 | Shigeta | Jun 2019 | A1 |
20190213830 | Main, Jr. | Jul 2019 | A1 |
20190236891 | Shigeta | Aug 2019 | A1 |
20190259238 | Shigeta | Aug 2019 | A1 |
20190266832 | Shigeta | Aug 2019 | A1 |
20190347893 | Shigeta | Nov 2019 | A1 |
20190362594 | Shigeta | Nov 2019 | A1 |
20190371112 | Shigeta | Dec 2019 | A1 |
20210248871 | Gelinotte | Aug 2021 | A1 |
20230032920 | Bulzacki | Feb 2023 | A1 |
Number | Date | Country |
---|---|---|
2017197452 | Nov 2017 | WO |
2018047965 | Mar 2018 | WO |
2020068141 | Apr 2020 | WO |
WO-2021072540 | Apr 2021 | WO |
Entry |
---|
US 10,380,841 B2, 08/2019, Shigeta (withdrawn) |
Number | Date | Country | |
---|---|---|---|
20230075651 A1 | Mar 2023 | US |
Number | Date | Country | |
---|---|---|---|
63240171 | Sep 2021 | US |