1. Field of the Invention
The present description generally relates to monitoring various aspects of casinos and gaming, and more specifically relates to automated game and wager tracking and analysis.
2. Description of the Related Art
Casinos and other forms of gaming are a multi-billion dollar, world-wide industry. Typically, a customer exchanges currency or some form of credit for a casino's chips. The customer places the chips as wagers at various games, such as blackjack, craps, roulette, and baccarat. A game operator, such as a dealer, pays out winning wagers with additional chips based on the set of odds for the particular game. The dealer collects the customer's chips for losing wagers. The odds of each game slightly favor the casino, so on average the casino wins and is profitable.
Like many businesses, casinos wish to understand the habits of their customers. Some casinos have employees visually observe customer's game play, manually tracking the gaming and wagering habits of the particular customers. For example, “pit managers” often visually monitors and records the live play of a game at the gaming table. Based on this visual monitoring, the pit managers try to guess what people are betting, and based on such betting the casino provides rewards to the customer in the form of complementary benefits, or “comps.”
The inventors have empirically determined that having human pit managers manually monitor and estimate customers' wagering habits is very inaccurate. For instance, in one recent study the inventors found accuracy of the human pit managers to vary widely, all the way from 30% accuracy up to 90%. In addition, the current method of using human pit managers to monitor customers' gaming activities is extremely labor intensive for the casinos.
Like many businesses, casinos wish to prevent their customers from cheating. The fast pace and large sums of money make casinos likely targets for cheating and stealing. In one commonly known method of cheating the casino, players count cards in games of blackjack (which the casinos view as cheating), and increase their wagers in lockstep with the increasing probability of a winning hand based on the card counting.
Casinos employ a variety of security measures to discourage such cheating. One measure is to track both the hands played and wagers of a blackjack player to determine if the pattern of wagers plus hands played give rise to an inference that the player is counting cards. For example, surveillance cameras covering a gaming area or particular gaming table provide a live or taped video signal that security personnel closely examine. However, as with the pit managers, the accuracy of such counter-cheating measures suffers due to the inability to track the often rapidly changing wagers made during a game.
It is therefore apparent that a need exists in the art for a method and system that can accurately track wagers during gaming.
A method for determining wagers by applying a chip denomination representation, having at least one angle associated with at least one color transition, against a working chip template. In one illustrated embodiment, a method includes: acquiring an image of a gaming table having a bet circle; selecting an area of the image proximate to the bet circle; detecting color transitions at least partially in the area; conforming the color transitions to the area to create area-conformed color transitions; constructing a working chip template from the area-conformed color transitions; recalling a first chip denomination representation from a chip denomination representation library, the first chip denomination representation having at least one angle associated with at least one color transition; applying the first chip denomination representation against the working chip template; and calculating a first chip score responsive to said applying the first chip denomination. Other method embodiments are disclosed herein.
Various illustrated system embodiments include, but are not limited to, circuitry and/or programming for effecting the foregoing-referenced method; the circuitry and/or programming can be virtually any combination of hardware, software, and/or firmware configured to effect the foregoing-referenced method embodiments depending upon the design choices of the system designer.
The foregoing is a summary and thus contains, by necessity, simplifications, generalizations and omissions of detail; consequently, those skilled in the art will appreciate that the summary is illustrative only and is not intended to be in any way limiting. Other aspects, inventive features, and advantages of the devices and/or processes described herein, as defined solely by the claims, will become apparent in the non-limiting detailed description set forth herein
The use of the same symbols in different drawings typically indicates similar or identical items.
In the following description, certain specific details are set forth in order to provide a thorough understanding of various embodiments of the invention. However, one skilled in the art will understand that the invention may be practiced without these details. In other instances, well-known structures associated with computers, computer networks, readers and machine-vision have not been shown or described in detail to avoid unnecessarily obscuring descriptions of the embodiments of the invention.
The headings provided herein are for convenience only and do not interpret the scope or meaning of the claimed invention.
This description initially presents a general explanation of gaming and gaming table monitoring components in the environment of a blackjack table. A more specific description of each of the individual hardware components and the interaction of the hardware components follows. A description of the overall operation of the system follows the hardware discussion. A more specific discussion of the operation of the system follows, presented in terms of discrete software modules. The presentation concludes with a discussion of a network of gaming tables.
Blackjack Gaming
In the following description, certain specific details are set forth in order to provide a thorough understanding of various embodiments of the invention. However, one skilled in the art will-understand that the invention may be practiced without these details. In other instances, well-known structures associated with computers, computer networks, readers and machine-vision have not been shown or described in detail to avoid unnecessarily obscuring descriptions of the embodiments of the invention.
The headings provided herein are for convenience only and do not interpret the scope or meaning of the claimed invention.
Blackjack Gaming Environment
During a game, each player 14, 16 places her respective wager by selecting a number of chips from her respective chip reserve 28, 29 and subsequently placing the selected number of chips in her respective bet circle 22, 23. The number of chips in each bet circle 22, 23 constitute the respective bet stacks 24, 25 of each player 14, 16.
The chips typically come in a variety of denominations. Players 14, 16 are issued chips in exchange for currency or credit by the casino's tellers. Casinos typically require the use of chips for wagering, rather than actual currency.
After players 14, 16 have placed an initial wager of chips in their respective bet circles 22, 23, dealer 12 deals each player 14, 16 two cards 30 face down, and deals herself one card 32 face down (“hole card”) and one card 34 face up (“show card”) from deck 18. Players 14, 16 can accept additional cards (“hits”) from deck 18 as each player 14, 16 attempts to reach a total card value of “21” without going over, where face cards count as ten points, and Aces can count as either one or eleven points, at the cardholder's option. Dealer 12 also attempts to reach “21” without going over, although the rules typically require dealer 12 to take a hit when holding a “soft 17.” Players 14, 16 can vary their respective wagers (e.g., the number and/or denomination of chips in bet stacks 22, 23) after the initial cards 30-34 are dealt based on their knowledge of their own hand and the dealer's face up card 34. For example, each player 14, 16 can “hit” or “stand” and may “double down” or “buy insurance.”
At the end of a “hand” or game, dealer 12 collects the wager chips from any losing players and pays out winnings in chips to any winning players. The winnings are calculated as a multiple of a set of odds for the game and the amount of wager chips in the respective bet stacks 24, 25. The losses are typically the amount of wager chips in the respective bet stacks 24, 25. Dealer 12 places the collected wager chips or “take” from any losing players into a gaming table bank that takes the form of chip tray 36. Dealer 12 pays out the winnings using the required number of chips from chip tray 36. Changes to the contents of bet stacks 22, 23 occur quickly throughout the game and can affect the winnings and losses of the casino (“house”) at gaming table 10. Thus, maintaining an accurate count of the number and value of the chips in bet stacks 24, 25 can assist the casino in managing its operations.
Chips
Gaming Recognition
Pixilated image 112 has colors shown by the legend of
In step 504, the recognition unit 304 superimposes rectangular box 118 over a portion of the pixilated color image 112. The position of rectangular box 118 defines, or is coordinated with, the location of the bet circle 22 in the image. The position of rectangular box 118 relative to pixilated color image 112 is based on pre-existing knowledge of images of gaming table 10 captured by image capture device 302. The rectangular box 118 is of height and width such that the process can determine from the position of the bottom of a chip's vertical lines of color transition whether or not the chip is substantially within bet circle 22. For example, an image of a chip spaced farther away from image capture device 302 would not have the bottom of its vertical color transitions in rectangular box 118. Likewise, an image of a chip spaced closer to image capture device 302 would not have the bottom of its vertical color transitions in the rectangular box 118.
In step 506, the recognition unit 304 strobes the portion of pixilated color image 112 interior to rectangular box 118, and logs all “vertical lines of color transition” whose bottoms are internal to the rectangular box 118. A “vertical line of color transition” is defined to be a vertical pixel boundary where to the left of the vertical boundary a number (e.g., 8) of substantially identical first-colored pixels in vertical alignment exist and where to the right of the vertical boundary a number (e.g., 8) of substantially identical second-colored (where the second color is different that the first color) pixels in vertical alignment exist. For example, 8 blue pixels in vertical alignment to the left of the vertical boundary, and 8 white pixels in vertical alignment to the right of the vertical boundary would constitute one vertical line of color transition, such as might be encompassed by the color bands of the top two chips 114, 116 in bet stack 24.
In step 508, the recognition unit 304 determines a median bottom-of-lines value, based on the bottoms of the detected vertical lines of color transition whose bottoms are interior to the rectangular box 118, for example by calculation. In step 510, the recognition unit 304 disregards the detected vertical lines of color transitions whose bottoms are not within a defined threshold of the calculated median bottom-of-lines value.
In step 512, the recognition unit 304 determines a median height-of-lines value based on the heights of each non-disregarded detected vertical lines of color transition. The height of each vertical line of color transition may be determined by calculation based on the actual top and actual bottom values of the individual lines of vertical color transition.
In step 514, the recognition unit 304 compares the tops of the remaining (i.e., non-disregarded) detected lines of vertical color transition—whose bottoms are internal to rectangular box 118—relative to the computed median height-of-lines value. If the tops of such lines are further from the determined median height-of-lines value than some pre-defined threshold, the process truncates, and saves the uppermost portions of such lines of color transition beyond such threshold value for subsequent processing.
In step 516, for the remaining (i.e., non-disregarded) lines of vertical color transition, the recognition unit 304 adjusts the top and the bottom of such lines so as to all be within some pre-defined distance from the determined median height-of-lines (step 512) and median bottom-of-lines (step 508) values.
In step 518, the recognition unit 304 determines a mean horizontal position for the remaining (i.e., non-disregarded) lines of color transition, for example, by calculation. The process disregards those remaining lines of vertical color transition that are internal to the rectangular box 118, but which fall outside a pre-defined threshold distance from the determined mean horizontal distance. In one implementation, the predefined threshold distance is the expected width of 1 chip.
In step 520, the recognition unit 304 recalls a previously unrecalled chip denomination representation consisting of lines of vertical color transition and associated angles (see, e.g.,
In step 522, the recognition unit 304 creates a “working chip” template. The recognition unit 304 initially populates the working chip template, in a left-right alternating fashion about the previously calculated mean horizontal position of the vertical color transitions, with a set of the detected vertical lines of color transition. For example, with respect to chip 102 of
In step 524, the recognition unit 304 determines a listing of possible side-plan view color transition sequences of the recalled representation of the chip denomination under consideration for the chosen set of color transitions currently populating the working chip template, for example, by calculation. That is, knowing that the recalled chip denomination under consideration includes a series of color transitions arising from markings 46 that appear, in sequence, along perimeter 46 of a chip having the recalled denomination, the process knows that only a portion of the perimeter 46 of the chip can be seen by image capture device 302 at any particular time (e.g.,
In step 525, the recognition unit 304 determines whether the listing determined in step 524 is empty. If the list is empty, it means that the recalled representation of the chip denomination under consideration cannot contain color transitions such as those in the working chip template. Consequently, the recognition unit 304 proceeds to step 520 and recalls a previously unrecalled representation of the chip denomination under consideration. If the list is not empty, the recognition unit 304 performs the actions of step 526 and subsequent steps.
As noted, in step 524 the recognition unit 304 creates a listing of possible side-plan views of color transition sequences of the recalled representation of the chip denomination under consideration. In step 526, the recognition unit 304 selects, from the current listing (e.g., created in step 524 or 550) one of the possible side-plan views of color transition sequences for the representation of the chip denomination under consideration and notes the angles associated with such color transition sequences. In step 528, for each pair of lines defining color transitions in the selected possible side-plan view of step 526, the recognition unit 304 determines distances between the lines defining the color transitions of the working chip template (e.g., as in method step 522, or in
In step 530, the recognition unit 304 determines a group of hypothetical radiuses (e.g., radius1, radius2, radius3 of
In step 532, the recognition unit 304 determines a mean radius, a mean circle-center, and a variance of the determined radius values based on the group of hypothetical radius values of step 530. In step 534, the recognition unit 304 stores the mean radius, the mean circle-center, and the variance of the determined radius values in logical association with the current selected one of the possible side-plan view of color transition sequences for the representation of the chip denomination under consideration (e.g., that of step 526 or 538).
In step 536, the recognition unit 304 determines whether all possible side-plan views of color transition sequences of the recalled chip denomination representation under consideration in the listing created in step 524 have been considered. If not all of the possible side-plan views in the listing created in step 524 have been considered, in step 538 the recognition unit 304 selects a previously unselected possible side plan view from the listing (created in step 524) of possible side-plan view of color transition sequences for the representation of the chip denomination under consideration, and notes the angles associated with such color transitions. Thereafter, the recognition unit 304 engages in step 528 (e.g. determined distances between lines) and subsequent steps as indicated in the flow chart.
If all of the possible side-plan views in the listing created in step 524 have been considered, at this point the recognition unit 304 has a determined mean circle-center, a determined variance of the radius values, and a determined mean radius associated with each possible side-plan view in the listing of step 524. As illustrated in
In step 540, the recognition unit 304 determines which particular side-plan views of color transitions in the list have determined mean radius and radius variance values within defined tolerances. The defined tolerances are tolerances relative to an expected radius based on prior knowledge of the width of the chip image if the chip is within the confines of the bet circle, and tolerances relative to the determined radius variances. In step 542, the recognition unit 304 logs as possible matches those particular side-plan views of color transitions in the list whose values are within defined tolerances.
In step 543, the recognition unit 304 checks to see if at least one side plan view has values within the defined tolerances. If at least one side plan view has values with defined tolerances, the recognition unit 304 takes action as shown in step 545. If no side plan view has values within defined tolerances, the recognition unit 304 takes action as shown in step 546.
In step 545, if only one side-plan view for the current chip denomination representation was found to have values within tolerances, that side-plan view is used. However, if more than one side plan view for the current chip denomination representation was found, the recognition unit 304 logs as the best possible side-plan view for the denomination that view which has the lowest determined radius variance value for the chip denomination representation under consideration. Subsequently, the mean circle-center value, determined radius variance value, the number of transitions used, and the determined mean radius value for the side plan view ultimately deemed the best for the current chip denomination representation under consideration are logged by the recognition unit 304. In other words, the recognition unit 304 rank orders the possible side-plan views for the current chip denomination representation under consideration and logs the view that the recognition unit 304 considers to be the best potential match.
If none of the determined radiuses and determined radius variances is within defined tolerances, the set of chosen innermost color transitions of the working chip (step 522) has failed to identify the current chip denomination representation as a potential match for the chip. Accordingly, in step 546 the recognition unit 304 determines, if for the area of pixilated image 112 under consideration, all color transitions have been utilized. If all color transitions have not been utilized, in step 548 the recognition unit 304 sequentially adds an additional innermost color transition, in left right alternating fashion to the working chip template (e.g., if the previous color transition was added to the left of center of the working chip template, the current color transition will be added to the right of center of the current working chip template).
In step 550, the recognition unit 304 determines a listing of possible side-plan view color transition sequences of the recalled representation of the chip denomination under consideration for the chosen set of color transitions currently populating the working chip template. In step 552, the recognition unit 304 determines whether the listing of step 550 is empty. If the list is empty, it means that the recalled representation of the chip denomination under consideration does not contain color transitions such as those currently in the working chip template. Consequently, it is known that the representation of the chip denomination currently under consideration does not match the detected chip. If the list is not empty, the recognition unit 304 takes actions as indicated in step 526 and selects one of the possible side plan views of the chip denomination representation under consideration.
In step 554, the recognition unit 304 determines whether all representation of chips in the chip denomination-representation library have been recalled and examined. If all representations of chips in the library have not been recalled, the recognition unit 304 takes action as recited in step 520 (e.g., recalls a previously unrecalled chip denomination representation consisting of lines of vertical color transition and associated angles (see, e.g.,
In step 556, the recognition unit 304 assigns an overall chip score to each candidate denomination. In one embodiment, the overall chip score is based upon (a) how closely each candidate denomination's determined mean radius matches an expected radius, (b) a determined color score, and (c) the number of vertical color transitions used to determine the candidate solution denomination. In step 558, the recognition unit 304 selects as the candidate denomination for the chip, the denomination with the highest overall chip score, and logs the identity denomination.
At this point, the recognition unit 304 has determined a denomination of the chip. Consequently, in step 560, the chip denomination is saved to an array defining the bet stack. In one embodiment, the number of elements in the bet stack array equates to the number of chips in the stack, so since the chips are detected individually they are counted individually.
There may be more chips in the bet stack. Thus, in step 562, the recognition unit 304 defines a horizontal width, which defines the horizontal boundaries of where a next chip potential chip in the stack can possibly be expected to be found. In one embodiment, this horizontal width is centered on the determined mean circle-center value of the just-previously found chip, where the horizontal width has width 2× the expected width of how a chip would appear within the bet circle. This 2× width allows the processes to catch skewed chips in stack, such as shown above in
In step 564, the recognition unit 304 strobes a portion of the pixilated color image 112 interior to the newly defined horizontal width, and logs all vertical lines of color transition (e.g., lines of at least 8 pixel height), whose bottoms are within a defined tolerance from the previously-determined median height-of-lines value of the just-lower vertical transitions used to find the just-previously identified chip in the bet stack.
In step 566, if there are no vertical lines of color transition within the defined tolerance, the recognition unit 304 assumes that all chips in the bet stack have been logged, and the process 500 terminates. Otherwise, the process 500 continues on as described below.
In step 568, the recognition unit 304 computes a newly-calculated median bottom-of-lines value, based on the bottoms of the vertical lines of color transitions whose bottoms are within a defined tolerance from the previously-calculated median height-of-lines value of the just-lower vertical transitions used to find the just-previously identified chip in the bet stack.
In step 570, the recognition unit 304 disregards those detected vertical lines of color transitions whose bottoms are not within a defined threshold of the newly calculated median bottom-of-lines value.
In step 572, the recognition unit 304 newly determines a median height-of-lines value based on the heights of each non-disregarded detected vertical lines of color transition—whose bottoms are within a defined tolerance from the previously-determined median height-of-lines value of the just-lower vertical transitions used to find the just-previously identified chip in the bet stack (e.g., a median height-of-lines value relative to a median bottom-of-lines value)—where the height of each vertical line of color transition is calculated based on the actual top and actual bottom values of the individual lines of vertical color transition.
In step 574, the recognition unit 304 compares the tops of the remaining (i.e., non-disregarded) detected lines of vertical color transition—whose bottoms are within a defined tolerance from the previously-determined median height-of-lines value of the just-lower vertical transitions used to find the previously identified chip in the bet stack (e.g., a median height-of-lines value relative to a median bottom-of-lines value)—relative to the newly-calculated median height-of-lines value. If the tops of such lines are further from the newly-determined median height-of-lines value than some pre-defined threshold, the uppermost portions of such lines of color transition beyond such threshold value are truncated by the process.
In step 576, for the remaining (i.e., non-disregarded) lines of vertical color transition, the recognition unit 304 adjusts the top and the bottom of such lines so as to be all within some pre-defined distance from the newly-calculated median height-of lines and median bottom-of-lines values.
In step 578, the recognition unit 304 determines a new mean horizontal position for the remaining (i.e., non-disregarded) lines of color transition, for example by calculation. The recognition unit 304 disregards those remaining lines of vertical color transition internal to the horizontal width, but which fall outside a pre-defined threshold distance (e.g., the expected width of a 1 chip) from that newly-calculated mean horizontal distance.
Thereafter, the foregoing steps, beginning with step 520, where the recognition unit 304 recalls a previously unrecalled chip denomination representation consisting of lines of vertical color transition and associated angles from a chip denomination representation library, are repeated.
As can be seen from the above, the process 500 of
p1=c1−x1 (C)
∴p1=r1 cos θ0 (B)
p2=c2−x2
∴p2=r2 cos(θ0+θ1)
p3=c3−x3
∴p3=r3 cos(θ0+θ1+θ2)
Note that, for m and n being any legitimate delimiters of color transitions and projections:
xm−xn=pm−pn
substituting formula A into the foregoing relation yields
Now assume that VCTs xm, xn come from the same chip (it is possible that the VCTs are not from the same chip, such as a VCT in the background), and the denomination currently under consideration is correct. This would imply that there is a common radius, which is designated herein by the notation: Rm,n, meant to indicate that the radius value should be the same, or nearly so, no matter what (legitimate) values m and n take on. Since there is a common radius, we know
∴rm=Rm,n,rn=Rm,n.
Substituting this common radius into the general relation set forth above yields
Rearranging terms yields
Now, if the assumptions that the VCTs come from the same chip, and the denomination under consideration, are correct, then all Rm,n from all possible VCT pairs from the VCT pair set (x, x2, x3, . . . ) must be the substantially the same, i.e., R1,2=R1,3=R2,3=. . .
To verify that all Rm,n from all possible VCT pairs from the VCT pair set (X, x2, x3, . . . ) are substantially the same, the process 500 computes all such Rm,n for various values of θ0 and validates that they are similar. In one implementation, as noted above, the process 500 computes Rm,n by an iterative numerical approach, since θ0 is unknown. In the iterative approach, the process 500 (a) selects a particular θ0 angle, (b) creates a group of calculated radius values by using the selected particular θ0 angle and the Rm,n formula to compute substantially every possible m,n pair of VCTs, (c) computes variance of the group of radiuses created in (b), and (d) if the variance of (c) is the lowest computed variance so far, that lowest computed variance is stored in association with the θ0 which resulted in such lowest computed variance. In one implementation, this iteration is done over all possible values of θ0 from 0° to 90°.
Subsequent to the process 500 iterating over all possible values of θ0 from 0° to 90°, the process has the lowest calculated variance for the chip denomination representation under consideration for an iteratively chosen value of θ0. This variance of all Rm,n will be less than a predefined threshold (chosen by the system designer as indicative of an acceptable match) if both of the following conditions are satisfied:
(1) the stored value of θ0 correctly indicates the orientation of the chip;
(2) the current denomination under consideration is a potential correct match for the chip.
Once the process 500 has done the foregoing, the process 500 has a single radius R calculated as the average radius based on the group of radiuses that has the best variance. From this average radius, the process 500 can compute the circle center C as follows:
Thus, for the current denomination under consideration, the process 500 now has an average radius, circle center, and variance such as discussed above in relation to the flowchart of
The foregoing described subject matter works well for existing chips, and is uniquely valuable in that it can be used with pre-existing chips. However, in another embodiment a monochromatic light source 354 (e.g., 880 nm infrared) is positioned proximate to image capture device 302 and aimed such that the monochromatic light source bounces directly off chip stack 24 and into the lens of image capture device 302. In this embodiment, the chips are encoded with infrared reflective and/or absorbant material that actually either fluoresces and/or absorbs light at the monochromatic wavelength from the source illumination. This allows the image capture device 302, with an optical band pass filter selected at the wavelength of the fluoresced material, to see only the code. The background ambient light as well as the reflective light source light from the absorbed coded area are substantially eliminated.
The fact that the monochromatic light source is infrared means that its presence is not generally detectable by players 14-16.
In the embodiment using monochromatic light source 354, the algorithm described herein is used in a substantially unchanged fashion. The reason for this is that since monochromatic light is being used, the red, green, and blue color values end up having substantially the same values.
Those having ordinary skill in the art will recognize that the state of the art has progressed to the point where there is little distinction left between hardware and software implementations of aspects of systems; the use of hardware or software is generally (but not always, in that in certain contexts the choice between hardware and software can become significant) a design choice representing cost vs. efficiency tradeoffs. Those having ordinary skill in the art will appreciate that there are various vehicles by which aspects of processes and/or systems described herein can be effected (e.g., hardware, software, and/or firmware), and that the preferred vehicle will vary with the context in which the processes and/or systems are deployed. For example, if an implementer determines that speed and accuracy are paramount, the implementer may opt for a hardware and/or firmware vehicle; alternatively, if flexibility is paramount, the implementer may opt for a solely software implementation; or, yet again alternatively, the implementer may opt for some combination of hardware, software, and/or firmware. Hence, there are several possible vehicles by which aspects of the processes described herein may be effected, none of which is inherently superior to the other in that any vehicle to be utilized is a choice dependent upon the context in which the vehicle will be deployed and the specific concerns (e.g., speed, flexibility, or predictability) of the implementer, any of which may vary.
The foregoing detailed description has set forth various embodiments of the devices and/or processes via the use of block diagrams, flowcharts, and examples. Insofar as such block diagrams, flowcharts, and examples contain one or more functions and/or operations, it will be understood as notorious by those within the art that each function and/or operation within such block diagrams, flowcharts, or examples can be implemented, individually and/or collectively, by a wide range of hardware, software, firmware, or virtually any combination thereof. In one embodiment, the present invention may be implemented via Application Specific Integrated Circuits (ASICs). However, those skilled in the art will recognize that the embodiments disclosed herein, in whole or in part, can be equivalently implemented in standard Integrated Circuits, as one or more computer programs running on one or more computers (e.g., as one or more programs running on one or more computer systems), as one or more programs running on one or more controllers (e.g., microcontrollers) as one or more programs running on one or more processors (e.g., microprocessors), as firmware, or as virtually any combination thereof, and that designing the circuitry and/or writing the code for the software and or firmware would be well within the skill of one of ordinary skill in the art in light of this disclosure. In addition, those skilled in the art will appreciate that the mechanisms of the present invention are capable of being distributed as a program product in a variety of forms, and that an illustrative embodiment of the present invention applies equally regardless of the particular type of signal bearing media used to actually carry out the distribution. Examples of signal bearing media include, but are not limited to, the following: recordable type media such as floppy disks, hard disk drives, CD ROMs, digital tape, and computer memory; and transmission type media such as digital and analogue communication links using TDM or IP based communication links (e.g., packet links).
In a general sense, those skilled in the art will recognize that the various embodiments described herein which can be implemented, individually and/or collectively, by a wide range of hardware, software, firmware, or any combination thereof can be viewed as being composed of various types of “electrical circuitry.” Consequently, as used herein “electrical circuitry” includes, but is not limited to, electrical circuitry having at least one discrete electrical circuit, electrical circuitry having at least one integrated circuit, electrical circuitry having at least one application specific integrated circuit, electrical circuitry forming a general purpose computing device configured by a computer program (e.g., a general purpose computer configured by a computer program which at least partially carries out processes and/or devices described herein, or a microprocessor configured by a computer program which at least partially carries out processes and/or devices described herein), electrical circuitry forming a memory device (e.g., forms of random access memory), and electrical circuitry forming a communications device (e.g., a modem, communications switch, or optical-electrical equipment).
Those skilled in the art will recognize that it is common within the art to describe devices and/or processes in the fashion set forth herein, and thereafter use standard engineering practices to integrate such described devices and/or processes into systems. That is, the devices and/or processes described herein can be integrated into a system via a reasonable amount of experimentation.
The foregoing described embodiments depict different components contained within, or connected with, different other components. It is to be understood that such depicted architectures are merely exemplary, and that in fact many other architectures can be implemented which achieve the same functionality. In a conceptual sense, any arrangement of components to achieve the same functionality is effectively “associated” such that the desired functionality is achieved. Hence, any two components herein combined to achieve a particular functionality can be seen as “associated with” each other such that the desired functionality is achieved, irrespective of architectures or intermedial components. Likewise, any two components so associated can also be viewed as being “operably connected,” or “operably coupled,” to each other to achieve the desired functionality.
While particular embodiments of the present invention have been shown and described, it will be obvious to those skilled in the art that, based upon the teachings herein, changes and modifications may be made without departing from this invention and its broader aspects and, therefore, the appended claims are to encompass within their scope all such changes and modifications as are within the true spirit and scope of this invention. Furthermore, it is to be understood that the invention is solely defined by the appended claims. It will be understood by those within the art that, in general, terms used herein, and especially in the appended claims (e.g., bodies of the appended claims) are generally intended as “open” terms (e.g., the term “including” should be interpreted as “including but not limited to,” the term “having” should be interpreted as “having at least,” the term “includes” should be interpreted as “includes but is not limited to,” etc.). It will be further understood by those within the art that if a specific number of an introduced claim recitation is intended, such an intent will be explicitly recited in the claim, and in the absence of such recitation no such intent is present. For example, as an aid to understanding, the following appended claims may contain usage of the introductory phrases “at least one” and “one or more” to introduce claim recitations. However, the use of such phrases should not be construed to imply that the introduction of a claim recitation by the indefinite articles “a” or “an” limits any particular claim containing such introduced claim recitation to inventions containing only one such recitation, even when the same claim includes the introductory phrases “one or more” or “at least one” and indefinite articles such as “a” or “an” (e.g. “a” and/or “an” should typically be interpreted to mean “at least one” or “one or more”); the same holds true for the use of definite articles used to introduce claim recitations. In addition, even if a specific number of an introduced claim recitation is explicitly recited, those skilled in the art will recognize that such recitation should typically be interpreted to mean at least the recited number (e.g., the bare recitation of “two recitations,” without other modifiers, typically means at least two recitations, or two or more recitations).
All of the above U.S. patents, U.S. patent application publications, U.S. patent applications, foreign patents, foreign patent applications and non-patent publications referred to in this specification and/or listed in the Application Data Sheet, are incorporated herein by reference, in their entirety.
This application also incorporates by reference in their entireties any and all materials incorporated by reference into the foregoing referenced application; such materials include at least the subject matter of the currently co-pending U.S. patent application Ser. No. 09/474,858, filed Dec. 30, 1999, entitled, METHOD AND APPARATUS FOR MONITORING CASINOS AND GAMING, naming Richard Soltys and Richard Huizinga as inventors, and the subject matter of the U.S. Provisional Patent Application No. 60/130,368, filed Apr. 21, 1999, entitled, TRACKING SYSTEM FOR GAMES OF CHANCE, naming Richard Soltys and Richard Huizinga as inventors which was previously incorporated by reference into the currently co-pending U.S. patent application Ser. No. 09/474,858.
This application claims the benefit of U.S. Provisional Patent Application No. 60/354,730 filed 5 Feb. 2002, where this provisional application is incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
3766452 | Burpee et al. | Oct 1973 | A |
3787660 | Meyers et al. | Jan 1974 | A |
3810172 | Burpee et al. | May 1974 | A |
4026309 | Howard | May 1977 | A |
4108361 | Krause | Aug 1978 | A |
4135663 | Nojiri et al. | Jan 1979 | A |
4467424 | Hedges et al. | Aug 1984 | A |
4475564 | Koester et al. | Oct 1984 | A |
4518001 | Branham | May 1985 | A |
4531187 | Uhland | Jul 1985 | A |
4534562 | Cuff et al. | Aug 1985 | A |
4636846 | Villareal | Jan 1987 | A |
4636896 | Takikawa | Jan 1987 | A |
4656463 | Anders et al. | Apr 1987 | A |
4662637 | Pfeiffer | May 1987 | A |
4667959 | Pfeiffer et al. | May 1987 | A |
4693480 | Smith | Sep 1987 | A |
4750743 | Nicoletti | Jun 1988 | A |
4755941 | Bacchi | Jul 1988 | A |
4814583 | Rainey | Mar 1989 | A |
4814589 | Storch et al. | Mar 1989 | A |
4822050 | Normand et al. | Apr 1989 | A |
4861041 | Jones et al. | Aug 1989 | A |
4926996 | Eglise et al. | May 1990 | A |
4951950 | Normand et al. | Aug 1990 | A |
4978322 | Paulsen | Dec 1990 | A |
5007641 | Seidman | Apr 1991 | A |
5067713 | Soules et al. | Nov 1991 | A |
5103081 | Fisher et al. | Apr 1992 | A |
5114153 | Rosenwinkel et al. | May 1992 | A |
5121921 | Friedman et al. | Jun 1992 | A |
5166502 | Rendleman et al. | Nov 1992 | A |
5167571 | Waller | Dec 1992 | A |
5186464 | Lamle | Feb 1993 | A |
5199710 | Lamle | Apr 1993 | A |
5216234 | Bell | Jun 1993 | A |
5258837 | Gormley | Nov 1993 | A |
5283422 | Storch et al. | Feb 1994 | A |
5319181 | Shellhammer et al. | Jun 1994 | A |
5343028 | Figarella et al. | Aug 1994 | A |
5364104 | Jones et al. | Nov 1994 | A |
5374061 | Albrecht | Dec 1994 | A |
5397133 | Penzias | Mar 1995 | A |
5406264 | Plonsky et al. | Apr 1995 | A |
5416308 | Hood et al. | May 1995 | A |
5431399 | Kelley | Jul 1995 | A |
5435778 | Castle et al. | Jul 1995 | A |
5458333 | Takemoto et al. | Oct 1995 | A |
5470079 | LeStrange et al. | Nov 1995 | A |
5505461 | Bell et al. | Apr 1996 | A |
5548110 | Storch et al. | Aug 1996 | A |
5586936 | Bennett et al. | Dec 1996 | A |
5605334 | McCrea, Jr. | Feb 1997 | A |
5613680 | Groves et al. | Mar 1997 | A |
5613912 | Slater | Mar 1997 | A |
5645486 | Nagao et al. | Jul 1997 | A |
5651548 | French et al. | Jul 1997 | A |
5669816 | Garczynski et al. | Sep 1997 | A |
5676231 | Legras et al. | Oct 1997 | A |
5698839 | Jagielinski et al. | Dec 1997 | A |
5707287 | McCrea, Jr. | Jan 1998 | A |
5722893 | Hill et al. | Mar 1998 | A |
5735525 | McCrea, Jr. | Apr 1998 | A |
5735742 | French | Apr 1998 | A |
5742656 | Mikulak et al. | Apr 1998 | A |
5755618 | Mothwurf | May 1998 | A |
5757876 | Dam et al. | May 1998 | A |
5759103 | Freels et al. | Jun 1998 | A |
5766075 | Cook et al. | Jun 1998 | A |
5770533 | Franchi | Jun 1998 | A |
5779546 | Meissner et al. | Jul 1998 | A |
5780831 | Seo et al. | Jul 1998 | A |
5781647 | Fishbine et al. | Jul 1998 | A |
5785321 | Van Putten et al. | Jul 1998 | A |
5788574 | Ornstein et al. | Aug 1998 | A |
5801766 | Alden | Sep 1998 | A |
5803808 | Strisower | Sep 1998 | A |
5809482 | Strisower | Sep 1998 | A |
5830064 | Bradish et al. | Nov 1998 | A |
5842921 | Mindes et al. | Dec 1998 | A |
5895321 | Gassies et al. | Apr 1999 | A |
5909876 | Brown | Jun 1999 | A |
5910044 | Luciano, Jr. et al. | Jun 1999 | A |
5911626 | McCrea, Jr. | Jun 1999 | A |
5919090 | Mothwurf | Jul 1999 | A |
5931731 | Chwalisz | Aug 1999 | A |
5941769 | Order | Aug 1999 | A |
5957776 | Hoehne | Sep 1999 | A |
5989122 | Roblejo | Nov 1999 | A |
6003013 | Boushy et al. | Dec 1999 | A |
6003651 | Waller et al. | Dec 1999 | A |
6021949 | Boiron | Feb 2000 | A |
6029891 | Freeman et al. | Feb 2000 | A |
6032955 | Luciano et al. | Mar 2000 | A |
6039650 | Hill | Mar 2000 | A |
6093103 | McCrea, Jr. | Jul 2000 | A |
6106395 | Begis | Aug 2000 | A |
6113493 | Walker et al. | Sep 2000 | A |
6117012 | McCrea, Jr. | Sep 2000 | A |
6126166 | Lorson et al. | Oct 2000 | A |
6142876 | Cumbers | Nov 2000 | A |
6154131 | Jones et al. | Nov 2000 | A |
6165069 | Sines et al. | Dec 2000 | A |
6165071 | Weiss | Dec 2000 | A |
6166763 | Rhodes et al. | Dec 2000 | A |
6168513 | Souza et al. | Jan 2001 | B1 |
6183362 | Boushy | Feb 2001 | B1 |
6186895 | Oliver | Feb 2001 | B1 |
6200218 | Lindsay | Mar 2001 | B1 |
6203856 | Ottersbach et al. | Mar 2001 | B1 |
6217447 | Lofink et al. | Apr 2001 | B1 |
6220954 | Nguyen et al. | Apr 2001 | B1 |
6234898 | Belamant et al. | May 2001 | B1 |
6254484 | McCrea, Jr. | Jul 2001 | B1 |
6264109 | Chapet et al. | Jul 2001 | B1 |
6267671 | Hogan | Jul 2001 | B1 |
6283856 | Mothwurf | Sep 2001 | B1 |
6299534 | Breeding et al. | Oct 2001 | B1 |
6299536 | Hill | Oct 2001 | B1 |
6313871 | Schubert | Nov 2001 | B1 |
6346044 | McCrea, Jr. | Feb 2002 | B1 |
6352261 | Brown | Mar 2002 | B1 |
6425817 | Momemy | Jul 2002 | B1 |
6446864 | Kim et al. | Sep 2002 | B1 |
6457715 | Friedman | Oct 2002 | B1 |
6460848 | Soltys et al. | Oct 2002 | B1 |
6464584 | Oliver | Oct 2002 | B2 |
6503147 | Stockdale et al. | Jan 2003 | B1 |
6508709 | Karmarkar | Jan 2003 | B1 |
6514140 | Storch | Feb 2003 | B1 |
6517435 | Soltys et al. | Feb 2003 | B2 |
6517436 | Soltys et al. | Feb 2003 | B2 |
6517437 | Wells et al. | Feb 2003 | B1 |
6520857 | Soltys et al. | Feb 2003 | B2 |
6527271 | Soltys et al. | Mar 2003 | B2 |
6530836 | Soltys et al. | Mar 2003 | B2 |
6530837 | Soltys et al. | Mar 2003 | B2 |
6532297 | Lindquist | Mar 2003 | B1 |
6533276 | Soltys et al. | Mar 2003 | B2 |
6533662 | Soltys et al. | Mar 2003 | B2 |
6567159 | Corech | May 2003 | B1 |
6575834 | Lindo | Jun 2003 | B1 |
6579180 | Soltys et al. | Jun 2003 | B2 |
6579181 | Soltys et al. | Jun 2003 | B2 |
6581747 | Charlier et al. | Jun 2003 | B1 |
6595857 | Soltys et al. | Jul 2003 | B2 |
6620046 | Rowe | Sep 2003 | B2 |
6629591 | Griswold et al. | Oct 2003 | B1 |
6629889 | Mothwurf | Oct 2003 | B2 |
6638161 | Soltys et al. | Oct 2003 | B2 |
6645077 | Rowe | Nov 2003 | B2 |
6663490 | Soltys et al. | Dec 2003 | B2 |
6685564 | Oliver | Feb 2004 | B2 |
6688979 | Soltys et al. | Feb 2004 | B2 |
6712696 | Soltys et al. | Mar 2004 | B2 |
6729956 | Wolf et al. | May 2004 | B2 |
6755741 | Rafaeli | Jun 2004 | B1 |
6758751 | Soltys et al. | Jul 2004 | B2 |
6848994 | Knust et al. | Feb 2005 | B1 |
7029009 | Grauzer et al. | Apr 2006 | B2 |
20020084587 | Bennett et al. | Jul 2002 | A1 |
20020086727 | Soltys et al. | Jul 2002 | A1 |
20020147042 | Vuong et al. | Oct 2002 | A1 |
20020165029 | Soltys et al. | Nov 2002 | A1 |
20020187821 | Soltys et al. | Dec 2002 | A1 |
20030064798 | Grauzer et al. | Apr 2003 | A1 |
20030195037 | Vuong et al. | Oct 2003 | A1 |
20030212597 | Ollins | Nov 2003 | A1 |
20030220136 | Soltys et al. | Nov 2003 | A1 |
20040005920 | Soltys et al. | Jan 2004 | A1 |
20040043820 | Schlottmann | Mar 2004 | A1 |
20040207156 | Soltys et al. | Oct 2004 | A1 |
20040219982 | Khoo et al. | Nov 2004 | A1 |
20040229682 | Gelinotte | Nov 2004 | A1 |
20050026680 | Gururajan | Feb 2005 | A1 |
20050051955 | Schubert et al. | Mar 2005 | A1 |
20050051965 | Gururajan | Mar 2005 | A1 |
20050054408 | Steil et al. | Mar 2005 | A1 |
20050059479 | Soltys et al. | Mar 2005 | A1 |
20050062226 | Schubert et al. | Mar 2005 | A1 |
20050073102 | Yoseloff et al. | Apr 2005 | A1 |
20050101367 | Soltys et al. | May 2005 | A1 |
20050116020 | Smolucha et al. | Jun 2005 | A1 |
20050164761 | Tain | Jul 2005 | A1 |
20050236771 | Soltys et al. | Oct 2005 | A1 |
20050258597 | Soltys et al. | Nov 2005 | A1 |
20050288083 | Downs, III | Dec 2005 | A1 |
20050288084 | Schubert | Dec 2005 | A1 |
20050288085 | Schubert et al. | Dec 2005 | A1 |
20060019739 | Soltys et al. | Jan 2006 | A1 |
Number | Date | Country |
---|---|---|
44 39 502 | Sep 1995 | DE |
197 48 930 | May 1998 | DE |
0 327 069 | Aug 1989 | EP |
0 790 848 | Aug 1997 | EP |
2 775 196 | Aug 1999 | FR |
2 382 034 | May 2003 | GB |
WO 9603188 | Feb 1996 | WO |
WO 9636253 | Nov 1996 | WO |
WO 9713227 | Apr 1997 | WO |
WO 0022585 | Apr 2000 | WO |
WO 0062880 | Oct 2000 | WO |
WO 0205914 | Jan 2002 | WO |
WO 0360846 | Jul 2003 | WO |
Number | Date | Country | |
---|---|---|---|
20030220136 A1 | Nov 2003 | US |
Number | Date | Country | |
---|---|---|---|
60354730 | Feb 2002 | US |