Methods for automatically generating a card deck library and master images for a deck of cards, and a related card processing apparatus

Information

  • Patent Grant
  • 10398966
  • Patent Number
    10,398,966
  • Date Filed
    Monday, December 5, 2016
    8 years ago
  • Date Issued
    Tuesday, September 3, 2019
    5 years ago
Abstract
A method of automatically generating a calibration file for a card handling device comprises automatically generating a calibration file stored in memory of a main control system for a card handling device. Automatically generating the calibration file comprises identifying at least one parameter associated with a rank area around a rank of at least a portion of the card, identifying at least one parameter associated with a suit area around a suit of the at least a portion of the card, and storing the at least one parameter associated with the rank area and the at least one parameter associated with the suit area in the calibration file. Additionally, a method of automatically generating deck libraries for one or more decks of cards comprises automatically generating a plurality of master images for the cards of the first deck type using the parameters from the calibration file.
Description
FIELD

The disclosure relates generally to card recognition in card handling devices. More specifically, disclosed embodiments relate to automatic generation of calibration files and other improvements to card recognition systems of card handling devices.


BACKGROUND

Card handling devices (e.g., card shufflers) are used in the gaming industry for increasing the efficiency, security, and game speed in live table games, such as blackjack, baccarat, and various forms of poker. Card handling devices may perform a variety of functions including randomly shuffling one or more decks of cards in an efficient and thorough manner. In a live table game, shuffling the cards in an efficient and thorough manner may assist in preventing players from having an advantage by knowing the position of specific cards or groups of cards in the final arrangement of cards delivered in the play of the game. Additionally, it may be desirable to shuffle the cards in a very short period of time in order to reduce delay in the play of the game.


Card shufflers may include a card recognition system, which may be used to verify the contents of the card set, such as one or more decks and ensure that the card set contains all the appropriate cards, and also to detect any cards that do not belong therein. The card recognition system may also enable a card shuffler to verify the contents of the deck throughout the game play. Some known card shufflers may comprise a card recognition system that employs sensors and a hardware component that may sense the rank (2-10, Jack-Ace) and suit (Spade, Club, Heart, Diamond) from the face of a card and thereafter convert signals from the sensed data into data array sets. The data array sets may be compared to known data array sets of a verified deck of cards. Other known card shufflers may comprise a camera that captures an unknown image of each card entered into the card shuffler and then extracts the card rank and suit from the unknown image. The unknown image may be compared to master images of a verified deck of cards to identify the cards.


There are several different playing card manufacturers (e.g., Angel, Gemaco, U.S. Playing Card Company, Cartamundi, Ace, Copag, etc.), each having different types of card designs. For example, the card images (e.g., graphics) printed on the card faces may vary from one deck to the next. In addition, the size and location of the rank and suit may also vary from one deck design to the next.


In order to support each of the various possible card images, the card recognition system of the card shuffler may be loaded with a set of master images containing the rank and suit symbols of a particular deck design. The master images may be stored in memory within the card shuffler in a particular sub-directory for that particular deck design. For example, a sub-directory may exist for each deck type supported by the card shuffler. The process of creating these master images conventionally requires a substantial amount of manual measurement and analysis by a technician to create and load the master images for each deck type. For example, the technician may manually enter parameters into a calibration file listing different measurements and locations related to the rank and suit symbols. This process involves trial and error, and is time consuming as the technician attempts to find the right combination of parameters to use in generating the master images.


Another obstacle associated with conventional card detection devices is that card manufacturers may create new deck designs or make changes to existing deck designs. The conventional method of manually creating deck libraries becomes burdensome to technicians who not only have to create the deck libraries, but also need to update the deck libraries of card shufflers in use in the field. In addition, each individual card shuffler may be configured differently, which may require the technician to create a new calibration file for a particular machine. As a result, when the same deck library is created for one card shuffler and then simply reproduced and stored on each additional card shuffler, there may be variations during card recognition from one card shuffler to the next, even within the same model of shuffler.


Once loaded onto a card shuffler, the dealer may select the specific deck design that will be used during game play. Selecting the deck in the card shuffler determines which deck library (e.g., master images and other related files) is used for comparison with the card images captured during use. The dealer may select the incorrect deck type, often for reasons such as a lack of training or, simply, input error. As a result, the deck library from one deck type may be used for comparison with images from another deck type. Using the wrong deck library may result in errors in card identification.


SUMMARY

In an embodiment, a method of automatically generating a calibration file for a card handling device is disclosed. The method comprises capturing a raw image from at least a portion of a card passing through a card handling device, and using a processor, automatically generating a calibration file stored in memory of a main control system of the card handling device. Automatically generating the calibration file comprises identifying at least one parameter associated with a rank area around a rank of the at least a portion of the card, identifying at least one parameter associated with a suit area around a suit of the at least a portion of the card, and storing the at least one parameter associated with the rank area and the at least one parameter associated with the suit area in the calibration file.


In another embodiment, a method of automatically generating one or more deck libraries for one or more decks of cards is disclosed, The method comprises using a processor to automatically generate a first calibration file without user input in identifying at least one parameter associated with a rank area and at least one parameter associated with a suit area for a first deck type of cards, the calibration file including the parameters associated with the rank area and the suit area, storing the first calibration file in a first deck library for the first deck type, using the processor to automatically generate a plurality of master images for the cards of the first deck type using the parameters from the calibration file, and storing the plurality of master images for the cards of the first deck type in the first deck library.


In another embodiment, a card processing apparatus is disclosed. The card processing apparatus comprises a memory device, an imaging device operably coupled with the memory device such that raw images from the imaging device are stored in the memory device, and a main control system coupled with the imaging device. The main control system is configured to run an operating system having a file directory system configured to store a plurality of deck libraries for a plurality of different deck types. The main control system is configured to receive the raw images from the memory device, automatically generate a calibration file having parameters related to a rank area and a suit area for a deck type.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a perspective view of a card handling device according to an embodiment of the present disclosure.



FIG. 2 is a perspective view of a card handling device according to another embodiment of the present disclosure.



FIG. 3 is a partial perspective view of a card handling device according to another embodiment of the present disclosure.



FIG. 4 is a schematic block diagram of a card processing system for a card handling device according to an embodiment of the present disclosure.



FIG. 5 is an illustration of an image captured by an imaging device of a card handling device, according to an embodiment of the present disclosure.



FIG. 6 is a flowchart illustrating a method for automatically generating a calibration file for a card detection system according to an embodiment of the present disclosure.



FIG. 7 is a flowchart illustrating a method for generating master images according to an embodiment of the present disclosure.



FIGS. 8A through 8C illustrate a process of generating a master rank image and a master suit image from a raw image according to the parameters stored in the calibration file.



FIGS. 8D and 8E show an example of the master images being normalized to form normalized master images.



FIGS. 9A through 9C are a series of card images that illustrate a method for generating master images by finding and filling contours according to another embodiment of the disclosure.



FIGS. 10 and 11 show histograms that may result from an OCR analysis of the master suit images and the master rank images generated by the contour analysis illustrated in FIGS. 9A through 9C.



FIG. 12 is a flowchart illustrating a method for determining the identity of unknown images according to an embodiment of the present disclosure.



FIGS. 13A, 13B, and 13C show a processed image of a card, in which the imaging device had experienced dust build-up on the lens.



FIGS. 14A and 14B illustrate a problem of incorrectly splitting an image that may arise during card recognition mode.



FIGS. 15A and 15B illustrate an issue that may arise when capturing an image using uneven illumination.



FIGS. 16A, 16B, and 16C are raw images from the imaging device of a card handling device showing fish eye distortion caused by the imaging device.



FIGS. 17A, 17B, and 17C are images for which the fisheye distortion has been reduced through mathematical stretching of the distorted image.





DETAILED DESCRIPTION

In the following description, reference is made to the accompanying drawings in which is shown, by way of illustration, specific embodiments of the present disclosure. Other embodiments may be utilized and changes may be made without departing from the scope of the disclosure. The following detailed description is not to be taken in a limiting sense, and the scope of the claimed invention is defined only by the appended claims and their legal equivalents. Furthermore, specific implementations shown and described are only examples and should not be construed as the only way to implement or partition the present disclosure into functional elements unless specified otherwise herein. It will be readily apparent to one of ordinary skill in the art that the various embodiments of the present disclosure may be practiced by numerous other partitioning solutions.


In the following description, elements, circuits, and functions may be shown in block diagram form in order not to obscure the present disclosure in unnecessary detail. Additionally, block definitions and partitioning of logic between various blocks is exemplary of a specific implementation. It will be readily apparent to one of ordinary skill in the art that the present disclosure may be practiced by numerous other partitioning solutions. Those of ordinary skill in the art would understand that information and signals may be represented using any of a variety of different technologies and techniques. For example, data, instructions, commands, information, signals, bits, symbols, and chips that may be referenced throughout the above description may be represented by voltages, currents, electromagnetic waves, magnetic fields or particles, optical fields or particles, or any combination thereof. Some drawings may illustrate signals as a single signal for clarity of presentation and description. It will be understood by a person of ordinary skill in the art that the signal may represent a bus of signals, wherein the bus may have a variety of bit widths and the present disclosure may be implemented on any number of data signals including a single data signal.


The various illustrative logical blocks, modules, and circuits described in connection with the embodiments disclosed herein may be implemented or performed with a general-purpose processor, a special-purpose processor, a Digital Signal Processor (DSP), an Application-Specific Integrated Circuit (ASIC), a Field-Programmable Gate Array (FPGA) or other programmable logic device, a controller, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general-purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A general-purpose processor may be considered a special-purpose processor while the general-purpose processor executes instructions (e.g., software code) stored on a computer-readable medium. A processor may also be implemented as a combination of computing devices, such as a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.


Also, it is noted that the embodiments may be described in terms of a process that may be depicted as a flowchart, a flow diagram, a structure diagram, or a block diagram. Although a process may describe operational acts as a sequential process, many of these acts can be performed in another sequence, in parallel, or substantially concurrently. In addition, the order of the acts may be re-arranged. A process may correspond to a method, a function, a procedure, a subroutine, a subprogram, etc. Furthermore, the methods disclosed herein may be implemented in hardware, software, or both. If implemented in software, the functions may be stored or transmitted as one or more instructions or code on computer-readable media. Computer-readable media includes both computer storage media and communication media, including any medium that facilitates transfer of a computer program from one place to another.


It should be understood that any reference to an element herein using a designation such as “first,” “second,” and so forth does not limit the quantity or order of those elements, unless such limitation is explicitly stated. Rather, these designations may be used herein as a convenient method of distinguishing between two or more elements or instances of an element. Thus, a reference to first and second elements does not mean that only two elements may be employed or that the first element must precede the second element in some manner. In addition, unless stated otherwise, a set of elements may comprise one or more elements.


As used herein, the term “master image” is an image generated by a card recognition system during calibration mode that may be stored for future comparison with unknown images to identify a card during card recognition mode. The master images may include separate master images for each rank and suit of a deck. There may also be master images for other symbols, such as a joker, Wagner symbols, deck set symbols, casino symbols and other known symbols. In some embodiments, a master image may include both the rank and the suit of an individual card, such that each individual card has its own master image. A “raw image” is an image generated by a card recognition system during calibration mode and may be used to generate the master image. An example of a “raw image” is an image generated by a two-dimensional (2D) CMOS image sensor. As discussed below, the master images may be generated according to parameters stored in an automatically generated calibration file. In the context of card recognition, a “raw image” may be generated and used to generate an unknown image. The term “unknown image” is an image that is generated by the card recognition system for comparison with a master image to identify a card rank and suit during card recognition mode.


Embodiments of the present disclosure include card handling devices, card recognition systems, and related methods. It is contemplated that there are various configurations of card handling devices that may include a card recognition system according to an embodiment of the present disclosure. FIGS. 1 through 3, described below, are non-limiting examples of such card handling devices that may employ card recognition systems and methods of the present disclosure. Of course, other configurations of card handling devices are also contemplated.



FIG. 1 is a perspective view of a card handling device 100 according to an embodiment of the present disclosure. The card handling device 100 may be configured to randomize sets of cards, such as decks and sets of multiple decks of cards. The card handling device 100 may include a top surface 112 that comprises a flip-up cover 114 that, when opened, may expose a card insertion area 116 and an elevator platform 118. The card insertion area 116 may be configured to receive an input set of cards to be shuffled, counted, and/or sorted. The card handling device 100 may be configured to receive, read rank and suit, sort, and/or shuffle one or more decks of cards (e.g., standard deck of 52 cards each, 52 cards plus one or two jokers, etc.). The card handling device 100 may be particularly well suited for providing randomized decks of cards for card games, such as blackjack, poker, etc. In some embodiments, the card handling device 100 may be located adjacent to, or flush mounted into, a gaming table surface in a casino where a live card game may be played. In some embodiments, the card handling device 100 may be located at a remote location off the casino floor, which may be inaccessible to the public.


The elevator platform 118 may be configured to raise a set of shuffled cards to a level where the cards may be removed by an operator after the shuffling, reading, and/or sorting processes are completed. Within a protective exterior 124 of the card handling device 100 is a card processing system 400 (FIG. 4) therein. The card processing system 400 may be configured to recognize the identity of the cards as the cards pass through the card handling device 100. The elevator platform 118 may include a card present sensor 120 configured to detect the presence of a card located on the elevator platform 118. Other card present sensors 420 (FIG. 4) in the card processing system 400 may trigger the card recognition system to capture the image of the card or data from the card image.


The card handling device 100 may also be configured to display operational data relating to the device to a display panel 122 located on the top surface 112. An operator using the card handling device 100 may monitor the display panel 122 and view the displayed information in order to know the status of operation of the card handling device 100. Such information displayed on display panel 122 may include the number of cards present in the card handling device 100, the status of any shuffling, reading, or sorting operations, security information relating to the card handling device 100, status relating to a card verification process, or any other information about errors, or the operation of card handling device 100 that would be useful to the operator. In one embodiment, the display panel 122 is an LED display. In another embodiment, the display is an LCD display or other electronic display capable of at least displaying alpha-numeric information. The display panel 122 may include a user interface for the user to interact with the card handling device 100. For example, buttons 113, 115 may control operations, such as power on/off, special functions (e.g., raise elevator to the card delivery position, reshuffle command, security check, card count command, etc.), and the like. In other embodiments, touchscreen controls are provided on a surface of the display panel 122.


Additional details regarding such a card handling device are described in U.S. Pat. No. 7,764,836, issued Jul. 27, 2010, and entitled “Card Shuffler with Card Rank and Value Reading Capability Using CMOS Sensor,” and U.S. Patent Application Publication No. 2008/0113700, filed Nov. 10, 2006, now U.S. Pat. No. 8,616,552, issued Dec. 31, 2013, and entitled “Methods and Apparatuses for an Automatic Card Handling Device and Communication Networks Including Same,” the disclosure of each of which is incorporated herein in its entirety by this reference.



FIG. 2 is a perspective view of another card handling device 200 according to another embodiment of the present disclosure. The card handling device 200 may include a recessed card infeed tray 222, adjacent recessed card output tray 224, and a plurality of card shuffling compartments (not shown) arranged into a carousel structure 223 that are configured to shuffle a deck of cards inserted into the card infeed tray 222 and outputted in smaller groups, such as hands and/or partial hands to the card output tray 224 during use. The shuffling compartments of the carousel structure 223 may be enclosed within a cover 228. A card present sensor (not shown) in the card output tray 224 may generate a signal that causes the processor to instruct mechanical elements to dispense another group of cards after the last group of cards is removed. The card handling device 200 includes a flange member 202 that may further include a dealer display 242 that may include touch screen controls for the dealer to input commands for the card handling device 200. The card handling device 200 may be flush-mounted on a gaming table. Additional details regarding such a card handling device are described in U.S. Pat. No. 8,342,525, issued Jan. 1, 2013, and entitled “Card Shuffler with Adjacent Card Infeed and Card Output Compartments,” the disclosure of which is incorporated herein in its entirety by this reference. The card handling device 200 may further include a card recognition system (FIG. 4) that may be housed within the cover 228, and which will be described in further detail below.



FIG. 3 is a partial perspective view of a card handling device 300 according to yet another embodiment of the present disclosure. The card handling device 300 includes a card receiving area 306 that may be provided with a stationary lower support surface 307 that slopes downwardly from an outer side 309 of the card handling device 300. The outer side 309 may include a depression 311 configured to facilitate an operator's ability to place or remove cards into the card receiving area 306. A top surface 304 of the card handling device 300 may include a user interface 302 that may include a visual display 312 (e.g., LED, liquid crystal, micro monitor, semiconductor display, etc.), and one or more user inputs 324, 326. The user inputs 324, 326 may include one or more buttons, touch screens, etc. The user interface 302 may further include additional lights and/or displays 328, 330, which may be configured to indicate power availability (on/off), a shuffler state (e.g., active shuffling, completed shuffling cycle, insufficient numbers of cards, missing cards, sufficient numbers of cards, complete deck(s), damaged or marked cards, entry functions for the dealer to identify the number of players, the number of cards per hand, access to fixed programming for various games, the number of decks being shuffled, card calibration information, etc.), or other information useful to the operator.


The card handling device 300 may further include a shuffled card return area 332. The shuffled card return area 332 may include an elevator surface 314 and card supporting sides 334 that surround at least a portion of the elevator surface 314. In some embodiments, the card supporting sides 334 remain fixed to the elevator surface 314 during operation. In other embodiments, the card supporting sides 334 may be fixed to the frame and do not move. In some embodiments, the card supporting sides 334 may be removable. Removal of the card supporting sides 334 may enable the operator to lift shuffled groups of cards onto a gaming table surface for use in a card game. Additional details regarding such a card handling device are described in U.S. Pat. No. 7,764,836, issued Jul. 27, 2010, and entitled “Card Shuffler with Card Rank and Value Reading Capability Using CMOS Sensor,” the disclosure of which is incorporated herein in its entirety by this reference. The card handling device 300 may further include a card recognition system (not shown), which will be described in further detail below.


Depending on the configuration of the card handling device employed, the physical configuration of the card recognition system may also vary from one card handling device to the next. For example, the placement of the imaging device may be different (e.g., different angles) from one card handling device to the next, which may result in the need to generate and maintain different deck libraries for the various types of card handling devices. According to conventional methods for generating deck libraries and master images where each step is performed manually, the need to maintain deck libraries for various types of card handling devices is increased with the different shuffler structures, which may further add to the benefits and advantages of embodiments of the present disclosure.


Embodiments of the present disclosure include apparatuses and related methods for automatically generating a calibration file for a card handling device. Thus, rather than using substantial human interaction and trial and error to arrive at certain parameters used in the identification of an unknown card, embodiments of the present disclosure use a processor programmed to identify the location and dimensions on a card for a rank, suit, region of interest, and/or other measurements regardless of the deck type and without user interaction, and to generate a calibration file that is later used during a card recognition mode of the processor.



FIG. 4 is a schematic block diagram of a card processing system 400 for a card handling device according to an embodiment of the present disclosure. Examples of card handling devices 100, 200, 300 that may include the card processing system 400 include those described above with respect to FIGS. 1 through 3. Of course, it is contemplated that the card processing system 400 may be adapted for use within any card handling device that is configured to shuffle, sort, deal, process or otherwise handle a deck of cards.


The card processing system 400 may be configured to be automatically tuned to obtain card rank and suit information from one or more decks of cards of different designs and manufacturers, after which the card processing system 400 may be used to determine the identity (i.e., rank and suit) of an unknown card passing through the card handling device. The ability to determine the identity of an unknown card may be desirable for fraud detection, verifying that the proper cards are in a deck, or for other reasons. In some embodiments, the card processing system 400 may also be configured to control the shuffling of the cards, as well as the motors, rollers, etc., that move the cards through the card handling device. In some embodiments, a separate shuffler processor (not shown) may be configured to control the mechanical operation of the card handling device.


The card processing system 400 may include a main control system 412, a card recognition processor 414, an imaging device 416, a memory device 418, a latch 419, and a card present sensor 420. Each of the main control system 412, card recognition processor 414, imaging device 416, the memory device 418, and the card present sensor 420 may be coupled with each other for communication therebetween. The latch 419 may be coupled between the card recognition processor 414 and the memory device 418.


The card recognition processor 414 may be coupled with the imaging device 416 to receive captured images. Capturing an image of all or a portion of the card is also referred to herein as “reading the card.” Cards may be read when stationary or in motion within the card handling device. The imaging device 416 may be positioned and oriented within the card handling device, such that at least a portion of the card may be placed within a field of view of the imaging device 416 when capturing an image of the card. As shown in FIG. 5, a portion of card 506 for a resulting raw image 417 that is within the field of view 502 of the imaging device 416 may be the upper-left hand corner of the card 506. (When the card is face-up, long side along the x axis). Additional detail regarding analysis of the raw image 417, and the related information derived therefrom, will be discussed with more detail below with respect to FIG. 5.


Referring again specifically to FIG. 4, the main control system 412 may include a processor 430 and memory 432. The processor 430 may be configured to perform operations, such as executing instructions (e.g., software code) that perform methods described herein. The instructions may include a program that will run on the main control system 412 at the time the card handing device is tuned (i.e., calibrated) for a specific deck of cards of a specific design. The memory 432 may be configured to store information therein. For example, the memory 432 may store the executables and other files that enable the processor 430 to run the operating system on the main control system 412. The memory 432 may include volatile and/or non-volatile memory. The main control system 412 may run an operating system (e.g., Linux, WINDOWS®, etc.). The main control system 412 may be configured to instruct the imaging device 416 to capture the image (e.g., responsive to a trigger signal from the card present sensor 420) or to extract data from a symbol printed in the field of view 502. The main control system 412 may also be configured to communicate information to input and output devices (not shown), such as a display, operator inputs, etc.


The operating system may enable a data organization structure that includes a file system 431 for storing files (e.g., image files, calibration files, etc.) within the memory 432 that may be used to determine the identity of unknown cards during card recognition mode. The main control system 412 may be configured to organize the file system 431 into sub-directories 434, 436. Each sub-directory 434, 436 may be for a deck type or design to which the card processing system 400 has been tuned. Thus, a sub-directory may also be referred to as a “deck library.” A first deck library 434 may include files stored therein that are for a first particular deck type. For example, files stored within the first deck library 434 may include a calibration file 433, a deck name file 435, and a plurality of master images 413, 415. The calibration file 433 may include parameters that identify certain measurements (e.g., rank and suit areas, region of interest, etc.) that may be used by the main control system 412 and/or the card recognition processor 414 to generate the master images 413, 415 and/or process other images to be compared with the master images 413, 415.


The processor 430 of the main control system 412 may be configured (e.g., programmed) to control the card processing system 400 for operation in one of a plurality of modes. For example, the card processing system 400 may be capable of operating in a calibration mode (e.g., automatically generating a calibration file, master images, and other files of a deck library) and a card recognition mode (e.g., determining the identity of an unknown card passing through the card handling device).


During the calibration mode, the card processing system 400 may be “tuned” to recognize a particular deck of cards or deck type. Therefore, the calibration mode may also be referred to herein as the “tuning mode.” During calibration mode, the main control system 412 may be configured to automatically generate the calibration file 433 and the master images 413, 415 that may be employed by the card processing system 400 when subsequently operated in the card recognition mode. The master images 413, 415 include master rank images 413 and master suit images 415 for the specific card deck type or deck design to which the card processing system 400 is being tuned. Thus, for example, the master rank images 413 may include thirteen images, one for each rank (2, 3, . . . 10, Jack (J), Queen (Q), King (K), and Ace (A)), and the master suit images 415 may include four images, one for each suit (Diamonds (D), Hearts (H), Spades (S), and Clubs (C)). The result from calibration mode includes the calibration file 433, the master images 413, 415, and the deck name file 435 being stored in a deck library 434 for the specific deck type.


The master images 413, 415 may be generated by the card processing system 400 by reading a pre-sorted deck into the card processing system 400. As used herein, “pre-sorted” means that cards of the deck to be tuned are placed and read into the card processing system 400 in an order that is known or expected by the main control system 412 during calibration mode. The term “pre-sorted” is not intended to require any particular order of cards, but rather an order that is expected by the main control system 412. In other words, the card processing system 400 knows the rank and suit of each respective card as it is read into the card processing system 400 and used to generate the master images 413, 415. In some embodiments, the master images 413, 415 may be generated by the card processing system 400 by reading an unsorted (e.g., randomly ordered) deck of cards into the card processing system 400. “Unsorted,” therefore, means that the cards of a deck to be tuned are placed in an order that is unknown or unexpected by the main control system 412. From the perspective of the main control system 412, an unsorted deck is a randomly ordered deck. Additional details regarding the generation of master images 413, 415 of a pre-sorted and an unsorted deck are described below with reference to FIGS. 7 and 8A-8C.


As discussed above, the file system 431 may include additional deck libraries 436 that are unique to additional deck types to which the card processing system 400 has been tuned. For example, a deck library may be stored for each style or brand of cards used by a casino. As a result, each time the card processing system 400 is tuned for a new deck type, the card processing system 400 may automatically generate a new deck library having a calibration file 433, a deck name file 435, and plurality of master images 413, 415 for the new deck type stored therein. Any number of deck libraries may be generated and included within the file system 431, according to the number of desired deck types to which the card processing system 400 is tuned. Because card styles sometimes change over time, it may be desirable to generate a new deck library 434 each time a casino receives a new shipment of cards.


The card processing system 400 may also operate in a card recognition mode. During the card recognition mode, an unknown image 411 may be compared with one or more master images 413, 415 to determine the identity (e.g., the rank and suit) of the unknown card passing through the card handling device. The card recognition mode may occur during real-time play of a wagering game, with the processing being primarily performed by the card recognition processor 414 rather than the main control system 412. Additional details regarding the recognition of an unknown card passing through the card handling device during the card recognition mode are described below with reference to FIG. 12.


The card recognition processor 414 may configured as an FPGA or comparable hardware component having control logic configured to process one or more images according to embodiments described herein. During the calibration mode, the card recognition processor 414 may be configured to process a raw image data captured by the imaging device 416 data and transmit the processed raw image data to the memory device 418 as raw images 417. During the calibration mode, and after the calibration file 433 is automatically generated, the card recognition processor 414 may also configured to generate the master images 413, 415 according to the parameters stored in the calibration file 433. During card recognition mode, the card recognition processor 414 may be configured to determine the identity of an unknown card. For example, the card recognition processor 414 may be configured to generate the unknown images 411 from the raw captured image data so that the unknown images 411 may be compared with one or more of the master images 413, 415 to determine the identity of the unknown card. In other words, during the card recognition mode, the card recognition processor 414 may be configured to compare the generated unknown rank and suit images (i.e., the unknown image 411) with master rank images 413 and master suit images 415 to determine the identity of the card. The card recognition processor 414 may also include memory that may store master images 413, 415 (as is shown in FIG. 4), which may be used to compare with the unknown image 411 during the card recognition mode. Memory of the card recognition processor 414 may also store raw images 417 in some embodiments.


The imaging device 416 may include a camera (e.g., 2D CMOS imager) configured to acquire a two-dimensional image of its field of view. The imaging device 416 may include an analog camera or a digital camera with a decoder or receiver that converts received radiation into signals that can be analyzed with respect to image content. The signals may reflect either color or black-and-white information, or merely measure shifts in color density and pattern. The imaging device 416 may include one or more lenses to focus light, mirrors to direct light, and radiation emitters to assure sufficient radiation intensity for imaging by the imaging device 416. For example, the radiation emitters may include LED light sources (not shown) to illuminate areas of the card being imaged. Although a white light source may sometimes be adequate to capture grayscale data from red and black print on cards, a green light source may be an efficient source of illumination for capturing black and white data from both red and black print on cards.


The card present sensor 420 may be configured to generate a signal when a card is present for the imaging device 416 to read. In some embodiments, the card present sensor 420 may be coupled to the imaging device 416 directly for sending a trigger signal to the imaging device 416 indicating that the card is present. In response to the trigger signal, the imaging device 416 may capture the image of the card. In some embodiments, the card present sensor 420 may be coupled to the imaging device 416 indirectly such that the trigger signal is sent to the imaging device 416 through other components, such as the main control system 412 or the card recognition processor 414. Although the card present sensor 420 is shown in FIG. 4 as being directly coupled with each of the main control system 412, the card recognition processor 414 and the imaging device 416, this is done to show various optional configurations for the card present sensor 420.


The memory device 418 may be configured to store the captured raw images 417 for each of the cards of the deck. The raw images 417 may be read in by the imaging device 416 to be used by the main control system 412 during calibration mode, which may be used to generate the calibration file 433. The raw images 417 may further be provided to the main control system 412 during calibration mode to generate a set of master images 413, 415, which master images 413, 415 may ultimately be stored in the corresponding deck library 434, 436 for that respective deck type. The memory device 418 may have N locations available for storing the raw images for each card, wherein N is any positive integer. For most standard decks, N may be equal to 52 or, maybe, each rank of each suit 13 rank images and 4 suit images. In some embodiments, N locations may include additional locations for jokers, special cards, blank cards, or other symbols. Decks of cards having more or fewer than 52 cards (e.g., cards with certain cards added or removed) are also contemplated. In addition, the calibration file 433 and master images 413, 415 may be generated using a sub-set of all cards of the deck. In other words, a raw image 417 may not be captured for each and every card in the deck so long as at least one raw image 417 is available for each rank and each suit in the deck. As a result, N may be fewer than the total number of cards in the deck. Although the memory device 418 is shown in FIG. 4 as a discrete memory device, it is contemplated that the memory device 418 may be integrated with the card recognition processor 414 or the memory 432 of the main control system 412, such that the raw images 417 may be stored in the card recognition processor 414 or the main control system 412.


In some embodiments, the memory 432 may also include one or more combined deck sub-directories 440. Each combined deck sub-subdirectory 440 may include normalized images for a corresponding rank and suit from a plurality of different deck types. For example, a first combined deck sub-directory 440 may have normalized images D1, D2, . . . DN previously taken from a plurality different deck types for the “2 rank,” a second combined deck sub-directory 440 may have normalized “3 rank” images D1, D2, . . . DN previously taken from a plurality of different deck types for the “3 rank,” and so on. Thus, there may be thirteen different combined deck rank sub-directories, each having a relatively large number of normalized rank images D1, D2, . . . DN from different deck types. Likewise, four different combined deck suit sub-directories 440 may be created that have a relatively large number of normalized suit images D1, D2, . . . DN from different deck types stored therein.


Each of the normalized images D1, D2, . . . DN may be a common size. In addition, the normalized images may be stretched so that a pixel from each of the rank and suit is located on an edge of the normalized image. The normalized images may be used in comparison with the master rank and suit images 413, 415 for linking to the master images 413, 415 to the appropriate ranks and suits for the particular deck being tuned. Thus, the master rank and suit images 413, 415 (or copies thereof) may also be normalized while tuning the deck by having a common size with the normalized images and stretching the master rank and suit images 413, 415 so that a pixel from each of the rank and suit is located on an edge of the normalized image.


The latch 419 may be configured to select a location in the memory device 418 in which a particular raw image 417 should be stored. For example, as the raw image 417 for a card is successfully stored, the latch 419 may include a flip flop and/or a counter that increments as each raw image 417 is stored so that the raw image 417 for the next card may be stored in the next location in memory device 418. If the raw image for a card is not successfully stored, the latch 419 may not increment.


Raw Image Analysis and Parameters of the Calibration File



FIG. 5 is an illustration of a raw image 417 acquired by the imaging device 416 (FIG. 4) of a card handling device, according to an embodiment of the present disclosure. The discussion related to the parameters in FIG. 5 also refers to the hardware environment for acquiring the raw image 417 that described with respect to FIG. 4. The lines and measurements shown in FIG. 5 illustrate certain parameters that may be determined and included within the calibration file 433 for future use. Future use of the calibration file 433 may include when generating the master images 413, 415 during the calibration mode, as well as when generating the unknown image 411 used to compare with the master images 413, 415 during the card recognition mode.


The raw image 417 may be acquired from the imaging device 416. Thus, the raw image 417 of FIG. 5 may be one of the raw images 417 that may be stored in the memory device 418 as the cards are read by the card handling device. The raw image 417 may be a grayscale image having a resolution determined by the imaging device 416 (e.g., 320 pixel×240 pixel resolution). A grayscale pixel may have relatively large number of different values, whereas a black and white pixel may have the value of either a 1 (black) or a 0 (white). When processing the raw image 417, the raw image 417 may be converted from a grayscale image to a black and white image. For example, the card recognition processor 414 (FIG. 4) may employ a method to assign grayscale pixels below a certain threshold value to a 0 (white) and grayscale pixels above a certain value to a 1 (black). Of course, different resolutions and color schemes may be employed, as desired. For example, a full color image may be captured by the imaging device 416; however, it should be appreciated that lower resolution and black and white conversion may result in smaller file sizes and reduced processing time. It is contemplated, however, that the color (e.g., red or black) of the rank or suit may be one more distinguishing factor to assist in card recognition even though the color is treated as irrelevant in many of the examples described herein. Adding color analysis may increase the accuracy of identifying ranks and suits; however, it may come at the expense of additional complexity and/or processing time.


The field of view 502 of the imaging device 416 defines what data is able to be captured by the imaging device 416 for the raw image 417. For example, the imaging device 416 may be located within the card handling device to capture at least a portion of a card 506 passing through the card handling device. As discussed above, a portion of the card 506 may be positioned within the field of view 502 such that the raw image 417 may include the rank and the suit of the card 506. In the example shown in FIG. 5, the rank is a Queen (Q) and the suit is a spade located in the upper left-hand corner of the card 506. Of course, the rank and suit may be located in other positions on the face of the card 506.


A rank area 508 and a suit area 510 may define an area around the rank and suit, respectively. The rank area 508 and the suit area 510 may completely encompass the rank and the suit of the card 506. Having a good fit for the rank and suit may reduce the amount of white space in the rank area 508 and the suit area 510, which may provide a better separation and a more accurate match when comparing master images and unknown images. The rank area 508 and/or the suit area 510 may be a box, a rectangle, or other shape. The rank area 508 has a rank width 516 and a rank depth 518, which may be measured in pixels. The suit area 510 has a suit width 520 and a suit depth 522, which may be measured in pixels. The rank area 508 and the suit area 510 may be separated at a split region 509. The split region 509 is a region (e.g., point, line, etc.) that is between the rank and the suit of the card 506, which may be used to be a starting point for measuring the rank area 508 and the suit area 510. In some embodiments, the split region 509 may be ignored by finding the rank and suit symbols, such as by blob analysis (described below), and then applying the parameters from the calibration file 433 if the calibration file 433 exists.


Within the field of view 502, the main control system 412 may also define a region of interest 504 to be stored in the calibration file 433 so that subsequent analysis may focus on a smaller portion (the region of interest 504) of an image rather than a full image (the field of view 502). The region of interest 504 is a portion of the field of view 502 that includes the rank area 508 and the suit area 510 of the card 506. Focusing the analysis of the card to the region of interest 504 may reduce the processing needed for generating master images 413, 415.


As discussed above, various parameters may be stored in the calibration file 433 in order to assist the generation of the master images 413, 415 during calibration mode and the generation of the unknown images 411 during card recognition mode. These parameters may be determined by the main control system 412 and stored in the calibration file 433. The calibration file 433 may include the parameters for the specific deck type. Such parameters may include V_lines 512, H_lines 514, rank width 516, rank depth 518, suit width 520, suit depth 522, H_start 524, V_start 526, and V_offset 528. These parameters may include various locations and measurements within the raw image 417.


V_start 526 is the shift in the X-axis for the finding the region of interest 504. V_start 526 may be based on the changes in the camera mount position relative to the calibration target. V_start 526 may be set internally by the card recognition processor 414 or the main control system 412. V_start 526 may be approximately the same for all shufflers of the same model, but may account for small changes in camera mount position between devices.


V_offset 528 is the pixel offset that is added along the X-axis to V_start 526 to find the edge of the region of interest 504. The region of interest 504 may be defined just beyond the edge of the card 506 (e.g., by a few pixels) into the dark background. The V_offset 528, which is a relative offset used to shift the card image further leftward into the region of interest 504. The V_offset 528 may be determined by checking across all card images that the region of interest 504 edge is just a few pixels away from the card next to each rank/suit image, as the main control system 412 uses a black-to-white transition algorithm to find the edge of the card. In order to compensate for some rotation-caused shifting of the cards, the V_offset 528 may be reduced by a number (e.g., 4 pixels) from the minimal value found across all cards.


H_start 524 is a relative offset along the Y-axis that is used to shift the card image to define the upper portion of the region of interest 504. The higher the value of H_start 524, the greater the shift. H_start 524 corresponds to a shift of the region of interest 504 downward from the top of the card 506. H_start 524 may be determined by finding the distance to the black-to-white transition at the top edge of the card 506 and reducing by a number (e.g., 4 pixels) to compensate for some shifting in the cards.


V_lines 512 is the number of pixels in the region of interest 504 along the X-axis. In other words, the V_lines 512 is the width of the region of interest 504. V_lines 512 may be determined by taking the maximum of the center-most edge coordinate for the rank and suit across all cards, and then subtracting V_start 526 and V_offset 528.


H_lines 514 is the number of pixels in the region of interest 504 along the Y-axis. In other words, the H_lines is the depth of the region of interest 504. H_lines 514 may be calculated by determining the maximum coordinate across all card images for the edge closest to the bottom of the suit.


The point having the coordinates (V_start+V_offset, H_start) may be used to define the upper left-hand corner of the region of interest 504. The size of the region of interest 504 may be defined by the V_lines 512 and H_lines 514. As a result, a smaller window (i.e., the region of interest 504) may be output in order to look at a selected region within the field of view 502 during operation.


Additional parameters may be stored in the calibration file 433 that relate to the operation of the imaging device. Such additional parameters may include exposure, camera gain, brightness, camera speed, camera resolution, etc., which may be read from registers of the imaging device 416.


Additional parameters may be stored in the calibration file 433 that relate to the deck or the operation of the card recognition mode. Such additional parameters may include preload sets, split_algorithm_select, err_min_rank, err_min_suit, a deck number, and a library number.


Split_algorithm_select may be used to indicate the direction that the card recognition processor 414 begins its scan for finding the split region 509 in the unknown image. For example, if there is a non-rank blob (e.g., Wagner symbol or artwork) between the rank and the top edge of the card 506, the split_algorithm_select may be set to 1 to instruct the card recognition processor 414 to scan the region of interest 504 from bottom to top when finding the split region 509. The split_algorithm_select may be set to 0 to instruct the card recognition processor 414 to scan the region of interest 504 from top to bottom when finding the split region 509.


Err_min_rank is a parameter used to identify unknown rank images. The number of black pixels in the unknown image is compared to the number of black pixels in reference images and the reference image with the highest number of pixel matches is determined to be the match. For example, if the score is less than the err_min_rank, the master rank image is reported as being a non-match to the unknown rank image. Err_min_suit is a parameter used to identify unknown suit images. For example, if the score is less than the err_min_suit, the suit image is reported as being a non-match. During a summing determination, a perfect match would have a 100% match rate between the unknown image and a master image. Because of some variations, this may not be the case. The err_min_rank and err_min_suit are may be set to have values equating to a desired error threshold (e.g., 75%) of the potential total matches in the summing determination. For example, if a rank area 508 or a suit area 510 has 32 pixels, the err_min_rank and the err_min_suit may be set to 24 (e.g., 24/32=0.75). As a result, if the percentage of pixel matches falls below this percentage (e.g., 75%), the rank and/or suit may be considered a non-match against the master image. If more than one master image provides a score that exceeds the match threshold (75%) when compared to the unknown image, then the master image having the highest score may be considered the matching symbol. If none of the master images provide a score that exceeds the match threshold (75%) when compared to the unknown image, then the unknown image may remain unknown and an error alert may be provided to the dealer. Situations in which the score may be below the match threshold may include an error in the tuning process, an error in the image capture of the unknown image, a card being turned over, a card being from another deck, a damaged or dirty card, the card handling device being dirty, etc.


The deck number may be a unique number for a particular calibration file 433 in order for the dealer to select to be used in the future. The library number may represent the number of times that the calibration file 433 has been updated. The preload sets parameter may represent a number of image shifts that are done during correlation for an image pair (e.g., an unknown image and a master image). In some embodiments, the deck number and library number may be stored in a separate deck name file that is different than the calibration file 433.


Calibration Mode Operation: Automatic Generation of Calibration File



FIG. 6 is a flowchart 600 illustrating a method for automatically generating a calibration file for a card detection system according to an embodiment of the present disclosure. The method of FIG. 6 is described with reference to the card processing system 400 of FIG. 4 and the raw image 417 of FIG. 5. The main control system 412 may operate in the calibration mode to tune the card processing system 400 to a particular deck of cards such that the card recognition processor 414 may subsequently identify unknown cards of that particular deck type while operating in card recognition mode.


At operation 610, the raw image 417 for the cards in the deck may be captured. For example, the deck of cards may be inserted into the card handling device and read into the card processing system 400. At operation 620, the raw image 417 may be stored in the memory device 418 in the order that they are received. At operation 630, it is determined if there is another card to be read and stored. If so, the next card is read, and the next raw image 417 is stored in the memory device 418. In other words, cards may be sequentially passed through the field of view 502 of the imaging device 416, and the raw image 417 of at least an area the rank and suit symbols of each card is captured by the imaging device 416, and moved by the card processing system 400 into the memory device 418 to be used for tuning the card processing system 400. In some embodiments, the cards may be read in a predetermined order (i.e., pre-sorted), while in other embodiments, the cards may be read in any order (i.e., unsorted).


In some embodiments, the upper left-hand corner of the card may be captured by the imaging device 416, whereas in other embodiments the imaging device 416 may capture a larger portion of the card face (e.g., the entire card face). At the close of reading all or a portion of each card in the deck, a raw image 417 may be stored in the memory device 418 for each card of the deck. At this point, the raw images 417 stored in the memory device 418 may not be processed (e.g., cropped or otherwise altered), but represent full images for the entire field of view 502 of the imaging device 416, including each rank and suit symbol.


At operation 640, one or more raw images 417 may be loaded from the memory device 418 to the main control system 412 for image processing and for automatically generating the calibration file 433. In some embodiments all raw images 417 captured for the card deck may be loaded from the memory device 418 to the main control system 412 for subsequent image processing. In other embodiments, each raw image 417 may be loaded to the main control system 412 and subsequently processed one at a time.


At operation 650, the location of the rank and suit symbols may be identified within the raw images 417 along with parameters associated with their areas. For example, the main control system 412 may be configured to perform an image processing analysis of each raw image 417. The image processing analysis may include identification of measurement data (e.g., parameters representative of length, width, areas, coordinates, dimensions, etc.) relating to at least one of a rank area 508 around a rank of the card, and a suit area 510 around a suit of the card.


As an example of a method that may be used by the main control system 412 to identify the areas within the raw images 417 that include the rank and suit symbols of a card, the main control system 412 may perform a “blob” analysis or other similar analysis on one or more of the raw images 417 stored in the memory device 418. A blob is a connected region in a binary digital image. In other words, a blob may include a point and/or a region in the raw image 417 in which the pixel data differs in some property from the pixel data in surrounding areas. For example, the blob analysis may locate black regions of a black-white image or may analyze intensity of a grayscale image.


The blob analysis may result in extracting features from an identified blob, such as the orientation of a blob, the centroid of the blob, the height and width of the blob, and other similar features. An example of such an image processing program capable of being programmed to perform the analysis described herein includes OpenCV (Open Source Computer Vision Library) developed by Intel Corporation of Santa Clara, Calif. OpenCV is an open source library that includes a plurality of additional algorithms and sub-features for algorithms that may be used in image analysis. Descriptions regarding the function and usage of the algorithms that may be performed using OpenCV are described in one or more of the following text books: “OpenCV 2 Computer Vision Application Programming Cookbook, published May 23, 2011, author Robert Laganiere; “Learning OpenCV: Computer Vision in C++ with the Open CV Library,” published Dec. 25, 2012, authors Gary Bradski et al.; and “Mastering OpenCV with Practical Computer Vision Projects,” published Dec. 3, 2012, authors Baggio et al., the disclosure of each of which is hereby incorporated herein by this reference in its entirety. For example, libraries from OpenCV may be employed to perform conversion of images from a grayscale image into a black and white image, finding contours on an image, and filling a contour to isolate a rank and suit within the images. In addition, libraries from OpenCV may be employed to perform a blob analysis, optical character recognition (OCR), or other methods that may be used to link the appropriate rank and suit images with the card images, which may enable tuning a deck of cards out of order (as discussed in further detail below). Other software that may be capable of blob analysis and other image processing features may also be employed, such as MATLAB® developed by Mathworks of Natick, Mass.


The blob analysis may be configured to locate blobs within a selected raw image 417. Although the term “raw image” is used during the description of the blob analysis, the image that is analyzed may be processed. For example, the main control system 412 may convert the raw image 417 to black and white, may crop (or ignore) the raw image 417 to the region of interest 504 around the card 506 that is of a smaller size than the field of view 502 of the imaging device 416, etc., prior to performing the blob analysis to identify the areas of interest that include the rank and suit of the card 506. In this context, the term “raw image” is intended to mean that the image is not a master image 413, 415, and not necessarily that no processing or alteration of the raw image 417 has been performed.


In particular, the blob analysis may be employed to locate rank and suit symbols, and distinguish between ranks, suits, and other markings on the card. As an example, the blob analysis may determine an initial number of blobs within the raw image 417. The initial number of blobs may be relatively large (e.g., 3000 blobs) based on the artwork on the card that is present within the field of view of the imaging device 416. The blob analysis may also return a location (e.g., centroid) of each blob within the raw image, as well as the measurements (e.g., height, width, etc.) of each blob.


The blob analysis may ignore other blobs based on size, shape, or location in order to arrive at the rank and suit symbols. For example, because the rank and suit of most cards may be expected to have at least a minimum size, blobs smaller than the minimum size may be ignored by the main control system 412 for purposes of finding the areas of interest in the raw image 417 that include the rank and suit. In addition, because the rank and suit of most cards may be expected to be located near the corners of the cards, blobs that are outside of that expected region may also be ignored. As a result, the remaining blobs left to be analyzed should include the one or more blobs corresponding to the rank symbol and the one or more blobs corresponding to the suit symbol on the card.


In some embodiments, the main control system 412 may identify the “10” rank symbol first (i.e., excluding other cards, such as Jack, King, etc.). Unique characteristics of the 10 rank symbol may be particularly helpful for the main control system 412 to identify the location and dimensions of the 10 rank and also distinguish the 10 rank from the other ranks. For example, the 10 rank may be recognizable because the 10 rank has two relatively large blobs that are side by side. Each of the two blobs may have approximately the same location on the Y-axis for the centroid, along with approximately the same height. In addition, the blob analysis may recognize that one blob (the “1”) is narrow whereas the other blob (the “0”) is wide. The 10 rank may also be useful as a basis in defining the measurements for the rank area 508 because the 10 rank symbol is usually the largest of the various rank symbols for many deck types.


Identifying a blob as the suit may be determined by locating the relatively large blob near the 10 rank as the suit. In other words, once the blob's location and measurements have been determined for the rank, the location and measurements of the suit may be based on the blob identified as the rank. For example, the suit may also be expected to be the blob located below the rank (in the Y direction), which is the case for most deck types.


In some embodiments, the main control system 412 may determine the location and measurements of the suit symbol prior to the rank symbol. As a result, once the blob's location and measurements have been determined for the suit, the location and measurements of the rank may be based on the blob identified as the suit. For example, the rank may also be expected to be the blob located above the suit (in the Y direction), which is the case for most deck types.


As another example, the rank and/or suit symbol may be identified by comparing portions of the raw image 417 against a generic file describing rank and/or suit curvatures, being immune to scale or rotation, for portions of the raw image 417 that are closest to the corner of the card 506.


At operation 660, the rank area 508 and the suit area 510 may be defined. For example, once the blobs are identified for the rank and suit of the card in the raw images 417, the results from the blob analysis, the locations and measurements (e.g., in pixels) may be further analyzed to define the rank area 508 and the suit area 510 as well as other related parameters that will be part of the calibration file 433. The rank area 508 and the suit area 510 are areas that are defined based on the location and measurements of the respective ranks and suits for the specific deck type. In other words, there may be one rank area 508 that is defined to be large enough that all ranks from all raw images 417 fit within the rank area 508. Similarly, there may be one suit area 510 that is defined to be large enough that all suits from all raw images 417 of the deck type fit within the suit area 510.


In some embodiments, the main control system 412 may define the rank area 508 to be based on the measurements and location of the 10 rank because the 10 rank is often the largest rank symbol for most decks. In some embodiments, the main control system 412 may analyze the measurements for the ranks from all raw images 417. In other words, the largest blob identified as a rank may be used as the basis of the rank area 508. Similarly, in some embodiments, the main control system 412 may analyze the measurements for the suits from all raw images 417. The largest blob identified as a suit may be used as the basis of the suit area 510. In other words, the final parameters for the rank area 508 and the suit area 510 that are stored in the calibration file 433 may be determined by taking the maximum measurements of the rank symbol and suit symbol, and then expanding the values slightly to allow for some white space around the ranks and suits. Expanding the maximum value of the ranks and suits in this manner may allow for slight variations in print size, as well as for rotation of cards that may exist when reading in the cards.


At operation 670, a region of interest 504 may be defined. The region of interest 504 should be large enough to completely contain each rank and suit symbol of each card of the specific deck type with some additional area added to account for variability in the acquired signal and variability in card position. Thus, the region of interest 504 may be defined by measuring a size (e.g., in pixels) and location of each rank area 508 and suit area 510 defined by the system, and identifying an area in which every rank area 508 and suit area 510 appears during the training and calibration phase. Because there may exist some variation in the location of the rank and suit symbols throughout the deck, the rank area 508 and suit area 510 for the entire deck may be analyzed to determine a minimum number of pixels (in both X and Y directions) to consider to ensure that the rank and suit symbols fall within the region of interest 504. In some embodiments, the minimum number of pixels may be used, whereas in other embodiments the region of interest 504 may include excess pixels to allow for variations (e.g., card orientation) during actual use.


In some embodiments, the region of interest 504 may be defined by adding a fixed dimension in the X-direction from the right side of the image, while the dimension in the Y-direction may be determined by defining the furthest location from the top of the card 506 in the Y-direction to the bottom of the suit symbol. Because the rank and suit symbols may vary in location, the furthest location may be determined after analyzing the locations of all suits from the image processing analysis. The final dimensions of the region of interest 504 may include some padding on these measurements in each direction to compensate for slight variations in card orientation during use. Other methods for defining the boundaries for the region of interest 504 are also contemplated such that the region of interest 504 has a suitable size and location to ensure that rank and suit symbols within the deck will be within the defined region of interest 504 even if the locations of the rank and suit symbols may vary somewhat.


At operation 680, the calibration file 433 may be automatically generated and the parameters may be stored therein. The calibration file 433 may include, among other information, this measurement data, which can be subsequently used by the card processing system 400 in the calibration mode to generate master images 413, 415, or during the card recognition mode to identify unknown cards of that particular deck type, as discussed in more detail below.


The parameters stored in the automatically generated calibration file 433 may include measurements for the rank area 508 (rank height and rank width) and the suit area 510 (suit height and suit width), and the measurements and coordinates for the region of interest 504, as well as additional parameters previously discussed with respect to FIG. 5. The calibration file 433 may be stored in memory 432 within the main control system 412. Storing the calibration file 433 in memory 432 of the main control system 412 may include storing the calibration file 433 in a subdirectory for the particular deck or deck type that is used to tune the card processing system 400. The calibration file 433 may further include other parameters that may be used by the card recognition processor 414 in generating master images, cropping unknown images, or for other reasons as discussed above.


Calibration Mode in Operation: Automatic Generation of Master Images



FIG. 7 is a flowchart 700 illustrating a method for generating master images according to an embodiment of the present disclosure. The method of FIG. 7 is described with reference to the card processing system 400 of FIG. 4 and the raw image 417 of FIG. 5. The calibration mode may be employed to tune the card processing system 400 for use with a particular deck of cards. Using the calibration file 433, the card recognition processor 414 (or, alternatively, the processor 430 of the main control system 412) may process the raw images 417 stored in the memory device 418 to generate the master images 413, 415 for that specific deck type to be used in the card recognition mode of the card processing system 400.


For the flowchart 700, it is presumed that a calibration file 433 has already been created, such as by the method of FIG. 6, or other methods described herein. For example, the calibration file 433 may be automatically created using image processing methods to locate rank and suit symbols in the raw images 417 and determine parameters (e.g., rank area 508, suit area 510, region of interest 504, etc.) regarding measurements of the rank and suit symbols. These parameters, along with other parameters discussed above, may be included with the calibration file 433 that may be stored in the deck library 434 of the file system 431 maintained by the main control system 412.


At operation 710, the raw images 417 may be loaded into the main control system 412 and/or the card recognition processor 414. If the raw images 417 are already stored in the memory device 418, the raw images 417 may be retrieved from the memory device 418. Thus, the raw images 417 used to generate the master images 413, 415 may be the same raw images 417 that were used in the analysis for automatically generating the calibration file 433. If the raw images 417 are not stored in the memory device 418, a deck of the same deck type may be read by the card processing system 400 so that new raw images 417 may be generated and loaded.


At operation 720, the raw images 417 may be converted to the master images 413, 415. For example, the card recognition processor 414 may convert the raw images 417 to the master images 413, 415. In particular, the card recognition processor 414 may receive and crop the raw images 417 according to the parameters previously stored in the calibration file 433 to automatically generate a master rank image 413 and the master suit image 415 for the card 506.


For example, FIGS. 8A through 8C illustrate a process of generating the master rank image 413 and the master suit image 415 from the raw image 417 according to the parameters stored in the calibration file 433. As shown in FIG. 8A, the first cropping of the raw image 417 may be to limit a processed image 800 of the card 506 (seven of diamonds shown in FIG. 8A) to have an area that is determined by the region of interest 704 stored in the calibration file 433. The processed image 800 shown in FIG. 8A may have been generated from the raw image 417. For example, the processed image 800 may be a black and white image, whereas the original raw image 417 may have been a grayscale image or a color image.


As shown in FIGS. 8B and 8C, the secondary cropping includes cropping the processed image 800 of FIG. 8A to generate the master images 413, 415. For example, the master rank image 413 may be the result of cropping the processed image 800 according to the parameters of the calibration file 433 that correspond to the rank area 508. The master suit image 415 may be the result of cropping the processed image 800 according to the parameters of the calibration file 433 that correspond to the suit area 510. In other words, the card recognition processor 414 may crop the raw images 417 according to stored region of interest 504, rank area 508, suit area 510 to generate separate master images 413, 415 for the rank and the suit of the card. The other parameters (e.g., offset values) stored in the calibration file 433 may assist in locating the rank and suits so that the application of the rank area 508 and suit area 510 parameters may result in master images 413, 415 that contain the entirety of the rank and suit, along with the desired alignment. Although FIGS. 8B and 8C show the rank and suit to be shifted to the left and upper edges of the rank area 508 and the suit area 510, there may be some padding (i.e., white space) on each edge. In order to maintain a consistent location of the rank and suit in the master images 413, 415, the rank and suit may be shifted toward one of the corners as shown.


In some embodiments, a single master image may be generated for each card that includes both the rank and the suit (essentially the region of interest 504). Doing so, however, may require 52 master images (one for each card) rather than having as few as 17 master images (one for each rank and one for each suit), which may change the processing time during the card recognition mode.


Referring again to FIG. 7, at operation 730, the master images 413, 415 may be stored in the appropriate deck library 434 together with the corresponding calibration file 433, deck name file 435, and other related files. The master images 413, 415 may be stored as image files (e.g., bitmap files) in the deck library 434 of the file system 431 of the main control system 412. The master images 413, 415 may further be loaded into the memory of the card recognition processor 414 for comparison with the unknown images 411 during card recognition mode. The master images 413, 415 may be separate files stored in the same subdirectory of the file system as the calibration file 433 (e.g., a text file). In some embodiments, the calibration file 433 may be combined in a file with the master images 413, 415 such that a single file may include image data and the calibration parameters.


In a standard deck of cards, 18 master images 413, 415 may be stored: one master rank image 413 for each rank (2-10, J, Q, K, A) and one master suit image 415 for each suit (Heart, Diamond, Club, Spade), as well as a master image for a joker. The master image for a joker may be stored as a rank. Other symbols may also be printed on the face of some of the cards. For example, a “Wagner” symbol is a special symbol printed on some of the cards for a deck that is used in certain card games in order to assist in game play. For example, a Wagner symbol is typically useful during blackjack by being printed on the face of each card having a value of ten (i.e., each 10 rank, Jack rank, Queen rank, King rank) and eleven (i.e., each Ace rank) to assist with the determination that the dealer has a “21.” The Wagner symbol is often a polygon shape that is located between the rank and the top of the card. A master image for a Wagner symbol may be created and stored, while in some embodiments the Wagner symbol may simply be ignored by the main control system 412. Other special symbols may also be treated similarly.


At operation 740, the master images 413, 415 may be linked with the correct rank and suit such that the card processing system 400 may know which master image 413, 415 corresponds to the correct rank and suit. When the link between the master images 413, 415 and the correct rank and suit are established, the link may be known by the card processing system 400, such as by creating a list in the calibration file 433, by storing the information in another file in the deck library 434, or by some other suitable method. The files for the master images 413, 415 may be named (or re-named) to have an identifier (e.g., name, number, etc.) that indicates what the rank or suit is for each master image 413, 415.


In some embodiments, the order of each card in the deck may be known when master images 413, 415 are generated because the deck may be pre-sorted when the raw images 417 are captured. As a result, each master image 413, 415 may be linked to the correct rank or suit based on the expected order of the pre-sorted deck as the raw images 417 were saved in the memory device 418. Such an embodiment, however, may rely on the card manufacturer or technician (or dealer, pit boss, etc.) who inserts the deck into the card handling device to insert a pre-sorted deck. Inserting an unsorted deck into such an embodiment may result in improper links being established between the master images 413, 415 and incorrect ranks and suits.


In other embodiments, the deck may not be required to be in any particular order when generating the master images 413, 415. The card processing system 400 may be configured to make the proper links between the master images 413, 415 and the correct ranks and suits even with an unsorted deck. The main control system 412 may perform additional image processing on the master images 413, 415 in order to determine which rank or suit should be linked to each master image 413, 415. Of course, at this point the master images 413, 415 are not linked to any particular rank or suit, and the actual identity of the master image 413, 415 may not yet be known to the main control system 412.


As discussed above, there may be a plurality of combined deck sub-directories 440 (FIG. 4) that include a plurality of normalized images for a corresponding rank and suit from a plurality of different deck types. While tuning the master images 413, 415 for a particular deck, the identity may be determined by comparing normalized versions of the master images 413′, 415′ with the normalized images D1, D2, . . . DN stored in the combined deck sub-directories 440.



FIGS. 8D and 8E show an example of how the master images 413, 415 may be normalized to form normalized master images 413′, 415′. Referring to FIG. 8D, the 7 rank may be normalized by cropping the master rank image 413 so that there is at least one black pixel from the 7 rank along the outer border of the cropped image (represented by box 802). The cropped rank image may be expanded to a common image size (e.g., rank area 508) as indicated by the arrows shown in FIG. 8D. The other master rank images 413 may be normalized in a similar manner. Referring to FIG. 8E, the diamond suit may be normalized by cropping the master suit image 415 so that there is at least one black pixel from the diamond suit along the outer border of the cropped image (represented by box 804). The cropped rank image may be expanded to a common image size (e.g., suit area 510) as indicated by the arrows shown in FIG. 8E. The other master suit images 415 may be normalized in a similar manner. The normalized master rank and suit images 413′, 415′ may appear to be somewhat “bloated” in comparison to the master rank and suit images 413, 415. It should be understood that the normalized images D1, D2, . . . DN of the combined deck sub-directories 440 may be normalized in a similar manner having a common image size as the normalized master rank and suit images 413′, 415′.


To link the master images 413, 415 to the appropriate rank and suit during tuning, the normalized master images 413′, 415′ may be compared with the normalized images D1, D2, . . . DN of the combined deck sub-directories 440. The comparison of the normalized master images 413′, 415′ with the normalized images D1, D2, . . . DN of the combined deck sub-directories 440 may be performed by a pixel to pixel comparison (e.g., X-OR comparison) to arrive at a score that is used to determine whether a match occurs. For example, referring again to FIG. 4, an unknown normalized master rank image 413′ may be compared to the normalized images D1, D2, . . . DN of a first combined deck sub-directory 440 (e.g., all 2 ranks stored therein) to produce a first score. The first score may be a combined score for all 2 ranks of the sub-directory 440 or a plurality of individual scores for each normalized images D1, D2, . . . DN. The unknown normalized master rank image 413′ may then be compared with the normalized images D1, D2, . . . DN for each of the remaining combined deck sub-directories 440 (e.g., as a loop) to produce the scores for each of them. The highest score produced may indicate the identity of the unknown normalized master rank image 413′. An unknown master suit image 415′ may likewise be compared with the appropriate combined deck sub-directories 440 to produce a score that indicates the identity of the master suit image 415′.


The comparisons may include a pixel by pixel comparison (e.g., X-OR) for each of the normalized images. In some embodiments, the comparison may further include a comparison of a pixel as well as its neighboring pixels to count in the score. For example, pixels that are not located on the edges of the image have eight bordering pixels. A valid count may be added to the score if the bordering pixels are also the same as the middle pixel being used in the comparison.


In some embodiments using normalized images, the spades and clubs may appear somewhat similar such that false identifications may occur. In some embodiments, the normalized images D1, D2, . . . DN of the combined deck sub-directories 440 may be altered somewhat to further distinguish the two types of suits. For example, the normalized images D1, D2, . . . DN of the combined deck sub-directories 440 that correspond to the clubs may have a circle (or other shape) drawn within each of the leaves. During tuning, the unknown master suit images 415 that correspond to clubs may be identified in an alternative method, such as counting pixels across a different path (e.g., a 45-degree line from the middle left side of the image to the middle top side of the image) that will identify the different shapes of the club versus the spade.


In some embodiments, the master images 413, 415 may be compared with each other to determine secondary relationships between master images 413, 415 and identify the correct rank and suit for each master image 413, 415. The method of comparison includes determining a score, which may be represented by a number (or percentage) of matched pixels or a number (or percentage) of unmatched pixels. In some embodiments, the method of comparing may include comparing a shape of an edge of a symbol, or other methods of comparison. By comparing master images 413, 415, the correct rank and suit for each master images 413, 415 determined indirectly from other its relationship with other master images 413, 415. For example, the master images 413, 415 may be compared with each other to obtain a score. The score may be an indication of how similar or dissimilar each master image 413, 415 is to each other.


As an example, a first master rank image 413 is compared with a second master rank image 413, the first master rank image 413 is then compared with a third master rank image 413, the first master rank image 413 is then compared with a fourth master rank image 413, and so on. As a result, a comparison for each permutation of all master rank images 413, and a score may be recorded for each individual permutation among the group of master rank images 413. A similar comparison may be performed to obtain a score for a comparison of each permutation of the master suit images 415.


Of course, the scores for comparisons of different master images 413, 415 would be dissimilar and the resulting score would be relatively dissimilar. However, the score resulting from a comparison of the A rank and the J rank may be more similar to each other than the score resulting from a comparison of the A rank and the 5 rank. In fact, each rank may have a different rank that, when compared, yields a closest secondary match relative to its comparison with the other ranks. For example, for some decks, the following table may represent the next best match for various ranks. The score represented in Table 1 below is a percentage of pixel matching as described in further detail below.











TABLE 1





Rank
Next Best Match Rank
Score (%)

















A
J
20


2
4
20


3
5
15


4
J
17


5
6
9


6
5
12


7
4
19


8
6
17


9
J
20


10 
Q
18


J
4
17


Q
4
17


K
J
19









The “Next Best Match Rank” shown in Table 1 is to be understood as an example for one style of card deck. Similarly, the scores shown in Table 1 may be approximates. Of course, as decks may vary from one design to the next, the next best match and the score may also differ from one deck to the next. In addition, the analysis regarding the secondary comparisons of master images 413, 415 may also consider the expected matches beyond the next best match. In other words, the accuracy may be improved if the third best match, the fourth best match, etc. may be considered.


In some embodiments, a specific master image 413, 415 may be used as a baseline master image to determine the links for the other master images. For example, the blob analysis may be used to identify a 10 rank by searching for the unique characteristics of the 10 rank as discussed above. The master rank image 413 for the 10 rank may then be used to compare with the unknown master rank images 413. In other words, the 10 rank is compared with an unknown master rank image 413 to obtain a first score, the 10 rank is compared with another unknown master rank images 413 to obtain a second score, and so on. Each rank may have a score that is different with respect to a comparison with the 10 rank. These different scores may correspond to a rank based on the expected relative match to the 10 rank. The highest score resulting from using the 10 rank as a baseline master rank image 413 may be linked to the Queen rank (the closest next best match in this example). Once the Queen rank has been identified, its master rank image 413 may be compared to the other master rank images 413, from which the four rank may be identified as the highest score that resulted from the comparisons with the Queen rank. Analyzing these secondary relationships in view of expected secondary relationships may result in the various master images 413, 415 being linked to the correct ranks and suits.


Other ranks may be used as the baseline master rank image 413. In some embodiments, the deck may be required to be only partially sorted, such as requiring the technician to have a specific card as the first card read from the deck. For example, the Queen of Hearts may be required to be the first card read from the deck. The master images 413, 415 for the Queen rank and the Hearts suit may then be used as the baseline master rank image 413 similar to the process described above for using the 10 rank.


In other embodiments, secondary analysis may be used to determine the identity of the suit for the master suit image 415, such as analyzing the curvature of the suit shapes or from comparisons of the other master suit images 415 to determine the identity from secondary relationships of non-matching master images. Such secondary analysis may be beneficial for situations when the deck may not be sorted in any particular order. Such secondary analysis may also be performed for other reasons, such as to verify the order of a sorted deck (e.g., the system may still require a sorted deck, but these secondary relationships may provide a way to alert the operator that the deck was not sorted properly), verify a correct deck (e.g., 52 unique cards exist), and verify quality of the total scan (e.g., identify dirty cards). For example, even in the situation of having a pre-sorted deck, secondary verification may be desired to determine whether the tuning process was correct. One example may include comparing master images 413, 415 with each other to determine secondary relationships. For example, such secondary relationships may identify incorrect relationships because a card that was out of order. Another secondary verification for a pre-sorted deck may be a simple check to see if the 10 ranks are in the correct location rather than verifying each and every card. Another secondary verification may be to display the master image and what the card recognition system determined the identification to be. The operator may be allowed to select whether the identification is correct and to make any changes if incorrect.


Once the master images 413, 415 have been created and properly linked with the correct ranks and suits, the card processing system 400 may be said to be “calibrated” or “tuned” as to the particular deck type.


In some embodiments, only a selected portion of the raw images 417 may be fed back from the memory device 418 into the card recognition processor 414 for generating the master images 413, 415. In some embodiments, only one image for each rank from one of the suits may be used to generate the master rank images 413. For example, the Ace of Diamonds may be used to obtain the master image linked with the Ace rank, while the other Ace ranks (e.g., Ace of Spades, Ace of Hearts, Ace of Clubs) may be ignored. The master rank images 413 for the other ranks may be generated in a similar manner. While generating the master images 413, 415 for the ranks, certain card images may be selected to generate the master suit image 415 for each suit as well.


In other embodiments, each raw image 417 may be used to obtain a master image 413, 415 for each rank and suit. As a result, a plurality of master images 413, 415 may be generated for each rank and suit. For example, four separate master rank images 413 may be created for the Ace rank (i.e., one Ace image from a card for each suit). The main control system 412 may then analyze each of the master rank images 413 for that rank to determine which of the master rank images 413 is desired to be used as the ultimate master rank image 413 during the card recognition mode. In other words, the main control system 412 may choose what it considers “the best” image from among a plurality of images to select the master image 413, 415 for a particular rank or suit. The best image may be determined by comparing against each of the other master images 413, 415 to obtain a score that is most dissimilar to the other master images 413, 415 of a different type so that the separation between master images 413, 415 is the greatest. For example, each of the master rank images 413 from a single rank (e.g., four different Ace images) may be compared to the master rank images 413 of the other ranks (e.g., 2-10, J, Q, K). The Ace image (from among all Ace images) that provides the most dissimilar score when compared with the images from the other ranks may be Ace image selected for the master rank image 413 for the Ace rank. Other factors or determinations may be used for, or contribute to, the determination of which master image 413, 415 is to be used as the master image 413, 415 for a particular rank or suit.


In some environments, the main control system 412 may employ OCR techniques to recognize the identity of the correct rank or suit of the each master image 413, 415 from contours identified in the master images 413, 415 to be linked with the correct rank and the suit of the master image 413, 415.


For example, FIGS. 9A, 9B, and 9C are a series of card images 900A, 900B, and 900C that illustrate a method for generating master images by finding and filling contours according to another embodiment of the disclosure. A contour is a line that identifies an edge of a white area within the card image that is not connected to any other white area within the card image.


As discussed above, the grayscale image from a raw image 417 may be converted to a black and white image. In some embodiments (e.g., as is shown in FIGS. 8A-8C) the card area may be white and the ranks and suits may be black. In some embodiments, these white and black regions may be inverted, as is the case for the images in FIGS. 9A-9C, 10A and 10B. Referring again to FIG. 9A, contours may be identified by an image processing analysis program (e.g., OpenCV), such that the white area from the Jack rank and the diamond suit stands out from the black area for the card background. Other white space is filled, which may occur in stages, as shown in FIGS. 9B and 9C. The resulting image that includes both the rank and the suit depicted in white has the remainder of the image depicted in black, which may provide further contrast and improve subsequent analysis on the master rank images and master suit images that are generated.



FIGS. 10 and 11 show histogram groups 1000, 1100 that may result from an OCR analysis of the master suit images and the master rank images generated by the contour analysis illustrated in FIGS. 9A-9C. OCR may be employed during the tuning mode, particularly for determining the rank and the suit of the unknown master images when the cards are not sorted and are not read in any particular order that is expected by the main control system 412. OCR may employ an artificial neural network machine-learning algorithm for analyzing the contours of the master images 415, 415.


The image in the lower left corner of each of the histogram groups 1000, 1100 is the master image 413, 415 that is unknown and to be determined. An image 1001 in the upper left corner of each of the histogram groups 1000, 1100 is a sum (a numerical count) of the white pixels along each column of the master image 413, 415 (i.e., a vertical histogram). In other words, the white pixels in the first column of pixels of the master image 413, 415 may be counted, with the sum being represented by the number of pixels in the light region of the first column of the image 1001. The corresponding dark region of the first column of the image 1001 may represent the sum of black pixels in the first column of pixels of the master image 413, 415.


An image 1002 in the lower right corner of each of the histogram groups 1000, 1100 is a sum of the white pixels along each row of the master image 413, 415 (i.e., a horizontal histogram). In other words, the white pixels in the first row of pixels of the master image 413, 415 may be counted, with the sum being represented by the number of pixels in the light region of the first column of the image 1002. The corresponding dark region of the first row of the image 1002 may represent the sum of black pixels in the first row of pixels of the master image 413, 415.


An image 1003 in the upper right corner of each of the histogram groups 1000, 1100 is a low resolution image (e.g., 5×5 pixels) of the master image 413, 415. The images 1001, 1002, 1003 may be compared with previous OCR results of other normalized rank and suit images for one or more deck types to determine the correct rank or suit of the master images 413, 415.


Card Recognition Mode Operation



FIG. 12 is a flowchart 1200 illustrating a method for determining the identity of unknown images according to an embodiment of the present disclosure. The method of FIG. 12 is described with reference to the card processing system 400 of FIG. 4 and the raw image 417 of FIG. 5. The card recognition mode may be employed to operate the card processing system 400 that has been tuned to a deck of cards to recognize the identity of unknown cards as the cards pass through the card handling device. Using the calibration file 433, the card recognition processor 414 (or, alternatively, the processor 430 of the main control system 412) may process the unknown images 411 to be compared with the master images 413, 415 for determining a match.


At operation 1210, the card recognition processor 414 may be initialized by loading the calibration file 433 and the master images 413, 415 into the card recognition processor 414 from the appropriate deck library 434 for the deck type being used. The operator (e.g., dealer) may select the appropriate deck type (corresponding to the deck name file 435) from an interface of the card handling device. Loading the calibration file 433 may include loading the actual file itself, or loading at least some of the information contained therein.


At operation 1220, an unknown image 411 may be captured. For example, the dealer may place the deck into the card handling device for shuffling, for dealing, etc. Each time a card is moved past the card present sensor 420, the imaging device 416 may capture an unknown image 411 of the card responsive to the trigger signal from the card present sensor 420. Unknown images 411 may be captured during game play to verify hands, at the beginning of game play to verify a deck, etc.


At operation 1230, the unknown image 411 may be converted from a raw image to generate unknown rank and suit images. Using the parameters (e.g., the region of interest 504, rank area 508, suit area 510, etc.) from the calibration file 433, the card recognition processor 414 may generate an unknown rank image and an unknown suit image from the unknown image 411. Generating an unknown rank image and an unknown suit image may mean generating separate files from the unknown image, but also embodiments in which boundaries for the rank and suit areas are simply defined within the unknown image 411 so that any further comparison is limited to within the defined boundaries. Because the parameters in the calibration file 433 were also used to generate the master images 413, 415, the unknown rank image and the master rank images 413 may be the same size (i.e., dimensions), and the unknown suit image and the master suit images 415 may be the same size. When the unknown image 411 is captured, the card recognition processor 414 may also convert the unknown image 411 from a grayscale image to a black and white image.


At operation 1240, the unknown image 411 (e.g., the unknown rank and suit images) may be compared to the master images 413, 415. The comparison may be based on comparing pixel to pixel of each file to generate a correlation score. The score may be represented as a number of pixels, a percentage of pixels, or other suitable form. In another embodiment, each master image may be scanned across a larger unknown image 411 to determine a plurality of scores that may be used to determine if a match exists. In some embodiments, contours may be determined and analyzed using an OCR method.


The card recognition processor 414 may compare the unknown image 411 against the set of master images 413, 415 to determine the identity of the card. For example, the card recognition processor 414 may compare the unknown rank image (e.g., a separate image or a defined boundary around the rank) against each of the thirteen master images for the ranks (2-10, J, Q, K, A). The card recognition processor 414 may also compare the unknown suit image (e.g., a separate image or a defined boundary around the suit) against each of the four master images for the suits (Diamond, Heart, Spade, Club). Based on the results of these comparisons, the card recognition processor 414 may determine the identity of the card by selecting the symbols with the highest degree of correlation. The card recognition processor 414 may perform a plurality of processes that at least partially overlap in time. As a result, the unknown image 411 may be compared with a plurality of different master images 413, 415 at the same time.


In one embodiment, the comparison may include comparing the unknown image 411 and the master images 413, 415 pixels by pixel across each row of each image array. For example, the card recognition processor 414 may perform an inverted XOR operation of the corresponding pixels of the unknown image 411 and one of the master images 413, 415. In another embodiment, the comparison may include comparing cross-correlation values of matrices from the master image 413, 415 and the unknown image 411. Such a cross-correlation function may, however, require a larger amount of computational resources than embodiments that include a more simple summation.


At operation 1250, a match is determined based on the score. For example, the result of the inverted XOR operation may be summed to obtain a score (e.g., a numeric sum). For example, a black pixel (1) compared with a black pixel (1) may add to the score, a white pixel (0) compared with a white pixel (0) may add to the score, while a black pixel (1) compared with a white pixel (0) may not add to the score. A larger score may indicate a greater number of matching pixels. The score may be represented as a raw number of matching pixels or as a percentage of the total number of pixels of the images. For example, a score of 100% may be a perfect pixel to pixel match of the unknown image and a master image. Of course, a perfect match may not be reasonable to expect, and a match may still be found for a score having a percentage less than 100%. As a result, a minimum threshold may be set for what the card processing system 400 may consider a match. In some embodiments, the inverted XOR operation may be implemented as another logic operation. In some embodiments, like pixels may be added and dissimilar pixels may be not counted, while in other embodiments, dissimilar pixels may be added and then subtracted from the total number of pixels to determine the score. In addition, some embodiments may include counting the number of pixels that do not match and using the lowest score below some threshold for non-matching pixels.


The card processing system 400 may generate either a valid card ID (if a match is determined) or an indication of an error (if a match is not determined). For example, the card recognition processor 414 may return a score resulting from the comparison of the unknown image 411 with each master image 413, 415. To obtain a “match,” the score may be required to be above a minimum error threshold. In some situations, the scores for more than one master image may be above the minimum error threshold. The highest score, however, will be selected as the valid card ID. In other words, the master image 413, 415 used in the comparison that resulted the highest score may be selected as the card ID (assuming the score is above a minimum error threshold). A deck may also be flagged as being invalid if a pre-determined (e.g., 6) number of errors (e.g., misread cards) occur during card recognition. In some embodiments, the score may be based on the dissimilar pixels. Thus, a match would occur for the lowest score of dissimilarity (0% for a perfect match) above a maximum error threshold rather than being based on the score of similar pixels.


In some embodiments, the card handling device may display an image of the card acquired by the imaging device 416 on a display device of the card handling device. Such an image displayed on the display device may be viewed by the operator for verification that the valid card ID is the correct determination, or as a way to see an image of the card that produced an error indication.


At operation 1260, it is determined if another card is present to determine its identity. If so, the unknown image 411 for the new card is captured and the process continues.



FIGS. 13A, 13B, and 13C show a processed image 1300A, 1300B, 1300C of a card 1306A, 1306B, 1306C, in which the imaging device 416 (FIG. 4) had experienced dust build-up on the lens. Dust may accumulate on the lens with use of the card handling device. For example, FIG. 13A is an example of dust build-up after a first number cycles, FIG. 13B is an example of dust build-up after a greater number cycles, and FIG. 13C is an example of dust build-up after many additional cycles.


When the imaging device 416 accumulates dust, the raw image 417 may become a different shade of gray in terms of the grayscale image. The white area may become a little more gray and the black area may become a little less black. As discussed above, each pixel in a grayscale image has a value between white and black. When converting a grayscale image to a black and white image, a threshold value may be used as the cutoff point between black and white. As a result, the white area of the processed black and white image may become smaller as the camera accumulates dust.


The card recognition processor 414 (FIG. 4) may be configured to correct for dust build-up. In one embodiment, rather than setting a fixed threshold used to convert a grayscale image to a black and white image, the threshold value may be dynamically changed over time to compensate for dust build-up. As an example, the threshold may change to have different levels over time. The threshold may be used to convert the grayscale image to a black and white image during the calibration mode. The threshold may be set to a first level during calibration mode, and to a second level during card recognition mode. As an example, the dynamic changes (from the first level to the second level) may be performed to compensate for changes in lighting conditions. It is also contemplated that the dynamic changes may be based on other conditions. In some embodiments, the number of cycles may be counted and the threshold may dynamically change with the number of cycles. In some embodiments, one or more data filters may be employed to further improve the processed image during conversion from grayscale to a black and white image.



FIGS. 14A and 14B illustrate a problem of incorrectly splitting an image that may arise during calibration mode and/or card recognition mode. In particular, incorrectly splitting an image may often occur when the imaging device 416 (FIG. 4) is not clean or if the actual card itself has a mark. As a result, a blemish 1401 may appear on an image 1400 that is used to generate master images 413, 415 (in calibration mode) or the unknown images 411 (in card recognition mode) for a card 1406; however, it may be more likely for a card to have a blemish 1401 during card recognition mode (because of the repeated use of the card over time).


The blemish 1401 may be mistaken for either the rank or suit, often because the card recognition processor 414 may first look for the split between the rank and suit as a starting point in the analysis. As shown in FIG. 14A, the card recognition processor 414 may mistake a space between the blemish 1401 and the rank as the split point. As a result, the card recognition processor 414 may generate the localized image as the master rank image 413 during the calibration mode (or as the unknown rank image during the card recognition mode) based on the either the rank area 508 in the calibration file. The localized image, however, does not include the proper portion of the image 900 that includes the entire rank.


In some embodiments, the card recognition processor 414 may not create the unknown images 411 based on finding a split between the rank and suit symbols. Rather, during the card recognition mode, the card recognition processor 414 may compare the master images 413, 415 to the entire unknown image 411 rather than first generating a smaller localized image of just the rank or the suit. In other words, the entire field of view (or a portion thereof) that includes both the rank and the suit for the unknown image may be used for comparison with the master images 413, 415. As a result the unknown image 411 may be larger than the master images 413, 415. In such an embodiment, the card recognition processor 414 may perform a plurality of comparisons of each master image 413, 415 and the unknown image 411 by scanning the master image 413, 415 across the unknown in an overlapping manner. For example, the master image 413, 415 may begin this comparison in the top left corner of the unknown image 411 and scan across pixel by pixel to the top right corner. The master image 413, 415 may then be moved down a row of pixels moving along the pixels of that row, and so on. If at some point during this scanning, a score results in a match, the card may be identified.



FIGS. 15A and 15B illustrate an issue that may arise when capturing an image using uneven illumination and fisheye distortion. FIG. 15A is a raw image 1500A using a grid for illustration. FIG. 15B is a processed image 1500B showing the grid after conversion from grayscale to black and white. As shown in FIGS. 15A and 15B, uneven lighting may cause some portions of the raw image 1500A to appear dark when they are actually white. As a result, recognition may not be as accurate due to a lower score when comparing images, or for more serious problems such as incorrect splitting (FIGS. 14A and 14B).


Uneven illumination may be corrected in a similar manner to the correction for dust build-up on the imaging device 416 (FIG. 4). For example, the card recognition processor 414 may be configured to dynamically change the threshold value used in the conversion from the grayscale image to the black and white image. The dynamic change may be responsive to a number of cycles of the imaging device 416, a light sensor detecting illumination changes, or other changes in the environment. In some embodiments, one or more data filters may be employed to further improve the processed image 1500B during conversion from grayscale to a black and white image.



FIGS. 16A, 16B, and 16C are raw images 1600A, 1600B, 1600C from the imaging device 416 (FIG. 4) of a card handling device showing fish eye distortion caused by the imaging device 416. In some embodiments, short distances between the lens and the card caused by very small spaces within the card handling device and proximity of the imaging device 416 to the card may require a fisheye lens to be used with the imaging device 416. The fisheye lens may provide a wide field of view for the imaging device 416 that is sufficient to view the rank and suit for different types of cards. For example, some decks may have relatively large ranks and suits that take up a large area of the corner of the card. In addition, the location of the rank and suit may vary from one deck to another.


The fisheye lens may introduce fisheye distortion in the raw images 1600A, 1600B, 1600C taken by the imaging device 416. For example, FIG. 11A shows a raw image 1600A of a grid in which the squares of the grid are equal sizes. However, as shown in FIG. 16A, the fisheye distortion causes the squares of the grid to appear to be different sizes throughout the raw image 1600A. Near the center of the raw image 1600A, the fisheye distortion may not be as pronounced; however, near the edges and corners of the raw image 1600A, the fisheye distortion becomes more pronounced.



FIGS. 16B and 16C are raw images 1600B, 1600C taken with the imaging device 416 with a lens having fisheye distortion. When comparing the diamond suits in FIGS. 16B and 16C, it can be seen that the shapes of the diamond suits vary because of the different placement of the diamond suits within the field of view of the imaging device 416. For example, the diamond suit in FIG. 11B is smaller than the diamond suit in FIG. 11C because it is located further from the center of the field of view. In addition, the Ace (A) rank in FIG. 11B is mostly centered within the field of view. The King (K) rank in FIG. 16C, however, is off-center and is smaller near the top of the rank compared with the bottom of the rank. As a result, the more distortion experienced by the ranks and suits in an image, the greater the effect the distortion may have for the score from the comparison with the master images 413, 415 for the ranks and suits when determining the card ID. In some situations, fisheye distortion has caused a misidentification of the card ID (e.g., the suit is identified as a spade when the suit was really a diamond).


The card processing system 400 (FIG. 4) may be configured to correct for such fisheye distortion. In other words, the fisheye distortion may be reduced, such as by stretching the distorted image. In some embodiments, the image pixels may be translated from the raw image 1600A, 1600B, 1600C to a corrected raw image according to a correction table (i.e., look-up table). In some embodiments, the image pixels may be mathematically translated from the raw image to a corrected raw image.



FIGS. 17A, 17B, and 17C are images 1700A, 1700B, 1700C for which the fisheye distortion has been reduced through mathematical stretching of the distorted image. As shown in FIG. 17A, the grid (which was distorted in FIG. 16A) has squares that are now substantially the same size. In FIG. 17B, the diamond suit (which was distorted in FIG. 16B) is now substantially symmetrical even though the diamond suit is off-center and proximate the edge of the image. In FIG. 17C, each of the King rank (K) and the diamond suit (which were distorted in FIG. 16C) are no longer distorted.


Additional embodiments include:


Embodiment 1

A method of automatically generating a calibration file for a card handling device, the method comprising: capturing a raw image from at least a portion of a card passing through a card handling device; and using a processor, automatically generating a calibration file stored in memory of a main control system linked with the card handling device, wherein automatically generating the calibration file comprises: identifying at least one parameter associated with a rank area around a rank of the at least a portion of the card; identifying at least one parameter associated with a suit area around a suit of the at least a portion of the card; and storing the at least one parameter associated with the rank area and the at least one parameter associated with the suit area in the calibration file.


Embodiment 2

The method of Embodiment 1, wherein automatically generating the calibration file comprises identifying a location and at least one parameter associated with a region of interest that is relatively larger than the rank area and the suit area, the method further comprising storing the location and the at least one parameter associated with the region of interest in the calibration file.


Embodiment 3

The method of Embodiment 1 or Embodiment 2, wherein capturing a raw image includes capturing a plurality of raw images from a plurality of different cards passing through the card handling device.


Embodiment 4

The method of Embodiment 3, wherein identifying at least one parameter associated with the rank area includes: identifying a at least one parameter associated with a plurality of rank areas from the plurality of different cards; and selecting the at least one parameter associated with the rank area to include a rank width parameter having a number of pixels representative of a width that is a widest dimension from the plurality of rank areas, and to include a rank depth parameter having a number of pixels representative of a depth that is a longest dimension from the plurality of rank areas.


Embodiment 5

The method of Embodiment 3, wherein identifying at least one parameter associated with the suit area includes: identifying at least one parameter associated with a plurality of suit areas from the plurality of different cards; and selecting at least one parameter associated with the rank area to include a suit width parameter having a number of pixels representative of a width that is a widest dimension from the plurality of suit areas, and to include a suit depth parameter having a number of pixels representative of a depth that is a longest dimension from the plurality of suit areas.


Embodiment 6

The method of any of Embodiments 1 through 5, further comprising storing the calibration file in a file system of an operating system running on the processor.


Embodiment 7

A method of automatically generating one or more deck libraries for one or more decks of cards, the method comprising: using a processor to automatically generate a first calibration file without user input in identifying at least one parameter associated with a rank area and at least one parameter associated with a suit area for a first deck type of cards, the calibration file including the parameters associated with the rank area and the suit area; storing the first calibration file in a first deck library for the first deck type; using the processor to automatically generate a plurality of master images for the cards of the first deck type using the parameters from the calibration file; and storing the plurality of master images for the cards of the first deck type in the first deck library.


Embodiment 8

The method of Embodiment 7, further comprising: using the processor to automatically generate a second calibration file for a second deck type of cards; storing the second calibration file in a second deck library for the second deck type; using the processor to automatically generate a second plurality of master images for the cards of the second deck type using the parameters from the second calibration file; and storing the second plurality of master images for the cards of the second deck type in the deck library.


Embodiment 9

The method of Embodiment 7 or Embodiment 8, further comprising linking each master image of the plurality of master images with an appropriate rank or suit.


Embodiment 10

The method of Embodiment 9, wherein linking each master image of the plurality of master images includes linking each master image according to an expected order that the cards were read into a card handling device.


Embodiment 11

The method of Embodiment 9, wherein linking each master image of the plurality of master images includes linking each master image from a deck that is unsorted.


Embodiment 12

The method of Embodiment 11, wherein linking each master image of the plurality of master images includes performing optical character recognition to each master image.


Embodiment 13

The method of Embodiment 11, wherein linking each master image of the plurality of master images includes comparing each master image of the plurality of master images with a set of images from a plurality of different deck types.


Embodiment 14

The method of Embodiment 11, wherein linking each master image of the plurality of master images includes comparing a normalized version of each master image of the plurality of master images with a set of normalized images from a plurality of different deck types.


Embodiment 15

The method of Embodiment 14, wherein comparing a normalized version of each master image of the plurality of master images with a set of normalized images from a plurality of different deck types includes performing a pixel by pixel comparison.


Embodiment 16

The method of Embodiment 15, wherein performing a pixel by pixel comparison further includes comparing a middle pixel with at least one additional neighboring pixel.


Embodiment 17

The method of any of Embodiments 7 through 16, wherein identifying at least one parameter associated with a rank area and at least one parameter associated with a suit area for a first deck type of cards includes performing a blob analysis to locate a rank and a suit for a card of the first deck type within a region of interest.


Embodiment 18

The method of Embodiment 17, wherein performing a blob analysis includes first locating a 10 rank to determine a width for the at least one parameter associated with the rank area.


Embodiment 19

A card processing apparatus, comprising: a memory device; an imaging device operably coupled with the memory device, such that raw images from the imaging device are stored in the memory device; and a main control system coupled with the imaging device, wherein the main control system is configured to run an operating system having a file directory system configured to store a plurality of deck libraries for a plurality of different deck types, wherein the main control system is configured to receive the raw images from the memory device, automatically generate a calibration file having parameters related to a rank area and a suit area for a deck type.


Embodiment 20

The card processing apparatus of Embodiment 19, including a card shuffler housing the memory device, the imaging device, and the main control system.


Embodiment 21

The card processing apparatus of Embodiment 19 or Embodiment 21, wherein the main control system is further configured to automatically generate a plurality of master images from the raw images according to the parameters of the calibration file.


Embodiment 22

The card processing apparatus of Embodiment 21, further comprising card recognition processor configured to load the plurality master images and the calibration file from the main control system, and compare an unknown image from the imaging device with the plurality of master images.


Embodiment 23

The card processing apparatus of Embodiment 22 wherein the card recognition processor is configured to compare the unknown image with the plurality of master images by comparing the unknown image with each master image pixel by pixel and sum the result of the comparing.


Embodiment 24

The card processing apparatus of Embodiment 22, wherein the card recognition processor includes a field-programmable gate array.


Embodiment 25

The card processing apparatus of any of Embodiments 21 through 24, wherein the main control system is configured to link the plurality of master images from an unsorted deck of cards with an appropriate rank and suit.


Embodiment 26

The card processing apparatus of Embodiment 25, wherein the main control system is configured to link the plurality of master images from an unsorted deck of cards with an appropriate rank and suit by: generating a normalized version of each master image of the plurality of master images; and comparing the normalized version of each master image of the plurality of master images with a plurality of normalized images corresponding to a plurality of different deck types.


While certain illustrative embodiments have been described in connection with the figures, those of ordinary skill in the art will recognize and appreciate that embodiments of the disclosure are not limited to those embodiments explicitly shown and described herein. Rather, many additions, deletions, and modifications to the embodiments described herein may be made without departing from the scope of embodiments of the disclosure as hereinafter claimed, including legal equivalents. In addition, features from one embodiment may be combined with features of another embodiment while still being encompassed within the scope of the disclosure as contemplated by the inventor.

Claims
  • 1. A method of automatically generating a calibration file for a card handling device, the method comprising: capturing, with an imaging device, a raw image from at least a portion of a card passing from a card input to a card output of the card handling device; andautomatically generating and storing a calibration file during a calibration mode without user input, the calibration file being associable with the card handling device and a deck of cards that includes the card such that the card handling device is trained for the associated deck during a subsequent card recognition mode, wherein automatically generating the calibration file includes: identifying at least one parameter associated with a rank area of the at least a portion of the card;identifying at least one parameter associated with a suit area of the at least a portion of the card; andstoring the at least one parameter associated with the rank area and the at least one parameter associated with the suit area in the calibration file.
  • 2. The method of claim 1, wherein automatically generating and storing the calibration file comprises identifying a location and at least one parameter associated with a region of interest that is relatively larger than the rank area and the suit area, the method further comprising storing the location and the at least one parameter associated with the region of interest in the calibration file.
  • 3. The method of claim 1, wherein automatically generating and storing the calibration file include using a processor associated with the card handling device.
  • 4. The method of claim 3, wherein storing the calibration file includes storing the calibration file in a file system of an operating system running on the processor incorporated within the card handling device.
  • 5. The method of claim 1, wherein capturing the raw image includes capturing a plurality of raw images from a plurality of different cards passing through the card handling device.
  • 6. The method of claim 5, wherein: identifying the at least one parameter associated with the rank area includes: identifying at least one parameter associated with a plurality of rank areas from the plurality of different cards; andselecting the at least one parameter associated with the rank area to include a rank width parameter having a number of pixels representative of a width that is a widest dimension from the plurality of rank areas, and to include a rank depth parameter having a number of pixels representative of a depth that is a longest dimension from the plurality of rank areas; andidentifying the at least one parameter associated with the suit area includes: identifying at least one parameter associated with a plurality of suit areas from the plurality of different cards; andselecting at least one parameter associated with the rank area to include a suit width parameter having a number of pixels representative of a width that is a widest dimension from the plurality of suit areas, and to include a suit depth parameter having a number of pixels representative of a depth that is a longest dimension from the plurality of suit areas.
  • 7. A method of automatically generating a deck library for tuning a card handling device to recognize cards from a deck of cards, the method comprising: using an imaging device of a card handling device to capture raw images of playing cards passing from a card input to a card output;using a processor to automatically generate, without user input, a first calibration file for identifying at least one parameter associated with a rank area and at least one parameter associated with a suit area from the raw images for a first deck of cards, the calibration file including the parameters associated with the rank area and the suit area;storing, in memory, the first calibration file in a first deck library for the first deck;using the processor to automatically generate a plurality of master images for the cards of the first deck using the parameters from the calibration file; andstoring, in memory, the plurality of master images for the cards of the first deck in the first deck library such that the card handling device is trained to the first deck for use in a subsequent card recognition mode of the card handling device.
  • 8. The method of claim 7, further comprising: using the processor to automatically generate a second calibration file for a second deck of cards;storing, in memory, the second calibration file in a second deck library for the second deck;using the processor to automatically generate a second plurality of master images for the cards of the second deck using the parameters from the second calibration file; andstoring, in memory, the second plurality of master images for the cards of the second deck in the deck library such that the card handling device is trained to the second deck for use in a subsequent card recognition mode of the card handling device.
  • 9. The method of claim 7, wherein the processor and memory are incorporated within the card handling device.
  • 10. The method of claim 7, wherein automatically generating the plurality of master images includes receiving and cropping the raw images according to the parameters of the first calibration file to generate at least one master image for each rank and each suit of the first deck.
  • 11. The method of claim 7, further comprising linking each master image of the plurality of master images according to an expected order by the processor as the cards were read into a card handling device.
  • 12. The method of claim 7, further comprising linking each master image of the plurality of master images according to an order that is previously unknown by the processor as the cards were read into the card handling device.
  • 13. The method of claim 11, wherein linking each master image of the plurality of master images includes performing at least one of an optical character recognition or a blob analysis on each master image.
  • 14. The method of claim 7, further comprising linking each master image of the plurality of master images for the processor to assign an identity to each master image by comparing each master image of the plurality of master images with a set of images from a plurality of different decks.
  • 15. The method of claim 7, further comprising linking each master image of the plurality of master images for the processor to assign an identity to each master image by comparing a normalized version of each master image of the plurality of master images with a set of normalized images from a plurality of different decks.
  • 16. A card processing system, comprising: a card input;a card output;a memory device;an imaging device operably coupled with the memory device, the imaging device configured to capture raw images of playing cards passing from the card input to the card output for storage of the raw images in the memory device; anda control system coupled with the imaging device, wherein the control system is configured to: run an operating system having a file directory system configured to store a plurality of deck libraries for a plurality of different deck types;receive the raw images from the memory device; andautomatically generate a calibration file having parameters related to a rank area and a suit area for each deck type without user input during a calibration mode.
  • 17. The card processing system of claim 16, including a card shuffler having a common housing for each of the memory device, the imaging device, and the control system.
  • 18. The card processing system of claim 16, wherein the control system is further configured to automatically generate a plurality of master images from the raw images according to the parameters of the calibration file.
  • 19. The card processing system of claim 18, wherein the control system is configured to link the plurality of master images from an unsorted deck of cards with an appropriate rank and suit during the calibration mode for the control system to identify which rank and which suit corresponds to each master image.
  • 20. The card processing system of claim 19, wherein the control system is configured to link the plurality of master images from an unsorted deck of cards with an appropriate rank and suit by: generating a normalized version of each master image of the plurality of master images; andcomparing the normalized version of each master image of the plurality of master images with a plurality of normalized images corresponding to a plurality of different deck types.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 14/022,160, filed Sep. 9, 2013, now U.S. Pat. No. 9,511,274, issued Dec. 6, 2016, which is a continuation-in-part of U.S. patent application Ser. No. 13/631,658, filed Sep. 28, 2012, now U.S. Pat. No. 9,378,766, issued Jun. 28, 2016, and entitled “Card Recognition System, Card Handling Device, and Method for Tuning a Card Handling Device,” the disclosure of which is hereby incorporated herein by this reference in its entirety.

US Referenced Citations (947)
Number Name Date Kind
130281 Coughlin Aug 1872 A
205030 Ash Jun 1878 A
609730 Booth Aug 1898 A
673154 Bellows Apr 1901 A
793489 Williams Jun 1905 A
892389 Bellows Jul 1908 A
1014219 Hall Jan 1912 A
1043109 Hurm Nov 1912 A
1157898 Perret Oct 1915 A
1256509 Belknap Feb 1918 A
1992085 McKay Feb 1925 A
1556856 Lipps Oct 1925 A
1850114 McCaddin Jun 1929 A
1757553 Gustav May 1930 A
1885276 McKay Nov 1932 A
1889729 Hammond Nov 1932 A
1955926 Matthaey Apr 1934 A
1998690 Shepherd et al. Apr 1935 A
2001220 Smith May 1935 A
2001918 Nevius May 1935 A
2016030 Woodruff et al. Oct 1935 A
2043343 Warner Jun 1936 A
2060096 McCoy Nov 1936 A
2065824 Plass Dec 1936 A
2159958 Sachs May 1939 A
2185474 Nott Jan 1940 A
2254484 Hutchins Sep 1941 A
D132360 Gardner May 1942 S
2328153 Laing Aug 1943 A
2328879 Isaacson Sep 1943 A
D139530 Schindler Nov 1944 S
2364413 Wittel Dec 1944 A
2525305 Lombard Oct 1950 A
2543522 Cohen Feb 1951 A
2588582 Sivertson Mar 1952 A
2615719 Fonken Oct 1952 A
2659607 Skillman et al. Nov 1953 A
2661215 Stevens Dec 1953 A
2676020 Ogden Apr 1954 A
2692777 Miller Oct 1954 A
2701720 Ogden Feb 1955 A
2705638 Newcomb Apr 1955 A
2711319 Morgan et al. Jun 1955 A
2714510 Oppenlander et al. Aug 1955 A
2717782 Droll Sep 1955 A
2727747 Semisch, Jr. Dec 1955 A
2731271 Brown Jan 1956 A
2747877 Howard May 1956 A
2755090 Aldrich Jul 1956 A
2757005 Nothaft Jul 1956 A
2760779 Ogden et al. Aug 1956 A
2770459 Wilson et al. Nov 1956 A
2778643 Williams Jan 1957 A
2778644 Stephenson Jan 1957 A
2782040 Matter Feb 1957 A
2790641 Adams Apr 1957 A
2793863 Liebelt May 1957 A
2815214 Hall Dec 1957 A
2821399 Heinoo Jan 1958 A
2914215 Neidig Nov 1959 A
2937739 Levy May 1960 A
2950005 MacDonald Aug 1960 A
RE24986 Stephenson May 1961 E
3067885 Kohler Dec 1962 A
3107096 Osborn Oct 1963 A
3124674 Edwards et al. Mar 1964 A
3131935 Gronneberg May 1964 A
3147978 Sjostrand Sep 1964 A
D200652 Fisk Mar 1965 S
3222071 Lang Dec 1965 A
3235741 Plaisance Feb 1966 A
3288308 Gingher Nov 1966 A
3305237 Granius Feb 1967 A
3312473 Friedman et al. Apr 1967 A
3452509 Hauer Jul 1969 A
3530968 Palmer Sep 1970 A
3588116 Miura Jun 1971 A
3589730 Slay Jun 1971 A
3595388 Castaldi Jul 1971 A
3597076 Hubbard et al. Aug 1971 A
3618933 Roggenstein et al. Nov 1971 A
3627331 Erickson Dec 1971 A
3666270 Mazur May 1972 A
3680853 Houghton et al. Aug 1972 A
3690670 Cassady et al. Sep 1972 A
3704938 Fanselow Dec 1972 A
3716238 Porter Feb 1973 A
3751041 Seifert Aug 1973 A
3761079 Azure, Jr. Sep 1973 A
3810627 Levy May 1974 A
D232953 Oguchi Sep 1974 S
3861261 Maxey Jan 1975 A
3897954 Erickson et al. Aug 1975 A
3899178 Watanabe Aug 1975 A
3909002 Levy Sep 1975 A
3929339 Mattioli Dec 1975 A
3944077 Green Mar 1976 A
3944230 Fineman Mar 1976 A
3949219 Crouse Apr 1976 A
3968364 Miller Jul 1976 A
4023705 Reiner et al. May 1977 A
4033590 Pic Jul 1977 A
4072930 Lucero et al. Feb 1978 A
4088265 Garczynski May 1978 A
4151410 McMillan et al. Apr 1979 A
4159581 Lichtenberg Jul 1979 A
4162649 Thornton Jul 1979 A
4166615 Noguchi et al. Sep 1979 A
4232861 Maul Nov 1980 A
4280690 Hill Jul 1981 A
4283709 Lucero et al. Aug 1981 A
4310160 Willette et al. Jan 1982 A
4339134 Macheel Jul 1982 A
4339798 Hedges et al. Jul 1982 A
4361393 Nato Nov 1982 A
4368972 Naramore Jan 1983 A
4369972 Parker Jan 1983 A
4374309 Walton Feb 1983 A
4377285 Kadlic Mar 1983 A
4385827 Naramore May 1983 A
4388994 Suda et al. Jun 1983 A
4397469 Carter, III Aug 1983 A
4421312 Delgado et al. Dec 1983 A
4421501 Scheffer Dec 1983 A
D273962 Fromm May 1984 S
D274069 Fromm May 1984 S
4467424 Hedges et al. Aug 1984 A
4494197 Troy et al. Jan 1985 A
4497488 Plevyak et al. Feb 1985 A
4512580 Matviak Apr 1985 A
4513969 Samsel, Jr. Apr 1985 A
4515367 Howard May 1985 A
4531187 Uhland Jul 1985 A
4534562 Cuff et al. Aug 1985 A
4549738 Greitzer Oct 1985 A
4566782 Britt et al. Jan 1986 A
4575367 Karmel Mar 1986 A
4586712 Lorber et al. May 1986 A
4659082 Greenberg Apr 1987 A
4662637 Pfeiffer May 1987 A
4662816 Fabrig May 1987 A
4667959 Pfeiffer et al. May 1987 A
4741524 Bromage May 1988 A
4750743 Nicoletti Jun 1988 A
4755941 Bacchi Jul 1988 A
4759448 Kawabata Jul 1988 A
4770412 Wolfe Sep 1988 A
4770421 Hoffman Sep 1988 A
4807884 Breeding Feb 1989 A
4822050 Normand et al. Apr 1989 A
4832342 Plevyak et al. May 1989 A
4858000 Lu Aug 1989 A
4861041 Jones et al. Aug 1989 A
4876000 Mikhail Oct 1989 A
4900009 Kitahara et al. Feb 1990 A
4904830 Rizzuto Feb 1990 A
4921109 Hasuo et al. May 1990 A
4926327 Sidley May 1990 A
4948134 Suttle et al. Aug 1990 A
4951950 Normand et al. Aug 1990 A
4969648 Hollinger et al. Nov 1990 A
4993587 Abe Feb 1991 A
4995615 Cheng Feb 1991 A
5000453 Stevens et al. Mar 1991 A
5004218 Sardano et al. Apr 1991 A
5039102 Miller Aug 1991 A
5067713 Soules et al. Nov 1991 A
5078405 Jones et al. Jan 1992 A
5081487 Hoyer et al. Jan 1992 A
5096197 Embury Mar 1992 A
5102293 Schneider Apr 1992 A
5118114 Tucci Jun 1992 A
5121192 Kazui Jun 1992 A
5121921 Friedman et al. Jun 1992 A
5146346 Knoll Sep 1992 A
5154429 LeVasseur Oct 1992 A
5179517 Sarbin et al. Jan 1993 A
5197094 Tillery et al. Mar 1993 A
5199710 Lamle Apr 1993 A
5209476 Eiba May 1993 A
5224712 Laughlin et al. Jul 1993 A
5240140 Huen Aug 1993 A
5248142 Breeding Sep 1993 A
5257179 DeMar Oct 1993 A
5259907 Soules et al. Nov 1993 A
5261667 Breeding Nov 1993 A
5267248 Reyner Nov 1993 A
5275411 Breeding Jan 1994 A
5276312 McCarthy Jan 1994 A
5283422 Storch et al. Feb 1994 A
5288081 Breeding Feb 1994 A
5299089 Lwee Mar 1994 A
5303921 Breeding Apr 1994 A
5344146 Lee Sep 1994 A
5356145 Verschoor Oct 1994 A
5362053 Miller Nov 1994 A
5374061 Albrecht Dec 1994 A
5377973 Jones et al. Jan 1995 A
5382024 Blaha Jan 1995 A
5382025 Sklansky et al. Jan 1995 A
5390910 Mandel et al. Feb 1995 A
5397128 Hesse et al. Mar 1995 A
5397133 Penzias Mar 1995 A
5416308 Hood et al. May 1995 A
5431399 Kelley Jul 1995 A
5431407 Hofberg et al. Jul 1995 A
5437462 Breeding Aug 1995 A
5445377 Steinbach Aug 1995 A
5470079 LeStrange et al. Nov 1995 A
D365853 Zadro Jan 1996 S
5489101 Moody Feb 1996 A
5515477 Sutherland May 1996 A
5524888 Heidel Jun 1996 A
5531448 Moody Jul 1996 A
5544892 Breeding Aug 1996 A
5575475 Steinbach Nov 1996 A
5584483 Sines et al. Dec 1996 A
5586766 Forte et al. Dec 1996 A
5586936 Bennett et al. Dec 1996 A
5605334 McCrea, Jr. Feb 1997 A
5613912 Slater Mar 1997 A
5632483 Garczynski et al. May 1997 A
5636843 Roberts Jun 1997 A
5651548 French et al. Jul 1997 A
5655961 Acres et al. Aug 1997 A
5655966 Werdin, Jr. et al. Aug 1997 A
5669816 Garczynski et al. Sep 1997 A
5676231 Legras et al. Oct 1997 A
5676372 Sines et al. Oct 1997 A
5681039 Miller Oct 1997 A
5683085 Johnson et al. Nov 1997 A
5685543 Gamer Nov 1997 A
5690324 Otomo et al. Nov 1997 A
5692748 Frisco et al. Dec 1997 A
5695189 Breeding et al. Dec 1997 A
5701565 Morgan Dec 1997 A
5707286 Carlson Jan 1998 A
5707287 McCrea, Jr. Jan 1998 A
5711525 Breeding Jan 1998 A
5718427 Cranford et al. Feb 1998 A
5719288 Sens et al. Feb 1998 A
5720484 Hsu Feb 1998 A
5722893 Hill et al. Mar 1998 A
5735525 McCrea, Jr. Apr 1998 A
5735724 Udagawa Apr 1998 A
5735742 French Apr 1998 A
5743798 Adams et al. Apr 1998 A
5768382 Schneier et al. Jun 1998 A
5770533 Franchi Jun 1998 A
5770553 Kroner et al. Jun 1998 A
5772505 Garczynski et al. Jun 1998 A
5779546 Meissner et al. Jul 1998 A
5781647 Fishbine et al. Jul 1998 A
5785321 van Putten et al. Jul 1998 A
5788574 Ornstein et al. Aug 1998 A
5791988 Nomi Aug 1998 A
5802560 Joseph et al. Sep 1998 A
5803808 Strisower Sep 1998 A
5810355 Trilli Sep 1998 A
5813326 Salomon Sep 1998 A
5813912 Shultz Sep 1998 A
5814796 Benson Sep 1998 A
5836775 Hiyama et al. Nov 1998 A
5839730 Pike Nov 1998 A
5845906 Wirth Dec 1998 A
5851011 Lott Dec 1998 A
5867586 Liang Feb 1999 A
5879233 Stupero Mar 1999 A
5883804 Christensen Mar 1999 A
5890717 Rosewarne et al. Apr 1999 A
5892210 Levasseur Apr 1999 A
5909876 Brown Jun 1999 A
5911626 McCrea, Jr. Jun 1999 A
5919090 Mothwurf Jul 1999 A
D412723 Hachuel et al. Aug 1999 S
5936222 Korsunsky Aug 1999 A
5941769 Order Aug 1999 A
5944310 Johnson et al. Aug 1999 A
D414527 Tedham Sep 1999 S
5957776 Hoehne Sep 1999 A
5974150 Kaish et al. Oct 1999 A
5989122 Roblejo Nov 1999 A
5991308 Fuhrmann et al. Nov 1999 A
6015311 Benjamin et al. Jan 2000 A
6019368 Sines et al. Feb 2000 A
6019374 Breeding Feb 2000 A
6039650 Hill Mar 2000 A
6050569 Taylor Apr 2000 A
6053695 Longoria et al. Apr 2000 A
6061449 Candelore et al. May 2000 A
6068258 Breeding et al. May 2000 A
6069564 Hatano et al. May 2000 A
6071190 Weiss et al. Jun 2000 A
6093103 McCrea, Jr. Jul 2000 A
6113101 Wirth Sep 2000 A
6117012 McCrea, Jr. Sep 2000 A
D432588 Tedham Oct 2000 S
6126166 Lorson et al. Oct 2000 A
6131817 Miller Oct 2000 A
6139014 Breeding et al. Oct 2000 A
6149154 Grauzer et al. Nov 2000 A
6154131 Jones, II et al. Nov 2000 A
6165069 Sines et al. Dec 2000 A
6165072 Davis et al. Dec 2000 A
6183362 Boushy Feb 2001 B1
6186895 Oliver Feb 2001 B1
6196416 Beagle Mar 2001 B1
6200218 Lindsay Mar 2001 B1
6210274 Carlson Apr 2001 B1
6213310 Wennersten et al. Apr 2001 B1
6217447 Lofink et al. Apr 2001 B1
6234900 Cumbers May 2001 B1
6236223 Brady et al. May 2001 B1
6250632 Albrecht Jun 2001 B1
6254002 Litman Jul 2001 B1
6254096 Grauzer et al. Jul 2001 B1
6254484 McCrea, Jr. Jul 2001 B1
6257981 Acres et al. Jul 2001 B1
6267248 Johnson et al. Jul 2001 B1
6267648 Katayama et al. Jul 2001 B1
6267671 Hogan Jul 2001 B1
6270404 Sines Aug 2001 B2
6272223 Carlson Aug 2001 B1
6293546 Hessing et al. Sep 2001 B1
6293864 Romero Sep 2001 B1
6299167 Sines Oct 2001 B1
6299534 Breeding et al. Oct 2001 B1
6299536 Hill Oct 2001 B1
6308886 Benson et al. Oct 2001 B1
6313871 Schubert Nov 2001 B1
6325373 Breeding et al. Dec 2001 B1
6334614 Breeding Jan 2002 B1
6341778 Lee Jan 2002 B1
6342830 Want et al. Jan 2002 B1
6346044 McCrea, Jr. Feb 2002 B1
6361044 Block Mar 2002 B1
6386973 Yoseloff May 2002 B1
6402142 Warren et al. Jun 2002 B1
6403908 Stardust et al. Jun 2002 B2
6443839 Stockdale et al. Sep 2002 B2
6446864 Kim et al. Sep 2002 B1
6454266 Breeding et al. Sep 2002 B1
6460848 Soltys et al. Oct 2002 B1
6464584 Oliver Oct 2002 B2
6490277 Tzotzkov Dec 2002 B1
6508709 Karmarkar Jan 2003 B1
6514140 Starch Feb 2003 B1
6517435 Soltys et al. Feb 2003 B2
6517436 Soltys et al. Feb 2003 B2
6520857 Soltys et al. Feb 2003 B2
6527271 Soltys et al. Mar 2003 B2
6530836 Soltys et al. Mar 2003 B2
6530837 Soltys et al. Mar 2003 B2
6532297 Lindquist Mar 2003 B1
6533276 Soltys et al. Mar 2003 B2
6533662 Soltys et al. Mar 2003 B2
6561897 Bourbour et al. May 2003 B1
6568678 Breeding et al. May 2003 B2
6579180 Soltys et al. Jun 2003 B2
6579181 Soltys et al. Jun 2003 B2
6581747 Charlier et al. Jun 2003 B1
6582301 Hill Jun 2003 B2
6582302 Romero Jun 2003 B2
6585586 Romero Jul 2003 B1
6585588 Hard Jul 2003 B2
6585856 Zwick et al. Jul 2003 B2
6588750 Grauzer et al. Jul 2003 B1
6588751 Grauzer et al. Jul 2003 B1
6595857 Soltys et al. Jul 2003 B2
6609710 Order Aug 2003 B1
6612928 Bradford et al. Sep 2003 B1
6616535 Nishizaki et al. Sep 2003 B1
6619662 Miller Sep 2003 B2
6622185 Johnson et al. Sep 2003 B1
6626757 Oliveras Sep 2003 B2
6629019 Legge et al. Sep 2003 B2
6629591 Griswold et al. Oct 2003 B1
6629889 Mothwurf Oct 2003 B2
6629894 Purton Oct 2003 B1
6637622 Robinson Oct 2003 B1
6638161 Soltys et al. Oct 2003 B2
6645068 Kelly et al. Nov 2003 B1
6645077 Rowe Nov 2003 B2
6651981 Grauzer et al. Nov 2003 B2
6651982 Grauzer et al. Nov 2003 B2
6651985 Sines Nov 2003 B2
6652379 Soltys et al. Nov 2003 B2
6655684 Grauzer et al. Dec 2003 B2
6655690 Osicwarek Dec 2003 B1
6658135 Morito et al. Dec 2003 B1
6659460 Blaha et al. Dec 2003 B2
6659461 Yoseloff Dec 2003 B2
6659875 Purton Dec 2003 B2
6663490 Soltys et al. Dec 2003 B2
6666768 Akers Dec 2003 B1
6671358 Seidman et al. Dec 2003 B1
6676127 Johnson et al. Jan 2004 B2
6676517 Beavers Jan 2004 B2
6680843 Farrow et al. Jan 2004 B2
6685564 Oliver Feb 2004 B2
6685567 Cockerille et al. Feb 2004 B2
6685568 Soltys et al. Feb 2004 B2
6688597 Jones Feb 2004 B2
6688979 Soltys et al. Feb 2004 B2
6690673 Jarvis Feb 2004 B1
6698756 Baker et al. Mar 2004 B1
6698759 Webb et al. Mar 2004 B2
6702289 Feola Mar 2004 B1
6702290 Buono-Correa et al. Mar 2004 B2
6709333 Bradford et al. Mar 2004 B1
6712696 Soltys et al. Mar 2004 B2
6719288 Hessing et al. Apr 2004 B2
6719634 Mishina et al. Apr 2004 B2
6722974 Sines Apr 2004 B2
6726205 Purton Apr 2004 B1
6732067 Powderly May 2004 B1
6733012 Bui et al. May 2004 B2
6733388 Mothwurf May 2004 B2
6746333 Onda et al. Jun 2004 B1
6747560 Stevens, III Jun 2004 B2
6749510 Giobbi Jun 2004 B2
6758751 Soltys et al. Jul 2004 B2
6758757 Luciano, Jr. et al. Jul 2004 B2
6769693 Huard et al. Aug 2004 B2
6774782 Runyon et al. Aug 2004 B2
6789801 Snow Sep 2004 B2
6802510 Haber Oct 2004 B1
6804763 Stockdale et al. Oct 2004 B1
6808173 Snow Oct 2004 B2
6827282 Silverbrook Dec 2004 B2
6834251 Fletcher Dec 2004 B1
6840517 Snow Jan 2005 B2
6842263 Saeki Jan 2005 B1
6843725 Nelson Jan 2005 B2
6848616 Tsirline et al. Feb 2005 B2
6848844 McCue, Jr. et al. Feb 2005 B2
6848994 Knust et al. Feb 2005 B1
6857961 Soltys et al. Feb 2005 B2
6874784 Promutico et al. Apr 2005 B1
6874786 Bruno Apr 2005 B2
6877657 Ranard et al. Apr 2005 B2
6877748 Patroni et al. Apr 2005 B1
6886829 Hessing et al. May 2005 B2
6889979 Blaha et al. May 2005 B2
6893347 Zilliacus et al. May 2005 B1
6899628 Leen et al. May 2005 B2
6902167 Webb Jun 2005 B2
6905121 Timpano Jun 2005 B1
6923446 Snow Aug 2005 B2
6938900 Snow Sep 2005 B2
6941180 Fisher et al. Sep 2005 B1
6950948 Neff Sep 2005 B2
6955599 Bourbour et al. Oct 2005 B2
6957746 Martin et al. Oct 2005 B2
6959925 Baker et al. Nov 2005 B1
6960134 Hartl et al. Nov 2005 B2
6964612 Soltys et al. Nov 2005 B2
6986514 Snow Jan 2006 B2
6988516 Debaes Jan 2006 B2
7011309 Soltys et al. Mar 2006 B2
7020307 Hinton et al. Mar 2006 B2
7028598 Teshima Apr 2006 B2
7029009 Grauzer et al. Apr 2006 B2
7036818 Grauzer et al. May 2006 B2
7046458 Nakayama May 2006 B2
7046764 Kump May 2006 B1
7048629 Sines May 2006 B2
7059602 Grauzer et al. Jun 2006 B2
7066464 Blad et al. Jun 2006 B2
7068822 Scott Jun 2006 B2
7073791 Grauzer et al. Jul 2006 B2
7079010 Champlin Jul 2006 B2
7084769 Bauer et al. Aug 2006 B2
7089420 Durst et al. Aug 2006 B1
D527900 Dewa Sep 2006 S
7106201 Tuttle Sep 2006 B2
7113094 Garber et al. Sep 2006 B2
7114718 Grauzer et al. Oct 2006 B2
7124947 Starch Oct 2006 B2
7128652 Lavoie et al. Oct 2006 B1
7137627 Grauzer et al. Nov 2006 B2
7139108 Andersen et al. Nov 2006 B2
7140614 Snow Nov 2006 B2
7162035 Durst et al. Jan 2007 B1
7165769 Crenshaw et al. Jan 2007 B2
7165770 Snow Jan 2007 B2
7175522 Hartl Feb 2007 B2
7186181 Rowe Mar 2007 B2
7201656 Darder Apr 2007 B2
7202888 Tecu et al. Apr 2007 B2
7203841 Jackson et al. Apr 2007 B2
7213812 Schubert May 2007 B2
7222852 Soltys May 2007 B2
7222855 Sorge May 2007 B2
7231812 Lagare Jun 2007 B1
7234698 Grauzer et al. Jun 2007 B2
7237969 Bartman Jul 2007 B2
7243148 Keir et al. Jul 2007 B2
7243698 Siegel Jul 2007 B2
7246799 Snow Jul 2007 B2
7255344 Grauzer et al. Aug 2007 B2
7255351 Yoseloff et al. Aug 2007 B2
7255642 Sines et al. Aug 2007 B2
7257630 Cole et al. Aug 2007 B2
7261294 Grauzer et al. Aug 2007 B2
7264241 Schubert et al. Sep 2007 B2
7264243 Yoseloff et al. Sep 2007 B2
7277570 Armstrong Oct 2007 B2
7278923 Grauzer et al. Oct 2007 B2
7294056 Lowell et al. Nov 2007 B2
7297062 Gatto et al. Nov 2007 B2
7300056 Gioia et al. Nov 2007 B2
7303473 Rowe Dec 2007 B2
7303475 Britt et al. Dec 2007 B2
7309065 Yoseloff et al. Dec 2007 B2
7316609 Dunn et al. Jan 2008 B2
7316615 Soltys et al. Jan 2008 B2
7322576 Grauzer et al. Jan 2008 B2
7331579 Snow Feb 2008 B2
7334794 Snow Feb 2008 B2
7338044 Grauzer et al. Mar 2008 B2
7338362 Gallagher Mar 2008 B1
7341510 Bourbour et al. Mar 2008 B2
D566784 Palmer Apr 2008 S
7357321 Yoshida Apr 2008 B2
7360094 Neff Apr 2008 B2
7367561 Blaha et al. May 2008 B2
7367563 Yoseloff et al. May 2008 B2
7367565 Chiu May 2008 B2
7367884 Breeding et al. May 2008 B2
7374170 Grauzer et al. May 2008 B2
7384044 Grauzer et al. Jun 2008 B2
7387300 Snow Jun 2008 B2
7389990 Mourad Jun 2008 B2
7390256 Soltys et al. Jun 2008 B2
7399226 Mishra Jul 2008 B2
7407438 Schubert et al. Aug 2008 B2
7413191 Grauzer et al. Aug 2008 B2
7434805 Grauzer et al. Oct 2008 B2
7436957 Fisher et al. Oct 2008 B1
7448626 Fleckenstein Nov 2008 B2
7458582 Snow Dec 2008 B2
7461843 Baker et al. Dec 2008 B1
7464932 Darling Dec 2008 B2
7464934 Schwartz Dec 2008 B2
7472906 Shai Jan 2009 B2
7478813 Hofferber et al. Jan 2009 B1
7500672 Ho Mar 2009 B2
7506874 Hall Mar 2009 B2
7510186 Fleckenstein Mar 2009 B2
7510190 Snow Mar 2009 B2
7510194 Soltys et al. Mar 2009 B2
7510478 Benbrahim et al. Mar 2009 B2
7513437 Douglas Apr 2009 B2
7515718 Nguyen et al. Apr 2009 B2
7523935 Grauzer et al. Apr 2009 B2
7523936 Grauzer et al. Apr 2009 B2
7523937 Fleckenstein Apr 2009 B2
7525510 Beland et al. Apr 2009 B2
7537216 Soltys et al. May 2009 B2
7540497 Tseng Jun 2009 B2
7540498 Crenshaw et al. Jun 2009 B2
7549643 Quach Jun 2009 B2
7554753 Wakamiya Jun 2009 B2
7556197 Yoshida Jul 2009 B2
7556266 Blaha et al. Jul 2009 B2
7575237 Snow Aug 2009 B2
7578506 Lambert Aug 2009 B2
7584962 Breeding et al. Sep 2009 B2
7584963 Krenn et al. Sep 2009 B2
7584966 Snow Sep 2009 B2
7591728 Gioia et al. Sep 2009 B2
7593544 Downs Sep 2009 B2
7594660 Baker et al. Sep 2009 B2
7597623 Grauzer et al. Oct 2009 B2
7644923 Dickinson et al. Jan 2010 B1
7661676 Smith et al. Feb 2010 B2
7666090 Hettinger Feb 2010 B2
7669852 Baker et al. Mar 2010 B2
7669853 Jones Mar 2010 B2
7677565 Grauzer et al. Mar 2010 B2
7677566 Krenn et al. Mar 2010 B2
7686681 Soltys et al. Mar 2010 B2
7699694 Hill Apr 2010 B2
7735657 Johnson Jun 2010 B2
7740244 Ho Jun 2010 B2
7744452 Cimring et al. Jun 2010 B2
7753373 Grauzer et al. Jul 2010 B2
7753374 Ho Jul 2010 B2
7753798 Soltys Jul 2010 B2
7758425 Poh et al. Jul 2010 B2
7762554 Ho Jul 2010 B2
7764836 Downs Jul 2010 B2
7766332 Grauzer et al. Aug 2010 B2
7766333 Stardust Aug 2010 B1
7769232 Downs Aug 2010 B2
7769853 Nezamzadeh Aug 2010 B2
7773749 Durst et al. Aug 2010 B1
7780529 Rowe et al. Aug 2010 B2
7784790 Grauzer et al. Aug 2010 B2
7804982 Howard et al. Sep 2010 B2
7846020 Walker et al. Dec 2010 B2
7867080 Nicely et al. Jan 2011 B2
7890365 Hettinger Feb 2011 B2
7900923 Toyama et al. Mar 2011 B2
7901285 Tran et al. Mar 2011 B2
7908169 Hettinger Mar 2011 B2
7909689 Lardie Mar 2011 B2
7931533 LeMay et al. Apr 2011 B2
7933448 Downs, III Apr 2011 B2
7946586 Krenn et al. May 2011 B2
7967294 Blaha et al. Jun 2011 B2
7976023 Hessing et al. Jul 2011 B1
7988152 Sines Aug 2011 B2
7988554 LeMay et al. Aug 2011 B2
7995196 Fraser Aug 2011 B1
8002638 Grauzer et al. Aug 2011 B2
8011661 Stasson Sep 2011 B2
8016663 Soltys et al. Sep 2011 B2
8021231 Walker et al. Sep 2011 B2
8025294 Grauzer et al. Sep 2011 B2
8038521 Grauzer et al. Oct 2011 B2
RE42944 Blaha et al. Nov 2011 E
8057302 Wells et al. Nov 2011 B2
8062134 Kelly et al. Nov 2011 B2
8070574 Grauzer et al. Dec 2011 B2
8092307 Kelly Jan 2012 B2
8092309 Bickley Jan 2012 B2
8109514 Toyama Feb 2012 B2
8141875 Grauzer et al. Mar 2012 B2
8150158 Downs, III Apr 2012 B2
8171567 Fraser et al. May 2012 B1
8210536 Blaha et al. Jul 2012 B2
8221244 French Jul 2012 B2
8251293 Nagata et al. Aug 2012 B2
8267404 Grauzer et al. Sep 2012 B2
8270603 Durst et al. Sep 2012 B1
8287347 Snow et al. Oct 2012 B2
8287386 Miller et al. Oct 2012 B2
8319666 Weinmann et al. Nov 2012 B2
8337296 Grauzer et al. Dec 2012 B2
8342525 Scheper et al. Jan 2013 B2
8342526 Sampson Jan 2013 B1
8342529 Snow Jan 2013 B2
8353513 Swanson Jan 2013 B2
8381918 Johnson Feb 2013 B2
8419521 Grauzer et al. Apr 2013 B2
8429229 Sepich et al. Apr 2013 B2
8444147 Grauzer et al. May 2013 B2
8444489 Lian et al. May 2013 B2
8469360 Sines Jun 2013 B2
8475252 Savage et al. Jul 2013 B2
8480088 Toyama et al. Jul 2013 B2
8485527 Sampson et al. Jul 2013 B2
8490973 Yoseloff et al. Jul 2013 B2
8498444 Sharma Jul 2013 B2
8505916 Grauzer et al. Aug 2013 B2
8511684 Grauzer et al. Aug 2013 B2
8512146 Gururajan et al. Aug 2013 B2
8548327 Hirth et al. Oct 2013 B2
8556263 Grauzer et al. Oct 2013 B2
8579289 Rynda et al. Nov 2013 B2
8602416 Toyama Dec 2013 B2
8616552 Czyzewski et al. Dec 2013 B2
8628086 Krenn et al. Jan 2014 B2
8651485 Stasson Feb 2014 B2
8662500 Swanson Mar 2014 B2
8695978 Ho Apr 2014 B1
8702100 Snow et al. Apr 2014 B2
8702101 Scheper et al. Apr 2014 B2
8720891 Hessing et al. May 2014 B2
8758111 Lutnick Jun 2014 B2
8777710 Grauzer et al. Jul 2014 B2
8820745 Grauzer et al. Sep 2014 B2
8844930 Sampson Sep 2014 B2
8899587 Grauzer et al. Dec 2014 B2
8919775 Wadds et al. Dec 2014 B2
9101821 Snow Aug 2015 B2
9251661 Tammesoo Feb 2016 B2
9266012 Grauzer Feb 2016 B2
9280866 Nayak et al. Mar 2016 B2
9474957 Haushalter et al. Oct 2016 B2
9504905 Kelly et al. Nov 2016 B2
9511274 Kelly Dec 2016 B2
9566501 Stasson et al. Feb 2017 B2
9731190 Sampson et al. Aug 2017 B2
20010036231 Easwar et al. Nov 2001 A1
20010036866 Stockdale et al. Nov 2001 A1
20010054576 Stardust et al. Dec 2001 A1
20020017481 Johnson et al. Feb 2002 A1
20020030425 Tiramani et al. Mar 2002 A1
20020045478 Soltys et al. Apr 2002 A1
20020045481 Soltys et al. Apr 2002 A1
20020063389 Breeding et al. May 2002 A1
20020068635 Hill Jun 2002 A1
20020070499 Breeding et al. Jun 2002 A1
20020094869 Harkham Jul 2002 A1
20020107067 McGlone et al. Aug 2002 A1
20020107072 Giobbi Aug 2002 A1
20020113368 Hessing et al. Aug 2002 A1
20020135692 Fujinawa Sep 2002 A1
20020142820 Bartlett Oct 2002 A1
20020155869 Soltys et al. Oct 2002 A1
20020163122 Vancura Nov 2002 A1
20020163125 Grauzer et al. Nov 2002 A1
20020187821 Soltys et al. Dec 2002 A1
20020187830 Rockdale et al. Dec 2002 A1
20030003997 Vuong et al. Jan 2003 A1
20030007143 McArthur et al. Jan 2003 A1
20030042673 Grauzer Mar 2003 A1
20030047870 Blaha et al. Mar 2003 A1
20030048476 Yamakawa Mar 2003 A1
20030052449 Grauzer et al. Mar 2003 A1
20030052450 Grauzer et al. Mar 2003 A1
20030064798 Grauzer et al. Apr 2003 A1
20030067112 Grauzer et al. Apr 2003 A1
20030071413 Blaha et al. Apr 2003 A1
20030073498 Grauzer et al. Apr 2003 A1
20030075865 Grauzer et al. Apr 2003 A1
20030075866 Blaha et al. Apr 2003 A1
20030087694 Starch May 2003 A1
20030090059 Grauzer et al. May 2003 A1
20030094756 Grauzer et al. May 2003 A1
20030151194 Hessing et al. Aug 2003 A1
20030195025 Hill Oct 2003 A1
20040015423 Walker et al. Jan 2004 A1
20040036214 Baker et al. Feb 2004 A1
20040067789 Grauzer et al. Apr 2004 A1
20040100026 Haggard May 2004 A1
20040108654 Grauzer et al. Jun 2004 A1
20040116179 Nicely et al. Jun 2004 A1
20040169332 Grauzer et al. Sep 2004 A1
20040180722 Giobbi Sep 2004 A1
20040224777 Smith et al. Nov 2004 A1
20040245720 Grauzer et al. Dec 2004 A1
20040259618 Soltys et al. Dec 2004 A1
20050012671 Bisig Jan 2005 A1
20050012818 Kiely et al. Jan 2005 A1
20050023752 Grauzer et al. Feb 2005 A1
20050026680 Gururajan Feb 2005 A1
20050035548 Yoseloff Feb 2005 A1
20050037843 Wells et al. Feb 2005 A1
20050040594 Krenn et al. Feb 2005 A1
20050051955 Schubert et al. Mar 2005 A1
20050051956 Grauzer et al. Mar 2005 A1
20050062227 Grauzer et al. Mar 2005 A1
20050062228 Grauzer et al. Mar 2005 A1
20050062229 Grauzer et al. Mar 2005 A1
20050082750 Grauzer et al. Apr 2005 A1
20050093231 Grauzer et al. May 2005 A1
20050104289 Grauzer et al. May 2005 A1
20050104290 Grauzer et al. May 2005 A1
20050110210 Soltys et al. May 2005 A1
20050113166 Grauzer et al. May 2005 A1
20050113171 Hodgson May 2005 A1
20050119048 Soltys Jun 2005 A1
20050121852 Soltys et al. Jun 2005 A1
20050137005 Soltys et al. Jun 2005 A1
20050140090 Breeding et al. Jun 2005 A1
20050146093 Grauzer et al. Jul 2005 A1
20050148391 Tain Jul 2005 A1
20050164759 Smith et al. Jul 2005 A1
20050164761 Tain Jul 2005 A1
20050192092 Breckner et al. Sep 2005 A1
20050206077 Grauzer et al. Sep 2005 A1
20050242500 Downs Nov 2005 A1
20050272501 Tran et al. Dec 2005 A1
20050277463 Knust et al. Dec 2005 A1
20050288083 Downs Dec 2005 A1
20050288086 Schubert et al. Dec 2005 A1
20060027970 Kyrychenko Feb 2006 A1
20060033269 Grauzer et al. Feb 2006 A1
20060033270 Grauzer et al. Feb 2006 A1
20060046853 Black Mar 2006 A1
20060063577 Downs, III et al. Mar 2006 A1
20060066048 Krenn et al. Mar 2006 A1
20060151946 Ngai Jul 2006 A1
20060181022 Grauzer et al. Aug 2006 A1
20060183540 Grauzer et al. Aug 2006 A1
20060189381 Daniel et al. Aug 2006 A1
20060199649 Soltys et al. Sep 2006 A1
20060205508 Green Sep 2006 A1
20060220312 Baker et al. Oct 2006 A1
20060220313 Baker et al. Oct 2006 A1
20060252521 Gururajan et al. Nov 2006 A1
20060252554 Gururajan et al. Nov 2006 A1
20060279040 Downs et al. Dec 2006 A1
20060281534 Grauzer et al. Dec 2006 A1
20070001395 Gioia et al. Jan 2007 A1
20070006708 Laakso Jan 2007 A1
20070015583 Tran Jan 2007 A1
20070018389 Downs, III Jan 2007 A1
20070045959 Soltys Mar 2007 A1
20070049368 Kuhn et al. Mar 2007 A1
20070057469 Grauzer et al. Mar 2007 A1
20070066387 Matsuno et al. Mar 2007 A1
20070069462 Downs, III et al. Mar 2007 A1
20070072677 Lavoie et al. Mar 2007 A1
20070102879 Stasson May 2007 A1
20070111773 Gururajan et al. May 2007 A1
20070148283 Harvey et al. Jun 2007 A1
20070184905 Gatto et al. Aug 2007 A1
20070197294 Gong Aug 2007 A1
20070197298 Rowe Aug 2007 A1
20070202941 Miltenberger et al. Aug 2007 A1
20070222147 Blaha et al. Sep 2007 A1
20070225055 Weisman Sep 2007 A1
20070233567 Daly Oct 2007 A1
20070238506 Ruckle Oct 2007 A1
20070259709 Kelly et al. Nov 2007 A1
20070267812 Grauzer et al. Nov 2007 A1
20070272600 Johnson Nov 2007 A1
20070278739 Swanson Dec 2007 A1
20070287534 Fleckenstein Dec 2007 A1
20070290438 Grauzer et al. Dec 2007 A1
20080004107 Nguyen et al. Jan 2008 A1
20080006997 Scheper et al. Jan 2008 A1
20080006998 Grauzer et al. Jan 2008 A1
20080022415 Kuo et al. Jan 2008 A1
20080032763 Giobbi Feb 2008 A1
20080039192 Laut Feb 2008 A1
20080039208 Abrink et al. Feb 2008 A1
20080096656 LeMay et al. Apr 2008 A1
20080111300 Czyzewski et al. May 2008 A1
20080113700 Czyzewski et al. May 2008 A1
20080136108 Polay Jun 2008 A1
20080143048 Shigeta Jun 2008 A1
20080176627 Lardie Jul 2008 A1
20080217218 Johnson Sep 2008 A1
20080234046 Kinsley Sep 2008 A1
20080234047 Nguyen Sep 2008 A1
20080248875 Beatty Oct 2008 A1
20080284096 Toyama et al. Nov 2008 A1
20080303210 Grauzer et al. Dec 2008 A1
20080315517 Toyama et al. Dec 2008 A1
20090026700 Shigeta Jan 2009 A2
20090048026 French Feb 2009 A1
20090054161 Schuber et al. Feb 2009 A1
20090072477 Tseng et al. Mar 2009 A1
20090121429 Walsh et al. Mar 2009 A1
20090091078 Grauzer et al. Apr 2009 A1
20090100409 Toneguzzo Apr 2009 A1
20090104963 Burman Apr 2009 A1
20090134575 Dickinson et al. May 2009 A1
20090140492 Yoseloff et al. Jun 2009 A1
20090166970 Rosh et al. Jul 2009 A1
20090176547 Katz Jul 2009 A1
20090179378 Amaitis et al. Jul 2009 A1
20090186676 Amaitis et al. Jul 2009 A1
20090189346 Krenn et al. Jul 2009 A1
20090191933 French Jul 2009 A1
20090194988 Wright et al. Aug 2009 A1
20090197662 Wright et al. Aug 2009 A1
20090224476 Grauzer et al. Sep 2009 A1
20090227318 Wright et al. Sep 2009 A1
20090227360 Gioia et al. Sep 2009 A1
20090250873 Jones Oct 2009 A1
20090253478 Walker et al. Oct 2009 A1
20090253503 Krise et al. Oct 2009 A1
20090267296 Ho et al. Oct 2009 A1
20090267297 Blaha et al. Oct 2009 A1
20090283969 Tseng et al. Nov 2009 A1
20090298577 Gagner et al. Dec 2009 A1
20090302535 Ho et al. Dec 2009 A1
20090302537 Ho et al. Dec 2009 A1
20090312093 Walker et al. Dec 2009 A1
20090314188 Toyama et al. Dec 2009 A1
20100013152 Grauzer Jan 2010 A1
20100038849 Scheper et al. Feb 2010 A1
20100048304 Boesen Feb 2010 A1
20100069155 Schwartz et al. Mar 2010 A1
20100178987 Pacey Jul 2010 A1
20100197410 Leen et al. Aug 2010 A1
20100234110 Clarkson Sep 2010 A1
20100240440 Szrek et al. Sep 2010 A1
20100244376 Johnson Sep 2010 A1
20100244382 Snow Sep 2010 A1
20100252992 Sines Oct 2010 A1
20100255899 Paulsen Oct 2010 A1
20100276880 Grauzer et al. Nov 2010 A1
20100311493 Miller et al. Dec 2010 A1
20100311494 Miller et al. Dec 2010 A1
20100314830 Grauzer et al. Dec 2010 A1
20100320685 Grauzer Dec 2010 A1
20110006480 Grauzer Jan 2011 A1
20110012303 Kourgiantakis et al. Jan 2011 A1
20110024981 Tseng Feb 2011 A1
20110052049 Rajaraman et al. Mar 2011 A1
20110062662 Ohta Mar 2011 A1
20110078096 Bounds Mar 2011 A1
20110079959 Hartley Apr 2011 A1
20110105208 Bickley May 2011 A1
20110109042 Rynda May 2011 A1
20110130185 Walker Jun 2011 A1
20110130190 Hamman et al. Jun 2011 A1
20110159952 Kerr Jun 2011 A1
20110159953 Kerr Jun 2011 A1
20110165936 Kerr Jul 2011 A1
20110172008 Alderucci Jul 2011 A1
20110183748 Wilson et al. Jul 2011 A1
20110230268 Williams Sep 2011 A1
20110269529 Baerlocher Nov 2011 A1
20110272881 Sines Nov 2011 A1
20110285081 Stasson Nov 2011 A1
20110287829 Clarkson et al. Nov 2011 A1
20120015724 Ocko et al. Jan 2012 A1
20120015725 Ocko et al. Jan 2012 A1
20120015743 Lam et al. Jan 2012 A1
20120015747 Ocko et al. Jan 2012 A1
20120021835 Keller et al. Jan 2012 A1
20120034977 Kammler Feb 2012 A1
20120062745 Han et al. Mar 2012 A1
20120074646 Grauzer et al. Mar 2012 A1
20120091656 Blaha et al. Apr 2012 A1
20120095982 Lennington et al. Apr 2012 A1
20120161393 Krenn et al. Jun 2012 A1
20120175841 Grauzer Jul 2012 A1
20120181747 Grauzer et al. Jul 2012 A1
20120187625 Downs, III et al. Jul 2012 A1
20120242782 Huang Sep 2012 A1
20120286471 Grauzer et al. Nov 2012 A1
20120306152 Krishnamurty et al. Dec 2012 A1
20130020761 Sines Jan 2013 A1
20130023318 Abrahamson Jan 2013 A1
20130085638 Weinmann et al. Apr 2013 A1
20130099448 Scheper et al. Apr 2013 A1
20130109455 Grauzer et al. May 2013 A1
20130132306 Kami et al. May 2013 A1
20130147116 Stasson Jun 2013 A1
20130161905 Grauzer et al. Jun 2013 A1
20130228972 Grauzer et al. Sep 2013 A1
20130241147 McGrath Sep 2013 A1
20130300059 Sampson et al. Nov 2013 A1
20130337922 Kuhn Dec 2013 A1
20140027979 Stasson et al. Jan 2014 A1
20140094239 Grauzer et al. Apr 2014 A1
20140103606 Grauzer et al. Apr 2014 A1
20140138907 Rynda et al. May 2014 A1
20140145399 Krenn et al. May 2014 A1
20140171170 Krishnamurty et al. Jun 2014 A1
20140175724 Huhtala et al. Jun 2014 A1
20140183818 Czyzewski et al. Jul 2014 A1
20150021242 Johnson Jan 2015 A1
20150069699 Blazevic Mar 2015 A1
20150196834 Snow Jul 2015 A1
20150238848 Kuhn et al. Aug 2015 A1
20170157499 Krenn et al. Jun 2017 A1
Foreign Referenced Citations (80)
Number Date Country
2383667 Jan 1969 AU
5025479 Mar 1980 AU
697805 Oct 1998 AU
757636 Feb 2003 AU
2266555 Sep 1996 CA
2284017 Sep 1998 CA
2612138 Dec 2006 CA
2051521 Jan 1990 CN
1824356 Aug 2006 CN
2848303 Dec 2006 CN
2855481 Jan 2007 CN
101025603 Aug 2007 CN
200954370 Oct 2007 CN
200987893 Dec 2007 CN
101099896 Jan 2008 CN
101127131 Feb 2008 CN
101134141 Mar 2008 CN
201085907 Jul 2008 CN
201139926 Oct 2008 CN
100571826 Dec 2009 CN
1771077 Jun 2010 CN
102125756 Jul 2011 CN
102170944 Aug 2011 CN
101783011 Dec 2011 CN
102847311 Jan 2013 CN
202983149 Jun 2013 CN
24952 Feb 2013 CZ
3807127 Sep 1989 DE
2757341 Sep 1998 DE
177514 Feb 2000 EP
1502631 Feb 2005 EP
1713026 Oct 2006 EP
1194888 Aug 2009 EP
2228106 Sep 2010 EP
1575261 Aug 2012 EP
2375918 Jul 1978 FR
337147 Sep 1929 GB
414014 Jul 1934 GB
672616 May 1952 GB
10063933 Mar 1998 JP
11045321 Feb 1999 JP
2000251031 Sep 2000 JP
2001327647 Nov 2001 JP
2002165916 Jun 2002 JP
2003-154320 May 2003 JP
2003250950 Sep 2003 JP
2005198668 Jul 2005 JP
2008246061 Oct 2008 JP
4586474 Nov 2010 JP
M359356 Jun 2009 TW
I345476 Jul 2011 TW
8700764 Feb 1987 WO
9221413 Dec 1992 WO
9528210 Oct 1995 WO
9607153 Mar 1996 WO
9710577 Mar 1997 WO
9814249 Apr 1998 WO
9840136 Sep 1998 WO
9943404 Sep 1999 WO
9952610 Oct 1999 WO
9952611 Oct 1999 WO
200051076 Aug 2000 WO
156670 Aug 2001 WO
178854 Oct 2001 WO
205914 Jan 2002 WO
3026763 Apr 2003 WO
2004067889 Dec 2004 WO
2004112923 Dec 2004 WO
2006031472 Mar 2006 WO
2006039308 Apr 2006 WO
2008005286 Jan 2008 WO
2008006023 Jan 2008 WO
2008091809 Jul 2008 WO
2009067758 Jun 2009 WO
2009137541 Nov 2009 WO
2010001032 Jan 2010 WO
2010052573 May 2010 WO
2010055328 May 2010 WO
2010117446 Oct 2010 WO
2013019677 Feb 2013 WO
Non-Patent Literature Citations (105)
Entry
Australian Examination Report for Australian Application No. 2013324026, dated Oct. 20, 2016, 3 pages.
Shuffle Master, Inc. (1996). Let It Ride, the Tournament, User Guide, 72 pages.
Chinese Office Action from Chinese Application No. 201380060943.7, dated May 18, 2015, 25 pages with English translation.
Chinese Office Action from Chinese Application No. 201380060943.7, dated Feb. 2, 2018, 18 pages with English translation.
U.S. Appl. No. 15/276,476, filed Sep. 26, 2016, titled “Devices, Systems, and Related Methods for Real-Time Monitoring and Display of Related Data for Casino Gaming Devices”, to Nagaragatta et al., 36 pages.
U.S. Appl. No. 15/365,610, filed Nov. 30, 2016, titled “Card Handling Devices and Related Assemblies and Components”, to Helsen et al., 62 pages.
Genevieve Orr, CS-449: Neural Networks Willamette University, http://www.willamette.edu/˜gorr/classes/cs449/intro.html (4 pages), Fall 1999.
http://www.google.com/search?tbm=pts&q=Card+handling+devicve+with+input+and+outpu . . . Jun. 8, 2012.
http://www.ildado.com/casino_glossary.html, Feb. 1, 2001, p. 1-8.
https://web.archive.org/web/19991004000323/http://travelwizardtravel.com/majon.htm, Oct. 4, 1999, 2 pages.
http://www.google.com/search?tbm=pts&q=shuffling+zone+onOopposite+site+of+input+. . . Jul. 18, 2012.
Litwiller, Dave, CCD vs. CMOS: Facts and Fiction reprinted from Jan. 2001 Issue of Photonics Spectra, Laurin Publishing Co. Inc. (4 pages).
Malaysian Patent Application Substantive Examination Adverse Report—Malaysian Patent Application Serial No. PI 20062710, dated May 9, 2009, 4 pages.
PCT International Preliminary Examination Report for International Patent Application No. PCT/US02/31105 dated Jul. 28, 2004, 9 pages.
PCT International Search Report for International Application No. PCT/US2003/015393, dated Oct. 6, 2003, 2 pages.
PCT International Search Report for PCT/US2005/034737 dated Apr. 7, 2006, 1 page. (WO06/039308).
PCT International Search Report for PCT/US2007/022894, dated 11 Jun. 2008, 3 pages.
PCT International Search Report and Written Opinion, PCT/US2012/48706, dated Oct. 16, 2012, 12 pages.
PCT International Search Report and Written Opinion of the International Searching Authority for PCT/US2010/001032, dated Jun. 16, 2010, 11 pages.
PCT International Search Report and Written Opinion for PCT/US07/15035, dated Sep. 29, 2008, 6 pages.
PCT International Search Report and Written Opinion for PCT/US07/15036, dated Sep. 23, 2008, 6 pages.
PCT International Search Report and Written Opinion, PCT Application No. PCT/US2015/051038, dated Jan. 22, 2016, 11 pages.
PCT International Search Report and Written Opinion of the International Searching Authority for PCT/US2008/007069, dated Sep. 8, 2008, 10 pages.
PCT International Search Report and Written Opinion, PCT Application No. PCT/US2015/022158, dated Jun. 17, 2015, 13 pages.
PCT International Search Report and Written Opinion for International Application No. PCT/US2007/023168, dated Sep. 12, 2008, 8 pages.
PCT International Search Report and Written Opinion, PCT Application No. PCT/US2015/040196, dated Jan. 15, 2016, 20 pages.
PCT International Search Report and Written Opinion, PCT Application No. PCT/US2013/062391, dated Dec. 17, 2013, 13 pages.
PCT International Search Report and Written Opinion of the International Searching Authority for PCT/US05/31400, dated Sep. 25, 2007, 12 pages.
PCT International Search Report and Written Opinion, PCT Application No. PCT/US2015/025420, dated Oct. 2, 2015, 15 pages.
PCT International Search Report and Written Opinion of the International Searching Authority for PCT/US13/59665, dated Apr. 25, 2014, 21 pages.
PCT International Search Report and Written Opinion of the International Searching Authority for PCT/IB2013/001756, dated Jan. 10, 2014, 7 pages.
PCT International Search Report and Written Opinion of the International Searching Authority for PCT/US11/59797, dated Mar. 27, 2012, 14 pages.
PCT International Search Report and Written Opinion for International Application No. PCT/US20071022858, dated Mar. 7, 2008, 7 pages.
PCT International Search Report and Written Opinion for International Patent Application No. PCT/US2006/22911, dated Jun. 1, 2007, 6 pages.
PCT International Search Report and Written Opinion of the International Searching Authority for PCT/GB2011/051978, dated Jan. 17, 2012, 11 pages.
Philippines Patent Application Formality Examination Report—Philippines Patent Application No. 1-2006-000302, dated Jun. 13, 2006.
Press Release for Alliance Gaming Corp., Jul 26, 2004—Alliance Gaming Announces Control with Galaxy Macau for New MindPlay Baccarat Table Technology, 2 pages, http://biz.yahoo.com/prnews.
Scarne's Encyclopedia of Games by John Scarne, 1973, “Super Contract Bridge”, p. 153.
Shuffle Master Gaming, Service Manual, ACETM Single Deck Card Shuffler, (1998), 63 pages.
Shuffle Master Gaming, Service Manual, Let It Ride Bonus® With Universal Keypad, 112 pages, © 2000 Shuffle Master, Inc.
Service Manual/User Manual for Single Deck Shufflers: BG1, BG2 and BG3 by Shuffle Master ©1997, 151 page.
Singapore Patent Application Examination Report—Singapore Patent Application No. SE 2008 01914 A, dated Jun. 18, 2008, 9 pages.
SHFL Entertainment, Inc., Opening Claim Construction Brief, filed in Nevada District Court Case No. 2:12-cv-01782 with exhibits, Aug. 8, 2013, p. 1-125.
Shuffle Master's Reply Memorandum in Support of Shuffle Master's Motion for Preliminary Injunction for Shuffle Master, Inc. vs. VendingData Corporation, in the U.S. District Court, District of Nevada, No. CV-S-04-1373-JCM-LRL, Nov. 29, 2004.
Statement of Relevance of Cited References, Submitted as Part of a Third-Party Submission Under 37 CFR 1.290 on Dec. 7, 2012 (12 pages).
tbm=pts&hl=en Google Search for card handling device with storage area, card removing system pivoting arm and processor ; http://www.google.com/?tbrn=pts&hl=en; Jul. 28, 2012, 2 pages.
Tracking the Tables, by Jack Bularsky, Casino Journal, May 2004, vol. 17, No. 5, pp. 44-47.
United States Court of Appeals for the Federal Circuit Decision Decided Dec. 27, 2005 for Preliminary Injuction for Shuffle Master, Inc. vs. VendingData Corporation, In the U.S. District Court, District of Nevada, No. CV-S-04-1373-JCM-LRL.
VendingData Corporation's Answer and Counterclaim Jury Trial Demanded for Shuffle Master, Inc. vs. VendingData Corporation, In the U.S. District Court, District of Nevada, No. CV-S-04-1373-JCM-LRL, Oct. 25, 2004.
VendingData Corporation's Opposition to Shuffle Master Inc.'s Motion for Preliminary Injection for Shuffle Master, Inc. vs. VendingData Corporation, In the U.S. District Court, District of Nevada, No. CV-S-04-1373-JCM-LRL, Nov. 12, 2004.
VendingData Corporation's Responses to Shuffle Master, Inc.'s First set of interrogatories for Shuffler Master, Inc. vs. VendingData Corporation, In the U.S. District Court, District of Nevada, No. CV-S-04-1373-JCM-LRL, Mar. 14, 2005.
1/3″ B/W CCD Camera Module EB100 by EverFocus Electronics Corp., Jul. 31, 2001, 3 pgs.
“ACE, Single Deck Shuffler,” Shuffle Master, Inc., (2005), 2 pages.
ADVANSYS, “Player Tracking” http://advansys.si/products/tablescanner/player-tracking/[Sep. 23, 2016 1:41:34 PM], 4 pages.
Australian Examination Report for Australian Application No. 2008202752, dated Sep. 25, 2009, 2 pages.
Australian Examination Report for Australian Application No. 2010202856, dated Aug. 11, 2011, 2 pages.
Australian Provisional Patent Application for Australian Patent Application No. PM7441, filed Aug. 15, 1994, Applicants: Rodney G. Johnson et al., Title: Card Handling Apparatus, 13 pages.
“Automatic casino card shuffle,” Alibaba.com, (last visited Jul. 22, 2014), 2 pages.
Bally Systems Catalogue, Ballytech.com/systems, 2012, 13 pages.
Canadian Office Action for CA 2,580,309 dated Mar. 20, 2012 (6 pages).
Canadian Office Action for Canadian Application No. 2,461,726, dated Jul. 19, 2010, 3 pages.
Canadian Office Action for Canadian Application No. 2,461,726, dated Dec. 11, 2013, 3 pages.
Christos Stergiou and Dimitrios Siganos, “Neural Networks,” http://www.doc.ic.ac.uk/˜nd/surprise_96/journal/vol4/cs11/report.html (13 pages), Dec. 15, 2011.
Complaint filed in the matter of SHFL entertainment, In. v. DigiDeal Corporation, U.S. District Court, District of Nevada, Civil Action No. CV 2:12-cv-01782-GMC-VCF, Oct. 10, 2012, 62 pages.
Documents submitted in case of Shuffle Master, Inc. v. Card Aurstia, et al., Case No. CV-N-0508-HDM-(VPC) Consolidated with Case No. CV-N-02-0244-ERC-(RAM)), May 6, 2003, scan of color pages, for clarity, Part 18 of 23 (color copies from Binder 1).
Documents submitted in the case of Shuffle Master, Inc. v. Card Austria, et al., Case No. CV-N-0508-HDM-(VPC) (Consolidated with Case No. CV-N-02-0244-ERC-(RAM)), May 6, 2003, Part 1 of 23 (Master Index and Binder 1, 1 of 2).
Documents submitted in the case of Shuffle Master, Inc. v. Card Austria, et al., Case No. CV-N-0508-HDM-(VPC) (Consolidated with Case No. CV-N-02-0244-ERC-(RAM)), May 6, 2003, Part 2 of 23 (Master Index and Binder 1, 2 of 2).
Documents submitted in the case of Shuffle Master, Inc. v. Card Austria, et al., Case No. CV-N-0508-HDM-(VPC) (Consolidated with Case No. CV-N-02-0244-ERC-(RAM)), May 6, 2003, Part 3 of 23 (Binder 2, 1 of 2).
Documents submitted in the case of Shuffle Master, Inc. v. Card Austria, et al., Case No. CV-N-0508-HDM-(VPC) (Consolidated with Case No. CV-N-02-0244-ERC-(RAM)), May 6, 2003, Part 4 of 23 (Binder 2, 2 of 2).
Documents submitted in case of Shuffle Master, Inc. v. Card Aurstia, et al., Case No. CV-N-0508-HDM-(VPC) Consolidated with Case No. CV-N-02-0244-ERC-(RAM)), May 6, 2003, scan of color pages, for clarity, Part 19 of 23 (color copies from Binder 3).
Documents submitted in the case of Shuffle Master, Inc. v. Card Austria, et al., Case No. CV-N-0508-HDM-(VPC) (Consolidated with Case No. CV-N-02-0244-ERC-(RAM)), May 6, 2003, Part 5 of 23 (Binder 3, 1 of 2).
Documents submitted in the case of Shuffle Master, Inc. v. Card Austria, et al., Case No. CV-N-0508-HDM-(VPC) (Consolidated with Case No. CV-N-02-0244-ERC-(RAM)), May 6, 2003, Part 6 of 23 (Binder 3, 2 of 2).
Documents submitted in case of Shuffle Master, Inc. v. Card Aurstia, et al., Case No. CV-N-0508-HDM-(VPC) Consolidated with Case No. CV-N-02-0244-ERC-(RAM)), May 6, 2003, scan of color pp., for clarity, Part 20 of 23 (color copies from Binder 4).
Documents submitted in the case of Shuffle Master, Inc. v. Card Austria, et al., Case No. CV-N-0508-HDM-(VPC) (Consolidated with Case No. CV-N-02-0244-ERC-(RAM)), May 6, 2003, Part 7 of 23 (Binder 4, 1 of 2).
Documents submitted in the case of Shuffle Master, Inc. v. Card Austria, et al., Case No. CV-N-0508-HDM-(VPC) (Consolidated with Case No. CV-N-02-0244-ERC-(RAM)), May 6, 2003, Part 8 of 23 (Binder 4, 2 of 2).
Documents submitted in case of Shuffle Master, Inc. v. Card Aurstia, et al., Case No. CV-N-0508-HDM-(VPC) Consolidated with Case No. CV-N-02-0244-ERC-(RAM)), May 6, 2003, scan of color pages, for clarity, Part 21 of 23 (color copies from Binder 6).
Documents submitted in the case of Shuffle Master, Inc. v. Card Austria, et al., Case No. CV-N-0508-HDM-(VPC) (Consolidated with Case No. CV-N-02-0244-ERC-(RAM)), May 6, 2003, Part 10 of 23 (Binder 6, 2 of 2).
Documents submitted in the case of Shuffle Master, Inc. v. Card Austria, et al., Case No. CV-N-0508-HDM-(VPC) (Consolidated with Case No. CV-N-02-0244-ERC-(RAM)), May 6, 2003, Part 9 of 23 (Binder 5 having no contents; Binder 6, 1 of 2).
Documents submitted in the case of Shuffle Master, Inc. v. Card Austria, et al., Case No. CV-N-0508-HDM-(VPC) (Consolidated with Case No. CV-N-02-0244-ERC-(RAM)), May 6, 2003, Part 11 of 23 (Binder 7, 1 of 2).
Documents submitted in the case of Shuffle Master, Inc. v. Card Austria, et al., Case No. CV-N-0508-HDM-(VPC) (Consolidated with Case No. CV-N-02-0244-ERC-(RAM)), May 6, 2003, Part 12 of 23 (Binder 7, 2 of 2).
Documents submitted in the case of Shuffle Master, Inc. v. Card Austria, et al., Case No. CV-N-0508-HDM-(VPC) (Consolidated with Case No. CV-N-02-0244-ERC-(RAM)), May 6, 2003, Part 13 of 23 (Binder 8, 1 of 5).
Documents submitted in case of Shuffle Master, Inc. v. Card Aurstia, et al., Case No. CV-N-0508-HDM-(VPC) Consolidated with Case No. CV-N-02-0244-ERC-(RAM)), May 6, 2003, scan of color pages, for clarity, Part 22 of 23 (color copies from Binder 8, part 1 of 2).
Documents submitted in case of Shuffle Master, Inc. v. Card Aurstia, et al., Case No. CV-N-0508-HDM-(VPC) Consolidated with Case No. CV-N-02-0244-ERC-(RAM)), May 6, 2003, scan of color pages, for clarity, Part 23 of 23 (color copies from Binder 8, part 2 of 2).
Documents submitted in the case of Shuffle Master, Inc. v. Card Austria, et al., Case No. CV-N-0508-HDM-(VPC) (Consolidated with Case No. CV-N-02-0244-ERC-(RAM)), May 6, 2003, Part 14 of 23 (Binder 8, 2 of 5).
Documents submitted in the case of Shuffle Master, Inc. v. Card Austria, et al., Case No. CV-N-0508-HDM-(VPC) (Consolidated with Case No. CV-N-02-0244-ERC-(RAM)), May 6, 2003, Part 15 of 23 (Binder 8, 3 of 5).
Documents submitted in the case of Shuffle Master, Inc. v. Card Austria, et al., Case No. CV-N-0508-HDM-(VPC) (Consolidated with Case No. CV-N-02-0244-ERC-(RAM)), May 6, 2003, Part 16 of 23 (Binder 8, 4 of 5).
Documents submitted in the case of Shuffle Master, Inc. v. Card Austria, et al., Case No. CV-N-0508-HDM-(VPC) (Consolidated with Case No. CV-N-02-0244-ERC-(RAM)), May 6, 2003, Part 17 of 23 (Binder 8, 5 of 5).
DVD labeled Exhibit 1. This is a DVD taken by Shuffle Master personnel of the live operation of a CARD One2Si| Shuffler (Oct. 7, 2003). DVD sent to Examiner by US Postal Service.
DVD labeled Morrill Decl. Ex. A is (see Binder 4-1, p. 149/206, Morrill Decl., para. 2.): A video (16 minutes) that the attorney for CARD, Robert Morrill, made to describe the Roblejo prototype card shuffler. DVD sent to Examiner by US Postal Service.
DVD labeled Solberg Decl.Ex.C, which is not a video at all, is (see Binder 4-1, p. 34/206, Solberg Decl., para.8): Computer source code for operating a computer-controlled card shuffler (an early Roblejo prototype card shuffler) and descriptive comments of how the code works. DVD sent to Examiner by US Postal Service.
DVD labeled Luciano Decl. Ex. K is (see Binder 2-1, p. 215/237, Luciano Decl., para.14): A video demonstration (11minutes) of a Luciano Packaging prototype shuffler. DVD sent to Examiner by US Postal Service.
European Search Report for European Application No. 12 152 303, dated Apr. 16, 2012, 3 pages.
European Patent Application Search Report—European Patent Application No. 06772987.1, dated Dec. 10, 2009, 5 pages.
European Examination Report for European Application No. 02 780 410, dated Jan. 25, 2010, 5 pages.
European Examination Report for European Application No. 02 780 410, dated Aug. 9, 2011, 4 pages.
“Error Back propagation,” http://willamette.edu˜gorr/classes/cs449/backprop.html (4 pages), Nov. 13, 2008.
“i-Deal,” Bally Technologies, Inc., (2014), 2 pages.
“shuffiers—SHFL entertainment,” Gaming Concepts Group, (2012), 6 pages.
“TAG Archives: Shuffle Machine,” Gee Wiz Online, (Mar. 25, 2013), 4 pages.
Weisenfeld, Bernie; Inventor betting on shuffler; Courier-Post; Sep. 11, 1990; 1 page.
Solberg, Halvard; Deposition; Shuffle Tech International v. Scientific Games Corp., et al. 1:15-cv-3702 (N.D. III.); Oct. 18, 2016; pp. 187, 224-246, 326-330, 338-339, 396; Baytowne Reporting; Panama City, FL.
Prototype Glossary and Timelines; Shuffle Tech International v. Scientific Games Corp., et al. 1:15-cv-3702 (N.D. III.); undated; pp. 1-4.
Olsen, Eddie; Automatic Shuffler ‘ready’ for Atlantic City experiment; Blackjack Confidential; Jul./Aug. 1989; pp. 6-7.
Gros, Roger; New Card Management System to Be Tested at Bally's Park Place; Casino Journal; Apr. 1989; 5 pages.
Gola, Steve; Deposition; Shuffle Tech International v. Scientific Games Corp., et al. 1:15-cv-3702 (N.D. III.); Oct. 13, 2016; pp. 1, 9-21, 30-69, 150-167, 186-188, 228-231, 290-315, 411; Henderson Legal Services, Inc.; Washington, DC.
Related Publications (1)
Number Date Country
20170173449 A1 Jun 2017 US
Continuations (1)
Number Date Country
Parent 14022160 Sep 2013 US
Child 15369312 US
Continuation in Parts (1)
Number Date Country
Parent 13631658 Sep 2012 US
Child 14022160 US