The present invention generally relates to an amusement game in which several players operate remote-controlled game objects, such as cars. The game may be coin-operated, but otherwise unattended. More specifically, the present disclosure relates to an amusement game that allows several of the game objects to be player controlled while those not player controlled are controlled by a control unit.
Presently, many different types of amusement games that include player-controlled movable game objects, such as cars, are available. One commercially available and successful amusement game is shown in U.S. Pat. No. 7,402,106. In the amusement game shown and described in the '106 patent, a series of cars are directed along a playfield by players positioned at one of a plurality of control stations. During game play, if less than the maximum number of players are involved, a control unit operates the remaining cars so that all of the cars are involved in each race. Although the control unit in the amusement game functions well in operating the computer-controlled cars during a race, the amusement game required a very large number of sensing devices positioned both above the playfield and along the inner and outer perimeter edges of the playfield to determine the current position of each of the computer-controlled cars. Because of the large number of sensors required to determine the position of the cars during game play, such amusement game was both expensive to manufacture and difficult to maintain. Additionally, information regarding the position and orientation of gaming pieces was inherently low resolution, which greatly limited the ability of the control unit to manipulate game objects on the playfield.
The present invention relates to an amusement game and a method of operating an amusement game that includes an image sensing device that is used to monitor game play and relay images to a control unit such that the control unit can control the operation of at least one of the game objects during game play. The amusement game includes one or more image sensing devices that are positioned such that the image sensing device can view the entire playfield of the amusement game. The image sensing device is in operative communication with a control unit and generates image scans of the playfield at a determined frame rate. During operation of the game, each of the image scans may include a visual representation of the game objects as the game objects move over the playfield. Based upon the position of the game objects on the playfield, the control unit can control the operation of at least one of the game objects.
In one embodiment, the control unit records a reference image of the playfield prior to the beginning of the game play. The reference image shows the playfield before any game object is present. From the reference image, a mask can be used to define the area to be searched for cars.
After game play begins, the control unit records a series of sequential image scans and determines the position of the game objects within the current image scan. Preferably, the control unit subtracts the reference image from the current image scan such that only the game objects are left within the composite image. Based upon the composite image, the control unit identifies the location of each of the game objects along the playfield. Preferably, each of the game objects has a different color and the control unit distinguishes between the game objects and determines the position of each of the game objects based upon a color analysis algorithm. Once different blocks of color have been identified by the control unit, the control unit defines the outer edges of the color blocks and calculates the center of mass for each of the color blocks.
Once the location of each of the color blocks has been identified, the control unit determines the angle of orientation of each of the color blocks. Based upon the angle of orientation and the location of the color block along the playfield, the control unit determines the proper speed and steering angle for the game object to move the game object along the playfield. Once these parameters have been calculated, the control unit relays this information to each of the game objects under computer control to guide the game object along the playfield.
The drawings illustrate the best mode presently contemplated of carrying out the invention. In the drawings:
The object of the game is to drive the cars around the oval track as many times as possible during the playing time allowed. Each time a car completes a lap, the player is credited with one lap. The lap counts of the four cars are shown on the computer scoreboard 20. A computer-generated announcer's voice announces the progress of the race through speakers 24 and the numbers of the cars in each position. At the end of the time allowed for a game, the car with the most laps is declared the winner.
In the game of the present disclosure, at the end of each race when the playing time is up, all of the cars are driven by the computer control unit to a Start/Finish Line, where the cars generally line up in position for the next race. Then, during the next race, the cars that are not being driven by a paying player are driven by the control unit as “drones”. The ability of the control unit to operate the cars not assigned to a paying player makes the game more interesting and challenging for the players, and prevents the cars from being in the way as stationary “obstacles” on the track. The computer driven drones typically drive laps around the oval track. If the computer controlled cars encounter an obstacle, or are hit by another car and are knocked out of position, the control unit automatically re-orients the drone cars and the cars resume making laps along with the paying players.
For small children and other players who have not acquired the skill needed for competitive racing, the control unit provides an option of computer-assisted driving. In the preferred embodiment of the present invention, three different skill levels are supported, although more or fewer levels are contemplated.
In the Beginner level, the paying player has control of the car's forward and reverse speed, but the control unit controls the car's steering system. Players can move the steering wheel 16 to the left and right, but the steering input is modified by the control unit to help the player. The control unit thus enables the players to drive laps around the track simply by operating the throttle 18. In the preferred embodiment of the invention, in the straight-aways the player is allowed some limited side-to-side movement, to move toward the inside or outside guardrails, but not enough movement to run into the guard rails. If the car is knocked completely out of line by another car, the control unit may give the player control of the steering function long enough to get the car re-oriented.
In the Intermediate level, the player must enter the turns under their own control, but the control unit assists in straightening the car out of turns until the car is proceeding properly down the next straightaway. The length of control while in a straightaway can be set by a game operator to allow for different skill levels at different game installation locations.
In the Expert skill level, players have full control of the steering at all times with no computer-assisted driving. In the Expert level, the maximum forward speed is also set to be the highest since it is assumed that expert players can either handle the car at full speed or are skilled enough to adjust their speed as necessary without help from the computer.
The control unit of the amusement game allows several players to compete at different skill levels in the same race. The computer-assisted driving helps the less-skilled players without giving them an undue or unfair advantage over players who drive as expert drivers. This ensures a fun experience for players of all ages and skill levels.
The player controls consist of steering wheel 16 and throttle mechanisms 18, which provide inputs from the control stations 14 to the computer control unit 40, as shown in
As illustrated in
In the preferred embodiment, the image sensing device 34 is a digital image sensor, such as either a CCD or CMOS image sensor or camera. In the embodiment shown, a CCD or CMOS image sensor is utilized to generate the image scans that are relayed to the control unit 28 through the communication line 38. However, it is contemplated that various other digital image sensors, or other types of analog image sensors, could be utilized while operating within the scope of the present disclosure. As an example, it is contemplated that the image sensing device can process the image scans prior to sending information to the control unit 28. In such an embodiment, the image sensing device 34 would send results to the control unit 40, such as the x, y coordinates of the game object location, rather than the entire raw video image, thereby reducing the bandwidth requirements of the communication line between the image sensing device and the control unit and reducing the processing requirement for the control unit.
In the case of the image sensing device 34 shown in
During operation of the CCD or CMOS camera, electrical signals are generated that have levels corresponding to the amount and color of light received by the respective photo electric conversion element of the CCD or CMOS camera. The electrical signals are received by the control unit 28 and analyzed as will be described below.
Although the embodiment describes utilizing only a single image sensing device 34, it is contemplated that multiple CCD or CMOS cameras could be combined to operate as the image sensing device, depending upon the size of the playfield 20 and resolution required by the amusement game. Further, the use of multiple image sensing devices 34 allows the concept of the present disclosure to be utilized in various different types of games, such as multiple player games that include separate and distinct playfields for each player. In such an embodiment, each playfield may include its own image sensing device and a single control unit could receive the visual images and conduct the game accordingly.
Alternatively, multiple image sensing devices may be required when the size of the playfield is much larger than the viewing field of any individual image sensing device. Likewise, the use of multiple cameras for a single playfield allows for “stereo” images and/or three dimensional tracking for the movement of the game object. The use of multiple image sensing devices allows the concept to be utilized with other types of amusement games.
Referring now to
As illustrated in
The image sensing device of the present disclosure creates the electronic image scans at a rate as low as ten frames per second during game play, although this low a frame rate may limit the speed of the cars. In one embodiment of the disclosure, it is contemplated that frame rates between 30 and 60 frames or more per second can be utilized to resolve high speed object motion and to reduce or eliminate blurring. These frame rates are well within current imaging and processing technology capability. The mask image 46 shown in
Referring now to
For the control unit of the amusement game to control the operation of one or more of the game objects 44, the control unit must utilize image processing techniques to identify both the position of the game objects 44 on the playfield 20 and the direction of movement of the game objects along the playfield.
In the embodiment illustrated in
Although four different colors for the game objects are described in the present embodiment, it is also contemplated that each of the game objects could include another type of distinguishing characteristic that would allow the game objects to be distinguished from each other utilizing image processing techniques. As an example, each of the four game objects could include a different geometric shape included on a top portion of the game object. In any event, each of the game objects includes a distinguishing characteristic that allows an image processing technique to distinguish between the game objects in an image scan similar to that shown in
Although various types of image processing techniques are known that could be utilized to isolate the position of the game object relative to the playfield in each of the image scans, in the embodiment of the disclosure shown in the Figures, the system utilizes an image subtraction method. Specifically, the control unit records the image scan 52 shown in
Once each pixel of the entire screen image has been classified as described above, the control unit determines the position of each of the colored game objects by first defining a game object block for each color. Once a block of color has been identified in the composite image, the control unit creates a bounding box for each of the game objects. Since each of the game objects has a different color, the control unit is able to create a bounding box for each of the game objects within the composite image 54.
Referring now to
Once the location of each of the game objects has been identified in the image scans 52 shown in
As stated previously, the control unit can identify the position of the game object on the playfield by utilizing image subtraction and color identification. Further, the bounding box and the center point 58 of each of the game objects allows the control unit to determine the angular orientation of the game object. In the embodiment shown in
In the embodiment illustrated in
In one embodiment, the control unit utilizes a target angle of 0° to control the object in a straightaway and gradually adjusts the position of the wheels to guide the game object around the corners of the track as illustrated.
Set forth below is a portion of the control algorithm utilized by the control unit to control the operation of one of the game objects along the straight portion of the playfield where the steering range is −127 hard left to +127 hard right:
Other values, as well as values modified by current speed or car position on playfield may also be used.
The portion of the control algorithm set forth above controls the movement of the game object along the straight portions of the playfield shown. Various different control algorithms can be utilized to direct each of the computer controlled “drone” game objects along the playfield depending upon various parameters. As an example, the speed and steering functions of the computer controlled cars can be adjusted depending upon the ability level of the other players engaging in the game play. If the control unit determines that the players have relatively high skill, the control algorithm can be adjusted to increase the speed of the drone cars and to cause the drone cars to take a more aggressive line around the playfield. This type of algorithm makes the drones less predictable and more fun to race since the speed of the drones can be adjusted in real time to closely match that of the fastest (and possibly just below the slowest) players.
Referring back to
In the embodiment shown in the above description and illustrated in
During game play, the control unit can compare the position of the car on the playfield and the orientation of the car in the current image scan to the position and orientation of the car in a past image scan. The comparison between the location and position amongst multiple image scans allows the control unit to determine the direction of movement of each of the game objects during game play. Further, the comparison from one image scan to the next allows the control unit to determine the speed of travel and identify the position of the computer controlled cars relative to those being player controlled.
In the above description, RGB values are the actual camera-generated pixel values for each of the three colors. The RGBY values include Y, which is an image processing example of the ability to distinguish more than three object colors using only three captured image input color data values. Other types of color measurement formats other than RGB, such as CMYK or HSV can accomplish the required image processing tasks as well.
As described above, although image subtraction and region of interest masking is described as being one type of image processing technique utilized to identify the position of the game object, various other types of image processing techniques can be utilized while operating within the scope of the present invention. Specifically, any type of imaging processing technique that can identify the tracking point of the game object can be utilized to determine the position of the game object relative to target areas defined on the playfield.
Although the preferred type of image sensor is a CCD or CMOS image sensor, it is also contemplated that a low cost, infrared camera can also be utilized while operating within the scope of the present disclosure. A low cost infrared camera can be utilized to determine differences between play objects and playfields to determine the location of a game object. In another alternate embodiment, a linear sensor array could be utilized where two dimensional resolution is not required. Although various other embodiments, such as an IR camera and a linear array, are specifically set forth, it should be understood that various other types of image sensing devices could be utilized while operating within the scope of the present disclosure.
Initially, the control unit activates the image sensing device to view the playfield, as shown in step 62. Once the playfield has been viewed, the control unit records the image of the playfield as a reference image in step 64. In addition to the reference image, the control unit creates a mask image for the playfield, as shown in
Referring back to
Once the game play has begun, the control unit operates the image sensing device to create a series of sequential image scans of the playfield at a pre-defined rate, as shown in step 70. In the embodiment of the invention described, the image sensor operates to generate at least thirty images per second, although a higher or lower frame rate could be utilized while operating within the scope of the present disclosure.
For each of the image scans created by the image sensor, the control unit compares the image scan to the reference image in step 72. As described previously, one method of comparing the image scan to the reference image is to subtract the reference image from the current image scan to create a composite image scan in which the only remaining elements are the individual game objects. In step 73, the mask image is logically and'ed with the composite image to define the region of interest where cars are to be identified.
In step 74, the control unit identifies the location and the orientation of each game object utilizing the system and method previously described. In the preferred embodiment shown and described in the present disclosure, the location and orientation of each of the game objects is determined based upon a color sensing technique. In such embodiment, each of the game objects has a different color such that the control unit can identify the location and identity of each of the objects based upon identifying blocks of color. However, it is contemplated that other methods can be utilized, such as including geometric shapes on each of the game objects such that the control unit can identify the location of each of the individual game objects based upon the geometric shape contained on the game object. Optical character recognition may also be used to determine player numbers placed in such a manner as to be visible by the imaging device.
Once the location and orientation of each of the game objects is identified, the control unit determines the desired throttle and steering position for each of the drones currently under computer control, as illustrated in step 76. As described previously, the control unit can utilize various algorithms to determine the speed and aggressive nature of the steering to create a game play that is both challenging for advanced players yet enjoyable for novice players.
Once the throttle and steering position control signals have been calculated, the control unit relays the signals to the drones, as shown in step 78. In the embodiment shown in
Referring back to
As can be understood by the flowchart of
If the computer control unit determines in step 80 that the game has been completed, the control unit operates to return all of the game objects to the Start/Finish Line, as indicated in step 82. Alternatively, the control unit directs all non-winning cars to an edge of the track, and performs one or more victory laps, or other celebratory sequences, with the winning car, then moves all cars in front of the corresponding player control stations. Once the game has been completed, the control unit takes over control of all of the game objects, even if one of the game objects was player controlled during game play. Based upon the control units control of the series of game objects, the control unit returns to step 66 to determine if another game needs to be played. Since the control unit returns each of the game object to the Start/Finish Line, at the beginning of the next game play, all of the game objects begin from a common position.
Although the embodiments shown in the Figures illustrate a racing game having a series of race cars, it is contemplated that various other types of amusement games could be utilized while operating within the description of the present disclosure. As an example, it is contemplated that other games, such as soccer, hockey, horse racing or other similar games in which a player controls the movement of a game object along a playfield could be utilized within the scope of the present disclosure. In each of these other alternate embodiments, the image sensing device monitors the movement and position of the game object such that the control unit can analyze the image scans from the image sensing device and control one or more of the game objects during the game play. Although specific examples are set forth in the disclosure, it should be understood that various other types of amusement games could be utilized while operating within the scope of the present disclosure. The disclosure of the present invention is not meant to be limiting as to the type of amusement games possible, but rather is meant to be illustrative of currently contemplated amusement games that could operate within the scope of the present disclosure.
The present application is based on and claims priority to U.S. Provisional Patent Application Ser. No. 61/087,404 filed on Aug. 8, 2008.
Number | Date | Country | |
---|---|---|---|
61087404 | Aug 2008 | US |