1. Field of the Invention
The present invention relates to a gaming machine, and in particular a gaming machine comprising an input device using a virtual mouse.
2. Background Information
Gaming machines installed in arcades and casinos are generally remodeled at frequent intervals in order to continuously attract many players. Remodeling of gaming machines often requires replacement of the mechanisms thereof, such as mechanical reels and push buttons serving as input devices, in their entirety. Accordingly, mechanical gaming machines are being replaced with video gaming machines having little mechanical portions in order to facilitate frequent remodeling and maintenance thereof. For example, mechanical reels are replaced with video reels displayed in graphic form on a screen of an electric display device. Push buttons separately assigned to types of bets and paylines, a spin button or lever, and the like, are replaced with virtual buttons displayed on a touch panel, which are assigned to various functions of the gaming machine by software. Remodeling of such a gaming machine generally requires only data updates, such as image data for use in the display on the screen and the touch panel, and data about the relationship between the virtual buttons displayed on the touch panel and the functions of the gaming machine.
In recent years, video gaming machines are increasing their versatility. This is changing the video gaming machines from specialized devices conducting video games with limited content to multi-function devices capable of providing various services, which are not limited in games, like personal computers. The increasing versatility requires input devices with easier operability and higher functionality such as mouses and keyboards, than known input devices such as push buttons and touch panels.
Especially in casinos and arcades, gaming machines are used by a number of players, and accordingly require a greater degree of ruggedization. However, it is difficult to sufficiently ruggedize input devices separate from bodies of gaming machines such as mouses and keyboards. Indeed, such input devices are required to withstand rough handling by players getting hooked on games, and severe environmental conditions, e.g., various drinks spilling thereon and various dirt and soils gummed thereon. Higher levels of security are also required to protect such input devices from theft. As a result, the adoption of such input devices may increase the need for frequent maintenance, and therefore prevent further reductions in the cost of upkeep for gaming machines.
“Virtual mouses” are expected to be able to resolve the above difficulties in using input devices on gaming machines. A virtual mouse device is a type of graphic user interface, which reproduces a virtual mouse, i.e., a graphic image of a mouse on a touch panel (e.g., U.S. Patent Application Publication No. 2006/0034042). The touch panel detects fingers and a palm of a user that touch an area of a screen in which the virtual mouse is reproduced. When the user slides his/her fingers and palm on the screen as if to operate a real mouse, the device causes the virtual mouse to follow the fingers and palm within the screen. Since a virtual mouse does not have a real body, the device resists damages caused by rough handling and dirt. In addition, the virtual mouse is never stolen.
A prior art virtual mouse device uses a touch panel that typically detects changes in structure or stress caused by press forces of user's fingers and palm touching a screen. As long as the fingers and palm touch the screen, the device can determine the location of a virtual mouse. If all the fingers and palm are lift from the screen, the device then keeps the virtual mouse at the last location for a predetermined time. If neither finger nor palm is detected again during the predetermined time in the area where the virtual mouse is reproduced, the device then returns the virtual mouse to a default location. The predetermined time has to be appropriately long in order to prevent the virtual mouse from an unintended return to the default location each time the touch panel fails to detect the fingers and palm. On the other hand, the device is required to allow operations of the virtual mouse to emulate operations of a real mouse, in particular, cyclical actions of a real mouse that a user slides from a location, lifts, and returns to the location in turn in order to cause a mouse pointer to travel a long distance across a screen. A manageable emulation of the cyclical actions requires the virtual mouse to be quickly returned to the default location once the fingers and palm have been lift from the screen. Accordingly, the device has to trade off the reduction of the unintended returns to the default location against the manageable emulation of the cyclical actions. This prevents operability of the virtual mouse from being further improved.
In view of the above, it will be apparent to those skilled in the art from this disclosure that there exists a need for an improved virtual mouse device, which can both reduce unintended returns of a virtual mouse to a default location, and cause the virtual mouse to respond more quickly. This invention addresses this need in the art as well as other needs, which will become apparent to those skilled in the art from this disclosure.
A virtual mouse device according to the present invention comprises a display unit, an image sensor unit, a virtual mouse controller unit, and an input unit. The display unit displays one or more images on a screen. The images preferably include images providing a user with information, images for decoration and visual effects, and icons linked instructions or data to be entered into a host machine, which uses the virtual mouse device as an input device. The image sensor unit detects fingers or a palm of a user that move on or over a specific area on the screen. The image sensor unit preferably includes a matrix of pixels arranged in the specific area. Each pixel preferably includes a photodiode, a capacitor, and a switching transistor. In this case, the image sensor uses the photodiodes to capture light reflected from fingers or a palm of a user that move on or over the specific area and convert the light to an electric signal. More preferably, the display unit and the image sensor unit are integrated into a single panel. In this case, the image sensor unit and the display unit preferably include arrays of capacitors and transistors implemented in the same substrate. The virtual mouse controller unit monitors the fingers or palm of the user that move on or over the specific area by using the image sensor unit, and causes a virtual mouse to follow the fingers or the palm within the specific area by using the display unit. If the fingers or the palm moves out of the specific area, the virtual mouse controller unit then returns the virtual mouse to a default location in the specific area. The input unit monitors the motion of the virtual mouse, and causes the display unit to move a pointer or cursor image, i.e., a mouse pointer or cursor on the screen depending on the amount and direction of travel of the virtual mouse. The input unit preferably decodes an instruction or data from the relationship in location between the images and the mouse pointer or cursor on the screen.
The image sensor unit can detect the location of fingers and a palm of a user, even if the fingers and palm are separated from the surface of the screen. Accordingly, the virtual mouse controller unit can determine the location of the virtual mouse with a high degree of reliability when all the fingers and palm are lift from the screen temporally or accidentally. This allows the virtual mouse to respond to the action of the fingers and palm with a higher degree of stability than a prior art virtual mouse depending on detection of user's fingers or palm by using a touch panel.
If the fingers or palm moves out of the specific area, the virtual mouse controller unit then returns the virtual mouse to a default location. Here, the input unit keeps the mouse pointer or cursor at the last location. This allows a user to operate the virtual mouse in order to cause the mouse pointer or cursor to travel a long distance across the screen as follows. The user first moves his/her fingers or palm from the default location of the virtual mouse to the outside of the specific area in a desired direction. The virtual mouse then follows the fingers or palm from the default location, and returns to the default location when the fingers or palm moves out of the specific area. The user repeats the movement of his/her fingers or palm from the default location to the outside of the specific area. Thus, the virtual mouse device can allow the user to easily emulate cyclical actions of a real mouse that the user slides from a location, lifts, and returns to the location in turn. In particular, the virtual mouse can return to the default location more quickly than the prior art virtual mouse. Therefore, the virtual mouse device can improve operability of the virtual mouse.
The display unit preferably comprises two or more separate screens, and the specific area preferably is placed on one of the screens. In this case, the input unit preferably causes the display unit to move the mouse pointer or cursor on one or more of the screens.
The virtual mouse preferably includes a virtual button or a virtual wheel. In this case, the virtual mouse controller unit preferably detects specific movements of one or more fingers detected by the image sensor unit, and the input unit preferably decodes a click of the virtual button or a roll of the virtual wheel from the specific movements of the fingers. In addition, the virtual mouse controller unit preferably causes the display unit to position the virtual button below the forefinger of the user that moves on or over the specific area. The virtual mouse controller can distinguish the forefinger from other fingers easily regardless of whether the user uses the virtual mouse with his/her right or left hand, since the image sensor unit can detect the whole shape of the user's hand. This improves the operability of the virtual mouse.
The virtual mouse controller unit preferably determines the size or shape of a hand from the fingers or palm of the user detected by the image sensor unit, and then adjusts the size or shape of the virtual mouse depending on the determined size or shape of the hand. In particular, the virtual mouse controller unit preferably distinguishes between the right and left hand of the user with which the user uses the virtual mouse, and then selects a right- or left-hand type of the virtual mouse. The virtual mouse controller unit preferably adjusts the size, shape, or location of the specific area on the screen depending on the determined size or shape of the hand.
The image sensor unit preferably detects fingers or a palm of a user that move on or over one or more optional areas on the screen. In this case, the virtual mouse controller unit preferably causes the display unit to initially display the optional areas on the screen. When the image sensor unit has detected fingers or a palm of a user within one of the optional areas, the virtual mouse controller unit preferably assigns the specific area to the optional area within which the image sensor unit has detected the fingers or palm of the user. This allows the user to select a desired optional area as the specific area. Furthermore, the virtual mouse controller unit preferably adjusts the shape of the virtual mouse depending on the location of the optional area to which the specific area has been assigned. For example, when there are optional areas on the right and left portion of the screen, most right handed users select the right portion, and vice versa. Accordingly, when the right or left portion has been assigned to the specific area, the virtual mouse controller unit may select a right- or left-hand type of the virtual mouse, respectively.
Alternatively, the virtual mouse controller unit may cause the display unit to initially display one or more options of virtual mouses on the screen. When the image sensor unit has detected fingers or a palm of a user within an area in which one of the options is displayed, the virtual mouse controller unit preferably assigns the virtual mouse to be actually used to the option that is displayed in the area within which the image sensor unit has detected the fingers or palm of the user. In addition, the virtual mouse controller unit preferably adjusts the location, size, or shape of the specific area depending on the initial location, size, or shape of the option which the virtual mouse to be actually used has been assigned.
These and other objects, features, aspects and advantages of the present invention will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, discloses a preferred embodiment of the present invention.
Referring now to the attached drawings which form a part of this original disclosure:
Selected embodiments of the present invention will now be explained with reference to the drawings. It will be apparent to those skilled in the art from this disclosure that the following descriptions of the embodiments of the present invention are provided for illustration only and not for the purpose of limiting the invention as defined by the appended claims and their equivalents.
A virtual mouse device according to an embodiment of the present invention is preferably installed in a gaming machine located in a casino or an amusement arcade. Referring to
Referring to
Referring to
The mouse pointer 2F and the virtual mouse 2G are reproduced on the input screen 2A. The mouse pointer 2F can travel across the input screen 2A in response to actions of the virtual mouse 2G. More specifically, the amount and direction of the travel of the mouse pointer 2F are determined by those of the motion of the virtual mouse 2G. By placing the mouse pointer 2F at each graphic element, a player can select the graphic element. Here, some graphic elements 1C may be placed on the game screen 1A, and the mouse pointer 2F may jump into the game screen 1A as shown in
When the gaming machine 10 conducts a slot game, for example, a player first guesses on which payline a winning combination of symbols will appear, and then uses the virtual mouse 2G to place the mouse pointer 2F at buttons linked to a desired payline and a desired amount of a bet, and click the buttons. After that, the player again uses the virtual mouse 2G to place the mouse pointer 2F at a button linked to the function of spinning the video reels 1B, and click the button. Then, the video reels 1B start spinning, and will stop in turn after a predetermined time. If a winning combination appears on the payline on which the player has placed a bet, the player will win an amount of a payout that depends on the amount of the bet and the type of the winning combination.
Referring to
Referring to
The main display unit 1 reproduces the game screen 1A shown in
The game controller unit 3 is preferably comprised of a microcomputer including a CPU, a ROM, and a RAM. The game controller unit 3 is preferably installed in the body of the main display unit 1 or the sub-display unit 2 shown in
For example, the game controller unit 3 conducts a slot game as follows. A player first enters cash or monetary data into the gaming machine 10 in a well-known manner to store credits in the gaming machine 10. The game controller unit 3 causes the main display unit 1 to display the video reels 1B on the game screen 1A, and causes the sub-display unit 2 to display graphic elements 2B-2E on the input screen 2A. The player uses the mouse pointer 2F and the virtual mouse 2G to select one or more paylines and an amount of a bet to be placed on each selected payline. For example, an amount of a bet is displayed in a window 2B, and incremented or decremented at each click of an icon 2C. Each button 2E is assigned to a payline. When a button 2E is clicked, the corresponding payline will be selected. The virtual mouse device 4 monitors the relationship in location between the graphic elements 2B-2E and the mouse pointer 2F, and accepts each pair of a payline and an amount of a bet selected by the player. The game controller unit 3 receives selected pairs of a payline and an amount of a bet from the virtual mouse device 4, and then decreases the credits by the amount of the bet. In addition, the game controller unit 3 may display the amounts of the bet and the available credits and the selected paylines on the display units 1 and 2. When the player has click a button 1C to cue the video reels 1B for the start of a spin as shown in
The virtual mouse device 4 serves as a graphical user interface by using the mouse pointer 2F and the virtual mouse 2G. Referring to
The image sensor unit 41 preferably includes an array of CMOS sensors that are arranged in a transparent film laminated on the mouse pad area 2H. Referring to
On the mouse pad area 2H in the input screen 2A as shown in
Preferably, the FETs T1-T4 and the photodetector PD shown in
The image sensor unit 41 detects not only the presence or absence of a player's hand that touches the surface of the mouse pad area 2H, but also changes in distances of portions of the hand from the surface of the mouse pad area 2H. Referring to
The virtual mouse controller unit 42 is preferably comprised of a microcomputer including a CPU, a ROM, and a RAM. The virtual mouse controller unit 42 is preferably separated from the game controller unit 3, or alternatively, may be integrated into the game controller unit 3. The virtual mouse controller unit 42 is preferably installed in the body of the sub-display unit 2 shown in
The virtual mouse controller unit 42 monitors fingers or a palm of player's hand that move on or over the mouse pad area 2H by using the image sensor unit 41, and causes the virtual mouse 2G to follow the fingers or the palm within the mouse pad area 2H by using the sub-display unit 2 as follows. The virtual mouse controller unit 42 first receives from the image sensor unit 41 the distribution of intensity of the light reflected from the hand, and decodes a location, size, and shape of the hand from the received distribution. Here, the virtual mouse controller unit 42 preferably stores one or more models of an average hand in advance, and determines whether or not an image decoded from the distribution of light intensity matches any model. If it matches a model, the virtual mouse controller unit 42 then recognizes the image as a hand. The virtual mouse controller unit 42 next causes the sub-display unit 2 to display the virtual mouse 2G at the decoded location of the hand. In particular, the virtual mouse controller unit 42 can adjust the position, size, and shape of the virtual mouse 2G, e.g., by scaling and deforming, on the basis of the decoded location, size, and shape of the hand, so that the virtual mouse 2G fits in the hand as shown in
The image sensor unit 41 can detect fingers and a palm separated from the surface of the mouse pad area 2H. Accordingly, the virtual mouse controller unit 42 can determine the location of the virtual mouse 2G with a high degree of reliability when all the fingers and palm are lift from the mouse pad area 2G temporally or accidentally. This allows the virtual mouse 2G to respond to the action of the fingers and palm with a higher degree of stability than a prior art virtual mouse depending on detection of user's fingers or palm by using a touch panel.
The virtual mouse controller unit 42 preferably stores one or more types of virtual mouse images, one of which is actually used as the virtual mouse 2G. Preferably, sizes, shapes, or designs vary with the types of virtual mouse images. The virtual mouse controller unit 42 selects a virtual mouse image of an appropriate type as the virtual mouse 2G on the basis of the decoded location, size, and shape of the hand. As shown in
The virtual mouse controller unit 42 can detect specific movements of fingers or a palm of player's hand, i.e., specific changes in position or shape of the fingers or the palm on or over the mouse pad area 2H by using the image sensor unit 41. Referring to
In addition, the virtual mouse controller unit 42 may decode a pattern of fingerprints or veins of player's hand from a distribution intensity of the light reflected from the hand, which has been detected by the image sensor unit 41. The detected pattern of fingerprints or veins of the player's hand will be used in verification of the player by the virtual mouse controller unit 42 or other similar computer unit linked to the unit 42.
The input unit 43 is preferably comprised of a microcomputer including a CPU, a ROM, and a RAM. The input unit 43 is preferably integrated into the virtual mouse controller unit 42, or alternatively, may be integrated into the game controller unit 3, or separated from both the controller units 42 and 3. The input unit 43 is preferably installed in the body of the sub-display unit 2 shown in
The input unit 43 preferably controls the sub-display unit 2 to display a desired design of the input screen 42 including the graphic elements 2B-2E shown in
On the other hand, the input unit 43 preferably receives information about graphic elements, e.g., the button 1C shown in
The virtual mouse controller unit 42 preferably limits the mouse pad area 2H to a portion of the input screen 2A, and displays only the virtual mouse 2G overlapped with the mouse pad area 2H. Here, the boundaries of the mouse pad area may be not displayed like the mouse pad area 2H shown in
If player's fingers or palm moves out of the mouse pad area 22H across a boundary thereof as shown in
STEP S21: the virtual mouse controller unit 42 detects player's fingers or palm moving on or over the mouse pad area 22H, by using the image sensor unit 41.
STEP S22: the virtual mouse controller unit 42 determines whether or not to locate the fingers or palm within the mouse pad area 22H. Here, the virtual mouse controller unit 42 preferably determines that the fingers or palm is not located within the mouse pad area 22H in one of the following cases: when the half or more of the virtual mouse 22G is positioned in the outside of the mouse pad area 22H; when a predetermined reference portion of the virtual mouse 22G is positioned in the outside of the mouse pad area 22H; or when the image sensor 41 fails to detect any fingers and palm. If the fingers or palm has been located within the mouse pad area 22H, the process goes to the step S23, otherwise the process goes to the step S24.
STEP S23: the virtual mouse controller unit 42 causes the sub-display unit 21 to display the virtual mouse 22G at the detected location of the fingers or palm.
STEP S24: the virtual mouse controller unit 42 returns the virtual mouse 22G to a default location in the mouse pad area 22H. In this case, the virtual mouse controller unit 42 preferably informs the input unit 43 of the return of the virtual mouse 22G.
The virtual mouse controller unit 42 repeats the steps S21-S24. Limiting the mouse pad area and automatically returning of the virtual mouse from the outside to the inside of the mouse pad area facilitates control of the virtual mouse, since the virtual mouse is prevented from overlapping other graphic elements included in the input screen (cf.
The virtual mouse controller unit 42 preferably adjusts the size, shape, and location of the mouse pad area 2H or 22H on the basis of the detected location, size, and shape of player's hand. For example, when a larger hand has been detected on or over the mouse pad area, the virtual mouse controller unit 42 then enlarges the mouse pad area, or vice versa. In addition, when a right or left hand has been detected, the virtual mouse controller unit 42 positions the mouse pad area at the right or left portion of the input screen, respectively. Alternatively, the virtual mouse controller unit 42 may allow a player to manually adjust the size, shape, and location of the mouse pad area by using the virtual mouse and the input screen.
As long as the virtual mouse 22G moves within the mouse pad area 22H as shown in
STEP S31: the input unit 43 detects the amount and direction of each travel of the reference point of the virtual mouse 22G from the information received from the virtual mouse controller unit 42.
STEP S32: the input unit 43 checks if the virtual mouse 22G is returned to a default location according to information received from the virtual mouse controller unit 42. If the virtual mouse 22G has been not returned to the default location, the process goes to the step S33, otherwise the process goes to the step S34.
STEP S33: the input unit 43 causes the display units 21 and 22 to move the mouse pointer 22F on the game screen 21A and the input screen 22A depending on the amount and direction of each travel of the reference point of the virtual mouse 22G.
STEP S34: the input unit 43 keeps the mouse pointer 22F at the last location.
STEP S35: the input unit 43 checks if any event, e.g., a click of any mouse button or a roll of a mouse wheel has been received from the virtual mouse controller unit 42. If an event has been occurred, the process goes to the step S36, otherwise the process returns to the step S31.
STEP S36: the input unit 43 decodes an instruction or data from the relationship in location between the graphic elements and the mouse pointer 22F on the game screen 21A or the input screen 22A. The input unit 43 then informs the game controller unit 3 or the virtual mouse controller unit 42 of the decoded instructions or data, and thereby the decoded instructions or data are entered into the controller unit 3 or 42.
When a player repeats the movement of his/her fingers or palm from the default location of the virtual mouse 22G to the outside of the mouse pad area 22H, the steps S31-S35 are repeated. This allows the player to operate the virtual mouse 22G in order to cause the mouse pointer 22F to travel a long distance across one or both of the game screen 21A and the input screen 22A. Thus, the virtual mouse device 4 can allow the player to easily emulate cyclical actions of a real mouse that the player slides from a location, lifts, and returns to the location in turn. In particular, the virtual mouse 22G can return to the default location more quickly than any prior art virtual mouse. Therefore, the virtual mouse device 4 can improve operability of the virtual mouse 22G.
After the image sensor unit 41 cannot detect player's finger or palm on or over the mouse pad area for a predetermined time, the virtual mouse controller unit 42 preferably erases a virtual mouse. In that case, if the image sensor unit 41 detects player's hand placed on or over a mouse pad area, the virtual mouse controller unit 42 again reproduces a virtual mouse of an appropriate size and shape below the hand in the mouse pad area as described above.
At power-on, or after the image sensor unit 41 cannot detect player's finger or palm on or over the mouse pad area for a predetermined time, the virtual mouse device 4 will execute initialization preferably in one of the following cases: when the virtual mouse device 4 has accepted an instruction to stop a game or cash all credits and the game controller unit 3 finishes changing all the credits to cash or monetary data; or when a predetermined time has elapsed after credits stored in the gaming machine has been reduced to zero while neither cash nor monetary data has been newly added. Note that the virtual mouse device 4 does not execute initialization as long as the image sensor unit 41 can detect player's finger or palm on or over the mouse pad area. Even if no credits are stored in the gaming machine, there is a possibility that a player will enter additional cash or monetary data into the gaming machine while the player stays at the gaming machine.
At the start of game play, the game controller unit 3 and the virtual mouse device 4 preferably display invitational screens on the game screen 1A and the input screen 2A, respectively. In particular, the virtual mouse device 4 displays either type of invitational screens shown in
Referring to
Referring to
At the start of game play, the virtual mouse device 4 may verify a player by using a pattern of fingerprints or veins of the player's hand that the virtual mouse controller unit 42 has been decoded from images captured by the image sensor unit 41.
The virtual mouse device 4 may cause the virtual mouse 2G or 22G to follow a barcode or a matrix code (or two-dimensional barcode) printed or displayed on a surface of an object, e.g., a card or a mobile phone, instead of player's hand.
In understanding the scope of the present invention, the term “configured” as used herein to describe a component, section or part of a device includes hardware and/or software that is constructed and/or programmed to carry out the desired function. In understanding the scope of the present invention, the term “comprising” and its derivatives, as used herein, are intended to be open ended terms that specify the presence of the stated features, elements, components, groups, integers, and/or steps, but do not exclude the presence of other unstated features, elements, components, groups, integers and/or steps. The foregoing also applies to words having similar meanings such as the terms, “including”, “having” and their derivatives. Also, the terms “part,” “section,” “portion,” “member” or “element” when used in the singular can have the dual meaning of a single part or a plurality of parts. Finally, terms of degree such as “substantially”, “about” and “approximately” as used herein mean a reasonable amount of deviation of the modified term such that the end result is not significantly changed. For example, these terms can be construed as including a deviation of at least ±5% of the modified term if this deviation would not negate the meaning of the word it modifies.
While only selected embodiments have been chosen to illustrate the present invention, it will be apparent to those skilled in the art from this disclosure that various changes and modifications can be made herein without departing from the scope of the invention as defined in the appended claims. Furthermore, the foregoing descriptions of the embodiments according to the present invention are provided for illustration only, and not for the purpose of limiting the invention as defined by the appended claims and their equivalents.