A portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent disclosure, as it appears in the Patent and Trademark Office patent files or records, but otherwise reserves all copyright rights whatsoever.
The present invention relates generally to a gaming apparatus, and methods for playing wagering games, and more particularly, to a gaming system offering more accurate feedback based on gestures made by a player in game play.
Gaming terminals, such as slot machines, video poker machines and the like, have been a cornerstone of the gaming industry for several years. Generally, the popularity of such machines with players is dependent on the likelihood (or perceived likelihood) of winning money at the machine and the intrinsic entertainment value of the machine relative to other available gaming options.
Consequently, shrewd operators strive to employ the most entertaining and exciting machines available because such machines attract frequent play and, hence, increase profitability to the operator. In the competitive gaming machine industry, there is a continuing need for gaming machine manufacturers to produce new types of games, or enhancements to existing games, which will attract frequent play by enhancing the entertainment value and excitement associated with the game.
One concept that has been successfully employed to enhance the entertainment value of a game is that of a “secondary” or “bonus” game which may be played in conjunction with a “basic” game. The bonus game may comprise any type of game, either similar to or completely different from the basic game, and is entered upon the occurrence of a selected event or outcome of the basic game. Such a bonus game produces a significantly higher level of player excitement than the basic game because it provides a greater expectation of winning than the basic game.
Gaming machines have also utilized a variety of input devices for receiving input from a player, such as buttons and touch screen devices. However, these input devices are limited in that they can receive only one input at a time from the player. For example, if a player touches a single-point sensing device such as a single-point touch screen device at two distinct points simultaneously, only one coordinate is provided by the touch-screen driver corresponding to one of the distinct points only or to a single average point between the two points. The inability of the player to interact with the gaming machine and other players by providing multiple inputs simultaneously is a significant disadvantage to gaming machines heretofore. In order to address such issues, multi-point touch displays have been introduced recently. The use of such devices allows player gestures to be interpreted with a wider range of motions and therefore increase player immersion into the game. However, one issue with such interactive devices is an inaccurate modeling of the players' actions where gestures may be misinterpreted or one gesture may be construed as multiple gestures. Further, multi-point inputs may not accurately reflect a player's actions. The inaccurate reflection of a player gesture results in player frustration or player manipulation of the inaccurate device.
While these player appeal features provide some enhanced excitement relative to other known games, there is a continuing need to develop new features for gaming machines to satisfy the demands of players and operators. Therefore it would be desirable for a more accurate interactive interface for more accurate interpretation of player gestures.
It has been observed by the inventors that a problem associated with interpreting gestures is that when a player makes a gesture, depending on the handedness of the player, there tends to be a trailing off of the gesture toward the end of the motion. As a result, the gesture the player actually intended to make can differ from the gesture actually sensed by the gesture-sensing hardware and software. For example, a right-handed player may tend to trail off to the right toward the end of a gesture, skewing the direction of the gesture toward the right. Aspects of the present disclosure are directed to ascertaining the intended trajectory and other characteristics of a gesture based on the actual gesture made by the player. In a wagering game context, it is particularly important to ensure that the intended gesture of the player is captured, for example, to ensure that an intended wager amount is inputted or to reassure the player that the gesture is accurately selecting a wagering game object.
A gaming terminal for playing a wagering game, the gaming terminal comprising a controller, a touch surface for actuation by a player gesture associated with an input to the wagering game, a sensor array underling the touch surface to sense the motion of the gesture, the sensor array coupled to the controller, wherein the controller converts the sensed motion to corresponding gesture data indicative of the gesture made by the player, and determines from at least a portion of the gesture data a trajectory of an intended gesture that differs from the gesture made by the player and a display coupled to the controller to display movement of an object image during the wagering game based on the trajectory of the intended gesture.
The controller can determine the trajectory by the tangent of a portion of a curved path of the gesture.
The controller can determine the trajectory based on a degree of curvature of an anticipated arc from the gesture.
The motion can include a pullback motion, and wherein the controller calculates the trajectory based on acceleration of the pullback motion.
The determination of the trajectory can include breaking the gesture into segments of sensors of the sensor array underlying the touch surface, measuring the acceleration of the gesture on each segment, and determining the trajectory based on the segment having the fastest measured acceleration.
The gaming terminal can further comprise a memory storing the gesture data as gesture values in a table having a plurality of trajectories each associated with different set of predetermined gesture values, wherein the controller selects one of the trajectories from the table based on a comparison of the gesture values with the predetermined gesture values.
The trajectory can be calculated based on the distance of the gesture on the touch surface and how much space an arc formed by the gesture occupies.
The touch surface can include a launch boundary defining a zone where the gesture is sensed.
The controller can determine a deceleration motion in the gesture, wherein the controller interprets the deceleration to cancel the input from the gesture.
The controller can sense any break in contact in the motion from the touch surface and terminates the input of the gesture.
The touch surface can include a defined area of the possible output in the array, and the gesture is calculated based on the sensors of the senor array in the area and all contact points of the gesture outside the area are disregarded to constrain the maximum angle of the gesture.
The touch surface can include a physical feature defining a point where the gesture releases the object image on the display.
The gaming machine can further comprise an audio output device coupled to the controller, the audio output device producing an audio output in response to the received gesture.
The gaming machine can further comprise a physical actuation device, the physical actuation output device producing a physical actuation in response to the received gesture.
The display can display indications of the resulting trajectory of the gesture relating to the object image.
A method of determining an intended gesture from an actual gesture made in a wagering game, comprising receiving gesture data indicative of an actual gesture made by a player within a defined coordinate space at a gaming terminal on which a wagering game is displayed, displaying on the gaming terminal an object that is influenced by a gesture, determining from at least a portion of the gesture data an intended gesture that differs from the actual gesture based on a criterion, causing the object to be influenced by the intended gesture instead of the actual gesture and responsive to the causing, executing a wagering game function using the influenced object as an input.
The criterion can include whether at least a portion of the actual gesture falls within a predefined area, the determining including if the portion of the actual gesture falls within the predefined area, ignoring the portion of the actual gesture in determining the intended gesture.
The criterion can include a trajectory of the actual gesture, the determining being carried out by calculating a tangent of a curved portion of an initial part of the gesture to determine the trajectory of the actual gesture and using the determined trajectory as the trajectory of the intended gesture.
The criterion can include whether the actual gesture is generally straight, the determining including determining a linear relationship between at least two points along the actual gesture responsive to the actual gesture being generally straight and using the linear relationship to determine the intended gesture.
The criterion can include an acceleration of at least a portion of the actual gesture, the determining including defining a plurality of segments along the actual gesture, calculating in each of the segments the acceleration of the actual gesture within the segment, determining in which of the segments the calculated acceleration is the highest, determining a trajectory of the actual gesture in the segment determined to have the highest calculated acceleration, and using the trajectory to determine the intended gesture.
The criterion can include a change in acceleration of at least a portion of the actual gesture relative to other portions of the actual gesture, the determining including defining a plurality of segments along the actual gesture, calculating in each of the segments the acceleration of the actual gesture within the segment, determining in which of the segments the acceleration has the highest change relative to the acceleration of the actual gesture in the other segments, determining a trajectory of the actual gesture in the segment determined to have the highest change in calculated acceleration, and using the trajectory to determine the intended gesture.
The criterion can include whether a characteristic of the actual gesture corresponds to value in a weighted table of values, the method further comprising selecting the value in the weighted table, and using the weighted table to determine an award for the wagering game based on a randomly determined winning outcome of the wagering game.
The characteristic is an angle relative to a horizontal line within the defined coordinate space.
The criterion can include whether a characteristic of the actual gesture corresponds to value in a weighted table of values, the method further comprising randomly selecting the weighted table or one of at least two weighted tables adjacent to the weighted table, wherein each of the weighted table and the at least two weighted tables has the same expected value but a different volatility, and using the randomly selected weighted table to determine an award for the wagering game based on a randomly determined winning outcome of the wagering game.
The method can further comprise sensing when the actual gesture has ended and coincidentally providing haptic feedback to the player who made the actual gesture to indicate that the actual gesture was received.
The haptic feedback is carried out by actuating a solenoid positioned under a substrate on which the actual gesture is made.
The method can further comprise displaying a trail of the actual gesture that persists after the actual gesture has completed, and displaying an indication of the intended gesture overlaying the trail.
The wagering game function can be accepting an amount of a wager.
The method can further comprise displaying a plurality of wager amounts on a display of the gaming terminal, displaying an animation in which the influenced object follows a path defined by the intended gesture until the influenced object corresponds to a selected one of the wager amounts, and using the selected wager amount as a wager to play the wagering game.
The wagering game function can include determining an award associated with the wagering game, the method further comprising displaying a plurality of further objects on a display of the gaming terminal, each of the further objects corresponding to an award to be awarded to the player responsive to a randomly selected outcome of the wagering game satisfying a criterion, displaying an animation in which the influenced object follows a path defined by the intended gesture until the influenced object corresponds to a selected one of the further objects, and awarding the player the award associated with the selected one of the further objects.
The award can include (a) eligibility to play a further round of the wagering game or a bonus game, (b) an amount of credits, or (c) an enhancement parameter associated with the wagering game.
A computer program product comprising a computer readable medium having an instruction set borne thereby, the instruction set being configured to cause, upon execution by a controller, the acts of receiving gesture data indicative of an actual gesture made by a player within a defined coordinate space at a gaming terminal on which a wagering game is displayed, displaying on the gaming terminal an object that is influenced by a gesture, determining from at least a portion of the gesture data an intended gesture that differs from the actual gesture based on a criterion, causing the object to be influenced by the intended gesture instead of the actual gesture and responsive to the causing, executing a wagering game function using the influenced object as an input.
The criterion can include whether at least a portion of the actual gesture falls within a predefined area, the determining including if the portion of the actual gesture falls within the predefined area, ignoring the portion of the actual gesture in determining the intended gesture.
The criterion can include a trajectory of the actual gesture, the determining being carried out by calculating a tangent of a curved portion of an initial part of the gesture to determine the trajectory of the actual gesture and using the determined trajectory as the trajectory of the intended gesture.
The criterion can include whether the actual gesture is generally straight, the determining including determining a linear relationship between at least two points along the actual gesture responsive to the actual gesture being generally straight and using the linear relationship to determine the intended gesture.
The criterion can include an acceleration of at least a portion of the actual gesture, the determining including defining a plurality of segments along the actual gesture, calculating in each of the segments the acceleration of the actual gesture within the segment, determining in which of the segments the calculated acceleration is the highest, determining a trajectory of the actual gesture in the segment determined to have the highest calculated acceleration, and using the trajectory to determine the intended gesture.
The criterion can include a change in acceleration of at least a portion of the actual gesture relative to other portions of the actual gesture, the determining including defining a plurality of segments along the actual gesture, calculating in each of the segments the acceleration of the actual gesture within the segment, determining in which of the segments the acceleration has the highest change relative to the acceleration of the actual gesture in the other segments, determining a trajectory of the actual gesture in the segment determined to have the highest change in calculated acceleration; and using the trajectory to determine the intended gesture.
The criterion can include whether a characteristic of the actual gesture corresponds to value in a weighted table of values, the instruction set being further configured to cause the acts of selecting the value in the weighted table, and using the weighted table to determine an award for the wagering game based on a randomly determined winning outcome of the wagering game.
The characteristic can be an angle relative to a horizontal line within the defined coordinate space.
The criterion can include whether a characteristic of the actual gesture corresponds to value in a weighted table of values, the instruction set being further configured to cause the acts of randomly selecting the weighted table or one of at least two weighted tables adjacent to the weighted table, wherein each of the weighted table and the at least two weighted tables has the same expected value but a different volatility, and using the randomly selected weighted table to determine an award for the wagering game based on a randomly determined winning outcome of the wagering game.
The instruction set can further be configured to cause the act of sensing when the actual gesture has ended and coincidentally providing haptic feedback to the player who made the actual gesture to indicate that the actual gesture was received.
The haptic feedback can be carried out by actuating a solenoid positioned under a substrate on which the actual gesture is made.
The instruction set can further be configured to cause the acts of displaying a trail of the actual gesture that persists after the actual gesture has completed, and displaying an indication of the intended gesture overlaying the trail.
The wagering game function can accept an amount of a wager.
The instruction set can further be configured to cause the acts of displaying a plurality of wager amounts on a display of the gaming terminal, displaying an animation in which the influenced object follows a path defined by the intended gesture until the influenced object corresponds to a selected one of the wager amounts, and using the selected wager amount as a wager to play the wagering game.
The wagering game function can include determining an award associated with the wagering game, the instruction set being further configured to cause the acts of displaying a plurality of further objects on a display of the gaming terminal, each of the further objects corresponding to an award to be awarded to the player responsive to a randomly selected outcome of the wagering game satisfying a criterion, displaying an animation in which the influenced object follows a path defined by the intended gesture until the influenced object corresponds to a selected one of the further objects, and awarding the player the award associated with the selected one of the further objects.
The award can include (a) eligibility to play a further round of the wagering game or a bonus game, (b) an amount of credits, or (c) an enhancement parameter associated with the wagering game.
Additional aspects of the invention will be apparent to those of ordinary skill in the art in view of the detailed description of various embodiments, which is made with reference to the drawings, a brief description of which is provided below.
While the invention is susceptible to various modifications and alternative forms, specific embodiments have been shown by way of example in the drawings and will be described in detail herein. It should be understood, however, that the invention is not intended to be limited to the particular forms disclosed. Rather, the invention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the invention as defined by the appended claims.
Referring to
The gaming terminal 10 illustrated in
The primary display area 14 include, in various aspects of the present concepts, a mechanical-reel display, a video display, or a combination thereof in which a transmissive video display is disposed in front of the mechanical-reel display to portray a video image in superposition over the mechanical-reel display. Further information concerning the latter construction is disclosed in U.S. Pat. No. 6,517,433 to Loose et al. entitled “Reel Spinning Slot Machine With Superimposed Video Image,” which is incorporated herein by reference in its entirety. The video display is, in various embodiments, a cathode ray tube (CRT), a high-resolution liquid crystal display (LCD), a plasma display, a light emitting diode (LED), a DLP projection display, an electroluminescent (EL) panel, or any other type of display suitable for use in the gaming terminal 10, or other form factor, such as is shown by way of example in
Video images in the primary display area 14 and/or the secondary display area 16 are rendered in two-dimensional (e.g., using Flash Macromedia™) or three-dimensional graphics (e.g., using Renderware™). In various aspects, the video images are played back (e.g., from a recording stored on the gaming terminal 10), streamed (e.g., from a gaming network), or received as a TV signal (e.g., either broadcast or via cable) and such images can take different forms, such as animated images, computer-generated images, or “real-life” images, either prerecorded (e.g., in the case of marketing/promotional material) or as live footage. The format of the video images can include any format including, but not limited to, an analog format, a standard digital format, or a high-definition (HD) digital format.
The player-input or user-input device(s) 26 include, by way of example, a plurality of buttons 36 on a button panel, as shown in
The information reader 24 (or information reader/writer) is preferably located on the front of the housing 12 and comprises, in at least some forms, a ticket reader, card reader, bar code scanner, wireless transceiver (e.g., RFID, Bluetooth, etc.), biometric reader, or computer-readable-storage-medium interface. As noted, the information reader may comprise a physical and/or electronic writing element to permit writing to a ticket, a card, or computer-readable-storage-medium. The information reader 24 permits information to be transmitted from a portable medium (e.g., ticket, voucher, coupon, casino card, smart card, debit card, credit card, etc.) to the information reader 24 to enable the gaming terminal 10 or associated external system to access an account associated with cashless gaming, to facilitate player tracking or game customization, to retrieve a saved-game state, to store a current-game state, to cause data transfer, and/or to facilitate access to casino services, such as is more fully disclosed, by way of example, in U.S. Patent Publication No. 2003/0045354, published on Mar. 6, 2003, entitled “Portable Data Unit for Communicating With Gaming Machine Over Wireless Link,” which is incorporated herein by reference in its entirety. The noted account associated with cashless gaming is, in some aspects of the present concepts, stored at an external system 46 (see
Turning now to
To provide gaming functions, the controller 42 executes one or more game programs comprising machine-executable instructions stored in local and/or remote computer-readable data storage media (e.g., memory 44 or other suitable storage device). The term computer-readable data storage media, or “computer-readable medium,” as used herein refers to any media/medium that participates in providing instructions to controller 42 for execution. The computer-readable medium comprises, in at least some exemplary forms, non-volatile media (e.g., optical disks, magnetic disks, etc.), volatile media (e.g., dynamic memory, RAM), and transmission media (e.g., coaxial cables, copper wire, fiber optics, radio frequency (RF) data communication, infrared (IR) data communication, etc). Common forms of computer-readable media include, for example, a hard disk, magnetic tape (or other magnetic medium), a 2-D or 3-D optical disc (e.g., a CD-ROM, DVD, etc.), RAM, PROM, EPROM, FLASH-EPROM, any other memory chip or solid state digital data storage device, a carrier wave, or any other medium from which a computer can read. By way of example, a plurality of storage media or devices are provided, a first storage device being disposed proximate the user interface device and a second storage device being disposed remotely from the first storage device, wherein a network is connected intermediate the first one and second one of the storage devices.
Various forms of non-transitory computer-readable media may be involved in carrying one or more sequences of one or more instructions to controller 42 for execution. By way of example, the instructions may initially be borne on a data storage device of a remote device (e.g., a remote computer, server, or system). The remote device can load the instructions into its dynamic memory and send the instructions over a telephone line or other communication path using a modem or other communication device appropriate to the communication path. A modem or other communication device local to the gaming terminal 10 or to an external system 46 associated with the gaming terminal can receive the data on the telephone line or conveyed through the communication path (e.g., via external systems interface 58) and output the data to a bus, which transmits the data to the system memory 44 associated with the controller 42, from which system memory the processor retrieves and executes the instructions.
Thus, the controller 42 is able to send and receive data, via carrier signals, through the network(s), network link, and communication interface. The data includes, in various examples, instructions, commands, program code, player data, and game data. As to the game data, in at least some aspects of the present concepts, the controller 42 uses a local random number generator (RNG) to randomly generate a wagering game outcome from a plurality of possible outcomes. Alternatively, the outcome is centrally determined using either an RNG or pooling scheme at a remote controller included, for example, within the external system 46.
As shown in the example of
As shown in the example of
As seen in
Communications between the controller 42 and both the peripheral components of the gaming terminal 10 and the external system 46 occur through input/output (I/O) circuit 56, which can include any suitable bus technologies, such as an AGTL+ frontside bus and a PCI backside bus. Although the I/O circuit 56 is shown as a single block, it should be appreciated that the I/O circuit 56 alternatively includes a number of different types of I/O circuits. Furthermore, in some embodiments, the components of the gaming terminal 10 can be interconnected according to any suitable interconnection architecture (e.g., directly connected, hypercube, etc.).
The I/O circuit 56 is connected to an external system interface or communication device 58, which is connected to the external system 46. The controller 42 communicates with the external system 46 via the external system interface 58 and a communication path (e.g., serial, parallel, IR, RC, 10bT, near field, etc.). The external system 46 includes, in various aspects, a gaming network, other gaming terminals, a gaming server, a remote controller, communications hardware, or a variety of other interfaced systems or components, in any combination. In yet other aspects, the external system 46 may comprise a player's portable electronic device (e.g., cellular phone, electronic wallet, etc.) and the external system interface 58 is configured to facilitate wireless communication and data transfer between the portable electronic device and the controller 42, such as by a near field communication path operating via magnetic field induction or a frequency-hopping spread spectrum RF signals (e.g., Bluetooth, etc.).
The gaming terminal 10 optionally communicates with external system 46 (in a wired or wireless manner) such that each terminal operates as a “thin client” having relatively less functionality, a “thick client” having relatively more functionality, or with any range of functionality therebetween (e.g., an “intermediate client”). In general, a wagering game includes an RNG for generating a random number, game logic for determining the outcome based on the randomly generated number, and game assets (e.g., art, sound, etc.) for presenting the determined outcome to a player in an audio-visual manner. The RNG, game logic, and game assets are contained within the gaming terminal 10 (“thick client” gaming terminal), the external systems 46 (“thin client” gaming terminal), or are distributed therebetween in any suitable manner (“intermediate client” gaming terminal).
Referring now to
In accord with various methods of conducting a wagering game on a gaming system in accord with the present concepts, the wagering game includes a game sequence in which a player makes a wager, such as through the money/credit detector 48, touch screen 38 soft key, button panel, or the like, and a wagering game outcome is associated with the wager. The wagering game outcome is then revealed to the player in due course following initiation of the wagering game. The method comprises the acts of conducting the wagering game using a gaming apparatus, such as the gaming terminal 10 depicted in
In the aforementioned method, for each data signal, the controller 42 is configured to processes the electronic data signal, to interpret the data signal (e.g., data signals corresponding to a wager input), and to cause further actions associated with the interpretation of the signal in accord with computer instructions relating to such further actions executed by the controller. As one example, the controller 42 causes the recording of a digital representation of the wager in one or more storage devices (e.g., system memory 44 or a memory associated with an external system 46), the controller, in accord with associated computer instructions, causing the changing of a state of the data storage device from a first state to a second state. This change in state is, for example, effected by changing a magnetization pattern on a magnetically coated surface of a magnetic storage device or changing a magnetic state of a ferromagnetic surface of a magneto-optical disc storage device, a change in state of transistors or capacitors in a volatile or a non-volatile semiconductor memory (e.g., DRAM), etc.). The noted second state of the data storage device comprises storage in the storage device of data representing the electronic data signal from the controller (e.g., the wager in the present example). As another example, the controller 42 further, in accord with the execution of the instructions relating to the wagering game, causes the primary display 14 or other display device and/or other output device (e.g., speakers, lights, communication device, etc.), to change from a first state to at least a second state, wherein the second state of the primary display comprises a visual representation of the physical player input (e.g., an acknowledgement to a player), information relating to the physical player input (e.g., an indication of the wager amount), a game sequence, an outcome of the game sequence, or any combination thereof, wherein the game sequence in accord with the present concepts comprises acts described herein. The aforementioned executing of computer instructions relating to the wagering game is further conducted in accord with a random outcome (e.g., determined by the RNG) that is used by the controller 42 to determine the outcome of the game sequence, using a game logic for determining the outcome based on the randomly generated number. In at least some aspects, the controller 42 is configured to determine an outcome of the game sequence at least partially in response to the random parameter.
The basic-game screen 60 is displayed on the primary display area 14 or a portion thereof. In
In the illustrated embodiment of
As shown in the example of
Symbol combinations are evaluated in accord with various schemes such as, but not limited to, “line pays” or “scatter pays.” Line pays are evaluated left to right, right to left, top to bottom, bottom to top, or any combination thereof by evaluating the number, type, or order of symbols 90 appearing along an activated payline 30. Scatter pays are evaluated without regard to position or paylines and only require that such combination appears anywhere on the reels 62a-e. While an example with nine paylines is shown, a wagering game with no paylines, a single payline, or any plurality of paylines will also work with the enhancements described below. Additionally, though an embodiment with five reels is shown in
The gaming terminal 10 can include a multi-touch sensing system 100, such as the one shown in
As used herein, a “touch” or “touch input” does not necessarily mean that the player's finger or body part actually must physically contact or touch the multi-touch sensing device array 102 or other multi-touch sensing device. As is known via techniques such as via capacitive sensing techniques and other electromagnetic or optical techniques, the player's body need not actually physically touch or contact the multi-touch sensing device, but rather need only be placed in sufficient proximity to the multi-touch sensing device so as to be interpreted as a touch input.
The local controller 106 can be coupled to the controller 42, either directly or via the I/O circuit 56. The local controller 106 receives information outputted from the multi-touch sensing array 102 via the interface 104, where the outputted information is indicative of a multi-point gesture made relative to the multi-touch sensing array 102. In a specific aspect, the array 102 of multi-touch sensing system 100 includes input sensors 110 (shown in
Although a specific multi-touch sensing system 100 is shown in
As used herein, a multi-point gesture refers to a gesture that originates by touching simultaneously two or more points relative to the multi-touch sensing system 100. By “relative to” it is meant that the body need not actually physically touch any part of the multi-touch sensing array 102, but must be brought sufficiently near the array 102 so that a touch input can be detected. Such multi-point gestures can be bimanual (i.e., require use of both hands to create a “chording” effect) or multi-digit (i.e., require use of two or more fingers as in rotation of a dial). Bimanual gestures may be made by the hands of a single player, or by different hands of different players, such as in a multi-player wagering game. By “simultaneously” it is meant that at some point in time, more than one point is touched. In other words, it is not necessary to touch two different points at the precise same moment in time. Rather, one point can be touched first, followed by a second point, so long as the first point remains touched as the second point is touched. In that sense, the first and second points are touched simultaneously. If contact must be removed from the first point before the second touch is capable of being sensed, then such a touch scheme would be deemed to be a single-touch scheme. For example, each individual input sensor 100a,b,c,d,e,f,g,h,i,j,k,l,m,n,o,p in the array of input sensors 102 can, for example, detect only one touch input at a time, but the entire array 102 can detect multiple touches simultaneously.
An actual gesture is one physically made with one or both hands by a player of the wagering game in a defined coordinate space that is configured for sensing or detecting the actual gesture. A gesture sensing system captures the actual gesture and converts it into corresponding gesture data indicative of the actual gesture. The coordinate space can be a two- or three-dimensional space defined by coordinates in each dimension. The gesture data can include, for example, coordinates corresponding to a path taken by the actual gesture within the coordinate space, along with other optional characteristics such as, for example, any combination of direction, velocity, acceleration, and pressure.
An intended gesture, by contrast, is a gesture that is determined or calculated by an electronic controller under control of software or firmware on one or more tangible non-transitory medium/media and corresponds to an estimation or approximation of what the player actually intended to gesture, which can be different from the player's actual single- or multi-touch gesture. In particular but not exclusively, the intended gesture is configured to account for the unconscious and unintended trail-off that occurs depending on the player's handedness (either right-handedness or left-handedness), which can skew the path of the actual gesture especially toward the end of the gesture. When the gesture is used to launch a projectile, such as a coin or a ball, for example, at one or more targets, the trail-off effect could otherwise cause the projectile to hit a target that the player did not intend to aim for using existing gesture-sensing techniques. Aspects disclosed herein avoid this problem by estimating or approximating what the player actually intended to gesture based on, for example, a criterion or a characteristic of the actual gesture. As a result, the gesture accuracy is enhanced, increasing the player's satisfaction in the wagering game and imbuing in the player a sense of confidence that the wagering game is capturing the player's intended actions.
Turning now to
The size and resolution of the multi-touch sensing system 100 can be optimized for detecting multiple touch inputs, specifically associated with gestures made by a player in a wagering game with multiple fingers. For example, the multi-touch sensing system 100 is about 2 inches wide by about 3 inches long, and may have a fairly low resolution (e.g., a total of 16 individual input sensors 110). In other embodiments, the multi-touch sensing system 100 is divided in half (left to right) and implemented as two single-touch devices. Other methods of sensing multiple contacts with a multi-touch sensing device are described in PCT Application No. PCT/US2007/021625 [247079-512WOPT], filed on Oct. 10, 2007, assigned to WMS Gaming Inc., entitled “Multi-Player, Multi-Touch Table for Use in Wagering Game Systems.”
Preferably, the components of the multi-touch input system 100 are constructed so that they form a single unit. For example, the multi-touch sensing array 102, the local controller 106, the memory 108, and the interface 104 can be mounted on a common substrate, such as a PCB to form a compact device that can be easily installed as a component of the gaming terminal 10. In the illustrated example of
The multi-touch sensing system 100 optionally includes a thin, plastic overlay or substrate for protection and appearance. The overlay may include information, such as instructions for using the multi-touch sensing system 100, or a graphic, such as a coin, a token, a dart, a ball or other graphics related to a wagering game. The multi-touch sensing system 100 can be located on a panel of the gaming terminal 10 with other input devices 26, as shown in
Another type of multi-touch sensing system 100 that is suitable for interpreting gestures is a multi-touch display such as the 3M™ multi-touch display, which is both a display suitable for the primary display area 14 and for sensing gestures.
As described above with respect to
Optionally, the local controller 106 can determine whether the gesture data received from the multi-point sensing system 100 corresponds to any of a plurality of gesture classification codes stored in the memory 108. If a valid gesture is determined (i.e., the gesture data corresponds to one of the plurality of gesture classification codes), the local controller 106 communicates the classification code to the CPU 42. This communication may occur over a USB connection, for example, though any other suitable wired or wireless connection techniques are contemplated. If no valid gesture is determined, the local controller 106 may communicate an error code to the CPU 42, so that the game may instruct the player to try again, or some other appropriate response. Another option is for the local controller to simply ignore the attempted input, thereby relieving the CPU 42 to perform other tasks relating to the wagering game. An advantage of having a separate local controller 106 filter only valid gestures is that the CPU 42 is not burdened by having to check every gesture made relative to the multi-touch sensing system 100 to determine whether it recognizes the gesture. In some implementations, such burdening of the controller 42 can prevent it from processing other tasks and functions related to the wagering game. In this sense, the local controller 106 acts as a “filter,” allowing only valid gestures to be passed to the controller 42, such that when the CPU receives a classification code from the local controller 106, the controller 42 can analyze that classification code to determine what function related to the wagering game to perform. Thus, rather than providing the raw coordinate data of the gesture, e.g., the X and Y locations of each touch input, continuously to the CPU 42, the local controller 106 takes the burden of interpreting the gesture data outputted by array of input sensors 110 via the interface 104 and classifies the gesture data according to a predetermined number of valid gestures. However, in other implementations, this filtering option can be eliminated.
The local controller 106 can include a predetermined classification system stored in the memory 108, where the predetermined classification system includes a plurality of gesture classification codes, each code representing a distinct combination of characteristics relating to the multi-point gesture. The predetermined classification system can recognize a finite number of valid gestures. Further, the local controller 106 interprets gestures to more accurately match the gesture sensed with stored classification codes. Alternately, any function disclosed herein that is carried out by the local controller 106 can be carried out by the CPU 42 and/or the external system(s) 46.
Alternately, instead of organizing the rows and columns of the table with different gesture characteristics, the local controller 106 in other aspects can determine a characteristic at a time relating to the multi-point gesture. For example, the local controller 106 can determine a speed characteristic relating to the multi-point gesture, and if the speed corresponds to a predetermined classification code for the speed characteristic, the local controller 106 communicates that code to the controller 42. In addition, the local controller 106 determines a direction characteristic relating to the multi-point gesture, and if the direction corresponds to a predetermined classification code for the direction characteristic, the local controller 106 communicates that code to the controller 42. In other words, there may be two separate tables of classification codes, one for speed and the other for direction, and these individual codes are communicated by the local controller 106 to the controller 42. While this is more cumbersome and less desirable, it is contemplated as an alternative way of detecting gestures while still achieving an objective of transferring the burden of detecting gestures away from the CPU 42 to the local controller 106. In other implementations, the CPU 42 can receive the gesture data and interpret the gesture data to determine an intended path of an actual gesture.
The controller 106 can access the memory 108 for determining characteristics corresponding to any particular predetermined gesture classification codes and their respective inputs to a wagering game. The system memory 44 can also include a similar table storing the predetermined gesture classification codes. In the exemplary table described above, the predetermined classification system includes five levels of a speed characteristic relating to the multi-point gesture and five levels of a direction characteristic relating to the multi-point gesture, for a total of 25 different gesture-related codes corresponding to different combinations speed and direction. It is contemplated more or fewer levels of speed or direction or other characteristics (such as pressure and/or acceleration) can be incorporated into the classification system.
To generate the predetermined classification codes, algorithms for interpreting the raw gesture data from the multi-touch sensing system 100 can be developed iteratively. Various gestures are made relative to the multi-touch sensing system 100 to develop a range of speeds to correspond to a particular classification code. The algorithms can also be changed depending the gesture being simulated. The raw gesture data can include coordinates within the coordinate space corresponding to the touched points, which together form a path or trajectory of the actual gesture.
Thus, instead of having an infinite number of possible gestures that may occur, only a finite number of valid gestures are available. This simplifies and reduces the information that is supplied to the controller 106, yet creates in the player the perception that there are an infinite number of possible gestures. Thus, according to a method, the player simulates a gesture relating to a wagering game, i.e., a wager input by depositing a coin, by contacting the multi-point sensing device array 102 at least two contact points simultaneously (e.g., points 120 and 150 in
For example, for a throwing coins gesture, if the classification code indicates a slow speed and a straight spin direction, a first animation of the coin 140 in the display area 14 (shown in
The coin 140 is made to appear to move in accordance on the display area 14 with the gesture characteristics indicated by the corresponding gesture classification code as shown in
The object depicted on the display area 14 or the secondary display area 16 in response to the communication of a classification code from the local controller 106 to the controller 42 is related to the wagering game. In other aspects, the object (such as the coin 140) is involved in the depiction of a randomly selected outcome of the wagering game. For example, the values on the faces of the coin 140 can indicate or reflect a randomly selected outcome.
An advantage of the classification system described above includes the handling of “outlier” contact points. For example, certain types of gestures, such as a downward gesture, a gesture that skips across the surface of the multi-touch sensing array 102 or the expanded array 500, etc., may cause a calculated algorithm to produce data that would generate gestures in odd directions, such as gestures with high velocities or zero velocity. The classification system described herein would only allow valid gesture-related outputs to be provided to the controller 42. In some examples, a “bad” input may be classified as a benign gesture or may be rejected completely. Under these conditions, the local controller 106 may assign a classification code that relates to a maximum, a minimum, or another predefined code to avoid communicating information based on a “bad” or invalid gesture.
The local controller 106 allows more precise interpretation of gestures from the multi-touch system 100. Initial parameters may be stored in the memory 108 that define valid areas of the multi-touch sensing array 102 or 500. For example, in
In cases where a gesture involves moving an object such as depositing a coin or throwing a projectile, a zone of input can be defined for purposes of calculating the trajectory of the object affected by the gesture. For example, if the player is gesturing to pitch a coin, a zone of input may be defined as the area between the launch boundary 520 (defined as a line, for example, extending across the multi-touch sensor 500) in
The controller 106 can be programmed to determine the trajectory of the object propelled by the gesture motion in a number of ways to insure accurate response to an intended gesture. A gesture such as throwing a coin can involve a pullback and release gesture to match a predetermined action in the stored tables in the memory 108. The acceleration of the pullback and release gesture can be sensed and calculated to determine the trajectory of the intended gesture on the object.
A gesture can be broken up into multiple gesture segments to determine the intended trajectory.
Another implementation to determine the intended trajectory of the gesture is to compute the tangent of the early portion of the curved path of the gesture based on data from sensors 510 in the early portion of the path from the starting point. After filtering to isolate these points, the tangent is calculated by the controller 106 to determine the intended trajectory. The intended trajectory can also be determined by an examination of the path of the gesture. A multi-dimensional array of input sensors 510 such as with the array 500 allows the controller 106 to more accurately determine the curve of the motion of the gesture on the surface of the array 500. The curve of the launching motion of a gesture is determined by the controller 106 to calculate the intended trajectory. For example, the straighter the launch in the actual gesture indicates a more linear intended trajectory. If the path of the actual gesture detected is more curved, the intended trajectory is deemed to be closer to the curve of the initial path of the gesture.
Similarly, the intended trajectory can be calculated based on the distance of the gesture on the multi-point touch array 102 or 500 and the amount of space the arc formed by the actual gesture occupies.
The local controller 106 can be instructed to determine when an actual gesture has been aborted and therefore not require interpretation. For example, if a player's gesture is decelerated rapidly at the end of the motion according to a predetermined threshold, the controller 42 or 106 can determine that the player did not intend to make the gesture and cancel the further process of interpreting the gesture. In addition, if a player breaks contact with the sensors 110 in the multi-touch sensor array 102 or the sensors 510 in the sensor array 500, the controller 42 or 106 can make the determination that the gesture input has been canceled by the player.
In an implementation that includes the sensing array 102 in
The interpretation of the gestures can be integrated into game play. For example, the player can use a gesture such as inserting a coin to input a wager in the gaming terminal 10 to play a wagering game thereon. A gesture by the player can be used to determine an outcome of a primary or bonus game, such as by throwing or launching an object at a selection element in a wagering game. A player may also be instructed to aim an object by making a gesture at moving targets to determine game outcomes or bonus awards or other enhancement parameters, including eligibility to play a bonus game. An example is shown in
Gestures that are incorporated into game play to determine outcomes can determine outcomes that enhance playability for a player. For example rather than having a single table of outcomes correlated to the gesture stored in the memory 108, multiple tables can be used. For example, a weighted table of angular values may be used for matching the gesture. Adjacent tables can be selected for the same angular value, but such tables can have different volatility, which creates greater excitement for the players. The respective expected values associated with each of the tables can be the same. To determine which weighted table to use, an initial angle of a gesture relative to a horizontal line (e.g., coincident with the line 610 in
The gaming terminal 10 can also include various sensory or haptic feedback to the player to enhance the gesture effect. As explained above, images of an object moving based on the sensed gesture can be displayed on the primary display area 14 indicating the result of the gesture. Sounds can be incorporated such as a coin-dragging sound during the gesture and stopping the sound when a release occurs in the gesture. Other sounds, such as the coin landing in an area may also be played during the gesture. Also, physical or haptic feedback in the form of a solenoid-driven motor underneath or behind the display 14 can be actuated to indicate when a coin release has occurred.
The gesture capture scheme carried out by the controller 42 or 106 can be used to assist the player in close situations. For example, the best possible throw result can be assigned to a gesture input by the controller 42 or 106. For example, the controller 106 in conjunction with the controller 42 can display graphics on the primary display area 14 to indicate the path of the intended trajectory as a result of the actual gesture to assist the player in making more accurate gestures in future plays. The controller 42 can cause an animation to be displayed in which the influenced object (such as a coin) follows a path defined by the intended gesture until the influenced object corresponds to or interacts with a target, such as the targets 660, 662, 664, 666, 668, which can correspond to wager amounts, for example.
Although some examples described above have referred to dice or coin throwing or launching gestures, in other aspects, other types of gestures are contemplated. For example, a “stir/mix” gesture is contemplated for stirring and/or mixing objects. The player uses one or more fingers to show how fast, in what direction, etc. an object is being spun and/or mixed. Additionally, a “card reveal” gesture is made by using two fingers, such as an index finger and a thumb finger, for example, to indicate a player picking up cards from a surface. Other possible gestures may include “ball toss,” “dart throw,” and the like. The “ball toss” and “dart throw” gestures approximate ball tossing and dart throw motions using the player's fingers. The player can control the spin direction of the ball or dart in a similar manner as with the dice throw by lifting one finger before the other finger. The player can also control the speed with which the ball or dart is thrown by controlling the speed with which the fingers are moved across the sensor array.
The criterion can include whether at least a portion of the actual gesture falls within a predefined area (e.g., 512 or below line 610). If the portion of the actual gesture falls within the predefined area, the controller 42 ignores the portion of the actual gesture in determining the intended gesture. Alternately, the criterion can include a trajectory of the actual gesture. The controller 42 calculates a tangent of a curved portion of an initial part of the gesture to determine the trajectory of the actual gesture and uses the determined trajectory as the trajectory of the intended gesture. Alternately, the criterion can include whether the actual gesture is generally straight. The controller 42 determines a linear relationship between at least two points along the actual gesture responsive to the actual gesture being generally straight and using the linear relationship to determine the intended gesture. Alternately, the criterion can include an acceleration of at least a portion of the actual gesture. The controller 42 defines defining multiple segments along the actual gesture (e.g., 630, 632, 634, 636, 638, 640) and calculates in each of the segments the acceleration of the actual gesture within the segment. The controller 42 determines in which of the segments the calculated acceleration is the highest, and determines a trajectory of the actual gesture in the segment determined to have the highest calculated acceleration. The controller 42 uses the trajectory to determine the intended gesture. Alternately, the criterion can include a change in acceleration of at least a portion of the actual gesture relative to other portions of the actual gesture. The controller 42 defines multiple segments (e.g., 630, 632, 634, 636, 638, 640) along the actual gesture and calculates in each of the segments the acceleration of the actual gesture within the segment. The controller 42 determines in which of the segments the acceleration has the highest change relative to the acceleration of the actual gesture in the other segments and determines a trajectory of the actual gesture in the segment determined to have the highest change in calculated acceleration. The controller 42 uses the trajectory to determine the intended gesture. Alternately, the criterion can include whether a characteristic of the actual gesture corresponds to value in a weighted table of values. The controller 42 selects the value in the weighted table and uses the weighted table to determine an award for the wagering game based on a randomly determined winning outcome of the wagering game. The characteristic can be an angle relative to a horizontal line (e.g., line 610) within the defined coordinate space (e.g., 512). Alternately, the criterion can include whether a characteristic of the actual gesture corresponds to value in a weighted table of values. The controller 42 can randomly select the weighted table or one of at least two weighted tables adjacent to the weighted table. Each of the weighted tables has the same expected value but a different volatility. The controller 42 can use the randomly selected weighted table to determine an award for the wagering game based on a randomly determined winning outcome of the wagering game.
The controller 42 can sense when the actual gesture has ended and coincidentally provide haptic feedback to the player who made the actual gesture to indicate that the actual gesture was received. As mentioned above, this haptic feedback can coincide with a coin release, for example. The haptic feedback can be carried out by actuating a solenoid positioned under or behind a substrate on which the actual gesture is made.
The controller 42 can display a trail of the actual gesture that persists after the actual gesture has completed and display an indication of the intended gesture overlaying the trail. The wagering game function can be accepting an amount of a wager. The controller 42, 106 can display a plurality of wager amounts on a display of the gaming terminal and display an animation in which the influenced object follows a path defined by the intended gesture until the influenced object corresponds to or interacts with a selected one of the wager amounts. The controller 42 uses the selected wager amount as a wager to play the wagering game.
The wagering game function can alternately include determining an award associated with the wagering game. The controller 42 displays multiple further objects on a display of the gaming terminal. Each of the further objects corresponds to an award to be awarded to the player when a randomly selected outcome of the wagering game satisfies a criterion. The controller 42 displays an animation in which the influenced object follows a path defined by the intended gesture until the influenced object corresponds to a selected one of the further objects. The award associated with the selected one of the further objects is awarded to the player. The award can include (a) eligibility to play a further round of the wagering game or a bonus game, (b) an amount of credits, or (c) an enhancement parameter associated with the wagering game.
Any of these algorithms include machine readable instructions for execution by: (a) a processor, (b) a controller, and/or (c) any other suitable processing device. It will be readily understood that the system 100 includes such a suitable processing device, such as the controller 42, 106. Any algorithm disclosed herein may be embodied in software stored on a tangible non-transitory medium such as, for example, a flash memory, a CD-ROM, a floppy disk, a hard drive, a digital versatile disk (DVD), or other memory devices, but persons of ordinary skill in the art will readily appreciate that the entire algorithm and/or parts thereof could alternatively be executed by a device other than a controller and/or embodied in firmware or dedicated hardware in a well known manner (e.g., it may be implemented by an application specific integrated circuit (ASIC), a programmable logic device (PLD), a field programmable logic device (FPLD), discrete logic, etc.).
Each of these embodiments and obvious variations thereof is contemplated as falling within the spirit and scope of the claimed invention, which is set forth in the following claims.
This application claims the benefit of U.S. Provisional Patent Application Ser. No. 61/497,311, filed Jun. 15, 2011, entitled “Gesture Sensing Enhancement System for a Wagering Game” which is incorporated herein in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
3533628 | Fisher | Oct 1970 | A |
4357488 | Knighton et al. | Nov 1982 | A |
4484179 | Kasday | Nov 1984 | A |
4522399 | Nishikawa | Jun 1985 | A |
4715004 | Kabasawa et al. | Dec 1987 | A |
4746770 | McAvinney | May 1988 | A |
4763278 | Rajasekaran et al. | Aug 1988 | A |
4844475 | Saffer et al. | Jul 1989 | A |
4968877 | McAvinney et al. | Nov 1990 | A |
5133017 | Cain et al. | Jul 1992 | A |
5186460 | Fongeallaz et al. | Feb 1993 | A |
5259613 | Marnell, II | Nov 1993 | A |
5318298 | Kelly et al. | Jun 1994 | A |
5370399 | Liverance | Dec 1994 | A |
5444786 | Raviv | Aug 1995 | A |
5469193 | Giobbi et al. | Nov 1995 | A |
5469510 | Blind et al. | Nov 1995 | A |
5511148 | Wellner | Apr 1996 | A |
5524888 | Heidel | Jun 1996 | A |
5533727 | DeMar | Jul 1996 | A |
5542669 | Charron et al. | Aug 1996 | A |
5589856 | Stein et al. | Dec 1996 | A |
5655961 | Acres et al. | Aug 1997 | A |
5695188 | Ishibashi | Dec 1997 | A |
5704836 | Norton et al. | Jan 1998 | A |
5743798 | Adams et al. | Apr 1998 | A |
5762552 | Vuong et al. | Jun 1998 | A |
5770533 | Franchi | Jun 1998 | A |
5775993 | Fentz et al. | Jul 1998 | A |
5803810 | Norton et al. | Sep 1998 | A |
5807177 | Takemoto et al. | Sep 1998 | A |
5808567 | McCloud | Sep 1998 | A |
5816918 | Kelly et al. | Oct 1998 | A |
5828768 | Eatwell et al. | Oct 1998 | A |
5833538 | Weiss | Nov 1998 | A |
5851148 | Brune et al. | Dec 1998 | A |
5896126 | Shieh | Apr 1999 | A |
5941773 | Harlick | Aug 1999 | A |
5943043 | Furuhata et al. | Aug 1999 | A |
5946658 | Miyazawa et al. | Aug 1999 | A |
5971850 | Liverance | Oct 1999 | A |
5976019 | Ikeda et al. | Nov 1999 | A |
6067112 | Wellner et al. | May 2000 | A |
6068552 | Walker et al. | May 2000 | A |
6089663 | Hill | Jul 2000 | A |
6110041 | Walker et al. | Aug 2000 | A |
6162121 | Morro et al. | Dec 2000 | A |
6210167 | Nishiyama | Apr 2001 | B1 |
6217448 | Olsen | Apr 2001 | B1 |
6246395 | Goyins et al. | Jun 2001 | B1 |
6254483 | Acres | Jul 2001 | B1 |
6255604 | Tokioka et al. | Jul 2001 | B1 |
6280328 | Holch et al. | Aug 2001 | B1 |
6283860 | Lyons et al. | Sep 2001 | B1 |
6302790 | Brossard | Oct 2001 | B1 |
6308953 | Nagano | Oct 2001 | B1 |
6315666 | Mastera et al. | Nov 2001 | B1 |
6364314 | Canterbury | Apr 2002 | B1 |
6416411 | Tsukahara | Jul 2002 | B1 |
6422941 | Thorner et al. | Jul 2002 | B1 |
6471589 | Nagano | Oct 2002 | B1 |
6517433 | Loose et al. | Feb 2003 | B2 |
6530842 | Wells et al. | Mar 2003 | B1 |
6561908 | Hoke | May 2003 | B1 |
6607443 | Miyamoto et al. | Aug 2003 | B1 |
6620045 | Berman et al. | Sep 2003 | B2 |
6638169 | Wilder et al. | Oct 2003 | B2 |
6642917 | Koyama et al. | Nov 2003 | B1 |
6676514 | Kusuda et al. | Jan 2004 | B1 |
6677932 | Westerman | Jan 2004 | B1 |
6767282 | Matsuyama et al. | Jul 2004 | B2 |
6788295 | Inkster | Sep 2004 | B1 |
6819312 | Fish | Nov 2004 | B2 |
6856259 | Sharp | Feb 2005 | B1 |
6929543 | Ueshima et al. | Aug 2005 | B1 |
6932706 | Kaminkow | Aug 2005 | B1 |
6942571 | Mcallister et al. | Sep 2005 | B1 |
6995752 | Lu | Feb 2006 | B2 |
7077009 | Lokhorst et al. | Jul 2006 | B2 |
7147558 | Giobbi | Dec 2006 | B2 |
7204428 | Wilson | Apr 2007 | B2 |
7254775 | Geaghan et al. | Aug 2007 | B2 |
7294059 | Silva et al. | Nov 2007 | B2 |
7331868 | Beaulieu et al. | Feb 2008 | B2 |
RE40153 | Westerman et al. | Mar 2008 | E |
7379562 | Wilson | May 2008 | B2 |
7397464 | Robbins et al. | Jul 2008 | B1 |
7411575 | Hill et al. | Aug 2008 | B2 |
7479065 | McAllister et al. | Jan 2009 | B1 |
7479949 | Jobs et al. | Jan 2009 | B2 |
7936341 | Weiss | May 2011 | B2 |
8147316 | Arezina et al. | Apr 2012 | B2 |
8312392 | Forutanpour et al. | Nov 2012 | B2 |
8348747 | Arezina et al. | Jan 2013 | B2 |
8727881 | Ansari et al. | May 2014 | B2 |
8732592 | Nielsen et al. | May 2014 | B2 |
20020003919 | Morimoto | Jan 2002 | A1 |
20020013173 | Walker et al. | Jan 2002 | A1 |
20020037763 | Idaka | Mar 2002 | A1 |
20020090990 | Joshi et al. | Jul 2002 | A1 |
20020097223 | Rosenberg | Jul 2002 | A1 |
20020142825 | Lark et al. | Oct 2002 | A1 |
20020142846 | Paulsen | Oct 2002 | A1 |
20020151349 | Joshi | Oct 2002 | A1 |
20020173354 | Winans et al. | Nov 2002 | A1 |
20030045354 | Giobbi | Mar 2003 | A1 |
20030054881 | Hedrick et al. | Mar 2003 | A1 |
20030067447 | Geaghan et al. | Apr 2003 | A1 |
20030114214 | Barahona et al. | Jun 2003 | A1 |
20040001048 | Kraus et al. | Jan 2004 | A1 |
20040029636 | Wells | Feb 2004 | A1 |
20040029637 | Hein, Jr. et al. | Feb 2004 | A1 |
20040038721 | Wells | Feb 2004 | A1 |
20040053695 | Mattice et al. | Mar 2004 | A1 |
20040063482 | Toyoda | Apr 2004 | A1 |
20040166930 | Beaulieu et al. | Aug 2004 | A1 |
20040166937 | Rothschild et al. | Aug 2004 | A1 |
20050059458 | Griswold et al. | Mar 2005 | A1 |
20050113163 | Mattice et al. | May 2005 | A1 |
20050202864 | Duhamel et al. | Sep 2005 | A1 |
20050212754 | Marvit et al. | Sep 2005 | A1 |
20050227217 | Wilson | Oct 2005 | A1 |
20050259378 | Hill et al. | Nov 2005 | A1 |
20060001652 | Chiu et al. | Jan 2006 | A1 |
20060010400 | Dehlin et al. | Jan 2006 | A1 |
20060025194 | McInerny et al. | Feb 2006 | A1 |
20060026521 | Hotelling et al. | Feb 2006 | A1 |
20060026536 | Hotelling et al. | Feb 2006 | A1 |
20060031786 | Hillis et al. | Feb 2006 | A1 |
20060033724 | Chaudhri et al. | Feb 2006 | A1 |
20060073891 | Holt | Apr 2006 | A1 |
20060101354 | Hashimoto et al. | May 2006 | A1 |
20060164399 | Cheston et al. | Jul 2006 | A1 |
20060284874 | Wilson | Dec 2006 | A1 |
20060294247 | Hinckley et al. | Dec 2006 | A1 |
20070093290 | Winans et al. | Apr 2007 | A1 |
20070124370 | Nareddy et al. | May 2007 | A1 |
20070152984 | Ording et al. | Jul 2007 | A1 |
20070177803 | Elias et al. | Aug 2007 | A1 |
20070201863 | Wilson et al. | Aug 2007 | A1 |
20070236460 | Young et al. | Oct 2007 | A1 |
20070236478 | Geaghan et al. | Oct 2007 | A1 |
20070247435 | Benko et al. | Oct 2007 | A1 |
20070270203 | Aida | Nov 2007 | A1 |
20080076506 | Nguyen et al. | Mar 2008 | A1 |
20080158145 | Westerman | Jul 2008 | A1 |
20080158146 | Westerman | Jul 2008 | A1 |
20080158147 | Westerman et al. | Jul 2008 | A1 |
20080158168 | Westerman et al. | Jul 2008 | A1 |
20080158169 | O'Connor et al. | Jul 2008 | A1 |
20080158174 | Land et al. | Jul 2008 | A1 |
20080163130 | Westerman | Jul 2008 | A1 |
20080180654 | Bathiche et al. | Jul 2008 | A1 |
20080204426 | Hotelling et al. | Aug 2008 | A1 |
20080211766 | Westerman et al. | Sep 2008 | A1 |
20080211775 | Hotelling et al. | Sep 2008 | A1 |
20080211783 | Hotelling et al. | Sep 2008 | A1 |
20080211784 | Hotelling et al. | Sep 2008 | A1 |
20080211785 | Hotelling et al. | Sep 2008 | A1 |
20080231610 | Hotelling et al. | Sep 2008 | A1 |
20080231611 | Bathiche et al. | Sep 2008 | A1 |
20080300055 | Lutnick et al. | Dec 2008 | A1 |
20080309631 | Westerman et al. | Dec 2008 | A1 |
20080309634 | Hotelling et al. | Dec 2008 | A1 |
20090002327 | Wilson et al. | Jan 2009 | A1 |
20090002344 | Wilson et al. | Jan 2009 | A1 |
20090005165 | Arezina et al. | Jan 2009 | A1 |
20090021489 | Westerman et al. | Jan 2009 | A1 |
20090118001 | Kelly et al. | May 2009 | A1 |
20090118006 | Kelly et al. | May 2009 | A1 |
20090143141 | Wells et al. | Jun 2009 | A1 |
20090191946 | Thomas et al. | Jul 2009 | A1 |
20090197676 | Baerlocher et al. | Aug 2009 | A1 |
20090325691 | Loose | Dec 2009 | A1 |
20100124967 | Lutnick et al. | May 2010 | A1 |
20100130280 | Arezina et al. | May 2010 | A1 |
20100313146 | Nielsen et al. | Dec 2010 | A1 |
20100328201 | Marvit et al. | Dec 2010 | A1 |
20110050569 | Marvit et al. | Mar 2011 | A1 |
20110118013 | Mattice et al. | May 2011 | A1 |
20110264272 | Wu et al. | Oct 2011 | A1 |
20120051596 | Darnell et al. | Mar 2012 | A1 |
20120113111 | Shiki et al. | May 2012 | A1 |
20120139857 | Terebkov et al. | Jun 2012 | A1 |
20120219196 | Dekel | Aug 2012 | A1 |
20120249443 | Anderson et al. | Oct 2012 | A1 |
20120309477 | Mayles et al. | Dec 2012 | A1 |
20120329553 | Gagner et al. | Dec 2012 | A1 |
20130165215 | Arezina et al. | Jun 2013 | A1 |
Number | Date | Country |
---|---|---|
199943487 | Mar 2000 | AU |
309946 | Apr 1989 | EP |
1269120 | Oct 1989 | JP |
5-31254 | Feb 1993 | JP |
8083144 | Mar 1996 | JP |
8190453 | Jul 1996 | JP |
8241161 | Sep 1996 | JP |
10-277213 | Oct 1998 | JP |
2000010733 | Jan 2000 | JP |
WO9730416 | Aug 1997 | WO |
WO9919855 | Apr 1999 | WO |
WO 0105477 | Jan 2001 | WO |
WO 0105477 | Jan 2001 | WO |
WO 0133905 | May 2001 | WO |
WO 0133905 | May 2001 | WO |
WO 0224288 | Mar 2002 | WO |
WO 0224288 | Mar 2002 | WO |
WO 0240921 | May 2002 | WO |
WO 0240921 | May 2002 | WO |
WO2006020305 | Feb 2006 | WO |
WO2007003928 | Jan 2007 | WO |
WO2008095132 | Oct 2008 | WO |
WO2008017077 | Dec 2008 | WO |
Entry |
---|
Apple, “Iphone User Guide”, iPhone iOS 3.1, released on Sep. 2009, 217 pages. |
Hand Tracking, Finger Identification, and Chordic Manipulation on a Multi-Touch Surface, by Wayne Westerman; 363 pages (Spring 1999). |
A Multi-Touch Three Dimensional Touch-Sensitive Tablet; CHI'85 Proceedings; pp. 21-25 (Apr. 1985). |
The Sensor Frame Graphic Manipulator Final Report (Sensor Frame) 28 pages; (printed on Feb. 6, 2009). |
The Design of a GUI Paradigm based on Tablets, Two-Hands, and Transparency; Gordon Kurtenbach, George Fitmaurice, Thomas Baudel, and Bill Buxton; 8 pages; (printed on Feb. 6, 2009). |
SmartSkin: An Infrastructure for Freehand Manipulation on Interactive Surfaces, by Jun Rekimoto, Interaction Laboratory; 8 pages; (printed on Feb. 6, 2009). |
Single-Handed Interaction Techniques for Multiple Pressure-Sensitive Strips by Gábor Blaskó, Steven Feiner; 4 pages; (printed on Feb. 6, 2009). |
A Multi-finger Interface for Performance Animation of Deformable Drawings; Tomer Moscovich, Takeo Igarashi, Jun Rekimoto, Kentaro Fukuchi, John F. Hughes; 2 pages; (printed on Feb. 6, 2009). |
Precise Selection Techniques for Multi-Touch Screens; Hrvoje Benko and Andrew D. Wilson and Patrick Baudisch; 10 pages; (printed on Feb. 6, 2009). |
ThinSight: Versatile Multi-touch Sensing for Thin Form-factor Displays; Steve Hodges, Shahram Izadi, Alex Butler, Alban Rrustemi and Bill Buxton; 10 pages; (printed on Feb. 6, 2009). |
Written Opinion corresponding to co-pending International Patent Application Serial No. PCT/US2007/021625, United States Patent Office, dated Sep. 15, 2008, 3 pages. |
International Search Report corresponding to co-pending International Patent Application Serial No. PCT/US2007/021625, United States Patent Office, dated Sep. 15, 2008, 2 pages. |
Web pages printed from http://multi-touchscreen.com/microsoft-surface-video-multi-touch-jeff-han-apple-bill-gates.html; (downloaded Aug. 24, 2009); 7 pages. |
Web pages printed from http:///www.jazzmutant.com/lemur—overview.php; (downloaded Aug. 24, 2009); 2 pages. |
Web pages printed from http://www.merl.com/projects/DiamondTouch/; (downloaded Aug. 24, 2009); 5 pages. |
Web pages printed from http://www.merl.com/projects/?proj—area=Off+the+Desktop+Interaction+and+Dis; (Downloaded Aug. 24, 2009); 1 page. |
Web pages printed from http://www.merl.com/projects/diamondspin/; (Downloaded Aug. 24, 2009); 2 pages. |
Web pages printed from http://kioskmarketplace.com/article.php?id=12284&na=1; (Downloaded Aug. 25, 2009); 5 pages. |
An Overview of Optical-Touch Technologies; Ian Maxwell; 5 pages; (dated Dec. 2007). |
Freescale Semiconductor, E-field Keyboard Designs, Michael Steffen; 6 pages; (dated Sep. 2007). |
Texas Instruments, PCB-Based Capacitive Touch Sensing with MSP430; Zack Albus; 25 pages; (dated Jun. 2007—Revised Oct. 2007). |
Planet Analog, The art of capacitive touch sensing; Mark Lee, Cypress Semiconductor Corp.; 5 pages; (dated Mar. 1, 2006). |
Weinert, Joe, Enterainment Vehicles, IGWB New '97 Games, pp. 11, 12 and 15-18 (Mar. 1997). |
Written Opinion corresponding to co-pending International Patent Application Serial No. PCT/US2007/010048, United States Patent Office, dated Jun. 10, 2008, 3 pages. |
International Search Report corresponding to co-pending International Patent Application No. PCT/US2007/010048, United States Patent Office, dated Jun. 10, 2008, 2 pages. |
http://www.mrl.nyu.edu/˜jhan/ftirsense/index.html; 2 pages, (downloaded Oct. 7, 2008). |
http://ds.advancedmn.com/article.php?artid=3395;3 pages (downloaded Oct. 7, 2008). |
http://us.gizmodo.com/gadgets/portable-media/apple-touchscreen-patent-documentation-154248.php; 11 pages (downloaded Oct. 7, 2008). |
http://loop.worldofapple.com/archives/2006/02/08/multi-touch-interaction-video/; 19 pages, (downloaded Oct. 7, 2008). |
http://www.pcmag.com/article2/0,1895,1918674,00.asp; 4 pages, (downloaded Oct. 7, 2008). |
Number | Date | Country | |
---|---|---|---|
20120322527 A1 | Dec 2012 | US |
Number | Date | Country | |
---|---|---|---|
61497311 | Jun 2011 | US |