Determination of advertisement based on player physiology

Information

  • Patent Grant
  • 10421010
  • Patent Number
    10,421,010
  • Date Filed
    Monday, March 17, 2014
    10 years ago
  • Date Issued
    Tuesday, September 24, 2019
    5 years ago
Abstract
A method, apparatus, and system of selective advertising includes a gaming server to receive a request to play a game of chance from a gaming device, transmit gaming data to the gaming device, the gaming data associated with the requested game of chance, and receive player data, the player data associated with the player's eye movement, gestures, or change in state. The server may also be configured to analyze the player data, determine a context associated with the player data; and initiate an action based on the determined context.
Description
BACKGROUND OF THE INVENTION

Gaming devices are ubiquitous in casinos and other gambling establishments. These devices, often in the form of slot machines, allow a user to place a wager on a game of chance and/or skill. Players of slot machines are captive audiences for the duration of the play session that could last hours.


However, gaming establishments are always looking for ways to captivate players to keep them at their establishment playing the gaming devices. Additionally, gaming establishments are always looking for ways to generate income. Advertisements (ads) from third party advertisers may be a way for the gaming establishment to generate additional income. However, unlike broadcast TV, radio, newspaper, and the Internet, there are no meaningful revenue generated by advertisements for the gaming establishments. Countless hours of legal and regulated video content are served up to the captive slot players each year for the 600 million trips that they make, but no meaningful advertisement revenue is realized.


Some attempts to insert advertisements on game devices have been made by replacing the reel symbols with pictures or logo's of products being advertised. However, it proved to be a failure as the games become confusing, and distracting to player. It also alienates players as there are no incentive for players to play on a heavily advertised gaming device when a nearby gaming device offers a pure entertainment experience without the distraction and delays associated with sponsored advertisements.


OVERVIEW

The present disclosure relates generally to advertisements. More particularly, the present disclosure relates generally to determining advertisements to be displayed on gaming machines. Even more particularly, the present disclosure relates generally to determining advertisements to be displayed on gaming machines based on a player's physical body movement, such as eye movement.


In one embodiment, a system of selective advertising comprises: a gaming server configured to: receive a request to play a game of chance from a gaming device; transmit gaming data to the gaming device, the gaming data associated with the requested game of chance; receive player data, the player data associated with the player's eye movement, gestures, or change in state; analyze the player data; determine a context associated with the player data; and initiate an action based on the determined context. The gaming device is configured to: receive the gaming data; display the gaming data on a display of the gaming device. A sensor proximate the gaming device may be configured to detect player data, the player data includes at least data based on player eye movement, gestures, or change in state.


In another embodiment, a method for selecting advertisements based on player physiology comprises transmitting, by a gaming device, a request to play a game of chance; receiving gaming data at the gaming device, the gaming data associated with the requested game of chance; determining, by a sensor, if there is player eye movement changes; determining, by the sensor, if there is player movement; recording the player eye movement changes if it is determined that there is player eye movement changes; and recording the player movement if it is determined that there is player movement.


In still another embodiment, a method for selecting advertisements comprises receiving, by a gaming server, a request to play a game of chance; transmitting gaming data to a gaming device, the gaming data associated with requested game of chance; receiving player data, the player data including at least player gesture movement, eye movement, or state change; analyzing the player data; determining at least one advertisement based upon the analyzed player data; transmitting the at least one advertisement to the gaming device; determining a context associated with the player data; and initiate an action based on the determined context.


The present invention provides other hardware configured to perform the methods of the invention, as well as software stored in a machine-readable medium (e.g., a tangible storage medium) to control devices to perform these methods. These and other features will be presented in more detail in the following detailed description of the invention and the associated figures.





BRIEF DESCRIPTION OF THE DRAWINGS

The accompanying drawings, which are incorporated into and constitute a part of this specification, illustrate one or more example embodiments and, together with the description of example embodiments, serve to explain the principles and implementations.


In the drawings:



FIG. 1 illustrates an example gaming system.



FIG. 2 illustrates an example method for determining advertisements to display on a gaming device.



FIG. 3 illustrates another example method for determining advertisements to display on a gaming device.



FIG. 4 illustrates yet another example method for determining advertisements to display on a gaming device.



FIG. 5 illustrates an exemplary computer system.



FIG. 6 is a block diagram of an example computing system.





DESCRIPTION OF EXAMPLE EMBODIMENTS

Embodiments are described herein in the context of determination of advertisement based on player physiology. The following detailed description is illustrative only and is not intended to be in any way limiting. Other embodiments will readily suggest themselves to such skilled persons having the benefit of this disclosure. Reference will now be made in detail to implementations as illustrated in the accompanying drawings. The same reference indicators will be used throughout the drawings and the following detailed description to refer to the same or like parts.


In the interest of clarity, not all of the routine features of the implementations described herein are shown and described. It will, of course, be appreciated that in the development of any such actual implementation, numerous implementation-specific decisions must be made in order to achieve the developer's specific goals, such as compliance with application- and business-related constraints, and that these specific goals will vary from one implementation to another and from one developer to another. Moreover, it will be appreciated that such a development effort might be complex and time-consuming, but would nevertheless be a routine undertaking of engineering for those of ordinary skill in the art having the benefit of this disclosure.


In accordance with the present invention, the components, process steps, and/or data structures may be implemented using various types of operating systems, computing platforms, computer programs, and/or general purpose machines. In addition, those of ordinary skill in the art will recognize that devices of a less general purpose nature, such as hardwired devices, field programmable gate arrays (FPGAs), application specific integrated circuits (ASICs), or the like, may also be used without departing from the scope and spirit of the inventive concepts disclosed herein.



FIG. 1 illustrates an example gaming system. The gaming system 100 includes a plurality of gaming devices. Gaming devices may be gaming machines 104 or mobile gaming devices 108. Mobile gaming devices 108 may be any portable device capable of playing games (e.g. games of chance, video games, and the like) such as, for example, portable telephones, laptops, computers, notebooks, tablets, media players, and the like. Although illustrated with one gaming machine 104 and one mobile gaming device 108, this is not intended to be limiting as there may be a plurality of gaming machines 104 and a plurality of mobile gaming devices 108. The gaming devices 104, 108 may be configured to communicate with a gaming server 112 via network 102. The gaming devices 104, 108 may communicate with gaming server 112 via any wired or wireless connection. Wireless connections may be any known wireless methods such as a 3G wireless technology, 4G wireless technology, Bluetooth, wireless universal serial bus, near-field magnetic communication, Fire Wire, WiMax, LTE, IEEE 802.11x technology, radio frequency, narrow-beam infrared (IR) sensor, RFID tag, vibration sensor or any other known wireless methods.


The gaming machine 104 may have a sensor 106 and mobile gaming device 108 may have a sensor 110. Although illustrated with one sensor 106, 110, this is not intended to be limiting as gaming machine 104 and mobile gaming device 108 may have a plurality of sensors to track different movements and states of the player. Sensors 106, 110 may be configured to determine or track movement, mental state, or physical state of the player. In one embodiment, sensors 106, 110 may be configured to determine or track the gaze or direction of a human eye. For example, if the player is playing a car racing game that has many billboards (e.g. billboards advertising food, liquor, electronics, entertainment, tournaments, and the like), the sensor 106, 110 may track where on the screen, and whether the player is looking at the billboards while driving the racing car. In another embodiment, the sensor may track whether the player is looking at an avatar associated with playing certain games of chance. In yet another embodiment, the sensor may determine that the player is looking at the spin button on gaming devices 104, 108.


The sensors 104, 108 may be any known sensors designed to determine or track the gaze of an eye, such as a camera. Additionally, any known eye tracking movement methods may be used to determine or track the gaze of the eye such as, for example, the method described in U.S. Pat. No. 7,986,816 entitled, “METHODS AND SYSTEMS FOR MULTIPLE FACTOR AUTHENTICATION USING GAZE TRACKING AND IRIS SCANNING”. Once player eye movement is detected by the sensor 106, 108, the detected eye movement data and the location of the stare (e.g. pictures or images of the player's eye) may be transmitted to the gaming server 112, via network 102, for analysis.


In the same or another embodiment, sensors 106, 110 may be configured to determine or track gestures or facial expressions from the player. For example, if the player wins a bonus, the player may smile. If the player lost at a game of chance, the player may turn the smile into a frown. The sensors 104, 108 may be any known sensors designed to determine or detect gestures and facial expressions, such as a camera. A well-known gesture sensor is Microsoft's Kinect® 3D sensor. Additionally, any known gesture detection method may be used to determine or track human gestures. Once player gesture, facial expression, or movement is detected by the sensor 106, 108, the detected gesture data (e.g. pictures or images of the player's face, body, hands, and the like) may be transmitted to the gaming server 112, via network 102, for analysis.


In still the same or another embodiment, sensors 106, 110 may be configured to track the state of a player. States of a player may be any biological or physiological state, for example, body temperature, oxygen level, heart rate, skin color (e.g. facial), breathing rate, and any other biological or physiological state. For example, sensor 106, 110 may detect that a player has a lower than normal temperature, low oxygen level and a high heart rate. In another example, sensor 106, 110 may detect a higher than normal temperature reading and fast heart rate from the player. In one embodiment, sensors 106, 100 may be an infrared camera that continuously monitors a player's face for temperature changes in various areas and dynamically compare the changes to a preset baseline measurement to quantify the player's state changes. In another embodiment, sensor 106,100 may be a camera that captures the player's facial expression in visible light and dynamically compare the variation to a baseline measurement to quantify the player's state change (emotional expression, in this case). Whenever an optical sensor, such as a camera, is being used, the analysis of state change can be done by comparing variations between successive video frames using known digital image processing methods such as pixel subtraction, genetic programming, and the like.


The sensors 104, 108 may be any known sensors designed to determine or detect biological or physiological changes in a player, such as, for example a temperature sensor, a conductivity/resistivity sensor, a PH level sensor, a salinity sensor, camera, and the like. In one embodiment, the physiological sensor may be positioned on the gaming machine in a location known to be in contact with the player, such as the “spin” button, “cash out” button, and the like. In another embodiment, the physiological sensor may be an IR camera configured to detect human temperature or a visible light spectrum camera to detect facial expressions. Additionally, any known biological or physiological change detection method may be used. Once a change in state of the player is detected by the sensor 106, 108, the detected state change data (e.g. thermal image of the player's face, heartbeat, detection of sweaty or clammy hands, facial expressions, and the like) may be transmitted to the gaming server 112, via network 102, for analysis.


Gaming server 112 may receive the eye movement data, gesture data, and/or change state data and analyze the data using analysis server 118. Analysis server 118 may have a gesture database 120, movement database 122, and state database 124. Each database 120, 122, and 124 may be configured to store associated data (e.g. gesture database 120 may store gesture data; movement database 122 may store eye movement data; state database 124 may store change state data, baselines and thresholds data) and at least one associated context. The associated context may be an extrapolation, determination, inference, or anticipation of what the player may desire, want, or feel.


EXAMPLE 1

Examples described herein for illustrative purposes only and not intended to be limiting. For example, although illustrated with a car racing game, any other non-gaming (e.g. video games) or games of chance may be played such as Keno, black jack, poker, and the like.


A player playing a car racing game may gaze to a billboard advertising beer. The sensor 106, 108 may detect this eye movement and transmit the eye movement data to the gaming machine server 112 for analysis using the analysis server 118. Analysis server 118 may analyze the eye movement data using movement database 122, player tracking database 128, games database 130, advertisement database 116, etc. Movement database 122 may have a table to map eye movements to at least one associated context. In this example, eye movements looking at beer may be associated with the context of having a desire to drink beer, a feeling of thirst, or being hot.


Simultaneously, before, or after detecting the eye movement, sensor 106, 110 may detect the player wiping sweat off his forehead. The sensor 106, 110 may also determine that the player does not remove his hands from the steering wheel (or any other input device, such as a joystick, trackball, and the like) often other than to wipe his forehead. This combination of detected gestures may be detected and transmitted to gaming server 112 for analysis by analysis server 118. Analysis server 118 may analyze the gesture movement using gesture database 120. Gesture database 120 may have a table to map gestures to at least one associated context. In this example, wiping sweat off a forehead may be associated with the context of being hot. Not moving his hands from the steering wheel often may be associated with the context of excitement and joy from playing the racing game and concentrating hard.


Simultaneously, before, or after detecting the eye movement and/or gesture, sensor 106, 110 may detect that the temperature of the player has decreased by five degrees, the player's hands are clammy, and his breathing and heart rate has increased. This change state data may be detected and transmitted to gaming server 112 for analysis by analysis server 118. Analysis server 118 may analyze the change state data using state database 124 in the context of the game database 130 and the player database 128, for instance. State database 124 may have a table to map different state changes to at least one associated context. In this example, an increase in temperature may be associated with the context of being hot, sick, or running a fever. The high breathing and heart rate may be associated with the context of being excited playing the game. The slow blinking of the eyelids may be associated with the context of being tired. The smiling facial expression may be associated with happiness.


Based on each associated context, individually or collectively, analysis server 118 may determine that the player is excited about playing the racing game because he has been at the game for 30 minutes, pushing for the pole position in the final 5 laps, and therefore hot. Thus, in one example, the gaming server may determine that gaming establishment personnel should provide the player with a beer or drink. In another example, the gaming server 112 may determine that the action to take is to lower the temperature in that part of the gaming establishment (e.g. turn on an air conditioner or fan near the location of the player). By lowering the temperature, the gaming establishment ensures that the player will remain and continue to play at the gaming machine 104, 106 rather than leave because it is too hot.


In still another example, analysis server 118 may determine that since the player is wiping sweat off his forehead, has decreased body temperature, has clammy hands, and his breathing and heart rate has increased, the player may be close to having a heart attack. Thus, gaming server 118 may then determine that the action to take is to notify the player to stop playing the game by displaying a recommended “Cease Play” message on a display of gaming machines 104, 108. Alternatively, the gaming machine 112 may slow the game down, or end the game play without player knowledge. Gaming server 118 may also notify or alert gaming establishment personnel of the potential health emergency.


Gaming server 112 may also be configured to help determine additional advertisements to display on a display of gaming devices 104, 108. Gaming server 112 may be configured to communicate with player tracking server 126. Player tracking server 126 may track any desired player information or preferences, such as, accumulated points, likes and dislikes, player's demographic, what games the player prefer or plays often, what tournaments the player likes to play, what drinks the player often or previously ordered, and the like. By comparing the analysis obtained from analysis server 118 and player preferences obtained from player tracking server 126, gaming server 112 may determine optimal and the type of advertisements to display on a display of the gaming devices 104, 108. For example, if analysis server 118 determined that the player is hot and thirsty and gaming server 112 obtains player information that the player likes to drink Budweiser® beer from player tracking server 126, gaming server 112 may inform gaming establishment personnel to provide the player with Budweiser® beer. Additionally, at a convenient time (e.g. between gaming session, when no input from the player is detected within a pre-defined period of time, and any other predetermined criteria or time) an advertisement for Budweiser® beer may pop-up on a display of the gaming device.


EXAMPLE 2

The player may be playing a poker game on his mobile gaming device 108 while sitting in the sports bar and drinking a glass of soda. While playing the poker game, sensor 110 may track the player's eyes gaze direction. When analyzed in the context of the player's location information from player location database 132 and the venue layout database 134, the player's gaze may be mapped to a screen in the sports bar displaying a football game. The eye movement data may be transmitted to gaming server 112. Sensor 110 may also determine that the player tilted his head while his hand, holding a glass, moved upward. Lastly, sensor 110 may further determine that the player suddenly stood up and his heart rate increased slightly. The gesture data and change state data may be transmitted to the gaming server 112. Player location database 132 may contain the location of where the player is within the venue or gaming establishment. Data in the player location database 132 may be updated periodically as desired, such as every 30 seconds, 1-20 minutes, 30 minutes, every hour, and the like. Venue layout database 134 may contain data about the layout of the venue such as the location of the pool, location of each gaming machine, location of restaurant and bars, and the location of any other desired area.


Gaming server may then transmit the received data to analysis server 118 for analysis. Analysis server 118 may compare the received eye movement data to data stored in movement database 122. Analysis server 118 may determine that, in the context of the player's current location, time, and gaze direction, eye movement to a display displaying a football game may be associated with the context of enjoying watching and/or playing football. Analysis server 118 may then compare the received gesture data to data stored in gesture database 120. Analysis server 118 may determine that movement of a tilted his head with hands moving upward while holding a glass may be associated with the context of finishing a drink. In other words, the player may have finished drinking his drink.


Analysis server 118 may then compare the received change state data to data stored in state database 124. Analysis server 118 may determine that movement of suddenly standing up and an increase in heart rate may be associated with a win from playing the game of chance.


Based on the analysis obtained from analysis server 118, gaming server 112 may determine that the player may need another drink and inform gaming establishment personnel to send another drink to the player. Advertising server 114 may also determine that advertisements associated with football and/or football related advertisements (e.g. beer, sports drink, athletic products, and the like) should be pushed to and displayed on a display of the mobile gaming device 108. In another embodiment, the advertisements may be any other type of advertisement for the gaming establishment such as advertisements in anticipation of upcoming events related to the gaming establishment and/or the game of chance being played on the gaming device (e.g. new game of chance themes, upcoming bonuses to be received, jackpot size, future poker tournaments, ability to double down in future hands of the current game session, and the like).


EXAMPLE 3

Gaming establishment may anticipate and/or determine the probability of eye movement (e.g. based on tests, statistics, probability, or any other method of probability or determination) to specific areas of the gaming devices 104, 108. For example, gaming establishment may determine that a person's gaze or eye movement is highest at lower right hand corner of the gaming device 104, 108 where the “Spin” button is located. Each area or zones of the gaming device 104, 108 may be assigned an advertisement pricing based on the probability of a player's eye gaze. For example, in areas with a higher probability of a player's gaze, the price for displaying or presenting the advertisement in that area may be high. In areas with a lower probability of a player's gaze, such as the bill collector, the price for displaying or presenting the advertisement on or proximate the bill collector may be low.


A player may begin playing a Wheel Of Fortune® game of chance on gaming device 104, 108. Sensor 106, 110 may detect the player's gaze at lower right hand corner of the gaming device 104, 108 where the “Spin” button is located at the start of the game and periodically during the gaming session. The eye movement data, lingering time, the game being played, current time, location of the player, profile of the player, and the like, may be transmitted to the advertising server 114, via gaming server 112, to determine what advertisement should be displayed on the gaming device 104, 108 and how much should be charged to the advertiser per ad impression.


Advertising server 114 may have an advertisement database 116 configured to store advertisement data received from third party advertisers. Stored advertisement data may include method of advertisement (e.g. whether the advertisement is an image, video, audio, or a combination of the above), fee for the advertisement, and when the advertisement should be transmitted to the gaming device 104, 108 based on the received eye movement data. For example, based upon the eye movement data received from sensor 106, 110, advertising server 114 may determine that an advertisement for Verizon® should be displayed on gaming device 104, 108.


Although gaming server 112, analysis server 118, player tracking server 126, and advertising server 114 are illustrated as separate servers, this is not intended to be limiting as they may be incorporated into one gaming server rather than separate servers.



FIG. 2 illustrates an example method for determining advertisements to display on a gaming device. Method 200 may begin with receiving a request to play a game of chance at 202 by a gaming server. The game of chance may be played on a gaming device, such as, gaming machines 104 or mobile gaming devices 108 illustrated in FIG. 1. Mobile gaming devices may be any portable device capable of playing games (e.g. games of chance, video games, and the like) such as, for example, portable telephones, laptops, computers, notebooks, tablets, media players, and the like. Although illustrated and described with a request to play a game of chance, it will know be known that this invention may apply to non-gaming games, such as video games.


The gaming devices may be configured to communicate with a gaming server via any wired or wireless connection. Wireless connections may be any known wireless methods such as a 3G wireless technology, 4G wireless technology, Bluetooth, wireless universal serial bus, near-field magnetic communication, Fire Wire, WiMax, IEEE 802.11x technology, radio frequency, narrow-beam infrared (IR) sensor, RFID tag, vibration sensor or any other known wireless methods.


The gaming server may transmit gaming data at 204 to initiate the game of chance on the gaming device. The gaming data may be associated with the game of chance requested. For example, if a game of black jack is requested, the gaming data may be associated with a black jack game.


The player's movement, state, location, activity, physiological changes, facial expressions, and/or gestures may be received at 206. The player's movement, state, and/or gestures may be recorded using any known sensor on the gaming device. For example, a camera may be used to track the gaze of the player's eyes, facial expression, or gestures. Any known biological and/or physiological sensor may be used to detect the state of the player such as a temperature sensor, an IR camera for temperature imaging, a camera to sense facial emotions, gestures, and the like. States of a player may be any biological or physiological state, for example, body temperature, oxygen level, heart rate, breathing rate, and any other biological or physiological state. In one embodiment, an infrared camera that continuously monitors a player's face for temperature changes in various areas and dynamically compare the changes to a preset baseline measurement to quantify the player's state changes. In another embodiment, a camera that captures the player's facial expression in visible light and dynamically compare the variation to a baseline measurement to quantify the player's state change (emotional expression, in this case). Whenever an optical sensor, such as a camera, is being used, the analysis of state change can be done by comparing variations between successive video frames using known digital image processing methods such as pixel subtraction, genetic programming, and the like.


The received eye movement, state, player location, venue characteristics, player's game activities, player's demographic, and gesture data may be analyzed at 208. Gaming server may have a database storing data related to eye movement, player state, and gesture movements. The received data may be compared with data stored in the database in order to match the received data with an associated context. The associated context may be an extrapolation, determination, or anticipation of what the player may desire, want, or feel. For example, gaming server may receive data indicating that the player's eye gazed to a picture of a beer and the player's state has increased in body temperature.


The gaming server may then determine whether any action should be taken at 210. If no action is determined to be taken at 210, method 200 may end. In one example, gaming server may determine and inform that gaming establishment personnel should provide the player with a beer or drink. In another example, the gaming server may determine that the action to take is to lower the temperature in that part of the gaming establishment (e.g. turn on an air conditioner or fan near the location of the player). By lowering the temperature, the gaming establishment ensures that the player will remain and continue to play at the gaming machine rather than leave because it is too hot. Thus, gaming server may initiate the action at 212. For example, gaming server may send a notification to the gaming establishment personnel to provide the player with a beer or drink.



FIG. 3 illustrates another example method for determining advertisements to display on a gaming device. Method 300 may begin with receiving a request to play a game at 302 by a gaming server. The game may be played on a gaming device, such as, gaming machines 104 or mobile gaming devices 108 illustrated in FIG. 1. Mobile gaming devices may be any portable device capable of playing games (e.g. games of chance, video games, and the like) such as, for example, portable telephones, laptops, computers, notebooks, tablets, media players, and the like. Although illustrated and described with a request to play a game of chance, it will know be known that this invention may apply to non-gaming games, such as video games.


The gaming devices may be configured to communicate with a gaming server via any wired or wireless connection. Wireless connections may be any known wireless methods such as a 3G wireless technology, 4G wireless technology, Bluetooth, wireless universal serial bus, near-field magnetic or communication sensor, Fire Wire, WiMax, IEEE 802.11x technology, radio frequency, narrow-beam infrared (IR) sensor, RFID tag, vibration sensor or any other known wireless methods.


The gaming server may transmit gaming data at 304 to initiate the game of chance on the gaming device. The gaming data may be associated with the game of chance requested. For example, if a game of black jack is requested, the gaming data may be associated with a black jack game.


The player's eye movement data may be received at 306. The player's eye movement may be tracked and/or determined using any known sensor on the gaming device. For example, a camera may be used to track the movement or gaze of the player's eyes. The location of the received eye movement data, gazing duration, screen location, gaze direction, and the like, may be determined at 308. In one embodiment, the location of the eye movement data may be determined relative to the gaming device. In other words, the gaming server may determine where, on the gaming machine, the player gazed at. In another embodiment, the location of the eye movement data may be determined relative to a screen or display at the bar within the gaming establishment.


Gaming establishment may anticipate and/or determine the probability of eye movement (e.g. based on tests, statistics, probability, or any other method of probability or determination) to specific areas of the gaming device. For example, gaming establishment may determine that a person's gaze or eye movement is highest at lower right hand corner of the gaming device where the “Spin” button is located. Each area of the gaming device may be assigned an advertisement pricing based on the probability of a player's eye gaze at 312. For example, in areas with a higher probability of a player's gaze, the price for the advertisement may be high. In areas with a lower probability of a player's gaze, such as the bill collector, the price for the advertisement may be low.


An advertisement may then be transmitted to gaming device at 314. Gaming server may determine, based on the fee paid by third party advertisers, highest bid received in an ad-placement auction, or any other criteria, what advertisements to display on the gaming device.



FIG. 4 illustrates yet another example method for determining advertisements to display on a gaming device. Method 400 may begin with transmitting a request to play a game at 402 by a gaming device to a server, such as gaming server 112 illustrated in FIG. 1. Gaming device may be any computing device capable of playing games (e.g. games of chance, video games, and the like) such as, for example, portable telephones, laptops, computers, notebooks, tablets, media players, and the like.


Gaming data may be received at 404 in response to the request to play a game. The gaming data may cause the game to be initiated on the gaming device. For example, if the request was to play Keno, the gaming data may be associated with Keno and Keno games may then be initiated on the gaming device.


The gaming device may have at least one sensor to track different movements and states of the player. A determination may be made if eye movement of the player is detected at 406. If eye movement is detected at 406, the eye movement may be recorded at 408. For example, if the player is playing a car racing game that has many billboards (e.g. billboards advertising food, liquor, electronics, entertainment, tournaments, and the like), the sensor may track whether the player is looking at the billboards while driving the racing car. Still further, the sensor may determine which billboard the player is looking at if there is more than one displayed simultaneously. In another embodiment, the sensor may track whether the player is looking at an avatar associated with playing certain games of chance such as poker. In yet another embodiment, the sensor may determine that the player is looking at the spin button on the gaming device.


The sensor may be any known sensor designed to determine or track the gaze of an eye, such as a camera. Additionally, any known eye tracking movement methods may be used to determine or track the gaze of the eye.


If no eye movement is detected at 406, a determination may be made if player gesture is detected at 410. The gaming device may have at least one sensor configured to determine or track gestures from the player. If a player gesture is detected at 410, the player gesture may be recorded at 412. For example, if the player wins a bonus, the player may smile. If the player lost at a game of chance, the player may turn the smile into a frown. The sensor may be any known sensor designed to determine or detect gestures, such as a camera. Additionally, any known gesture or emotion detection method may be used to determine or track human gestures.


If no gesture or emotion is detected at 410, a determination may be made if a player's state change is detected at 414. States of a player may be any biological or physiological state, for example, body temperature, oxygen level, heart rate, breathing rate, facial expression, and any other biological or physiological state. If no state change is detected at 410, the method may continue at step 420 as further discussed below. However, if a state change is detected at 414, the state change may be recorded at 416. For example, sensor may detect that a player has a lower than normal temperature, low oxygen level and a high heart rate. In another example, sensor may detect a higher than normal temperature reading and fast heart rate from the player.


The sensors used may be any known sensor designed to determine or detect biological or physiological changes in a player, such as, for example a temperature sensor, PH level, conductivity, salinity, and the like. In one embodiment, the temperature sensor may be positioned on the gaming machine in a location known to be in contact with the player, such as the “spin” button, “cash out” button, and the like. In another embodiment, the temperature sensor may be an IR (infrared) camera configured to detect human temperature. Additionally, any known biological or physiological change detection method may be used. In one embodiment, an infrared camera that continuously monitors a player's face for temperature changes in various areas and dynamically compares the changes to a preset baseline measurement to quantify the player's state changes. In another embodiment, a camera that captures the player's facial expression in visible light and dynamically compare the variation to a baseline measurement to quantify the player's state change (e.g. emotional expression, in this case). Whenever an optical sensor, such as a camera, is being used, the analysis of state change can be done by comparing variations between successive video frames using known digital image processing methods such as pixel subtraction, genetic programming, and the like.


Those of ordinary skill in the art will know realize that above steps 408, 410, and 414 may be performed simultaneously or individually. The sensors may be used alone or in combination. Additionally, each sensor may be invoked at various times throughout game play. For example, if a player wipes sweat from his forehead, the state sensor may check to determine how tired the player is (e.g. check the player's heart rate, body temperature, and the like), and the gaming server may decide to slow down the game and introduce a refreshing beverage image on a billboard. The eye sensor may determine whether the player gazes at the billboard and send a notification to the gaming establishment to provide a drink to the player if it determines that the player looked at the billboard.


The recorded data (e.g. eye movement, gesture, and state change) may be transmitted to a server at 418. The server may be, for example, gaming server 112, or the advertisement server 114, illustrated in FIG. 1. A determination of whether an advertisement is received is made at 420. An advertisement associated with (or not associated with) the recorded data may be received at the gaming device. If no advertisement is received at 420, the method may end. However, if an advertisement is received at 420, the advertisement may be presented on a display of the gaming device at 422. For example, the server may determine that the player was gazing at an advertisement for beer, the advertisement may be for Budweiser® beer. In another embodiment, the advertisement may be priced based on the location or zone the beer advertisement was displayed as discussed in detail above in FIG. 3.


In one embodiment, recorded data may be stored on the gaming server for future and/or immediate analysis. For example, through game play, the gaming server may determine that the user felt excitement (e.g. detecting a smile, eyes become wider, heart rate increases, and the like), felt frustration (e.g. detecting a frown, rolling eyes, hands placed in a fist, hands banging on the gaming device or nearby object, player shouting some unhappy and/or curse words, and the like), and any other emotions or state of the player. Analysis of the player's state throughout game play may help the gaming establishment provide better customer service to its players, provide more customized advertisement to the players, help to determine products to provide to an establishment's customers, and the like.


For example, “frustration” state is recorded and can be correlated to when my “flappy bird” crashed into a post. Saving a game state at the time a player state is recorded is new and fertile ground for future development.



FIG. 5 illustrates an exemplary computer system 500 suitable for use with at least one embodiment of the invention. The methods, processes and/or graphical user interfaces discussed above can be provided by a computer system. The computer system 500 includes a display monitor 502 having a single or multi-screen display 504 (or multiple displays), a cabinet 506, a keyboard 508, and a mouse 510. The mouse 510 is representative of one type of pointing device. The cabinet 506 houses a processing unit (or processor), system memory and a hard drive (not shown). The cabinet 506 also houses a drive 512, such as a DVD, CD-ROM or floppy drive. The drive 512 can also be a removable hard drive, a Flash or EEPROM device, etc. Regardless, the drive 512 may be utilized to store and retrieve software programs incorporating computer code that implements some or all aspects of the invention, data for use with the invention, and the like. Although CD-ROM 514 is shown as an exemplary computer readable storage medium, other computer readable storage media including floppy disk, tape, Flash or EEPROM memory, memory card, system memory, and hard drive may be utilized. In one implementation, a software program for the computer system 500 is provided in the system memory, the hard drive, the drive 512, the CD-ROM 514 or other computer readable storage medium and serves to incorporate the computer code that implements some or all aspects of the invention.



FIG. 6 is a block diagram of an example computing system. The computing system 600 may be the gaming server 112, gaming machine 104, mobile gaming device 108, analysis server 118, player tracking server 126, advertising server 114 illustrated in FIG. 1, or any other server or computing device used to carry out the various embodiments disclosed herein. The computing system 600 may include a processor 602 that pertains to a microprocessor or controller for controlling the overall operation of the computing system 600. The computing system 600 may store any type of data and information as discussed above in a file system 604 and a cache 606. The file system 604 is, typically, a storage disk or a plurality of disks. The file system 604 typically provides high capacity storage capability for the computing system 600. However, since the access time to the file system 604 is relatively slow, the computing system 600 can also include a cache 606. The cache 606 is, for example, Random-Access Memory (RAM) provided by semiconductor memory. The relative access time to the cache 606 is substantially shorter than for the file system 604. However, the cache 606 does not have the large storage capacity of the file system 604. Further, the file system 604, when active, consumes more power than does the cache 606. The computing system 600 also includes a RAM 620 and a Read-Only Memory (ROM) 622. The ROM 622 can store programs, utilities or processes to be executed in a non-volatile manner. The RAM 620 provides volatile data storage, such as for the cache 606.


The computing system 600 also includes a user input device 608 that allows a user of the computing system 600 to interact with the computing system 600. For example, the user input device 608 can take a variety of forms, such as a button, keypad, dial, and the like. Still further, the computing system 600 includes a display 610 (screen display) that can be controlled by the processor 602 to display information, such as a list of upcoming appointments, to the user. A data bus 611 can facilitate data transfer between at least the file system 604, the cache 606, the processor 602, and the CODEC 612.


In one embodiment, the computing system 600 serves to store a plurality of player tracking and/or third party advertiser data in the file system 604. When a user desires to have the computing system display a particular advertisement, a list of the various third party advertisers may be displayed on the display 610.


The computing system 600 may also include a network/bus interface 616 that couples to a data link 618. The data link 618 allows the computing system 600 to couple to a host computer or data network, such as the Internet. The data link 618 can be provided over a wired connection or a wireless connection. In the case of a wireless connection, the network/bus interface 616 can include a wireless transceiver.


While embodiments and applications of this invention have been shown and described, it would be apparent to those skilled in the art having the benefit of this disclosure that many more modifications than mentioned above are possible without departing from the inventive concepts herein.

Claims
  • 1. A system of selective advertising, comprising: a gaming server configured to: receive a request to play a game of chance from a gaming device configured to receive a wager to play the game of chance;transmit gaming data to the gaming device, the gaming data associated with the requested game of chance;receive player data, the player data associated with an eye movement of the player, at least one gesture of the player, or at least one change in state of the player;analyze the player data;determine a context associated with the player data;determine an action to take based on the determined context associated with the player data;transmit a notification to a gaming establishment associated with the gaming device, the notification to the gaming establishment including at least the determined action to take;determine at least one non-gaming advertisement associated with the context; andtransmit the at least one non-gaming advertisement to the gaming device for display on a display of the gaming device;the gaming device configured to: receive the gaming data; anddisplay the gaming data on a display of the gaming device;display the at least one non-gaming advertisement on a portion the display; anda sensor proximate the gaming device, the sensor configured to detect player data, the player data includes at least data based on the eye movement of the player, the at least one gesture of the player, or the at least one change in state of the player.
  • 2. The system of claim 1, wherein the determined action is a player recommended action, and wherein the player recommended action is displayed on a portion of the display of the gaming device.
  • 3. The system of claim 1, further comprising: determine a location on the gaming device associated with the eye movement of the player;determine at least one advertisement associated with the determined location; andtransmit the at least one advertisement to the gaming device for display on the display of the gaming device.
  • 4. The system of claim 1, wherein the transmit the at least one at non-gaming advertisement further comprises: determine a location to display the at least one non-gaming advertisement; andassign an advertisement price for the displayed at least one non-gaming advertisement based on the displayed location.
  • 5. The system of claim 3, wherein the transmit the at least one advertisement further comprises: determine a location to display the advertisement; andassign an advertisement price for the advertisement based on the determined location to display the advertisement.
  • 6. The system of claim 1, wherein the determined action to take is associated with a service provided at the gaming establishment.
  • 7. The system of claim 1, wherein the at least one change in state of the player includes biological or physical state changes.
  • 8. A method for selecting advertisements based on player physiology, comprising: transmitting, by a gaming device configured to receive a wager, a request to play a game of chance;receiving gaming data at the gaming device, the gaming data associated with the requested game of chance;determining, by a sensor, if there is player eye movement changes from a first position to a second location;determining, by the sensor, if there is player movement;recording the player eye movement changes from the first position to the second location if it is determined that there is player eye movement changes;recording the player movement if it is determined that there is player movement;transmitting the recorded player eye movement changes and the recorded player movements to a server; andreceiving a non-gaming advertisement for display on a display of the gaming device, the received non-gaming advertisement based on at least the recorded player eye movement changes to the second location.
  • 9. The method of claim 8, wherein the received non-gaming advertisement for display on the display of the gaming device is based on at least the recorded player movement.
  • 10. The method of claim 9, further comprising: determine a location to display the non-gaming advertisement on the display of the gaming device; andassign an advertisement price for the non-gaming advertisement based on the determined location to display the non-gaming advertisement.
  • 11. The method of claim 8, wherein the sensor is positioned proximate to or on the gaming device.
  • 12. The method of claim 8, further comprising determining if there is a change in a physiology of the player; and transmitting the change in the physiology of the player to the server if it is determined that there is a change in the physiology of the player.
  • 13. The method of claim 12, further comprising receiving an advertisement for display on the display of the gaming device, the received advertisement associated with the change in the physiology of the player.
  • 14. A method for selecting advertisements, comprising: receiving, by a gaming server, a request to play a game of chance from a gaming device configured to receive a wager;transmitting gaming data to a gaming device, the gaming data associated with the requested game of chance;receiving player data, the player data including at least a gesture of the player, an eye movement of the player, or a state change of the player;analyzing the player data;determining at least one non-gaming advertisement based upon the analyzed player data;transmitting the at least one non-gaming advertisement to the gaming device;determining a context associated with the player data;determine an action to take based on the determined context associated with the player data; andtransmit a notification to a gaming establishment associated with the gaming device, the notification to the gaming establishment including at least the determined action to take.
  • 15. The method of claim 14, wherein the action based on the determined context includes at least transmitting the notification for display on a display of the gaming device.
  • 16. The method of claim 14, wherein the transmitting the at least one non-gaming advertisement further comprises: determining a location on a display of the gaming device to present the at least one non-gaming advertisement; andassigning an ad price based on the display location of the at least one non-gaming advertisement.
  • 17. The method of claim 16, wherein the determining the location on the display of the gaming device further comprises analyzing a probability of the eye movement of the player on the display of the gaming device to determine the location to present the at least one advertisement.
  • 18. The method of claim 14, further comprising determining at least one player interest based upon player tracking data, wherein the at least one non-gaming advertisement is based on the at least one player interest.
  • 19. The method of claim 14, wherein the gaming device is a portable gaming device.
  • 20. The method of claim 14, wherein analyzing the player data further comprises: storing the player data for analysis at a later date.
CROSS-REFERENCE TO OTHER APPLICATION

This application claim priority of U.S. Provisional Patent Application No. 61/789,332, filed Mar. 15, 2013, and entitled “DETERMINATION OF ADVERTISEMENT BASED ON PLAYER PHYSIOLOGY”, which is hereby incorporated by reference herein.

US Referenced Citations (528)
Number Name Date Kind
2033638 Koppl Mar 1936 A
2062923 Nagy Dec 1936 A
4741539 Sutton et al. May 1988 A
4948138 Pease et al. Aug 1990 A
5067712 Georgilas Nov 1991 A
5429361 Raven et al. Jul 1995 A
5489103 Okamoto Feb 1996 A
5630757 Gagin May 1997 A
5655961 Acres et al. Aug 1997 A
5704835 Dietz, II Jan 1998 A
5727786 Weingardt Mar 1998 A
5833537 Barrie Nov 1998 A
5919091 Bell et al. Jul 1999 A
5947820 Morro et al. Sep 1999 A
5997401 Crawford Dec 1999 A
6001016 Walker et al. Dec 1999 A
6039648 Guinn et al. Mar 2000 A
6059289 Vancura May 2000 A
6089977 Bennett Jul 2000 A
6095920 Sudahiro Aug 2000 A
6110041 Walker et al. Aug 2000 A
6142872 Walker et al. Nov 2000 A
6146273 Olsen Nov 2000 A
6165071 Weiss Dec 2000 A
6231445 Acres May 2001 B1
6270412 Crawford et al. Aug 2001 B1
6290600 Glasson Sep 2001 B1
6293866 Walker et al. Sep 2001 B1
6353390 Beri et al. Mar 2002 B1
6364768 Acres et al. Apr 2002 B1
6404884 Marwell et al. Jun 2002 B1
6416406 Duhamel Jul 2002 B1
6416409 Jordan Jul 2002 B1
6443452 Brune Sep 2002 B1
6491584 Graham et al. Dec 2002 B2
6505095 Kolls Jan 2003 B1
6508710 Paravia et al. Jan 2003 B1
6561900 Baerlocker et al. May 2003 B1
6592457 Frohm et al. Jul 2003 B1
6612574 Cole et al. Sep 2003 B1
6620046 Rowe Sep 2003 B2
6641477 Dietz, II Nov 2003 B1
6645078 Mattice Nov 2003 B1
6719630 Seelig et al. Apr 2004 B1
6749510 Globbi Jun 2004 B2
6758757 Luciano, Jr. et al. Jul 2004 B2
6773345 Walker et al. Aug 2004 B2
6778820 Tendler Aug 2004 B2
6780111 Cannon et al. Aug 2004 B2
6799032 McDonnell et al. Sep 2004 B2
6800027 Giobbi et al. Oct 2004 B2
6804763 Stockdale et al. Oct 2004 B1
6811486 Luciano, Jr. Nov 2004 B1
6843725 Nelson Jan 2005 B2
6846238 Wells Jan 2005 B2
6848995 Walker et al. Feb 2005 B1
6852029 Baltz et al. Feb 2005 B2
6869361 Sharpless et al. Mar 2005 B2
6875106 Weiss et al. Apr 2005 B2
6884170 Rowe Apr 2005 B2
6884172 Lloyd et al. Apr 2005 B1
6902484 Idaka Jun 2005 B2
6908390 Nguyen et al. Jun 2005 B2
6913532 Bearlocher et al. Jul 2005 B2
6923721 Luciano et al. Aug 2005 B2
6935958 Nelson Aug 2005 B2
6949022 Showers Sep 2005 B1
6955600 Glavich et al. Oct 2005 B2
6971956 Rowe et al. Dec 2005 B2
6984174 Cannon et al. Jan 2006 B2
6997803 LeMay et al. Feb 2006 B2
7018292 Tracy et al. Mar 2006 B2
7032115 Kashani Apr 2006 B2
7033276 Walker et al. Apr 2006 B2
7035626 Luciano Apr 2006 B1
7037195 Schneider et al. May 2006 B2
7048628 Schneider May 2006 B2
7048630 Berg et al. May 2006 B2
7063617 Brosnan et al. Jun 2006 B2
7076329 Kolls Jul 2006 B1
7089264 Guido et al. Aug 2006 B1
7094148 Bearlocher et al. Aug 2006 B2
7105736 Laakso Sep 2006 B2
7111141 Nelson Sep 2006 B2
7144321 Mayeroff Dec 2006 B2
7152783 Charrin Dec 2006 B2
7169041 Tessmer et al. Jan 2007 B2
7169052 Beaulieu et al. Jan 2007 B2
7175523 Gilmore et al. Feb 2007 B2
7181228 Boesch Feb 2007 B2
7182690 Giobbi et al. Feb 2007 B2
RE39644 Alcorn et al. May 2007 E
7243104 Bill Jul 2007 B2
7247098 Bradford et al. Jul 2007 B1
7259718 Patterson et al. Aug 2007 B2
7275989 Moody Oct 2007 B2
7285047 Gielb et al. Oct 2007 B2
7311608 Danieli Dec 2007 B1
7314408 Cannon et al. Jan 2008 B2
7316615 Soltys et al. Jan 2008 B2
7316619 Nelson Jan 2008 B2
7318775 Brosnan et al. Jan 2008 B2
7326116 O'Donovan et al. Feb 2008 B2
7330108 Thomas Feb 2008 B2
7346358 Wood et al. Mar 2008 B2
7355112 Laakso Apr 2008 B2
7384338 Rothschild et al. Jun 2008 B2
7387571 Walker et al. Jun 2008 B2
7393278 Gerson et al. Jul 2008 B2
7396990 Lu et al. Jul 2008 B2
7415426 Williams et al. Aug 2008 B2
7425177 Rodgers et al. Sep 2008 B2
7427234 Soltys et al. Sep 2008 B2
7427236 Kaminkow et al. Sep 2008 B2
7427708 Ohmura Sep 2008 B2
7431650 Kessman Oct 2008 B2
7448949 Kaminkow et al. Nov 2008 B2
7500913 Baerlocher Mar 2009 B2
7510474 Carter Mar 2009 B2
7513828 Nguyen et al. Apr 2009 B2
7519838 Suurballe Apr 2009 B1
7559838 Walker et al. Jul 2009 B2
7563167 Walker et al. Jul 2009 B2
7572183 Olivas et al. Aug 2009 B2
7585222 Muir Sep 2009 B2
7602298 Thomas Oct 2009 B2
7607174 Kashchenko et al. Oct 2009 B1
7611409 Muir et al. Nov 2009 B2
7637810 Amaitis et al. Dec 2009 B2
7644861 Alderucci et al. Jan 2010 B2
7653757 Fernald et al. Jan 2010 B1
7693306 Konami Apr 2010 B2
7699703 Muir Apr 2010 B2
7722453 Lark et al. May 2010 B2
7758423 Foster et al. Jul 2010 B2
7771271 Walker et al. Aug 2010 B2
7780529 Rowe et al. Aug 2010 B2
7780531 Englman et al. Aug 2010 B2
7785192 Canterbury et al. Aug 2010 B2
7811172 Asher et al. Oct 2010 B2
7819749 Fish Oct 2010 B1
7822688 Labron Oct 2010 B2
7828652 Nguyen et al. Nov 2010 B2
7828654 Carter Nov 2010 B2
7828661 Fish Nov 2010 B1
7850528 Wells Dec 2010 B2
7874919 Paulsen et al. Jan 2011 B2
7877798 Saunders et al. Jan 2011 B2
7883413 Paulsen Feb 2011 B2
7892097 Muir et al. Feb 2011 B2
7909692 Nguyen et al. Mar 2011 B2
7909699 Parrott et al. Mar 2011 B2
7918728 Nguyen et al. Apr 2011 B2
7927211 Rowe et al. Apr 2011 B2
7927212 Hedrick et al. Apr 2011 B2
7951008 Wolf et al. May 2011 B2
8057298 Nguyen et al. Nov 2011 B2
8057303 Rasmussen Nov 2011 B2
8087988 Nguyen et al. Jan 2012 B2
8117608 Slettehaugh Feb 2012 B1
8133113 Nguyen Mar 2012 B2
8182326 Speers et al. May 2012 B2
8210927 Hedrick Jul 2012 B2
8221245 Walker Jul 2012 B2
8226459 Barrett Jul 2012 B2
8226474 Nguyen et al. Jul 2012 B2
8231456 Zielinski Jul 2012 B2
8235803 Loose et al. Aug 2012 B2
8282475 Nguyen et al. Oct 2012 B2
8323099 Durham et al. Dec 2012 B2
8337290 Nguyen et al. Dec 2012 B2
8342946 Amaitis Jan 2013 B2
8393948 Allen et al. Mar 2013 B2
8403758 Homik Mar 2013 B2
8430745 Agarwal et al. Apr 2013 B2
8461958 Saenz Jun 2013 B2
8469813 Joshi Jun 2013 B2
8529345 Nguyen Sep 2013 B2
8602875 Nguyen Dec 2013 B2
8613655 Kisenwether Dec 2013 B2
8613659 Nelson et al. Dec 2013 B2
8696470 Nguyen Apr 2014 B2
8745417 Huang et al. Jun 2014 B2
8858323 Nguyen et al. Oct 2014 B2
8864586 Nguyen Oct 2014 B2
8942995 Kerr Jan 2015 B1
9039507 Allen et al. May 2015 B2
9235952 Nguyen Jan 2016 B2
9292996 Davis et al. Mar 2016 B2
9325203 Nguyen Apr 2016 B2
9466171 Hornik Oct 2016 B2
9483901 Nguyen Nov 2016 B2
9486697 Chung Nov 2016 B2
9486704 Chung Nov 2016 B2
9576425 Nguyen Feb 2017 B2
9626826 Nguyen Apr 2017 B2
9666021 Nguyen May 2017 B2
9672686 Nguyen Jun 2017 B2
9741205 Nguyen Aug 2017 B2
9811973 Nguyen Nov 2017 B2
9814970 Nguyen Nov 2017 B2
9842462 Nguyen Dec 2017 B2
9875606 Nguyen Jan 2018 B2
9875609 Nguyen Jan 2018 B2
20010004607 Olsen Jun 2001 A1
20010016516 Takatsuka Aug 2001 A1
20010024971 Brossard Sep 2001 A1
20010047291 Garahi Nov 2001 A1
20020006822 Krintzman Jan 2002 A1
20020042295 Walker et al. Apr 2002 A1
20020111210 Luciano, Jr. et al. Aug 2002 A1
20020111213 McEntee et al. Aug 2002 A1
20020113369 Weingardt Aug 2002 A1
20020116615 Nguyen et al. Aug 2002 A1
20020133418 Hammond et al. Sep 2002 A1
20020137217 Rowe et al. Sep 2002 A1
20020142825 Lark et al. Oct 2002 A1
20020147047 Letovsky et al. Oct 2002 A1
20020147049 Carter, Sr. Oct 2002 A1
20020151366 Walker et al. Oct 2002 A1
20020152120 Howington Oct 2002 A1
20020167536 Valdes et al. Nov 2002 A1
20020183105 Cannon et al. Dec 2002 A1
20030001338 Bennett et al. Jan 2003 A1
20030008696 Abecassis et al. Jan 2003 A1
20030027635 Walker et al. Feb 2003 A1
20030064805 Wells Apr 2003 A1
20030064807 Walker et al. Apr 2003 A1
20030092480 White et al. May 2003 A1
20030100361 Sharpless et al. May 2003 A1
20030103965 Jung et al. Jun 2003 A1
20030104860 Cannon et al. Jun 2003 A1
20030104865 Itkis et al. Jun 2003 A1
20030148809 Nelson Aug 2003 A1
20030162588 Brosnan et al. Aug 2003 A1
20030195024 Slattery Oct 2003 A1
20030199295 Vancura Oct 2003 A1
20030224852 Walker et al. Dec 2003 A1
20030224854 Joao Dec 2003 A1
20040002386 Wolfe et al. Jan 2004 A1
20040005919 Walker et al. Jan 2004 A1
20040023709 Beaulieu et al. Feb 2004 A1
20040023716 Gauselmann Feb 2004 A1
20040038736 Bryant Feb 2004 A1
20040048650 Mierau et al. Mar 2004 A1
20040068460 Feeley Apr 2004 A1
20040082385 Silva et al. Apr 2004 A1
20040106449 Walker et al. Jun 2004 A1
20040127277 Walker Jul 2004 A1
20040127290 Walker et al. Jul 2004 A1
20040137987 Nguyen et al. Jul 2004 A1
20040147308 Walker et al. Jul 2004 A1
20040152508 Lind Aug 2004 A1
20040214622 Atkinson Oct 2004 A1
20040224753 Odonovan et al. Nov 2004 A1
20040256803 Ko Dec 2004 A1
20040259633 Gentles et al. Dec 2004 A1
20050003890 Hedrick et al. Jan 2005 A1
20050004980 Vadjinia Jan 2005 A1
20050026696 Hashimoto et al. Feb 2005 A1
20050054446 Kammler Mar 2005 A1
20050101376 Walker et al. May 2005 A1
20050101383 Wells May 2005 A1
20050130728 Nguyen et al. Jun 2005 A1
20050137014 Vetelaninen Jun 2005 A1
20050181865 Luciano Aug 2005 A1
20050181870 Nguyen et al. Aug 2005 A1
20050181875 Hoehne Aug 2005 A1
20050187020 Amaitis et al. Aug 2005 A1
20050202875 Murphy et al. Sep 2005 A1
20050209002 Blythe et al. Sep 2005 A1
20050221881 Lannert Oct 2005 A1
20050223219 Gatto et al. Oct 2005 A1
20050239546 Hedrick Oct 2005 A1
20050255919 Nelson Nov 2005 A1
20050273635 Wilcox et al. Dec 2005 A1
20050277471 Russell et al. Dec 2005 A1
20050282637 Gatto et al. Dec 2005 A1
20060009283 Englman et al. Jan 2006 A1
20060036874 Cockerille Feb 2006 A1
20060046822 Kaminkow et al. Mar 2006 A1
20060046830 Webb Mar 2006 A1
20060046849 Kovacs Mar 2006 A1
20060068893 Jaffe et al. Mar 2006 A1
20060073869 LeMay et al. Apr 2006 A1
20060073897 Englman et al. Apr 2006 A1
20060079317 Flemming et al. Apr 2006 A1
20060148551 Walker et al. Jul 2006 A1
20060189382 Muir et al. Aug 2006 A1
20060217170 Roireau Sep 2006 A1
20060217193 Walker et al. Sep 2006 A1
20060247028 Brosnan et al. Nov 2006 A1
20060247035 Rowe et al. Nov 2006 A1
20060252530 Oberberger et al. Nov 2006 A1
20060253481 Guido et al. Nov 2006 A1
20060281525 Borissov Dec 2006 A1
20060281541 Nguyen et al. Dec 2006 A1
20060287106 Jensen Dec 2006 A1
20070004510 Underdahl et al. Jan 2007 A1
20070026935 Wolf et al. Feb 2007 A1
20070026942 Kinsley Feb 2007 A1
20070054739 Amaitis et al. Mar 2007 A1
20070060254 Muir Mar 2007 A1
20070060306 Amaitis et al. Mar 2007 A1
20070060319 Block et al. Mar 2007 A1
20070060358 Amaitas et al. Mar 2007 A1
20070077981 Hungate et al. Apr 2007 A1
20070087833 Feeney et al. Apr 2007 A1
20070087834 Moser et al. Apr 2007 A1
20070093299 Bergeron Apr 2007 A1
20070129123 Eryou et al. Jun 2007 A1
20070149279 Norden et al. Jun 2007 A1
20070149286 Bemmel Jun 2007 A1
20070159301 Hirt et al. Jul 2007 A1
20070161402 Ng et al. Jul 2007 A1
20070184896 Dickerson Aug 2007 A1
20070184904 Lee Aug 2007 A1
20070191109 Crowder et al. Aug 2007 A1
20070207852 Nelson et al. Sep 2007 A1
20070207854 Wolf et al. Sep 2007 A1
20070238505 Okada Oct 2007 A1
20070241187 Alderucci et al. Oct 2007 A1
20070248036 Nevalainen Oct 2007 A1
20070257430 Hardy et al. Nov 2007 A1
20070259713 Fiden et al. Nov 2007 A1
20070259717 Mattice et al. Nov 2007 A1
20070270213 Nguyen et al. Nov 2007 A1
20070275777 Walker et al. Nov 2007 A1
20070275779 Amaitis et al. Nov 2007 A1
20070281782 Amaitis et al. Dec 2007 A1
20070281785 Amaitas et al. Dec 2007 A1
20070298873 Nguyen et al. Dec 2007 A1
20080015032 Bradford et al. Jan 2008 A1
20080020824 Cuddy et al. Jan 2008 A1
20080032787 Low et al. Feb 2008 A1
20080070652 Nguyen et al. Mar 2008 A1
20080070681 Marks et al. Mar 2008 A1
20080076505 Nguyen Mar 2008 A1
20080076506 Nguyen et al. Mar 2008 A1
20080076548 Paulsen Mar 2008 A1
20080076572 Nguyen et al. Mar 2008 A1
20080096650 Baerlocher Apr 2008 A1
20080102956 Burman et al. May 2008 A1
20080102957 Burnman et al. May 2008 A1
20080113772 Burrill et al. May 2008 A1
20080119267 Denlay May 2008 A1
20080139306 Lutnick Jun 2008 A1
20080146321 Parente Jun 2008 A1
20080150902 Edpalm et al. Jun 2008 A1
20080153583 Huntley et al. Jun 2008 A1
20080161110 Campbell Jul 2008 A1
20080167106 Lutnick et al. Jul 2008 A1
20080182667 Davis et al. Jul 2008 A1
20080200251 Alderucci Aug 2008 A1
20080207307 Cunningham, II et al. Aug 2008 A1
20080214258 Brosnan et al. Sep 2008 A1
20080215319 Lu Sep 2008 A1
20080234047 Nguyen Sep 2008 A1
20080238610 Rosenbereg Oct 2008 A1
20080248849 Lutnick Oct 2008 A1
20080252419 Batchelor Oct 2008 A1
20080254878 Sauders et al. Oct 2008 A1
20080254881 Lutnick et al. Oct 2008 A1
20080254883 Patel et al. Oct 2008 A1
20080254891 Sauders et al. Oct 2008 A1
20080254892 Sauders et al. Oct 2008 A1
20080254897 Sauders et al. Oct 2008 A1
20080263173 Weber et al. Oct 2008 A1
20080300058 Sum et al. Dec 2008 A1
20080305864 Kelly et al. Dec 2008 A1
20080305865 Kelly et al. Dec 2008 A1
20080305866 Kelly et al. Dec 2008 A1
20080311994 Amaitas et al. Dec 2008 A1
20080318669 Buchholz Dec 2008 A1
20080318686 Crowder et al. Dec 2008 A1
20090005165 Arezina et al. Jan 2009 A1
20090011822 Englman Jan 2009 A1
20090029766 Lutnick et al. Jan 2009 A1
20090054149 Brosnan et al. Feb 2009 A1
20090077396 Tsai et al. Mar 2009 A1
20090088258 Saunders et al. Apr 2009 A1
20090098925 Gagner et al. Apr 2009 A1
20090104977 Zielinski Apr 2009 A1
20090104983 Okada Apr 2009 A1
20090118002 Lyons May 2009 A1
20090118013 Finnimore et al. May 2009 A1
20090118022 Lyons et al. May 2009 A1
20090124366 Aoki et al. May 2009 A1
20090124390 Seelig et al. May 2009 A1
20090131151 Harris et al. May 2009 A1
20090132163 Ashley et al. May 2009 A1
20090137255 Ashley et al. May 2009 A1
20090138133 Buchholz et al. May 2009 A1
20090149245 Fabbri Jun 2009 A1
20090149261 Chen et al. Jun 2009 A1
20090153342 Thorn Jun 2009 A1
20090156303 Kiely et al. Jun 2009 A1
20090176578 Herrmann et al. Jul 2009 A1
20090191962 Hardy et al. Jul 2009 A1
20090197684 Arezina et al. Aug 2009 A1
20090216547 Canora et al. Aug 2009 A1
20090219901 Bull et al. Sep 2009 A1
20090221342 Katz et al. Sep 2009 A1
20090227302 Abe Sep 2009 A1
20090239666 Hall et al. Sep 2009 A1
20090264190 Davis et al. Oct 2009 A1
20090271287 Halpern Oct 2009 A1
20090275410 Kisenwether et al. Nov 2009 A1
20090275411 Kisenwether et al. Nov 2009 A1
20090282469 Lynch Nov 2009 A1
20090298468 Hsu Dec 2009 A1
20100002897 Keady Jan 2010 A1
20100004058 Acres Jan 2010 A1
20100016069 Herrmann Jan 2010 A1
20100056248 Acres Mar 2010 A1
20100062833 Mattice et al. Mar 2010 A1
20100062840 Herrmann et al. Mar 2010 A1
20100079237 Falk Apr 2010 A1
20100081501 Carpenter et al. Apr 2010 A1
20100081509 Burke Apr 2010 A1
20100099499 Amaitis et al. Apr 2010 A1
20100106612 Gupta Apr 2010 A1
20100120486 DeWaal May 2010 A1
20100124967 Lutnick et al. May 2010 A1
20100130276 Fiden May 2010 A1
20100160035 Herrmann Jun 2010 A1
20100160043 Fujimoto et al. Jun 2010 A1
20100178977 Kim et al. Jul 2010 A1
20100197383 Rader et al. Aug 2010 A1
20100197385 Aoki et al. Aug 2010 A1
20100203955 Sylla Aug 2010 A1
20100203963 Allen Aug 2010 A1
20100227662 Speers et al. Sep 2010 A1
20100227670 Arezina et al. Sep 2010 A1
20100227671 Laaroussi Sep 2010 A1
20100227687 Speers et al. Sep 2010 A1
20100234091 Baerlocher et al. Sep 2010 A1
20100279764 Allen et al. Nov 2010 A1
20100323780 Acres Dec 2010 A1
20100325703 Etchegoyen Dec 2010 A1
20110009181 Speers et al. Jan 2011 A1
20110039615 Acres Feb 2011 A1
20110065492 Acres Mar 2011 A1
20110105216 Cohen May 2011 A1
20110111827 Nicely et al. May 2011 A1
20110111843 Nicely et al. May 2011 A1
20110111860 Nguyen May 2011 A1
20110118010 Brune May 2011 A1
20110159966 Gura et al. Jun 2011 A1
20110183732 Block Jul 2011 A1
20110183749 Allen Jul 2011 A1
20110207525 Allen Aug 2011 A1
20110212711 Scott Sep 2011 A1
20110212767 Barclay et al. Sep 2011 A1
20110223993 Allen et al. Sep 2011 A1
20110263318 Agarwal et al. Oct 2011 A1
20110306400 Nguyen Dec 2011 A1
20110306426 Novak et al. Dec 2011 A1
20120015709 Bennett et al. Jan 2012 A1
20120028703 Anderson et al. Feb 2012 A1
20120028718 Barclay et al. Feb 2012 A1
20120034968 Watkins et al. Feb 2012 A1
20120046110 Amaitis Feb 2012 A1
20120094769 Nguyen et al. Apr 2012 A1
20120100908 Wells Apr 2012 A1
20120108319 Caputo et al. May 2012 A1
20120122561 Hedrick May 2012 A1
20120122567 Gangadharan et al. May 2012 A1
20120122584 Nguyen May 2012 A1
20120122590 Nguyen May 2012 A1
20120172130 Acres Jul 2012 A1
20120184362 Barclay et al. Jul 2012 A1
20120184363 Barclay et al. Jul 2012 A1
20120190426 Acres Jul 2012 A1
20120194448 Rothkopf Aug 2012 A1
20120208618 Frerking Aug 2012 A1
20120231885 Speer, II Sep 2012 A1
20120239566 Everett Sep 2012 A1
20120322563 Nguyen et al. Dec 2012 A1
20120330740 Pennington et al. Dec 2012 A1
20130005433 Holch Jan 2013 A1
20130005443 Kosta Jan 2013 A1
20130005453 Nguyen et al. Jan 2013 A1
20130059650 Sylla et al. Mar 2013 A1
20130065668 LeMay Mar 2013 A1
20130281188 Guinn Mar 2013 A1
20130104193 Gatto et al. Apr 2013 A1
20130132745 Schoening et al. May 2013 A1
20130185559 Morel Jul 2013 A1
20130196756 Nguyen Aug 2013 A1
20130196776 Nguyen Aug 2013 A1
20130210513 Nguyen Aug 2013 A1
20130210514 Nguyen Aug 2013 A1
20130210530 Nguyen Aug 2013 A1
20130225279 Patceg Aug 2013 A1
20130225282 Williams et al. Aug 2013 A1
20130252730 Joshi Sep 2013 A1
20130316808 Nelson Nov 2013 A1
20130337889 Gagner Dec 2013 A1
20140006129 Heath Jan 2014 A1
20140057716 Massing et al. Feb 2014 A1
20140087862 Burke Mar 2014 A1
20140094295 Nguyen Apr 2014 A1
20140094316 Nguyen Apr 2014 A1
20140100955 Osotio Apr 2014 A1
20140121005 Nelson May 2014 A1
20140179431 Nguyen Jun 2014 A1
20140274309 Nguyen Sep 2014 A1
20140274319 Nguyen Sep 2014 A1
20140274320 Nguyen Sep 2014 A1
20140274342 Nguyen Sep 2014 A1
20140274357 Nguyen Sep 2014 A1
20140274360 Nguyen Sep 2014 A1
20140274367 Nguyen Sep 2014 A1
20140274388 Nguyen Sep 2014 A1
20150089595 Telles Mar 2015 A1
20150133223 Carter May 2015 A1
20150143543 Phegade Aug 2015 A1
20170116819 Nguyen Apr 2017 A1
20170116823 Nguyen Apr 2017 A1
20170144071 Nguyen May 2017 A1
20170148259 Nguyen May 2017 A1
20170148261 Nguyen May 2017 A1
20170148263 Nguyen May 2017 A1
20170206734 Nguyen Jul 2017 A1
20170228979 Nguyen Aug 2017 A1
20170243440 Nguyen Aug 2017 A1
20170337770 Nguyen Nov 2017 A1
Foreign Referenced Citations (11)
Number Date Country
2033638 May 1980 GB
2062923 May 1981 GB
2096376 Oct 1982 GB
2097570 Nov 1982 GB
2335524 Sep 1999 GB
12005000454 May 2007 PH
WO 05073933 Aug 2005 WO
WO 2008027621 Mar 2008 WO
WO 2009026309 Feb 2009 WO
WO 2009062148 May 2009 WO
WO 2010017252 Feb 2010 WO
Non-Patent Literature Citations (203)
Entry
Benston, Liz, “Harrahs Launches iPhone App; Caesars Bypasses Check-in,” Las Vegas Sun, Las Vegas, NV. Jan. 8, 2010.
Finnegan, Amanda, “Casinos Connecting with Customers via Iphone Apps”, May 27, 2010, Las Vegas Sun, Las Vegas, NV.
Gaming Today Staff, “Slots showcased at 2009 National Indian Gaming Assoc.”, GamingToday.com, Apr. 14, 2009.
Green, Marian,“Testing Texting Casino Journal”, Mar. 2, 2009.
Hasan, Ragib, et al., “A Survey of Peer-to-Peer Storage Techniques for Distributed File Systems”, National Center for Supercomputing Applications, Department of Computer Science, University of Ilinois at Urbana Champain, Jun. 27, 2005.
Jones, Trahern, “Telecon-equipped drones could revolutionize wireless market”, azcentral.com, http://www.azcentral.com/business/news/articles/20130424telecom-equipped-drones-could-revolutionize-wireless-market.html, downloaded Jul. 2, 2013, 2 pages.
Yancey, Kitty Bean, “Navigate Around Vegas with New iPhone Apps”, USA Today, Jun. 3, 2010.
IAPS, Daily Systems LLC, 2010.
U.S. Appl. No. 12/945,888, filed Nov. 14, 2010.
U.S. Appl. No. 12/945,889, filed Nov. 14, 2010.
U.S. Appl. No. 13/622,702, filed Sep. 19, 2012.
U.S. Appl. No. 13/800,917, filed Mar. 13, 2013.
U.S. Appl. No. 13/296,182, filed Nov. 15, 2011.
U.S. Appl. No. 13/801,234, filed Mar. 13, 2013.
U.S. Appl. No. 13/801,171, filed Mar. 13, 2013.
U.S. Appl. No. 13/843,192, filed Mar. 15, 2013.
U.S. Appl. No. 13/843,087, filed Mar. 15, 2013.
U.S. Appl. No. 13/632,743, filed Oct. 1, 2012.
U.S. Appl. No. 13/632,828, filed Oct. 1, 2012.
U.S. Appl. No. 13/833,953, filed Mar. 15, 2013.
U.S. Appl. No. 12/619,672, filed Nov. 16, 2009.
U.S. Appl. No. 13/801,121, filed Mar. 13, 2013.
U.S. Appl. No. 12/581,115, filed Oct. 17, 2009.
U.S. Appl. No. 13/801,076, filed Mar. 13, 2013.
U.S. Appl. No. 13/617,717, filed Nov. 12, 2009.
U.S. Appl. No. 13/633,118, filed Oct. 1, 2012.
U.S. Appl. No. 12/797,610, filed Jun. 10, 2010.
U.S. Appl. No. 13/801,256, filed Mar. 13, 2013.
U.S. Appl. No. 12/757,968, filed Apr. 9, 2010.
U.S. Appl. No. 12/797,616, filed Jun. 10, 2010.
U.S. Appl. No. 13/557,063, filed Jul. 24, 2012.
U.S. Appl. No. 13/833,116, filed Mar. 15, 2013.
U.S. Appl. No. 13/801,271, filed Mar. 13, 2011.
Office Action for U.S. Appl. No. 12/945,888 dated Apr. 10, 2012.
Final Office Action for U.S. Appl. No. 12/945,888 dated Sep. 21, 2012.
Advisory Action for U.S. Appl. No. 12/945,888 dated Jan. 30, 2013.
Office Action for U.S. Appl. No. 12/581,115 dated Dec. 20, 2011.
Final Office Action for U.S. Appl. No. 12/581,115 dated Sep. 13, 2012.
Notice of Allowance for U.S. Appl. No. 12/581,115 dated May 24, 2013.
Office Action for U.S. Appl. No. 12/619,672 dated Dec. 20, 2011.
Final Office Action for U.S. Appl. No. 12/619,672 dated Nov. 6, 2012.
Office Action for U.S. Appl. No. 12/619,672 dated Mar. 7, 2013.
Office Action for U.S. Appl. No. 12/617,717 dated Oct. 4, 2011.
Office Action for U.S. Appl. No. 12/617,717 dated Apr. 4, 2012.
Advisory Action for U.S. Appl. No. 12/617,717 dated Jun. 12, 2011.
Office Action for U.S. Appl. No. 12/617,717 dated Jun. 17, 2013.
Office Action for U.S. Appl. No. 12/797,610 dated Dec. 8, 2011.
Final Office Action for U.S. Appl. No. 12/797,610 dated Jun. 6, 2012.
Office Action for U.S. Appl. No. 12/797,610 dated Feb. 26, 2013.
Office Action for U.S. Appl. No. 12/757,968, dated May 9, 2012.
Final Office Action for U.S. Appl. No. 12/757,968, dated Nov. 29, 2012.
Office Action for U.S. Appl. No. 12/757,968, dated Apr. 25, 2013.
Office Action for U.S. Appl. No. 12/797,616 dated Mar. 15, 2012.
Final Office Action for U.S. Appl. No. 12/797,616 dated Oct. 13, 2012.
Office Action for U.S. Appl. No. 12/797,616 dated Feb. 13, 2013.
Final Office Action for U.S. Appl. No. 12/797,616 dated May 8, 2013.
Office Action for U.S. Appl. No. 13/296,182 dated Dec. 5, 2012.
Brochure, 5000 Ft. Inc., 1 page, Nov. 2010.
Frontier Fortune game, email notification, MGM Resorts Intl., Aug. 9, 2013.
“Getting Back in the Game: Geolocation Can Ensure Compliance with New iGaming Regulations”, White Paper, Quova, Inc., 2010.
Notice of Allowance of U.S. Appl. No. 12/619,672, dated Aug. 23, 2013.
Office Action for U.S. Appl. No. 13/633,118, dated Sep. 20, 2013.
Office Action for U.S. Appl. No. 13/801,256, dated Jul. 2, 2013.
Notice of Allowance for U.S. Appl. No. 12/619,672, dated Oct. 3, 2013.
Notice of Allowance for U.S. Appl. No. 12/757,968, dated Oct. 11, 2013.
Final Office Action for U.S. Appl. No. 12/797,610, dated Jul. 10, 2013.
Notice of Allowance for U.S. Appl. No. 12/757,968, dated Dec. 18, 2013.
Office Action for U.S. Appl. No. 12/945,889, dated Dec. 18, 2013.
Office Action for U.S. Appl. No. 13/632,828, dated Jul. 30, 2013.
Restriction Requirement for U.S. Appl. No. 13/801,256, dated Dec. 30, 2013.
Office Action for U.S. Appl. No. 13/801,171, dated Dec. 26, 2013.
Office Action for U.S. Appl. No. 13/801,234, dated Jan. 10, 2014.
Final Office Action for U.S. Appl. No. 13/296,182, dated Feb. 12, 2014.
Office Action for U.S. Appl. No. 12/617,717, dated Feb. 25, 2014.
Office Action for U.S. Appl. No. 13/801,076, dated Mar. 28, 2014.
Final Office Action for U.S. Appl. No. 13/633,118, dated Apr. 3, 2014.
Office Action for U.S. Appl. No. 13/843,192, dated Apr. 3, 2014.
Office Action for U.S. Appl. No. 13/632,743, dated Apr. 10, 2014.
Office Action for U.S. Appl. No. 13/801,121, dated Apr. 11, 2014.
Final Office Action for U.S. Appl. No. 12/945,889, dated Jun. 30, 2014.
Notice of Allowance for U.S. Appl. No. 12/617,717, dated Jul. 14, 2014.
Office Action for U.S. Appl. No. 13/801,121, dated Sep. 24, 2014.
Office Action for U.S. Appl. No. 13/801,171, dated Sep. 22, 2014.
Office Action for U.S. Appl. No. 13/801,234, dated Oct. 1, 2014.
Office Action for U.S. Appl. No. 13/801,271, dated Oct. 31, 2014.
Final Office Action for U.S. Appl. No. 13/843,192, dated Oct. 21, 2014.
Office Action for U.S. Appl. No. 13/632,743, dated Oct. 23, 2014.
Office Action for U.S. Appl. No. 12/945,889, dated Oct. 23, 2014.
Office Action for U.S. Appl. No. 13/632,828, dated Nov. 7, 2014.
Office Action fpr U.S. Appl. No. 12/797,610, dated Dec. 15, 2014.
Final Office Action for U.S. Appl. No. 12/945,889, dated Feb. 12, 2015.
Final Office Action for U.S. Appl. No. 13/801,171, dated Mar. 16, 2015.
Office Action for U.S. Appl. No. 13/833,116, dated Mar. 27, 2015.
Office Action for U.S. Appl. No. 13/632,828, dated Apr. 10, 2015.
Final Office Action for U.S. Appl. No. 13/801,121, dated Apr. 21, 2015.
Final Office Action for U.S. Appl. No. 13/557,063, dated Apr. 28, 2015.
Office Action for U.S. Appl. No. 13/296,182, dated Jun. 5, 2015.
Office Action for U.S. Appl. No. 13/843,192, dated Jun. 19, 2015.
Office Action for U.S. Appl. No. 12/797,610, dated Jul. 14, 2015.
Final Office Action for U.S. Appl. No. 13/833,953, dated Jul. 17, 2015.
Notice of Allowance for U.S. Appl. No. 12/945,889, dated Jul. 22, 2015.
Office Action for U.S. Appl. No. 12/797,616, dated Aug. 10, 2015.
Final Office Action for U.S. Appl. No. 13/801,234, dated Aug. 14, 2015.
Final Office Action for U.S. Appl. No. 13/833,116, dated Sep. 24, 2015.
Office Action for U.S. Appl. No. 13/801,121, dated Oct. 2, 2015.
Office Action for U.S. Appl. No. 14/017,150, dated Oct. 7, 2015.
Office Action for U.S. Appl. No. 14/017,159, dated Oct. 7, 2015.
Office Action for U.S. Appl. No. 13/801,271 dated Oct. 19, 2015.
Office Action for U.S. Appl. No. 14/211,536 dated Oct. 19, 2015.
Final Office Action for U.S. Appl. No. 13/632,828, dated Oct. 22, 2015.
Office Action for U.S. Appl. No. 14/217,066, dated Dec. 17, 2015.
Notice of Allowance for U.S. Appl. No. 13/557,063, dated Dec. 23, 2015.
Final Office Action for U.S. Appl. No. 13/843,192, dated Dec. 30, 2015.
Office Action for U.S. Appl. No. 13/801,076, dated Jan. 11, 2016.
Office Action for U.S. Appl. No. 12/945,888, dated Jan. 22, 2016.
Final Office Action for U.S. Appl. No. 12/797,616, dated Jun. 12, 2016.
Office Action for U.S. Appl. No. 13/800,917, dated Feb. 25, 2016.
Advisory Action for U.S. Appl. No. 13/632,828, dated Feb. 25, 2016.
Office Action for U.S. Appl. No. 13/843,087, dated Feb. 25, 2016.
Office Action for U.S. Appl. No. 13/801,234, dated Mar. 8, 2016.
Final Office Action for U.S. Appl. No. 13/801,271, dated Mar. 11, 2016.
Office Action for U.S. Appl. No. 13/622,702, dated Mar. 22, 2016.
Final Office Action for U.S. Appl. No. 13/633,118, dated Mar. 24, 2016.
Final Office Action for U.S. Appl. No. 14/189,948, dated Apr. 6, 2016.
Final Office Action for U.S. Appl. No. 12/797,610, dated Apr. 21, 2016.
Final Office Action for U.S. Appl. No. 14/017,150, dated Apr. 26, 2016.
Final Office Action for U.S. Appl. No. 13/801,121, dated May 11, 2016.
Final Office Action for U.S. Appl. No. 14/017,159, dated Jun. 6, 2016.
Office Action for U.S. Appl. No. 13/801,171, dated Jun. 6, 2016.
Office Action for U.S. Appl. No. 13/843,192, dated Jun. 9, 2016.
Final OA for U.S. Appl. No. 12/945,888, mailed Jun. 28, 2016.
Notice of Allowance for U.S. Appl. No. 13/833,953, dated Jul. 6, 2016.
Final Office Action for U.S. Appl. No. 13/801,171, dated May 21, 2014.
Final Office Action for U.S. Appl. No. 13/801,234, dated May 22, 2014.
Office Action for U.S. Appl. No. 14/211,536, dated Jul. 13, 2016.
Notice of Allowance for U.S. Appl. No. 13/801,076, dated Jul. 11, 2016.
Office Action for U.S. Appl. No. 13/296,182, dated Jul. 20, 2016.
Restriction Requirement for U.S. Appl. No. 13/296,182, dated Oct. 12, 2012.
Advisory Action for U.S. Appl. No. 13/296,182, dated May 8, 2014.
Office Action for U.S. Appl. No. 13/296,182, dated Dec. 23, 2015.
Advisory Action for U.S. Appl. No. 13/843,192, dated May 8, 2014.
Office Action for U.S. Appl. No. 14/217,066, dated Dec. 22, 2016.
Office Action for U.S. Appl. No. 14/017,159, dated Sep. 23, 2016.
Office Action for U.S. Appl. No. 13/632,743, dated Sep. 23, 2016.
Final Office Action for U.S. Appl. No. 13/801,234, dated Oct. 14, 2016.
Final Office Action for U.S. Appl. No. 13/843,087, dated Oct. 13, 2016.
Final Office Action for U.S. Appl. No. 13/622,702, dated Oct. 13, 2016.
Office Action for U.S. Appl. No. 14/189,948, dated Nov. 7, 2016.
Final Office Action for U.S. Appl. No. 14/211,536, dated Mar. 14, 2014.
Notice of Allowance for U.S. Appl. No. 13/833,116, dated Oct. 11, 2016.
Notice of Allowance for U.S. Appl. No. 13/801,271, dated Dec. 2, 2016.
Notice of Allowance for U.S. Appl. No. 12/797,610, dated Dec. 7, 2016.
Notice of Allowance for U.S. Appl. No. 13/632,828, dated Dec. 16, 2016.
Final Office Action for U.S. Appl. No. 13/801,171, dated Dec. 19, 2016.
Notice of Allowance for U.S. Appl. No. 14/211,536, dated Dec. 28, 2016.
Notice of Allowance for U.S. Appl. No. 13/801,256, dated Jan. 20, 2017.
Office Action for U.S. Appl. No. 13/800,917, dated Feb. 3, 2017.
Final Office Action for U.S. Appl. No. 12/797,616, dated Feb. 10, 2017.
Office Action for U.S. Appl. No. 12/945,888, dated Feb. 28, 2017.
Final Office Action for U.S. Appl. No. 14/189,948, dated Mar. 17, 2017.
Office Action for U.S. Appl. No. 15/400,840, dated Mar. 10, 2017.
Notice of Allowance for U.S. Appl. No. 13/801,121, dated Mar. 29, 2017.
Office Action for U.S. Appl. No. 15/270,333, dated Mar. 30, 2017.
Office Action for U.S. Appl. No. 15/402,945, dated Apr. 5, 2017.
Office Action for U.S. Appl. No. 15/271,488, dated Apr. 19, 2017.
Final Office Action for U.S. Appl. No. 14/217,066, dated Apr. 21, 2017.
Office Action for U.S. Appl. No. 14/216,986 dated Apr. 26, 2017.
Office Action for U.S. Appl. No. 13/801,171, maied Jun. 14, 2017.
Office Action for U.S. Appl. No. 14/017,159, dated Jun. 29, 2017.
Notice of Allowance for U.S. Appl. No. 15/270,333, dated Jul. 5, 2017.
Final Office Action for U.S. Appl. No. 13/800,917, dated Jul. 13, 2017.
Notice of Allowance for U.S. Appl. No. 13/801,234, dated Jul. 5, 2017.
Notice of Allowance for U.S. Appl. No. 14/217,066, dated Jul. 14, 2017.
Final Office Action for U.S. Appl. No. 14/518,909, dated Jul. 19, 2017.
Final Office Action for U.S. Appl. No. 13/801,121, dated Sep. 15, 2016.
Advisory Action for U.S. Appl. No. 13/801,121, dated Jul. 17, 2015.
Advisory Action for U.S. Appl. No. 13/801,121, dated Jul. 19, 2016.
Notice of Allowance for U.S. Appl. No. 15/293,751, dated Aug. 4, 2017.
Advisory Action for U.S. Appl. No. 14/189,948, dated Jul. 28, 2017.
Final OA for U.S. Appl. No. 13/801,256, mailed Aug. 15, 2014.
Final OA for U.S. Appl. No. 13/801,256, dated Feb. 18, 2015.
Advisory Action for U.S. Appl. No. 13/801,256, dated Dec. 5, 2014.
Office Action for U.S. Appl. No. 13/801,256, dated Jan. 12, 2016.
Final Office Action for U.S. Appl. No. 13/801,256, dated Aug. 16, 2016.
Office Action for U.S. Appl. No. 13/801,256, dated Aug. 18, 2017.
Office Action for U.S. Appl. No. 13/622,702, dated Aug. 31, 2017.
Office Action for U.S. Appl. No. 12/945,888, dated Sep. 1, 2017.
Office Action for U.S. Appl. No. 14/017,150, dated Sep. 7, 2017.
Notice of Allowance for U.S. Appl. No. 14/189,948, dated Sep. 13, 2017.
Office Action for U.S. Appl. No. 15/138,086, dated Oct. 19, 2017.
Notice of Allowance for U.S. Appl. No. 15/402,945 dated Nov. 21, 2017.
Final Office Action for U.S. Appl. No. 13/801,171, dated Dec. 13, 2017.
Final Office Action for U.S. Appl. No. 15/271,488, dated Dec. 21, 2017.
Office Action for U.S. Appl. No. 15/671,133, dated Dec. 22, 2017.
Final Office Action for U.S. Appl. No. 14/216,986, dated Dec. 26, 2017.
Restriction Requirement for U.S. Appl. No. 15/427,307, dated Jan. 17, 2018.
Office Action for U.S. Appl. No. 15/798,363, dated Jan. 26, 2018.
Office Action for U.S. Appl. No. 15/427,291, dated Jan. 29, 2018.
Final Office Action for U.S. Appl. No. 14/017,159, dated Feb. 1, 2018.
Final Office Action for U.S. Appl. No. 13/622,702, dated Feb. 22, 2018.
Office Action for U.S. Appl. No. 15/811,654, dated Feb. 22, 2018.
Final Office Action for U.S. Appl. No. 13/622,702, dated Feb. 27, 2018.
Final Office Action for U.S. Appl. No. 15/427,308, dated Mar. 19, 2018.
Related Publications (1)
Number Date Country
20140274388 A1 Sep 2014 US
Provisional Applications (1)
Number Date Country
61789332 Mar 2013 US