Determining targeting information based on a predictive targeting model

Information

  • Patent Grant
  • 10423983
  • Patent Number
    10,423,983
  • Date Filed
    Tuesday, September 16, 2014
    10 years ago
  • Date Issued
    Tuesday, September 24, 2019
    5 years ago
Abstract
A targeting system based on a predictive targeting model based on observed behavioral data including visit data, user profile and/or survey data, and geographic features associated with a geographic region. The predictive targeting model analyzes the observed behavioral data and the geographic features data to predict conversion rates for every cell in a square grid of predefined size on the geographic region. The conversion rate of a cell indicates a likelihood that any random user in that cell will perform a targeted behavior.
Description
BACKGROUND

As the popularity of mobile devices has soared among consumers worldwide, the potential for targeting advertising content to users of mobile devices has also increased. For example, advertisers can obtain information about a current location a user of a mobile device and use that information along with information about nearby businesses to send targeted advertisements to the user's mobile device. By way of another example, advertisers can also deliver a specific advertisement to a mobile device of any user who comes within a certain radius of a point of interest.





BRIEF DESCRIPTION OF THE DRAWINGS

The details of embodiments of a system and method of determining targeting information based on a predictive targeting model (hereinafter the “targeting system”) are set forth in the accompanying drawings and description below. Further embodiments and implementations and advantages of the disclosed targeting system will be apparent from the following detailed description, drawings and the claims.



FIG. 1 is a block diagram illustrating an example environment in which the targeting system can operate.



FIG. 2 is a block diagram illustrating an overview of running an advertisement campaign using targeting information provided by the targeting system.



FIG. 3 is a block diagram illustrating generation of targeting information based on an analysis of behavioral data, place data and census data using a targeting model to predict a conversion rate of each cell in a grid on a geographic region.



FIG. 4 is a block diagram illustrating example program modules of a targeting server of the targeting system.



FIG. 5 is a block diagram illustrating an analysis of behavioral visitation data.



FIG. 6A is a logic flow diagram illustrating an example method of generating geographic features data by the targeting system.



FIG. 6B is a logic flow diagram illustrating an example method of generating predicted conversion rate per cell by the targeting system.



FIG. 7 is a logic flow diagram illustrating an example method of identifying targeting information based on targeting criteria.



FIG. 8 is a logic flow diagram illustrating an example method of identifying locations or users that are likely to convert, based on a particular campaign.



FIG. 9 is a logic flow diagram illustrating an example method of identifying a set of users associated with a targeted behavior.





DETAILED DESCRIPTION

Overview


Embodiments of the present disclosure include a system and method of determining targeting information based on a predictive targeting model (hereinafter the “targeting system”). The targeting model is based on observed behavioral data including visit data, user profile data and/or survey data, and geographic features associated with a geographic region. The targeting model predicts a conversion rate or likelihood that any random user in a small geographic area within the geographic region will perform a targeted behavior. The latitude and longitude coordinates of at least some of the geographic areas within the geographic region that are associated with high conversion rates are then provided by the targeting system as targeting information to advertisers, publishers and/or advertisement networks for use in targeting advertisements to customers in those locations. In some embodiments, instead of or in addition to the latitude and longitude coordinates of the locations with high conversion rates, the targeting system can identify one or more unique identifiers of the users in those locations with high conversion rates. Advertisers and publishers can then target against users with matching the unique identifiers, where those identifiers may lack any personally identifying information.


In some embodiments, location data collected from a panel of users (“panelists”) can provide information about the places the panelists visited, and the timing and duration of such visits. Based on the place visit data, information about where a panelist's home location is, where the panelist's work location is, which grocery store the panelist visits regularly, and so on can be inferred. Moreover, user profile data can provide information about age, gender, ethnicity and/or other attributes of the panelists, while survey data can provide information about preferences of the panelists. The behavioral information can be projected onto a geographic region sub-divided into geographic units or cells, with each cell having a set of geographic features. The predictive targeting model can then take into account the behavioral information projected onto the cells to identify opportunities for advertisers and publishers for targeting advertisements.


For example, consider a brand (e.g., 24 Hour Fitness) that wants to know where its potential customers are in a geographic area (e.g., Washington State) in order to target advertisements against those locations. The targeting system can analyze the behavioral data of users to identify the places the users visited and geographic feature data associated with the geographic area. The targeting system can then use the targeting model to determine that users who go to 24 Hour Fitness (“24 Hour Fitness user group”) are more likely to go to Jamba Juice compared to the overall population. Then the locations of Jamba Juice can be used for target ads related to 24 Hour Fitness because the users who live, work or visit those locations have a higher affinity for 24 Hour Fitness. Thus, rather than waiting until a user is in proximity to a 24 Hour Fitness to send an advertisement to the user's device, the targeting system enables a publisher or an advertiser to target advertisements against the locations that have a high affinity for 24 Hour Fitness regardless of whether the user is close to a 24 Hour Fitness or far away from it.


In some embodiments, the targeting model is not just based on which businesses users visited, but also a category of each business. For example, the targeting model can take into account not just Jamba Juice, but also other businesses in the beverage category that are visited by the 24 Hour Fitness user group. In some embodiments, the affinities are not necessarily with respect to businesses and/or businesses in the same category, but also demographics and/or other features of the geographic area. For example, in addition the 24 Hour Fitness user group having an affinity to Jamba Juice, the user group may also skew male. The targeting model can take into account demographics of the geographic area to identify cells that have a greater male population than female as being locations at which advertisements related to 24 Hour Fitness should be served. By way of another example, if an advertiser wants to target 13-17 year olds for a new animated feature, the targeting model would consider users that belong to the targeted age group and visits performed by the targeted age group users to identify locations against which advertisements related to the new animated feature can be targeted. In some embodiments, panelists can be asked survey questions such as “Do you like animated movies?” The targeting model would then consider users in the targeted age group that responded “yes” to the survey question and the visits performed by those users to identify locations that can be targeted against for advertisements related to the animated feature.


In some embodiments, a targeting model can also have a temporal component. For example, consider a targeted behavior of visiting a Walmart store in the morning. The targeting model would take into account place visits of users in the morning hours of 9 am to 12 noon to identify locations in a geographic region where users are likely to visit the Walmart store in the morning.


In some embodiments, the targeting model can be a look-alike model that enables advertisers to target users who look like their established or known customers. For example, the targeting model can be used to identify locations where customers who look like the people who go to 24 Hour Fitness and who are likely to sign up for a new membership are. To implement the look-alike model, the panelists can be segmented in two groups, the first group including panelists who have been to a gym in the last 30 days and the second group including panelists who have not. The behavior of the second group is considered by the targeting model as the first group of panelists likely already has a gym membership. The targeting model can then use the behavior of the second group of users to identify locations that have high affinity for gyms and/or users who are likely to sign up for a new gym membership.


Various embodiments and implementations of the targeting system will now be described. The following description provides specific details for a thorough understanding and an enabling description of these embodiments and implementations. One skilled in the art will understand, however, that the embodiments and implementations may be practiced without many of these details. Additionally, some well-known structures or functions may not be shown or described in detail, so as to avoid unnecessarily obscuring the relevant description of the various embodiments and implementations. The terminology used in the description presented below is intended to be interpreted in its broadest reasonable manner, even though it is being used in conjunction with a detailed description of certain specific embodiments and implementations of the targeting system.


Suitable System


The targeting system can be implemented in a suitable computing environment 100 illustrated in FIG. 1. Aspects, embodiments and implementations of the targeting system will be described in the general context of computer-executable instructions, such as routines executed by a general-purpose computer, a personal computer, a server, or other computing systems. The targeting system can also be embodied in a special-purpose computer or data processor that is specifically programmed, configured, or constructed to perform one or more of the computer-executable instructions explained in detail herein. Indeed, the terms “computer” and “computing device,” as used generally herein, refer to devices that have a processor and non-transitory memory, like any of the above devices, as well as any data processor or any device capable of communicating with a network. Data processors include programmable general-purpose or special-purpose microprocessors, programmable controllers, application-specific integrated circuits (ASICs), programmable logic devices (PLDs), or the like, or a combination of such devices. Computer-executable instructions may be stored in memory, such as random access memory (RAM), read-only memory (ROM), flash memory, or the like, or a combination of such components. Computer-executable instructions may also be stored in one or more storage devices, such as magnetic or optical-based disks, flash memory devices, or any other type of non-volatile storage medium or non-transitory medium for data. Computer-executable instructions may include one or more program modules, which include routines, programs, objects, components, data structures, and so on that perform particular tasks or implement particular abstract data types.


Embodiments of the targeting system can be implemented in distributed computing environments, where tasks or modules are performed by remote processing devices, which are linked through a communications network, such as a Local Area Network (“LAN”), Wide Area Network (“WAN”), or the Internet. In a distributed computing environment, program modules or subroutines may be located in both local and remote memory storage devices. Aspects of the targeting system described herein may be stored or distributed on tangible, non-transitory computer-readable media, including magnetic and optically readable and removable computer discs, stored in firmware in chips (e.g., EEPROM chips), an array of devices (e.g., Redundant Array of independent Disks (RAID)), solid state memory devices (e.g., solid state drives (SSD), Universal Serial Bus (USB)), and/or the like. Alternatively, aspects of the targeting system may be distributed electronically over the Internet or over other networks (including wireless networks). Those skilled in the relevant art will recognize that portions of the targeting system may reside on one or more server computers, while corresponding portions reside on a client computer. Data structures and transmission of data particular to aspects of the targeting system are also encompassed within the scope of the present disclosure.


A targeting system 115 embodied in a targeting server 120 can operate in the example environment 100 illustrated in FIG. 1. The targeting server 120 can be coupled to one or more databases and/or database tables represented as a database 125. The environment 100 includes panelist devices 150, which can be devices of panelists of the targeting system that report location and other information to the targeting system 115 and/or an inference pipeline system 130. The interference pipeline system 130 is shown as including an analytics server 135 and a database 140. The panelist devices 150 can be any type of client device capable of measuring and reporting its location data. Typically, a client application is installed on a panelist device 150 to facilitate the location data collection and reporting. The inference pipeline system 130 and the data collection system are described in detail in related application Ser. Nos. 13/405,182 and 13/405,190, both filed on Feb. 12, 2012, which are expressly incorporated by reference herein.


The environment 100 can also include one or more advertisers 105 (or content providers in general) that wish to provide advertisements or other non-promotional content to user devices 145 of users for consumption. Typically, advertisers 105 engage publishers 110 to run advertisement (“ad”) campaigns. The advertisers 105 and the publishers 110 can be computing systems in the form of one or more servers. User devices 145 can include any computing devices, such as but not limited to: a desktop computer, a laptop computer, a tablet, a mobile device or a feature phone. In some embodiments, user devices 145 can also include a television, a radio or any other electronic communication media through which content can be delivered to users for consumption. In some embodiments, the environment 100 can also include print media such as newspapers, publications, or the like. The user devices 145, via their network interlaces, connect to and/or communicate with networks 125, either directly or via wireless routers or cell towers. Networks 125 can include wired and wireless, private networks and public networks (e.g., the Internet). Network interfaces employ connection protocols such as direct connect, Ethernet, wireless connection such as IEEE 802.11a-n/ac, and the like to connect to networks 125. Some client devices may be equipped with transceiver circuitry to handle radio communications to wirelessly communicate with nearby cell towers or base stations using wireless mobile telephone standards, such as Global System for Mobile Communications (GSM), CDMA (Code Division Multiple Access), General Packet Radio Service (GPRS), and/or the like.


As illustrated, the publishers 110 can communicate with the targeting server 120 via the networks 125 to request targeting information for ad campaigns of advertisers 105. As described before, the targeting information can be latitude/longitude coordinates of locations having high conversion rates and/or one or more unique identifiers of users that are more likely to engage in a targeted behavior. The publishers 110 can then use the targeting information to target ads to user devices 145 of users.



FIG. 2 is a block diagram illustrating an overview of running an advertisement campaign using targeting information provided by the targeting system.


As illustrated in the diagram 200 an adve1iiser 105 can engage a publisher 110 to run an ad campaign 220. Generally, the ad campaign 120 can have one or more criteria that dictate how the ads are to be delivered (e.g., on web, mobile web, mobile applications), in what formats and whether the ads should be geo-targeted (e.g., country, region, metro area, city, postal code, latitude/longitude coordinates), for example. In accordance with the present disclosure the publisher 110 can provide one or more targeting criteria 225 for the ad campaign 120 to the targeting system 115. The targeting system 115 can collect behavioral data 174 collected from panelists 150 and extract geographic features from places and census data. The targeting system 115 can utilize a machine learning algorithm (e.g., based on a supervised learning model) to train a targeting model 210 using observed conversion rates determined from a random sample of the behavioral data 174 and the geographic features of geographic units (e.g., cells) and use the trained targeting model 210 to predict conversion rates for geographic units. The conversion rates for geographic units can then be further processed and ranked to identify a portion of the location coordinates 230 with the highest conversion rates that can be used for targeting the ad campaign. In some embodiments, instead of or in addition to the location coordinates, user profiles or unique identifiers can be provided as targeting information to the publisher 110, including anonymous identifiers that lack any personally identifiable information (“PII”), e.g. where the publisher has access to only identifier.


The publisher 110 can then run the ad campaign 220 based on the targeting information. For example, the publisher can identify user devices 205A-D as being located at the targeting location coordinates 230 and can then send ads 235 to these user devices 205, and receive impressions 240 from the user devices 205. By way of another example, the publisher 110 can identify the users 205 as having user profiles matching the targeting user profiles 230 and can send ads to 235 to the user devices 205A-D of the users 205 (even though the publisher lacks any specific PII for each user).


Example Processing



FIG. 3 is a block diagram illustrating generation of targeting information based on an analysis of behavioral data, place data and census data using a targeting model to predict a conversion rate of each cell in a grid on a geographic region.


The targeting system 115 uses observed behavioral data 174 in determining targeting information. The observed behavioral data 174 can include place visit data 302 that links a user to a place at an instance of time. For example, place visit data 302 can indicate that a user visited a Walmart store in Shoreline, Wash. on Mar. 2, 2012 from 9 am-11:30 am. The observed behavioral data 174 can also include demographic profile data 306. For example, the demographic profile data 306 can indicate that the user is a male and his ethnicity is Hispanic. The observed behavioral data 174 can also include survey data 308 (e.g., how the user answered a specific survey question). The observed behavioral data 174 can be collected from the panelists of the targeting system 115 and can be organized and processed before being used as input data to the targeting model 155.


The targeting system 115 can also use place data 310 and census data 315. The place data 310 can include information about places (e.g., latitude/longitude coordinates of places, place names, business categories, and/or the like). The census data 315 can include but are not limited to: population density, proportion of population at various income buckets, proportion of married individuals, proportion of males at various age buckets, proportion of females at various age buckets, proportion of males at various education buckets, proportion of females at various education buckets, and/or the like. The targeting system 115 can extract features or distinct attributes from the place data 310 and/or census data 315 that can quantitatively describe each cell (e.g., cells 325) in a geographic region 320. Generally, a sample of users and their corresponding behavioral data and feature data are used in training the predictive targeting model 155, which when applied to the overall set of behavioral data 174 and the feature data, can output a predicted conversion rate 160 for each cell (e.g., cells 325) in the geographic region 320 as illustrated. The predicted conversion rate 160 for each cell can then be used to identify targeting location coordinates 330 and/or targeting user profiles or unique user identifiers 335.


Example Programming Modules



FIG. 4 is a block diagram illustrating example modules of a targeting server 120 of the targeting system 115. In some embodiments, the targeting server 120 can include various computer-executable program modules stored in a memory 402. Examples of the program modules can include a behavior data analyzer 405, a geographic features data generator 410, a geographic features data labeling module 415, a model training module 420 that includes a training dataset generator 422 for building a predictive model 425, a model application engine 430, a reporting module 432 and/or a cell aggregator 435. The targeting server 120 can also include additional modules (e.g., communication modules, user interface modules and so on) that have not been described herein.


In some embodiments, the targeting server 120 receives one or more targeting criteria 404 from a client (e.g., a publisher, an advertiser) as input. Such targeting criteria can include a targeting behavior (e.g., target customers who go to Walmart), a demographic criterion (e.g., males), a geographic region of interest (e.g., a state, a country or any arbitrary geographic area), etc. The targeting server 120 via the geographic features data generator 410, can divide the geographic area of interest using a grid into cells (e.g., 250 m by 250 m square cell) and generate feature data for each cell. The feature data can include, for example, a total number of places, distances to nearby businesses of particular brands and distances to nearby categories of businesses, demographic features (e.g., percent of males, percent of females, percent of different ethnicities) and/or the like and can be stored in a database table in the database 125. While generally shown and described as using a two-dimensional grid, the system may also employ a three dimensional grid. Such a grid can be useful in dense urban locations, such as Hong Kong, where the system benefits from understanding which floor of a high-rise building a user visits, and the business on that floor being visited. In such an embodiment, the system uses not only latitude and longitude data, but also altitude data.


In addition to the feature data, targeting server 120 also considers observed behavioral data stored in a table in the database 125. The behavior data analyzer 405 can select a set of users for the targeting analysis. Normally, the set of users can include all the panelists of the targeting system. The panelists are users of the targeting system from whom geolocation data is collected. A data collection system can obtain user information, including location data, collected directly from the panelists. The data collection system obtains and validates location data from the panelists. The user information collected by the data collection system includes, for example, latitude, longitude, altitude determinations, sensor data (e.g., compass/bearing data, accelerometer or gyroscope measurements), user agent identifying information (e.g., device identifier, device name/model, operating system name/version), and other information that can be used to identify a panelist's location and activity. Additional details on the data collection system can be found in related application Ser. Nos. 13/405,182 and 13/405,190 both filed on Feb. 12, 2012 which are expressly incorporated by reference herein. In some embodiments, the set of users can include panelists matching one or more criteria. For example, the set of users can be users associated with a demographic feature or users who were asked a specific survey question. This set of users forms a global user set or group. The behavior data analyzer 405 then selects users who match a profile and/or a targeted behavior (e.g., answered a survey question a specific way) from the global user set to form a behavior matched user set. Referring to FIG. 5, the behavior match user set 510 is a subset of the global user set 505. The behavior data analyzer 405 can also identify all visits performed by the global user set as a global visit set and a subset of the global visit set that matches the targeted behavior (e.g., visiting a Walmart store) and performed by users in the behavior match visit set as a behavior match visit set. As illustrated in FIG. 5, the behavior match visit set 520 is a subset of the global visit set 515.


The geographic features data labeling module 415 utilizes information relating to the global visit set and the behavior matched visit set to determine a behavior match metric for each user in the behavior matched user set. The behavior match metric for a user defines a total number of days with each at least one behavior match visit by the user. As illustrated in FIG. 5, the behavior match metric can be calculated by grouping the behavior match visit set by user id 525 and by day 532. A probability of at least one behavior match visit 530 for each day is calculated and the number of days with a behavior match visit is summed to obtain the total behavior match visit metric 535. A value of the behavior match metric corresponding to a user from the behavior matched user set can then be assigned to a key formed by the tuple (cell id, user id) corresponding to all the visits to various places by the global user set. For example, if the value of a behavior match metric of user “A” is 10, then that value can get mapped on to each visit by user “A” in the following manner:


(1) user A visited “Walmart” in cell id “2”→key: (2, A)=10


(2) user A visited “Target” in cell id “1”→key (1, A)=10


(3) user A visited “Whole Foods” in cell id “5”→key (5, A)=10


The geographic features data labeling module 415 can then join the key-value pair obtained from the observed behavioral data with the geographic feature data (generated by the geographic features data generator 410) using the cell id to label the geographic features data with the behavior matched metric. The result is a labeled feature vector table, with the key: (cell id, user id, [feature vector]) and the value: behavior matched metric. This labeled feature vector table can be stored in a database table of the database 125. This has the effect of layering the observed behavioral information on to the cells so that all of the cells in which a user was observed get labeled with a prediction of the probability that the user will perform the targeted behavior.


The training dataset generator 422 of the model training module can generate a training data set for training a two-level predictive model 425 to predict a conversion rate for each cell, which is the probability that a random user observed in a cell will perform the targeted behavior. The training dataset generator 422 can take a random sampling of labeled feature vectors from the labeled feature vector table for the training. In some embodiments, prior to sampling, the training dataset generator 422 can perform a thresholding to exclude certain labeled feature vectors (e.g., labeled feature vectors with a place distance feature that exceeds 100 km) from the initial dataset.


At the first level, the model training module 420 can use the labeled feature data from the labeled feature vector table, including the observed behavior match metric, to train a model to predict a value of the observed behavior match metric which corresponds to a visit probability. At the second level, the system trains a statistical model (e.g., a linear regression model) using an observed conversion rate of a cell to predict a conversion rate of the cell for a given visit probability and in some embodiments, a geo-fence feature (e.g., the distance from Walmart or radius around Walmart). The model training module 420 can calculate the observed conversion rate as the average value of the behavior match metric over all users observed in a given cell in the training dataset. The statistical model can thus predict the average number of times the targeted behavior was performed by users observed in a given cell.


The model application engine 430 can apply the two-level predictive model 425 to a set of feature data (e.g., the full set of feature data) to generate a predicted conversion rate for the cell. The model application engine 430 can apply the first level of the model 425 to predict visit probabilities and the second level of the model 425 to predict the conversation rates. In some embodiments, the model application engine 430 can rank the cells based on conversion rates and select the top n number or x percent of cells as cells with high affinity for the targeted behavior. The reporting module 432 can then report locations (e.g., latitude and longitude coordinates) corresponding to the high affinity cells as targeting information 440. In some embodiments, instead of or in addition to the locations corresponding to the high affinity cells, one or more unique identifiers of users observed in these high affinity cells can be reported as targeting information 440. In some embodiments, the model application engine 430 can combine conversion rates with predictions of number of impressions served per cell (e.g., via another model) to generate a combined score. The cells can then be ranked based on the combined score and the top performing cell coordinates can be provided as targeting information 440. In some embodiments, the model training module 420 and the model application engine 430 can include or use only the first level and pass the predicted visit probabilities to the cell aggregator 435.


In some embodiments, the cell aggregator 435 can aggregate the predicted conversion rates to allow for a targeting area that is larger than a cell. The cell aggregator 435 can calculate an aggregate conversion rate of each cell by summing conversion rates of the cell and neighboring cells within a pre-defined distance from the cell. The cell aggregator 435 can retain only those cells with the highest aggregate conversion rates within a smaller radius. The reporting module 432 can then report locations corresponding to the retained cells as targeting information 440.


As described above, location data (e.g., 2-dimensional or 3-dimensional location data) can be used for indexing targeting predictions. In some embodiments, in addition to the location data, user agent identifying information, such as but not limited to operating system name/version, device name, model and/or identifier, and/or the like, can be used for indexing targeting predictions.



FIG. 6A is a logic flow diagram illustrating an example method of generating geographic features data by the targeting system.


In some embodiments, the targeting system 115 generates a geographic feature set that includes a list of vectors of feature data from place data 605 and/or census data 635. The place data 605 and the census data 635 can be stored in the database 125. Each row in the geographic feature set can correspond to a cell (e.g., 0.005 degree×0.005 degree section on the map of the globe) and each column in a vector includes a feature extracted from the place data and/or the census data.


For each cell, the targeting system 115 can retrieve places that are within a predetermined distance from the cell (e.g., ˜750 meters from the center of the cell) at block 610. The targeting system 115 can then extract nearby place features from the resulting data set at block 620. Examples of nearby features that can be extracted can include, but are not limited to:


1. Total number of places


2. Total number of places from each business category (e.g., the number of coffee shops within 750 meters of the cell)


3. Ratio of total number of places from each business category to total number of places (“category proportion”)


At block 615, the targeting system can extract, for each cell, place distance features by calculating the distance to the nearest place of each brand and business category. This calculation can be an expensive process and some of the computational cost of performing this calculation can be reduced by using the following example methodology:


1. For each place, map or assign the place to all cells that are within 100 km of the place.


2a. For each of these cells, calculate the distance from the center of the cell to the place.


2b. Emit or generate a key-value pair. The key is the tuple (cell id, business id, category id) and the value is the calculated distance from 2a. The business id identifies a brand (e.g., McDonald's or Starbucks) and the category id identifies a type of business (e.g., a cafe or a grocery store).


3. For each (cell id, business id, category id) tuple, find the minimum distance from the cell (e.g., center of the cell) to the business id, and emit the key-value pair (cell id, (business id, category id, minimum distance)).


4. For each cell, calculate the following features:


4a. Minimum distance to each business id


4b. Minimum distance to each category id


For example, if there are three business ids (McDonald's, KFC and Walmart) associated with a cell id, after step 4, a following example row can be generated:


(cell id, mcdonalds_min_distance, Walmart_min_distance, kfc_min_distance, fast_food_min_distance, retail_min_distance)


At block 640, the targeting system 115 can retrieve census data 635 and calculate demographic features. The census data 635 can include, for example, a block group, census tract, county census tables and/or the like. Each census aggregation region can have its geographic shape associated with it. The targeting system 115 can calculate demographic features using the following example methodology:


1. For each census region, calculate a vector of features from the census table:


1a. population density


1b. proportion of population at various income buckets


1c. proportion of married individuals


1d. proportion of males at various age buckets


1e. proportion of females at various age buckets


1f. proportion of males at various education buckets


1g. proportion of females at various education buckets


2. For each cell inside each census region, generate a key-value pair. The key is the cell id. The value is the tuple (aggregation size, feature vector).


3. For each cell, find the feature vector associated with the smallest aggregation size and generate the key value pair (cell id, feature vector).


The nearby place features (from block 620), the place distance features (from block 615) and the demographic features (from block 640) are then merged together at block 650 to obtain a set of geographic features 655.



FIG. 6B is a logic flow diagram illustrating an example method of generating predicted conversion rate per cell by the targeting system.


In some embodiments, the targeting system 115 receives user visits data 660, user profiles 662 and/or survey data 664. In some embodiments, the targeting system 115 can consider all the users of the targeting system for analysis. Alternatively, at block 666, the targeting system can filter users based on one or more criteria. The one or more criteria can include, for example, survey data 664 and/or user profile data 662 (e.g., users matching a demographic criterion). The set of users that match the filter criteria from the global user group or set.


At block 668, the targeting system 115 can tag users that match a targeted behavior. The tagged users then form the behavior matched user group or set. In this instance, the targeted behavior can be unrelated to store or place visits and may be related to, for example, survey data 664. For example if a user answered a specific survey question a specific way, then that user can be in the behavior matched user group. In some embodiments, if only place visit constraints are specified as a targeted behavior, then the functionality of block 668 is optional because the user group would be the same the global user group.


At block 670, the targeting system 115 filters the user visits data 660 using the global user group determined from block 666 to obtain a set of all visits performed by each user in the global user group (“global visit set”). At block 672, the targeting system 115 tags the visits in the global visit set that match a targeted behavior performed by each user in the behavior match user set. This subset of the global visit set is the behavior match visit set. For example, if the targeted behavior is visiting a Walmart, all visits with a high probability of being at Walmart can be part of the behavior match visit set.


At block 674, the targeting system 115 labels geographic features 655. The following example methodology can be used to label the geographic features 655.


1. Grouping the behavior match visit set by user and by day. For each day, calculate the probability of at least one behavior match visit. Sum this value across days to get a total number of days with a behavior match visit (“total behavior match visits”).


2. For each visit in the global visit set, generate a key-value pair. The key can be the tuple (cell id, user id) and the value can be the user's total behavior match visits calculated in (1). If a user has no visits in the behavior match visit set, this value will be 0.


3. Join the key-value pairs in (2) with the geographic features 655 by cell id.


Each key-value pair is a row in a labeled feature vector table and the label is the total behavior match visits value.


At block 676, the targeting system 115 trains statistical models to obtain trained models 678. To train the statistical models, a training data set is first selected from a random sample of users and their corresponding behaviors and features. The sampled data set is then used for training the two-level statistical model to predict a conversion rate for each cell.


In the first level, based on the geographic feature data and the total behavior match visits (observed), a classifier (e.g., Random Forest Classifier) is used to predict whether the total behavior match visits is non-zero. This classification can be performed as a non-linear feature extraction step to combine the high-dimensional geographic feature vectors (e.g., a large number of features) into a single number, i.e., the visit probability, that is smaller than the original feature set but retains most of its characteristics.


In the second level, the visit probability and in some cases a geo-fence feature can be used to train a linear regression model to predict the conversion rate of a given cell. In some embodiments, the geo-fence feature can be a log-transformed distance to a target location. To train this regression model, the observed conversion rate is computed. In some embodiments, the observed conversion rate can be calculated as the average value of total behavior match visits over all users observed in a cell as seen in the sampled training data. The visit probability and geo-fence feature are then regressed on to these observed conversion rates.


In some embodiments, the targeting system 115 can apply the trained models 678 to each cell in the full geographic feature set at block 680 to generate a predicted conversion rate for each cell 684. This can be done by sequentially applying the first and second levels of the model to obtain the predicted visit probabilities and conversion rates, respectively.


In some embodiments, the targeting system 115 can aggregate cells at block 682. Aggregating the cells can include aggregating the predicted conversion rates to allow for larger targeting radii. In some embodiments, the targeting system 115 can calculate the aggregated conversion rate at each cell by considering every cell within a specified distance (e.g., number of meters). The targeted system can then retain cells with the highest aggregated conversion rate or score within a smaller radius.



FIG. 7 is a logic flow diagram illustrating an example method of identifying targeting information based on targeting criteria.


As illustrated, at block 705, the targeting system receives targeting criteria. In some embodiments, the targeting criteria can include a targeted behavior and a geographic region. The targeting criteria can be received from an advertiser, a publisher or any other entity that desires to identify people and/or places that have a high propensity to perform the targeted behavior. The targeted behavior can be any behavior of interest, for example, visiting a store or signing up for an event. In some embodiments, the targeting criteria can also include information about demographic profiles and/or survey data. For example, an advertiser may be interested in knowing where women in the age group 40-50 that have a high affinity for gyms may be located. The targeting system 115 can then consider the age group and gender of the panelists along with visits to gyms to predict those locations. The geographic region can be any region of interest (e.g., the USA, Washington, North West).


At block 710, the targeting system 115 can segment the geographic region using a grid into cells. Each cell has a cell identifier. At block 715, the targeting system 115 can receive or retrieve behavioral information associated with users. The behavioral information can include time-stamped place visit data corresponding to places visited by the users. At block 720, the targeting system 115 can calculate a behavior match metric for each cell based on the behavioral information. At block 725, the targeting system 115 can receive or retrieve feature data for each cell. The feature data can be generated using a separate process. The targeting system 116 can label the feature data for each cell using the corresponding behavior match metric to obtain labeled feature data at block 730. A set of the labeled feature data can then be used by the targeting system 115 at block 735 to train a model for predicting a conversion rate for each cell. At block 740, after the model has been trained, a set of the feature data (e.g., the unlabeled feature data) can be analyzed by the trained model to predict a conversion rate of each cell. At block 745, the targeting system 115 can identify targeting information based on the conversion rates of the cells. The targeting information can be in the form of people (e.g., unique identifiers or profile characteristics of users in cells with high conversion rates), places (e.g., location coordinates of cells with high conversion rates) or both in various embodiments.



FIG. 8 is a logic flow diagram illustrating an example method of identifying locations or users that are likely to convert, based on a particular campaign.


The targeting system 115, in some embodiments, gathers, receives or retrieves behavioral data at block 805. At block 810, the targeting system 115 receives, gathers or retrieves sets of attributes associated with an array of locations in a geographic region. Each set of attribute can be a high dimensional data (e.g., 2000 or more attributes or features). The targeting system 115 can process the gathered data to identify at least one of locations (block 815) or users (820) that are likely to convert, based on a particular campaign. In some embodiments, both locations and users can be identified and each, alone or in combination, can be used to target promotional content or non-promotional content associated with the particular campaign (or similar campaign) to other users at block 825.


In some embodiments, the targeting system 115 can identify a set of users associated with a targeted behavior and/or a targeting model using an example method illustrated in FIG. 9.


Referring to FIG. 9, the targeting system 115 can build a set of targeting models (e.g., one for each of the top 30 businesses and 20 categories) at block 902. At block 904, the targeting system 115 can normalize the set of targeting models. In some embodiments, normalizing can include calculating a mean (or another statistical measure) of predicted conversion rates for each model and using the mean to calculate, for each cell, a normalized score. The normalized score, in some embodiments, can be calculated as a ratio of a conversion rate of the cell and the mean conversion rate. At block 906, the targeting system 115 can calculate a score threshold. The score threshold, in some embodiments, can be calculated by determining the N percentile (e.g., 95th, 98th) normalized score across the full set of models. At block 908, the targeting system can receive or gather data over a period of time (e.g., 1 week, 1 month). The gathered data can include, for example, a user identifier, a coordinate, and a timestamp. The gathered data can be mapped to corresponding cell ids based on the coordinates at block 910. At block 912, the targeting system 115 can group the data by cell id and generate a table of data at block 914. The process for generating the table of data can include, determining, for each cell id, and for each of the normalized models (each having a model id) associated with the cell id, a normalized score for the model id and generating a user id, model id, a normalized score and a timestamp. For each user id 916 in the table of data, the targeting system 115 can determine if the user has been observed on more than a threshold number of days at decision block 918. If false, the targeting system 115 evaluates the next user id at block 940. If true, the targeting system 115 can calculate, for each model id 920 associated with the user id, a mean normalized score at block 922. At decision block 934, the targeting system 115 can if the mean normalized score is greater than the score threshold (from block 906). If true, the targeting system 115 can tag the user id with the model id at block 936. If another model id is associated with the user id as determined at decision block 938, targeting system 115 can repeat the process with the next model id 942, otherwise the next user id 940, if available, can be evaluated. In some embodiments, user profiles or unique identifiers associated with the identified user ids that are tagged with a particular model id can be provided as targeting information to a publisher associated with the model id.


Conclusion


The above Detailed Description of embodiments of the targeting system 115 is not intended to be exhaustive or to limit the embodiments to the precise form disclosed above. While specific examples for the embodiments are described above for illustrative purposes, various equivalent modifications are possible within the scope of the embodiments, as those skilled in the relevant art will recognize. For example, while processes or blocks are presented in a given order, alternative implementations may perform routines having steps, or employ systems having blocks, in a different order, and some processes or blocks may be deleted, moved, added, subdivided, combined, and/or modified to provide alternative combinations or subcombinations. Each of these processes or blocks may be implemented in a variety of different ways. Also, while processes or blocks are at times shown as being performed in series, these processes or blocks may instead be performed or implemented in parallel, or may be performed at different times.


In general, the terms used in the following claims should not be construed to limit the embodiments to the specific examples disclosed in the specification, unless the above Detailed Description section explicitly defines such terms. Accordingly, the actual scope of the embodiments encompasses not only the disclosed examples, but also all equivalent ways of practicing or implementing the embodiments under the claims.

Claims
  • 1. A method for use by at least one data processing device, the method comprising: receiving, from a client device, targeting criteria that comprises at least a targeted behavior and an identification of a geographic region;segmenting the geographic region identified by the targeting criteria into a three-dimensional grid that comprises a plurality of cells based on latitude data, longitude data, and altitude data, the plurality of cells including a cell that encompasses a location within the region, the cell comprising a cell identifier;receiving behavioral information associated with at least a user, wherein the behavioral information includes location data that comprises at least device altitude data, and temporal data, the location data identifying the location encompassed by the cell;calculating a behavior match metric for the cell based on the behavioral information projected onto the geographic region, the behavior match metric indicating a number of visits by the user to the location represented by the cell;assigning the behavioral match metric to a key that comprises a tuple that comprises the cell identifier of the cell and a user identifier of the user;retrieving feature data for the cell, the feature data indicating attributes of the location encompassed by the cell;generating a labeled feature vector table based on the key and the attributes of the location encompassed by the cell;training a model for predicting a conversion rate of the cell based on the labeled feature vector table, wherein the conversion rate provides a probability of the user performing the targeted behavior within the cell;applying the model to the feature data to predict the conversion rate of the cell; andpresenting targeting information based on the conversion rate of the cell to the client device.
  • 2. The method of claim 1, wherein the cell is a first cell, and the identifying the targeting information based on the conversion rate of the first cell further comprises: ranking the first cell among the plurality of cells based on the conversion rate; andselecting the first cell based on the ranking of the first cell among the plurality of cells.
  • 3. The method of claim 2, further comprising providing latitude and longitude coordinates corresponding to each cell among the plurality of cells as the targeting information to an entity from which the targeting criteria were received.
  • 4. The method of claim 2, further comprising: determining one or more unique identifiers of users observed in the first cell, in response to the selecting the first cell; andproviding the one or more unique identifiers as the targeting information to an entity from which the targeting criteria were received.
  • 5. The method of claim 1, wherein the identifying the targeting information based on the conversion rate of the cell further comprises: calculating an aggregate conversion rate for the cell based on one or more conversion rates of the plurality of cells within a predefined distance from the cell;selecting cells from among the plurality of cells with higher aggregate conversion rates than others; andproviding latitude and longitude coordinates corresponding to the selected cells as the targeting information to an entity from which the targeting criteria were received.
  • 6. The method of claim 1, wherein the behavioral information further includes survey data and demographic profile data associated with the multiple users.
  • 7. The method of claim 6, further comprising: identifying a global user set from the multiple users based on at least one of the survey data or the demographic profile data;wherein the global user set includes the multiple users when no criteria for identifying users based on the survey data or the demographic profile data is provided.
  • 8. The method of claim 7, further comprising: identifying a behavior match user set from the global user set based on the survey data;wherein the behavior match user set is the same as the global user set when no criteria for identifying the behavior match user set based on the survey data is provided.
  • 9. The method of claim 8, further comprising: identifying a global visit set as a subset of all visits performed by users in the global user set; andidentifying a subset of the global visit set that matches the targeted behavior and was performed by a user in the behavior match user set as behavior match visit set.
  • 10. The method of claim 9, wherein calculating the behavior match metric for each cell based on the behavioral information farther comprises: grouping the behavior match visit set by a user and by day;for each user in the behavior match user set, calculating a total number of days on which the user performed the targeted behavior, wherein the total number of days corresponds to the behavior match metric.
  • 11. The method of claim 1, wherein labeling the feature data for the cell includes using the cell identifier of the cell to join a key-value pair having a tuple of the cell identifier and a user identifier as a key and a behavior match metric of the cell as a value with another key-value pair having the cell identifier as a key and the feature data as a value.
  • 12. The method of claim 1, further comprising: generating the feature data for the cell, wherein the feature data includes:nearby place features related to all places that are within a predefined distance from each cell;place distance features related to minimum distances from the cell to each business identifier and each category identifier respectively that are within a predefined distance from the cell; anddemographic features for the cell obtained from census data.
  • 13. The method of claim 1, wherein training the model for predicting the conversion rate of the cell based on the set of the labeled feature data further comprises: training a classifier using the set of the labeled feature data to predict a visit probability that provides an indication of a likelihood that a user observed in the cell would perform the targeted behavior; andtraining a statistical model using the visit probability and an observed conversion rate aggregated across all users in the cell to predict the conversion rate for the cell.
  • 14. The method of claim 13, wherein applying the model to the feature data to predict the conversion rate of the cell further comprises: applying the trained classifier on the feature data to predict a visit probability; andapplying the statistical model on the visit probability to predict the conversion rate for the cell.
  • 15. The method of claim 14, further comprising: randomly sampling users to obtain the set of the labeled feature data from the feature data for training the model.
  • 16. The method of claim 1, wherein the targeted behavior includes visiting a place or performing an activity.
  • 17. A system comprising: memory;at least one processor in communication with the memory and configured to execute a plurality of instructions stored in the memory to:receive, from a client device, targeting criteria that comprises at least a targeted behavior and an identification of a geographic region;segment the geographic region identified by the targeting criteria into a three-dimensional grid that comprises a plurality of cells based on latitude data, longitude data, and altitude data, the plurality of cells including a cell that encompasses a location within the region, the cell comprising a cell identifier;receive behavioral information associated with at least a user, wherein the behavioral information includes location data that comprises at least device altitude data, and temporal data, the location data identifying the location encompassed by the cell;calculate a behavior match metric for the cell based on the behavioral information projected onto the geographic region, the behavior match metric indicating a number of visits by the user to the location represented by the cell;assign the behavioral match metric to a key that comprises a tuple that comprises the cell identifier of the cell and a user identifier of the user;retrieve feature data for the cell, the feature data indicating attributes of the location encompassed by the cell;generate a labeled feature vector table based on the key and the attributes of the location encompassed by the cell;train a model for predicting a conversion rate of the cell based on the labeled feature vector table, the conversion rate providing a probability of the user performing the targeted behavior within the cell;apply the model to the feature data to predict the conversion rate of the cell; andpresent targeting information based on the conversion rate of the cell to the client device.
  • 18. The system of claim 17, wherein the targeting information includes at least one of: latitude and longitude coordinates of the plurality of cells; orone or more unique identifiers associated with users observed in the plurality of cells.
  • 19. A non-transitory computer-readable medium storing computer-executable instructions that cause a machine to perform operations comprising: receiving, from a client device, targeting criteria that comprises at least a targeted behavior and an identification of a geographic region;segmenting the geographic region identified by the targeting criteria into a three-dimensional grid that comprises a plurality of cells based on latitude data, longitude data, and altitude data, the plurality of cells including a cell that encompasses a location within the region, the cell comprising a cell identifier;receiving behavioral information associated with at least a user, wherein the behavioral information includes location data that comprises at least device altitude data, and temporal data, the location data identifying the location encompassed by the cell;calculating a behavior match metric for the cell based on the behavioral information projected onto the geographic region, the behavior match metric indicating a number of visits by the user to the location represented by the cell;assigning the behavioral match metric to a key that comprises a tuple that comprises the cell identifier of the cell and a user identifier of the user;retrieving feature data for the cell, the feature data indicating attributes of the location encompassed by the cell;generating a labeled feature vector table based on the key and the attributes of the location encompassed by the cell;training a model for predicting a conversion rate of the cell based on the labeled feature vector table, wherein the conversion rate provides a probability of the user performing the targeted behavior within the cell;applying the model to the feature data to predict the conversion rate of the cell; andpresenting targeting information based on the conversion rate of the cell to the client device.
  • 20. The medium of claim 19, wherein the cell is a first cell, and the identifying the targeting information based on the conversion rate of the first cell further comprises: ranking the first cell among the plurality of cells based on the conversion rate; andselecting the first cell based on the ranking of the first cell among the plurality of cells.
US Referenced Citations (629)
Number Name Date Kind
666223 Shedlock Jan 1901 A
4581634 Williams Apr 1986 A
4975690 Torres Dec 1990 A
5072412 Henderson, Jr. et al. Dec 1991 A
5493692 Theimer et al. Feb 1996 A
5713073 Warsta Jan 1998 A
5754939 Herz et al. May 1998 A
5855008 Goldhaber et al. Dec 1998 A
5883639 Walton et al. Mar 1999 A
5999932 Paul Dec 1999 A
6012098 Bayeh et al. Jan 2000 A
6014090 Rosen et al. Jan 2000 A
6029141 Bezos et al. Feb 2000 A
6038295 Mattes Mar 2000 A
6049711 Yehezkel et al. Apr 2000 A
6154764 Nitta et al. Nov 2000 A
6167435 Druckenmiller et al. Dec 2000 A
6204840 Petelycky et al. Mar 2001 B1
6205432 Gabbard et al. Mar 2001 B1
6216141 Straub et al. Apr 2001 B1
6285381 Sawano et al. Sep 2001 B1
6285987 Roth et al. Sep 2001 B1
6310694 Okimoto et al. Oct 2001 B1
6317789 Rakavy et al. Nov 2001 B1
6334149 Davis, Jr. et al. Dec 2001 B1
6349203 Asaoka et al. Feb 2002 B1
6353170 Eyzaguirre et al. Mar 2002 B1
6446004 Cao et al. Sep 2002 B1
6449485 Anzil Sep 2002 B1
6449657 Stanbach et al. Sep 2002 B2
6456852 Bar et al. Sep 2002 B2
6484196 Maurille Nov 2002 B1
6487601 Hubacher et al. Nov 2002 B1
6523008 Avrunin Feb 2003 B1
6542749 Tanaka et al. Apr 2003 B2
6549768 Fraccaroli Apr 2003 B1
6618593 Drutman et al. Sep 2003 B1
6622174 Ukita et al. Sep 2003 B1
6631463 Floyd et al. Oct 2003 B1
6636247 Hamzy et al. Oct 2003 B1
6636855 Holloway et al. Oct 2003 B2
6643684 Malkin et al. Nov 2003 B1
6658095 Yoakum et al. Dec 2003 B1
6665531 Soderbacka et al. Dec 2003 B1
6668173 Greene Dec 2003 B2
6684238 Dutta Jan 2004 B1
6684257 Camut et al. Jan 2004 B1
6698020 Zigmond et al. Feb 2004 B1
6700506 Winkler Mar 2004 B1
6720860 Narayanaswami Apr 2004 B1
6724403 Santoro et al. Apr 2004 B1
6757713 Ogilvie et al. Jun 2004 B1
6832222 Zimowski Dec 2004 B1
6834195 Brandenberg et al. Dec 2004 B2
6836792 Chen Dec 2004 B1
6898626 Ohashi May 2005 B2
6959324 Kubik et al. Oct 2005 B1
6970088 Kovach Nov 2005 B2
6970907 Ullmann et al. Nov 2005 B1
6980909 Root et al. Dec 2005 B2
6981040 Konig et al. Dec 2005 B1
7020494 Spriestersbach et al. Mar 2006 B2
7027124 Foote et al. Apr 2006 B2
7072963 Anderson et al. Jul 2006 B2
7085571 Kalhan et al. Aug 2006 B2
7110744 Freeny, Jr. Sep 2006 B2
7124164 Chemtob Oct 2006 B1
7149893 Leonard et al. Dec 2006 B1
7173651 Knowles Feb 2007 B1
7188143 Szeto Mar 2007 B2
7203380 Chiu et al. Apr 2007 B2
7206568 Sudit Apr 2007 B2
7227937 Yoakum et al. Jun 2007 B1
7237002 Estrada et al. Jun 2007 B1
7240089 Boudreau Jul 2007 B2
7269426 Kokkonen et al. Sep 2007 B2
7280658 Amini et al. Oct 2007 B2
7315823 Brondrup Jan 2008 B2
7349768 Bruce et al. Mar 2008 B2
7356564 Hartselle et al. Apr 2008 B2
7394345 Ehlinger et al. Jul 2008 B1
7411493 Smith Aug 2008 B2
7423580 Markhovsky et al. Sep 2008 B2
7454442 Cobleigh et al. Nov 2008 B2
7508419 Toyama et al. Mar 2009 B2
7512649 Faybishenko et al. Mar 2009 B2
7519670 Hagale et al. Apr 2009 B2
7535890 Rojas May 2009 B2
7546554 Chiu et al. Jun 2009 B2
7607096 Oreizy et al. Oct 2009 B2
7639943 Kalajan Dec 2009 B1
7650231 Gadler Jan 2010 B2
7668537 DeVries Feb 2010 B2
7770137 Forbes et al. Aug 2010 B2
7778973 Choi Aug 2010 B2
7779444 Glad Aug 2010 B2
7787886 Markhovsky et al. Aug 2010 B2
7796946 Eisenbach Sep 2010 B2
7801954 Cadiz et al. Sep 2010 B2
7856360 Kramer et al. Dec 2010 B2
7966658 Singh et al. Jun 2011 B2
8001204 Burtner et al. Aug 2011 B2
8010685 Singh et al. Aug 2011 B2
8032586 Challenger et al. Oct 2011 B2
8082255 Carlson, Jr. et al. Dec 2011 B1
8090351 Klein Jan 2012 B2
8098904 Ioffe et al. Jan 2012 B2
8099109 Altman et al. Jan 2012 B2
8112716 Kobayashi Feb 2012 B2
8131597 Hudetz Mar 2012 B2
8135166 Rhoads Mar 2012 B2
8136028 Loeb et al. Mar 2012 B1
8146001 Reese Mar 2012 B1
8161115 Yamamoto Apr 2012 B2
8161417 Lee Apr 2012 B1
8195203 Tseng Jun 2012 B1
8199747 Rojas et al. Jun 2012 B2
8200247 Starenky et al. Jun 2012 B1
8208943 Petersen Jun 2012 B2
8214443 Hamburg Jul 2012 B2
8220034 Hahn et al. Jul 2012 B2
8229458 Busch Jul 2012 B2
8234350 Gu et al. Jul 2012 B1
8276092 Narayanan et al. Sep 2012 B1
8279319 Date Oct 2012 B2
8280406 Ziskind et al. Oct 2012 B2
8285199 Hsu et al. Oct 2012 B2
8287380 Nguyen et al. Oct 2012 B2
8296842 Singh et al. Oct 2012 B2
8301159 Hamynen et al. Oct 2012 B2
8306922 Kunal et al. Nov 2012 B1
8312086 Velusamy et al. Nov 2012 B2
8312097 Siegel et al. Nov 2012 B1
8326315 Phillips et al. Dec 2012 B2
8326327 Hymel et al. Dec 2012 B2
8332475 Rosen et al. Dec 2012 B2
8352546 Dollard Jan 2013 B1
8379130 Forutanpour et al. Feb 2013 B2
8385950 Wagner et al. Feb 2013 B1
8402097 Szeto Mar 2013 B2
8405773 Hayashi et al. Mar 2013 B2
8418067 Cheng et al. Apr 2013 B2
8423409 Rao Apr 2013 B2
8471914 Sakiyama et al. Jun 2013 B2
8472935 Fujisaki Jun 2013 B1
8509761 Krinsky et al. Aug 2013 B2
8510383 Hurley et al. Aug 2013 B2
8527345 Rothschild et al. Sep 2013 B2
8554627 Svendsen et al. Oct 2013 B2
8560612 Kilmer et al. Oct 2013 B2
8588942 Agrawal Nov 2013 B2
8594680 Ledlie et al. Nov 2013 B2
8613088 Varghese et al. Dec 2013 B2
8613089 Holloway et al. Dec 2013 B1
8660358 Bergboer et al. Feb 2014 B1
8660369 Llano et al. Feb 2014 B2
8660793 Ngo et al. Feb 2014 B2
8682350 Altman et al. Mar 2014 B2
8718333 Wolf et al. May 2014 B2
8724622 Rojas May 2014 B2
8732168 Johnson May 2014 B2
8744523 Fan et al. Jun 2014 B2
8745132 Obradovich Jun 2014 B2
8761800 Kuwahara Jun 2014 B2
8768876 Shim et al. Jul 2014 B2
8775972 Spiegel Jul 2014 B2
8788680 Naik Jul 2014 B1
8790187 Walker et al. Jul 2014 B2
8797415 Arnold Aug 2014 B2
8798646 Wang et al. Aug 2014 B1
8856349 Jain et al. Oct 2014 B2
8874677 Rosen et al. Oct 2014 B2
8886227 Schmidt et al. Nov 2014 B2
8909679 Root et al. Dec 2014 B2
8909725 Sehn Dec 2014 B1
8942953 Yuen et al. Jan 2015 B2
8972357 Shim et al. Mar 2015 B2
8995433 Rojas Mar 2015 B2
9015285 Ebsen et al. Apr 2015 B1
9020745 Johnston et al. Apr 2015 B2
9040574 Wang et al. May 2015 B2
9055416 Rosen et al. Jun 2015 B2
9094137 Sehn et al. Jul 2015 B1
9100806 Rosen et al. Aug 2015 B2
9100807 Rosen et al. Aug 2015 B2
9113301 Spiegel et al. Aug 2015 B1
9119027 Sharon et al. Aug 2015 B2
9123074 Jacobs Sep 2015 B2
9143382 Bhogal et al. Sep 2015 B2
9143681 Ebsen et al. Sep 2015 B1
9152477 Campbell et al. Oct 2015 B1
9191776 Root et al. Nov 2015 B2
9204252 Root Dec 2015 B2
9225897 Sehn et al. Dec 2015 B1
9256832 Shim et al. Feb 2016 B2
9258459 Hartley Feb 2016 B2
9344606 Hartley et al. May 2016 B2
9385983 Sehn Jul 2016 B1
9396354 Murphy et al. Jul 2016 B1
9407712 Sehn Aug 2016 B1
9407816 Sehn Aug 2016 B1
9430783 Sehn Aug 2016 B1
9439041 Parvizi et al. Sep 2016 B2
9443227 Evans et al. Sep 2016 B2
9450907 Pridmore et al. Sep 2016 B2
9459778 Hogeg et al. Oct 2016 B2
9489661 Evans et al. Nov 2016 B2
9491134 Rosen et al. Nov 2016 B2
9532171 Allen et al. Dec 2016 B2
9537811 Allen et al. Jan 2017 B2
9628950 Noeth et al. Apr 2017 B1
9710821 Heath Jul 2017 B2
9854219 Sehn Dec 2017 B2
20020047868 Miyazawa Apr 2002 A1
20020078456 Hudson et al. Jun 2002 A1
20020087631 Sharma Jul 2002 A1
20020097257 Miller et al. Jul 2002 A1
20020122659 McGrath et al. Sep 2002 A1
20020128047 Gates Sep 2002 A1
20020144154 Tomkow Oct 2002 A1
20030001846 Davis et al. Jan 2003 A1
20030016247 Lai et al. Jan 2003 A1
20030017823 Mager et al. Jan 2003 A1
20030020623 Cao et al. Jan 2003 A1
20030023874 Prokupets et al. Jan 2003 A1
20030037124 Yamaura et al. Feb 2003 A1
20030052925 Daimon et al. Mar 2003 A1
20030101230 Benschoter et al. May 2003 A1
20030110503 Perkes Jun 2003 A1
20030126215 Udell Jul 2003 A1
20030148773 Spriestersbach et al. Aug 2003 A1
20030164856 Prager et al. Sep 2003 A1
20030229607 Zellweger et al. Dec 2003 A1
20040027371 Jaeger Feb 2004 A1
20040064429 Hirstius et al. Apr 2004 A1
20040078367 Anderson et al. Apr 2004 A1
20040111467 Willis Jun 2004 A1
20040158739 Wakai et al. Aug 2004 A1
20040189465 Capobianco et al. Sep 2004 A1
20040203959 Coombes Oct 2004 A1
20040215625 Svendsen et al. Oct 2004 A1
20040243531 Dean Dec 2004 A1
20040243688 Wugofski Dec 2004 A1
20050021444 Bauer et al. Jan 2005 A1
20050022211 Veselov et al. Jan 2005 A1
20050048989 Jung Mar 2005 A1
20050078804 Yomoda Apr 2005 A1
20050097176 Schatz et al. May 2005 A1
20050102381 Jiang et al. May 2005 A1
20050104976 Currans May 2005 A1
20050114783 Szeto May 2005 A1
20050119936 Buchanan et al. Jun 2005 A1
20050122405 Voss et al. Jun 2005 A1
20050193340 Amburgey et al. Sep 2005 A1
20050193345 Klassen et al. Sep 2005 A1
20050198128 Anderson Sep 2005 A1
20050223066 Buchheit et al. Oct 2005 A1
20050288954 McCarthy et al. Dec 2005 A1
20060026067 Nicholas et al. Feb 2006 A1
20060107297 Toyama et al. May 2006 A1
20060114338 Rothschild Jun 2006 A1
20060119882 Harris et al. Jun 2006 A1
20060242239 Morishima et al. Oct 2006 A1
20060252438 Ansamaa et al. Nov 2006 A1
20060265417 Amato et al. Nov 2006 A1
20060270419 Crowley et al. Nov 2006 A1
20060287878 Wadhwa et al. Dec 2006 A1
20070004426 Pfleging et al. Jan 2007 A1
20070038715 Collins et al. Feb 2007 A1
20070040931 Nishizawa Feb 2007 A1
20070073517 Panje Mar 2007 A1
20070073823 Cohen et al. Mar 2007 A1
20070075898 Markhovsky et al. Apr 2007 A1
20070082707 Flynt et al. Apr 2007 A1
20070136228 Petersen Jun 2007 A1
20070192128 Celestini Aug 2007 A1
20070198340 Lucovsky et al. Aug 2007 A1
20070198495 Buron et al. Aug 2007 A1
20070208751 Cowan et al. Sep 2007 A1
20070210936 Nicholson Sep 2007 A1
20070214180 Crawford Sep 2007 A1
20070214216 Carrer et al. Sep 2007 A1
20070233556 Koningstein Oct 2007 A1
20070233801 Eren et al. Oct 2007 A1
20070233859 Zhao et al. Oct 2007 A1
20070243887 Bandhole et al. Oct 2007 A1
20070244633 Phillips et al. Oct 2007 A1
20070244750 Grannan et al. Oct 2007 A1
20070255456 Funayama Nov 2007 A1
20070281690 Altman et al. Dec 2007 A1
20080022329 Glad Jan 2008 A1
20080025701 Ikeda Jan 2008 A1
20080032703 Krumm et al. Feb 2008 A1
20080033930 Warren Feb 2008 A1
20080043041 Hedenstroem et al. Feb 2008 A2
20080049704 Witteman et al. Feb 2008 A1
20080062141 Chandhri Mar 2008 A1
20080076505 Nguyen et al. Mar 2008 A1
20080092233 Tian et al. Apr 2008 A1
20080094387 Chen Apr 2008 A1
20080104503 Beall et al. May 2008 A1
20080109844 Baldeschwieler et al. May 2008 A1
20080120409 Sun et al. May 2008 A1
20080147730 Lee et al. Jun 2008 A1
20080148150 Mall Jun 2008 A1
20080158230 Sharma et al. Jul 2008 A1
20080168033 Ott et al. Jul 2008 A1
20080168489 Schraga Jul 2008 A1
20080189177 Anderton et al. Aug 2008 A1
20080207176 Brackbill et al. Aug 2008 A1
20080208692 Garaventi et al. Aug 2008 A1
20080214210 Rasanen et al. Sep 2008 A1
20080222545 Lemay Sep 2008 A1
20080255976 Altberg et al. Oct 2008 A1
20080256446 Yamamoto Oct 2008 A1
20080256577 Funaki et al. Oct 2008 A1
20080266421 Takahata et al. Oct 2008 A1
20080270938 Carlson Oct 2008 A1
20080288338 Wiseman et al. Nov 2008 A1
20080306826 Kramer et al. Dec 2008 A1
20080313329 Wang et al. Dec 2008 A1
20080313346 Kujawa et al. Dec 2008 A1
20080318616 Chipalkatti et al. Dec 2008 A1
20090006191 Arankalle et al. Jan 2009 A1
20090006565 Velusamy et al. Jan 2009 A1
20090015703 Kim et al. Jan 2009 A1
20090024956 Kobayashi Jan 2009 A1
20090030774 Rothschild et al. Jan 2009 A1
20090030999 Gatzke et al. Jan 2009 A1
20090040324 Nonaka Feb 2009 A1
20090042588 Lottin et al. Feb 2009 A1
20090058822 Chaudhri Mar 2009 A1
20090079846 Chou Mar 2009 A1
20090089558 Bradford et al. Apr 2009 A1
20090089678 Sacco et al. Apr 2009 A1
20090089710 Wood et al. Apr 2009 A1
20090093261 Ziskind Apr 2009 A1
20090132341 Klinger May 2009 A1
20090132453 Hangartner et al. May 2009 A1
20090132665 Thomsen et al. May 2009 A1
20090148045 Lee et al. Jun 2009 A1
20090153492 Popp Jun 2009 A1
20090157450 Athsani et al. Jun 2009 A1
20090157752 Gonzalez Jun 2009 A1
20090160970 Fredlund et al. Jun 2009 A1
20090163182 Gatti et al. Jun 2009 A1
20090177299 Van De Sluis Jul 2009 A1
20090192900 Collison Jul 2009 A1
20090199242 Johnson et al. Aug 2009 A1
20090204354 Davis et al. Aug 2009 A1
20090215469 Fisher et al. Aug 2009 A1
20090232354 Camp, Jr. et al. Sep 2009 A1
20090234815 Boerries et al. Sep 2009 A1
20090239552 Churchill et al. Sep 2009 A1
20090249222 Schmidt et al. Oct 2009 A1
20090249244 Robinson et al. Oct 2009 A1
20090265647 Martin et al. Oct 2009 A1
20090276235 Benezra et al. Nov 2009 A1
20090278738 Gopinath Nov 2009 A1
20090288022 Almstrand et al. Nov 2009 A1
20090291672 Treves et al. Nov 2009 A1
20090292608 Polachek Nov 2009 A1
20090319607 Belz et al. Dec 2009 A1
20090327073 Li Dec 2009 A1
20100041378 Aceves et al. Feb 2010 A1
20100062794 Han Mar 2010 A1
20100082427 Burgener et al. Apr 2010 A1
20100082693 Hugg et al. Apr 2010 A1
20100100568 Papin et al. Apr 2010 A1
20100113065 Narayan et al. May 2010 A1
20100130233 Parker May 2010 A1
20100131880 Lee et al. May 2010 A1
20100131895 Wohlert May 2010 A1
20100153144 Miller et al. Jun 2010 A1
20100159944 Pascal et al. Jun 2010 A1
20100161658 Hamynen et al. Jun 2010 A1
20100161720 Colligan et al. Jun 2010 A1
20100161831 Haas et al. Jun 2010 A1
20100162149 Sheleheda et al. Jun 2010 A1
20100183280 Beauregard et al. Jul 2010 A1
20100185552 Deluca et al. Jul 2010 A1
20100185665 Horn et al. Jul 2010 A1
20100191631 Weidmann Jul 2010 A1
20100197318 Petersen et al. Aug 2010 A1
20100197319 Petersen et al. Aug 2010 A1
20100198683 Aarabi Aug 2010 A1
20100198694 Muthukrishnan Aug 2010 A1
20100198826 Petersen Aug 2010 A1
20100198828 Petersen et al. Aug 2010 A1
20100198862 Jennings et al. Aug 2010 A1
20100198870 Petersen et al. Aug 2010 A1
20100198917 Petersen et al. Aug 2010 A1
20100201482 Robertson et al. Aug 2010 A1
20100201536 Robertson et al. Aug 2010 A1
20100211425 Govindarajan Aug 2010 A1
20100214436 Kim et al. Aug 2010 A1
20100223128 Dukellis et al. Sep 2010 A1
20100223343 Bosan et al. Sep 2010 A1
20100223346 Dragt Sep 2010 A1
20100250109 Johnston et al. Sep 2010 A1
20100257036 Khojastepour et al. Oct 2010 A1
20100257196 Waters et al. Oct 2010 A1
20100259386 Holley et al. Oct 2010 A1
20100273509 Sweeney et al. Oct 2010 A1
20100281045 Dean Nov 2010 A1
20100306669 Della Pasqua Dec 2010 A1
20110004071 Faiola et al. Jan 2011 A1
20110010205 Richards Jan 2011 A1
20110029512 Folgner et al. Feb 2011 A1
20110040783 Uemichi et al. Feb 2011 A1
20110040804 Peirce et al. Feb 2011 A1
20110050909 Ellenby et al. Mar 2011 A1
20110050915 Wang et al. Mar 2011 A1
20110064388 Brown et al. Mar 2011 A1
20110066743 Hurley et al. Mar 2011 A1
20110076653 Culligan et al. Mar 2011 A1
20110083101 Sharon et al. Apr 2011 A1
20110099046 Weiss et al. Apr 2011 A1
20110099047 Weiss et al. Apr 2011 A1
20110099048 Weiss et al. Apr 2011 A1
20110102630 Rukes May 2011 A1
20110119133 Igelman et al. May 2011 A1
20110137881 Cheng et al. Jun 2011 A1
20110145564 Moshir et al. Jun 2011 A1
20110159890 Fortescue et al. Jun 2011 A1
20110164163 Bilbrey et al. Jul 2011 A1
20110197194 D'Angelo et al. Aug 2011 A1
20110202598 Evans et al. Aug 2011 A1
20110202968 Nurmi Aug 2011 A1
20110211534 Schmidt et al. Sep 2011 A1
20110213845 Logan et al. Sep 2011 A1
20110215903 Yang et al. Sep 2011 A1
20110215966 Kim et al. Sep 2011 A1
20110225048 Nair Sep 2011 A1
20110238763 Shin et al. Sep 2011 A1
20110255736 Thompson et al. Oct 2011 A1
20110273575 Lee Nov 2011 A1
20110282799 Huston Nov 2011 A1
20110283188 Farrenkopf et al. Nov 2011 A1
20110314419 Dunn et al. Dec 2011 A1
20110320373 Lee et al. Dec 2011 A1
20120028659 Whitney et al. Feb 2012 A1
20120033718 Kauffman et al. Feb 2012 A1
20120036015 Sheikh Feb 2012 A1
20120036443 Ohmori et al. Feb 2012 A1
20120054797 Skog et al. Mar 2012 A1
20120059722 Rao Mar 2012 A1
20120062805 Candelore Mar 2012 A1
20120084731 Filman et al. Apr 2012 A1
20120084835 Thomas et al. Apr 2012 A1
20120099800 Llano et al. Apr 2012 A1
20120108293 Law et al. May 2012 A1
20120110096 Smarr et al. May 2012 A1
20120113143 Adhikari et al. May 2012 A1
20120113272 Hata May 2012 A1
20120123830 Svendsen et al. May 2012 A1
20120123871 Svendsen et al. May 2012 A1
20120123875 Svendsen et al. May 2012 A1
20120124126 Alcazar et al. May 2012 A1
20120124176 Curtis et al. May 2012 A1
20120124458 Cruzada May 2012 A1
20120131507 Sparandara et al. May 2012 A1
20120131512 Takeuchi et al. May 2012 A1
20120143760 Abulafia et al. Jun 2012 A1
20120150978 Monaco Jun 2012 A1
20120165100 Lalancette et al. Jun 2012 A1
20120166971 Sachson et al. Jun 2012 A1
20120169855 Oh Jul 2012 A1
20120172062 Altman et al. Jul 2012 A1
20120173991 Roberts et al. Jul 2012 A1
20120176401 Hayward et al. Jul 2012 A1
20120184248 Speede Jul 2012 A1
20120197724 Kendall Aug 2012 A1
20120200743 Blanchflower et al. Aug 2012 A1
20120209924 Evans et al. Aug 2012 A1
20120210244 De Francisco et al. Aug 2012 A1
20120212632 Mate et al. Aug 2012 A1
20120220264 Kawabata Aug 2012 A1
20120226748 Bosworth et al. Sep 2012 A1
20120233000 Fisher et al. Sep 2012 A1
20120236162 Imamura Sep 2012 A1
20120239761 Linner et al. Sep 2012 A1
20120246004 Book et al. Sep 2012 A1
20120250951 Chen Oct 2012 A1
20120252418 Kandekar et al. Oct 2012 A1
20120254325 Majeti et al. Oct 2012 A1
20120264446 Xie et al. Oct 2012 A1
20120278387 Garcia et al. Nov 2012 A1
20120278692 Shi Nov 2012 A1
20120290637 Perantatos et al. Nov 2012 A1
20120299954 Wada et al. Nov 2012 A1
20120304052 Tanaka et al. Nov 2012 A1
20120304080 Wormald et al. Nov 2012 A1
20120307096 Bray et al. Dec 2012 A1
20120307112 Kunishige et al. Dec 2012 A1
20120319904 Lee et al. Dec 2012 A1
20120323933 He et al. Dec 2012 A1
20120324018 Metcalf et al. Dec 2012 A1
20130006759 Srivastava et al. Jan 2013 A1
20130024757 Doll et al. Jan 2013 A1
20130036364 Johnson Feb 2013 A1
20130045753 Obermeyer et al. Feb 2013 A1
20130050260 Reitan Feb 2013 A1
20130055083 Fino Feb 2013 A1
20130057587 Leonard et al. Mar 2013 A1
20130059607 Herz et al. Mar 2013 A1
20130060690 Oskolkov et al. Mar 2013 A1
20130063369 Malhotra et al. Mar 2013 A1
20130067027 Song et al. Mar 2013 A1
20130071093 Hanks et al. Mar 2013 A1
20130080254 Thramann Mar 2013 A1
20130085790 Palmer et al. Apr 2013 A1
20130086072 Peng et al. Apr 2013 A1
20130090171 Holton et al. Apr 2013 A1
20130095857 Garcia et al. Apr 2013 A1
20130104053 Thornton et al. Apr 2013 A1
20130110885 Brundrett, III May 2013 A1
20130111514 Slavin et al. May 2013 A1
20130128059 Kristensson May 2013 A1
20130129252 Lauper May 2013 A1
20130132477 Bosworth et al. May 2013 A1
20130145286 Feng et al. Jun 2013 A1
20130159110 Rajaram et al. Jun 2013 A1
20130159919 Leydon Jun 2013 A1
20130169822 Zhu et al. Jul 2013 A1
20130173729 Starenky et al. Jul 2013 A1
20130182133 Tanabe Jul 2013 A1
20130185131 Sinha et al. Jul 2013 A1
20130191198 Carlson et al. Jul 2013 A1
20130194301 Robbins et al. Aug 2013 A1
20130198176 Kim Aug 2013 A1
20130218965 Abrol et al. Aug 2013 A1
20130218968 Mcevilly et al. Aug 2013 A1
20130222323 Mckenzie Aug 2013 A1
20130225202 Shim et al. Aug 2013 A1
20130226857 Shim et al. Aug 2013 A1
20130227476 Frey Aug 2013 A1
20130232194 Knapp et al. Sep 2013 A1
20130254227 Shim et al. Sep 2013 A1
20130263031 Oshiro et al. Oct 2013 A1
20130265450 Barnes, Jr. Oct 2013 A1
20130267253 Case et al. Oct 2013 A1
20130275505 Gauglitz et al. Oct 2013 A1
20130290443 Collins et al. Oct 2013 A1
20130304646 De Geer Nov 2013 A1
20130311255 Cummins et al. Nov 2013 A1
20130325964 Berberat Dec 2013 A1
20130344896 Kirmse et al. Dec 2013 A1
20130346869 Asver et al. Dec 2013 A1
20130346877 Borovoy et al. Dec 2013 A1
20140006129 Heath Jan 2014 A1
20140011538 Mulcahy et al. Jan 2014 A1
20140019264 Wachman et al. Jan 2014 A1
20140032682 Prado et al. Jan 2014 A1
20140043204 Basnayake et al. Feb 2014 A1
20140045530 Gordon et al. Feb 2014 A1
20140047016 Rao Feb 2014 A1
20140047045 Baldwin et al. Feb 2014 A1
20140047335 Lewis et al. Feb 2014 A1
20140049652 Moon et al. Feb 2014 A1
20140052485 Shidfar Feb 2014 A1
20140052633 Gandhi Feb 2014 A1
20140057660 Wager Feb 2014 A1
20140082651 Sharifi Mar 2014 A1
20140092130 Anderson et al. Apr 2014 A1
20140096029 Schultz Apr 2014 A1
20140114565 Aziz et al. Apr 2014 A1
20140122658 Haeger et al. May 2014 A1
20140122787 Shalvi et al. May 2014 A1
20140129953 Spiegel May 2014 A1
20140143143 Fasoli et al. May 2014 A1
20140149519 Redfern et al. May 2014 A1
20140155102 Cooper et al. Jun 2014 A1
20140173424 Hogeg et al. Jun 2014 A1
20140173457 Wang et al. Jun 2014 A1
20140189592 Benchenaa et al. Jul 2014 A1
20140207679 Cho Jul 2014 A1
20140214471 Schreiner, III Jul 2014 A1
20140222564 Kranendonk et al. Aug 2014 A1
20140258405 Perkin Sep 2014 A1
20140265359 Cheng et al. Sep 2014 A1
20140266703 Dalley, Jr. et al. Sep 2014 A1
20140279061 Elimeliah et al. Sep 2014 A1
20140279436 Dorsey et al. Sep 2014 A1
20140279540 Jackson Sep 2014 A1
20140280537 Pridmore et al. Sep 2014 A1
20140282096 Rubinstein et al. Sep 2014 A1
20140287779 O'keefe et al. Sep 2014 A1
20140289833 Briceno Sep 2014 A1
20140304038 Milton Oct 2014 A1
20140304212 Shim et al. Oct 2014 A1
20140306986 Gottesman et al. Oct 2014 A1
20140317302 Naik Oct 2014 A1
20140324627 Haver et al. Oct 2014 A1
20140324629 Jacobs Oct 2014 A1
20140325383 Brown et al. Oct 2014 A1
20150020086 Chen et al. Jan 2015 A1
20150046278 Pei et al. Feb 2015 A1
20150071619 Brough Mar 2015 A1
20150087263 Branscomb et al. Mar 2015 A1
20150088622 Ganschow et al. Mar 2015 A1
20150095020 Leydon Apr 2015 A1
20150096042 Mizrachi Apr 2015 A1
20150116529 Wu et al. Apr 2015 A1
20150169827 Laborde Jun 2015 A1
20150172534 Miyakawa et al. Jun 2015 A1
20150178260 Brunson Jun 2015 A1
20150222814 Li et al. Aug 2015 A1
20150261917 Smith Sep 2015 A1
20150312184 Langholz et al. Oct 2015 A1
20150350136 Flynn, III et al. Dec 2015 A1
20150365795 Allen et al. Dec 2015 A1
20150378502 Hu et al. Dec 2015 A1
20160006927 Sehn Jan 2016 A1
20160014063 Hogeg et al. Jan 2016 A1
20160048869 Shim et al. Feb 2016 A1
20160085773 Chang et al. Mar 2016 A1
20160085863 Allen et al. Mar 2016 A1
20160099901 Allen et al. Apr 2016 A1
20160157062 Shim et al. Jun 2016 A1
20160180887 Sehn Jun 2016 A1
20160182422 Sehn et al. Jun 2016 A1
20160182875 Sehn Jun 2016 A1
20160239248 Sehn Aug 2016 A1
20160277419 Allen et al. Sep 2016 A1
20160321708 Sehn Nov 2016 A1
20170006094 Abou Mahmoud et al. Jan 2017 A1
20170061308 Chen et al. Mar 2017 A1
20170287006 Azmoodeh et al. Oct 2017 A1
Foreign Referenced Citations (32)
Number Date Country
2887596 Jul 2015 CA
2051480 Apr 2009 EP
2151797 Feb 2010 EP
2399928 Sep 2004 GB
19990073076 Oct 1999 KR
20010078417 Aug 2001 KR
WO-1996024213 Aug 1996 WO
WO-1999063453 Dec 1999 WO
WO-2000058882 Oct 2000 WO
WO-2001029642 Apr 2001 WO
WO-2001050703 Jul 2001 WO
WO-2006118755 Nov 2006 WO
WO2007092668 Aug 2007 WO
WO-2009043020 Apr 2009 WO
WO-2011040821 Apr 2011 WO
WO-2011119407 Sep 2011 WO
WO-2013008238 Jan 2013 WO
WO2013008238 Jan 2013 WO
WO-2013045753 Apr 2013 WO
WO2014006129 Jan 2014 WO
WO-2014068573 May 2014 WO
WO-2014115136 Jul 2014 WO
WO-2014194262 Dec 2014 WO
WO-2015192026 Dec 2015 WO
WO-2016044424 Mar 2016 WO
WO-2016054562 Apr 2016 WO
WO-2016065131 Apr 2016 WO
WO-2016100318 Jun 2016 WO
WO-2016100318 Jun 2016 WO
WO-2016100342 Jun 2016 WO
WO-2016149594 Sep 2016 WO
WO-2016179166 Nov 2016 WO
Non-Patent Literature Citations (23)
Entry
Gregorich et al., “Verification of AIRS Boresight Accuracy Using Coastline Detection” IEEE Transactions on Geoscience and Remote Sensing, vol. 41, Issue 2, (2003) pp. 298-302.
Hsu-Yang Kun et al., “Using RFID Technology and SOA with 4D Escape Route” Wireless Communications, Networking and Mobile Computing (2008) pp. 1-4.
Ning Xia et al., “GeoEcho: Inferring User Interests from Geotag Reports in Network Traffic” IEEE/WIC/ACM International Joint Conferences, vol. 2 (2014) pp. 1-8.
“A Whole New Story”, URL: https://www.snap.com/en-US/news/, (2017), 13 pgs.
“Adding a watermark to your photos”, eBay, URL: http://pages.ebay.com/help/sell/pictures.html, (accessed May 24, 2017), 4 pgs.
“BlogStomp”, URL: http://stompsoftware.com/blogstomp, (accessed May 24, 2017), 12 pgs.
“Cup Magic Starbucks Holiday Red Cups come to life with AR app”, URL: http://www.blastradius.com/work/cup-magic, (2016), 7 pgs.
“Daily App: InstaPlace (iOS/Android): Give Pictures a Sense of Place”, TechPP, URL: http://techpp.com/2013/02/15/instaplace-app-review, (2013), 13 pgs.
“InstaPlace Photo App Tell the Whole Story”, URL: https://youtu.be/uF_gFkg1hBM, (Nov. 8, 2013), 113 pgs.
“International Application Serial No. PCT/US2015/037251, International Search Report dated Sep. 29, 2015”, 2 pgs.
“Introducing Snapchat Stories”, URL: https://www.youtube.com/watch?v=88Cu3yN-LIM, (Oct. 3, 2013), 92 pgs.
“Macy's Believe-o-Magic”, URL: https://www.youtube.com/watch?v=xvzRXy3J0Z0, (Nov. 7, 2011), 102 pgs.
“Macys Introduces Augmented Reality Experience in Stores across Country as Part of Its 2011 Believe Campaign”, URL: https://www.businesswire.com/news/home/20111102006759/en/Macys-Introduces-Augmented-Reality-Experience-Stores-Country, (Nov. 2, 2011), 6 pgs.
“Starbucks Cup Magic”, URL: https://www.youtube.com/watch?v=RWwQXi9RG0w, (Nov. 8, 2011), 87 pgs.
“Starbucks Cup Magic for Valentine's Day”, URL: https://www.youtube.com/watch?v=8nvqOzjq10w, (Feb. 6, 2012), 88 pgs.
“Starbucks Holiday Red Cups Come to Life, Signaling the Return of the Merriest Season”, URL: http://www.businesswire.com/news/home/20111115005744/en/2479513/Starbucks-Holiday-Red-Cups-Life-Signaling-Return, (Nov. 15, 2011), 5 pgs.
Carthy, Roi, “Dear All Photo Apps: Mobil Just Won Filters”, URL:https://techcrunch.com/2011/09/08/mobli-filters, (Sep. 8, 2011), 10 pgs.
Janthong, Isaranu, “Android App Review Thailand”, URL: http://www.android-free-app-review.com/2013/01/instaplace-android-google-play-store.html, (Jan. 23, 2013), 9 pgs.
Macleod, Duncan, “Macys Believe-o-Magic App”, URL: http://theinspirationroom.com/daily/2011/macys-believe-o-magic-app, (Nov. 14, 2011), 10 pgs.
Macleod, Duncan, “Starbucks Cup Magic Lets Merry”, URL: http://theinspirationroom.com/daily/2011/starbucks-cup-magic, (Nov. 12, 2011), 8 pgs.
Notopoulos, Katie, “A Guide to the New Snapchat Filters and Big Fonts”, URL: https://www.buzzfeed.com/katienotopoulos/a-guide-to-the-new-snapchat-filters-and-big-fonts?utm_term=.bkQ9qVZWe#.nv58YXpkV, (Dec. 22, 2013), 13 pgs.
Panzarino, Matthew, “Snapchat Adds Filters, a Replay Function and for Whatever Reason, Time, Temperature and Speed Overlays”, URL: https://techcrunch.com/2013/12/20/snapchat-adds-filters-new-font-and-for-some-reason-time-temperature-and-speed-overlays/, (Dec. 20, 2013), 12 pgs.
Tripathi, Rohit, “Watermark Images in PHP and Save File on Server”, URL: http://code.rohitink.com/2012/12/28/watermark-images-in-php-and-save-file-on-server, (Dec. 28, 2012), 4 pgs.
Related Publications (1)
Number Date Country
20160078485 A1 Mar 2016 US