Systems and methods for determining and displaying optimal associations of data items

Information

  • Patent Grant
  • 11061542
  • Patent Number
    11,061,542
  • Date Filed
    Friday, May 31, 2019
    5 years ago
  • Date Issued
    Tuesday, July 13, 2021
    3 years ago
Abstract
Systems and methods are described for determining and displaying optimal associations of data items. Data items may include media content such as television programs, and may be associated with advertisements to be displayed during content consumption. A tool may process data regarding segments of viewers that have common characteristics, and further process data regarding viewers of particular data items, to identify degrees of association between individual segments of viewers and particular data items. The degrees of association between a particular data item and multiple segments of viewers, or between multiple data items and a particular segment of viewers, may be displayed in a user interface that identifies optimal associations between data items and advertisements based on the viewer segments having high degrees of association with the data item.
Description
TECHNICAL FIELD

The present disclosure relates to systems and techniques for accessing one or more databases and aggregating, analyzing, and displaying data in interactive user interfaces. More specifically, the present disclosure relates to systems and techniques for determining optimal associations of data items.


BACKGROUND

Entities, such as content providers, often associate data items of a first type (e.g., advertisements) with available slots in data items of a second type (e.g., a television program). It can be difficult, however, for an entity to determine optimal associations. For example, an entity may obtain data regarding consumption of provided content, such as ratings or other information. The data may further be categorized into segments, such as viewership by men aged 18-25, individuals with a gross annual income between $50,000 and $75,000, and the like. However, the segments in the obtained data may not correspond to segments of interest. For example, a particular advertiser may seek to present an advertisement to individuals who are planning to take a vacation in the near future, individuals who have recently had a child, or other market segments that are not captured in the ratings data.


An entity may also have access to other data sources that provide information regarding the characteristics of individuals or groups. However, there may be thousands of market segments that could potentially be identified from such data sources, and there may be thousands of potential pairings of advertisements and television programs. The level of analysis and data visualization required to identify optimal associations is thus beyond the capabilities of unaided humans and prior art systems.


SUMMARY

The systems, methods, and devices described herein each have several aspects, no single one of which is solely responsible for its desirable attributes. Without limiting the scope of this disclosure, several non-limiting features will now be discussed briefly.


Disclosed herein are various systems and methods for allowing entities to determine and visualize optimal associations of media content (e.g., advertisements) with particular media programming (e.g., television programs, streaming media, online content, etc.). For example, a tool may be provided that identifies specific market segments of consumers (where a “segment” generally represents consumers having a common particular demographic, behavioral, medical, and/or psychographic attribute, as discussed in more detail below) that are represented in the viewership of a particular television program. A segment may also be related to other attributes, such as to shopping behavior via loyalty programs or pharmacy sources. For example, in some embodiments, consumer segments could include groups of consumers that “purchased [brand name] almond milk,” “were diagnosed with athlete's foot,” or “were prescribed [medication name]”.


The tool may determine, for example, that individuals who enjoy outdoor activities (e.g., individuals in an “outdoor activities” segment) and who plan to buy a car in the next six months are watching a particular television program. A segment may be associated with particular media programming, for example, if viewers in that segment watch more minutes of the particular media programming than viewers in other segments. Advertisements relating to particular outdoor activities (e.g., automotive brands or products, such as vehicles that have features such as a roof rack or a large cargo capacity) may thus be optimally associated with available slots on the television program. In some embodiments, the tool may correlate information regarding viewership of television programs (e.g., ratings) with information regarding viewers (e.g., information collected through surveys or other sources) to identify segments of viewership at a high level of detail. The tool may thus provide detailed information regarding viewership that enables optimal associations.


The tool may provide user interfaces that present information regarding associations between content data items and viewer segments. It has been noted that design of computer user interfaces “that are useable and easily learned by humans is a non-trivial problem for software developers.” (Dillon, A. (2003) User Interface Design. MacMillan Encyclopedia of Cognitive Science, Vol. 4, London: MacMillan, 453-458.) The present disclosure describes various embodiments of interactive and dynamic user interfaces that are the result of significant development. This non-trivial development has resulted in the user interfaces described herein which may provide significant cognitive and ergonomic efficiencies and advantages over previous systems. The interactive and dynamic user interfaces include improved human-computer interactions that may provide reduced mental workloads, improved decision-making, reduced work stress, and/or the like, for a user. For example, user interaction with the interactive user interface via the inputs described herein may provide an optimized display of, and interaction with, graph data, image data, and/or other data, and may enable a user to more quickly and accurately access, navigate, assess, and digest the data than previous systems.


Further, the interactive and dynamic user interfaces described herein are enabled by innovations in efficient interactions between the user interfaces and underlying systems and components. For example, disclosed herein are improved methods of receiving user inputs (including methods of interacting with, and selecting, images, graphs, and other types of data), translation and delivery of those inputs to various system components, automatic and dynamic execution of complex processes in response to the input delivery, automatic interaction among various components and processes of the system, and automatic and dynamic updating of the user interfaces (to, for example, display the relevant data from various different applications and/or data sources). The interactions and presentation of data via the interactive user interfaces described herein may accordingly provide cognitive and ergonomic efficiencies and advantages over previous systems.


Various embodiments of the present disclosure provide improvements to various technologies and technological fields. For example, existing data aggregation and analysis technology is limited in various ways (e.g., limited in the types of applications or data sources the data may be drawn from, loss of data interactivity, etc.), and various embodiments of the disclosure provide significant improvements over such technology. Additionally, various embodiments of the present disclosure are inextricably tied to computer technology. In particular, various embodiments rely on detection of user inputs via graphical user interfaces, aggregation of data from different applications and data sources, and automatic processing, formatting, and display of the aggregated data via interactive graphical user interfaces. Such features and others (e.g., automatically determining an application or data source an inputted link is directed to, accessing the application or data source to retrieve and display the requested data, implementing interactivity of displayed data reflective of how the data would be displayed in its native application) are intimately tied to, and enabled by, computer technology, and would not exist except for computer technology. For example, the interactions with displayed data described below in reference to various embodiments cannot reasonably be performed by humans alone, without the computer technology upon which they are implemented. Further, the implementation of the various embodiments of the present disclosure via computer technology enables many of the advantages described herein, including more efficient interaction with, and presentation of, various types of electronic data.





BRIEF DESCRIPTION OF THE DRAWINGS

The following drawings and the associated descriptions are provided to illustrate embodiments of the present disclosure and do not limit the scope of the claims. Aspects and many of the attendant advantages of this disclosure will become more readily appreciated as the same become better understood by reference to the following detailed description, when taken in conjunction with the accompanying drawings, wherein:



FIG. 1 is a functional block diagram depicting an example network environment for implementing a segment association system in accordance with aspects of the present disclosure.



FIGS. 2A-2B are pictorial diagrams depicting example item association user interfaces in accordance with aspects of the present disclosure.



FIG. 3A is a flow diagram depicting an example segment association routine that may be implemented by a segment association system in accordance with aspects of the present disclosure.



FIG. 3B is a flow diagram depicting an example segment association display routine that may generate item association user interfaces in accordance with aspects of the present disclosure.



FIG. 4 is a block diagram depicting a general architecture of an example computing device for implementing a segment association system in accordance with aspects of the present disclosure.





DETAILED DESCRIPTION

Overview


As described above, it can be difficult for an entity to determine optimal associations between data items. For example, assigning a particular commercial to an available slot on a radio broadcast, podcast, or streaming media may be a better use of the slot than assigning a different commercial that is less relevant to the viewing audience. However, the entity may not be able to determine which of several media items (e.g., advertisements) are most relevant to viewers of numerous slots. The entity may have access to data, such as ratings, that provide general information about the viewing audience for a program. These data may allow the entity to eliminate assignments that are obviously suboptimal, such as product placements that target young adults in programs that are not popular with that demographic. However, the broad demographic categories provided by such data are not specific enough to allow determination of optimal associations.


The entity may also have access to data regarding individuals or households, such as surveys, activity logs, purchase histories, or other information. Such data may allow the entity to determine that a particular individual would be receptive to a particular advertisement. However, because the audience for mass media typically numbers in the millions, the entity cannot make optimal decisions by targeting individual viewers. Further, an unaided human cannot accurately sift through the sheer volume of data collected at the individual or household level to identify patterns and make optimal assignments.


Accordingly, systems and methods are described herein for providing tools that determine and display optimal assignments of data items. For example, a segment association system can process data regarding individuals or households to identify segments of viewers. Segments of viewers may have common characteristics. For example, individuals who all gave the same answer to a particular survey question (e.g., “do you have a valid passport?” or “have you purchased a mobile phone in the past twelve months?”) may be identified as a segment. Characteristics that define a segment may include, for example, recent activities, planned activities, food or drink preferences, professions, employment statuses, medical conditions, hobbies, political affiliations, or other such characteristics. In some embodiments, segments may be identified based on the answers to multiple survey questions (e.g., individuals whose answers indicated that they work more than 40 hours per week, frequently travel by air, and have not taken a vacation recently may be identified as a “business traveler” segment). In some embodiments, segments may be determined for an individual based on surveys of the individual's household. For example, the answers to a household survey may indicate that the household includes young children. One or more adults residing at the household may thus be identified as members of a “parents of young children” segment. In some embodiments, segments may be identified based on information derived from activities of individuals (e.g., an individual that makes a purchase at a specialty biking store every week may be associated with a bike enthusiast segment). The number of segments that could be identified may number in the tens of thousands, and may involve interdependencies and patterns that would be impossible for an unaided human to discern or properly interpret.


The segment association system may further process data regarding the viewing habits of individuals, including at least some of the individuals who have been classified into segments. The segment association system may thus determine a degree of association between various segments and the data items that are viewed. For example, the segment association system may determine a degree of association between a first segment (e.g., people who have purchased a computer within the past twelve months) and each of a plurality of data items (e.g., the first segment could be scored with reference to a television program with a science fiction theme, as well as numerous other data items). The segment association system may further determine another degree of association between a second segment (e.g., people whose hobbies include cooking) and the same plurality of data items. The segment association system may then generate user interfaces that display the degrees of association between various segments and data items, and thereby enable interactive determination of optimal data item assignments. For example, the segment association system may indicate that the “people who have recently purchased a computer” segment has a higher degree of association with the sci-fi program than with a cooking program, and thus an advertisement targeting recent computer buyers would be optimally assigned to an advertising slot on the sci-fi program. The segment association system may identify and display degrees of association that would be unexpected or counterintuitive to an unaided human, and may identify and display degrees of association for segments that have no obvious affinity to any particular genre or category of data items. The segment association system may thereby enable optimal associations that would not otherwise be achieved.


While the present disclosure may use television programs and advertisements as examples, it will be understood that the present disclosure is not limited to any particular medium or method of delivery. For example, data items may include radio broadcasts, webcasts, podcasts, streaming media, social media, and the like, and may be associated with network-based advertisements (e.g., advertisements viewed or heard in video games, social media, mobile applications, etc.), advertisements preceding or following movies, product placements, announcements or displays at sporting events, physical kiosks and displays, and so forth.


Terms

In order to facilitate an understanding of the systems and methods discussed herein, a number of terms are defined below. The terms defined below, as well as other terms used herein, should be construed to include the provided definitions, the ordinary and customary meaning of the terms, and/or any other implied meaning for the respective terms. Thus, the definitions below do not limit the meaning of these terms, but only provide exemplary definitions.


Data item: An item of content that may be viewed, heard, or otherwise consumed. Data items may include audio content, video content, and/or other content. Examples of data items include television programs, radio programs, commercials, podcasts, webcasts, and the like.


Data Store: Any non-transient computer readable storage medium and/or device (or collection of data storage mediums and/or devices). Examples of data stores include, but are not limited to, optical disks (e.g., CD-ROM, DVD-ROM, etc.), magnetic disks (e.g., hard disks, floppy disks, etc.), memory circuits (e.g., solid state drives, random-access memory (RAM), etc.), and/or the like. Another example of a data store is a hosted storage environment that includes a collection of physical data storage devices that may be remotely accessible and may be rapidly provisioned as needed (commonly referred to as “cloud” storage).


Database: Any data structure (and/or combinations of multiple data structures) for storing and/or organizing data, including, but not limited to, relational databases (e.g., Oracle databases, MySQL databases, etc.), non-relational databases (e.g., NoSQL databases, etc.), in-memory databases, spreadsheets, as comma separated values (CSV) files, eXtendible markup language (XML) files, TeXT (TXT) files, flat files, spreadsheet files, and/or any other widely used or proprietary format for data storage. Databases are typically stored in one or more data stores. Accordingly, each database referred to herein (e.g., in the description herein and/or the figures of the present application) is to be understood as being stored in one or more data stores.


Provider: A provider of one or more data items, such as a television network, internet service provider, broadcaster, podcaster, etc. In some contexts, a provider may be referred to as a “network” or “television network.” It will be understood that such references are provided for purposes of example, and do not limit the present disclosure to a particular type of provider.


Viewer: A consumer of content, including multiple forms of content. Thus, the term “viewer” should be understood (unless the context requires otherwise) as including consumers of audio or other content formats.


Segment: A portion of a set of viewers that shares one or more characteristics or attributes. A segment may be defined in terms of demographic, geographic, psychographic, and/or other characteristics, such as behaviors or activities (past, present, or future), interests, hobbies, or other identifiable patterns.


Timeslot: A time interval associated with a data item, such as the day of the week and time of day at which a network television program is typically broadcast. Unless the context requires otherwise, “timeslot” may be used interchangeably with “data item” to reference content that airs on a particular day and time.


Example Embodiments


FIG. 1 is a block diagram of an example system 100 for optimal association of data items in accordance with aspects of the present disclosure. As shown, the example system 100 includes a client computing devices 110, a segment association system 140, and data stores 150, 152, and 154, which may communicate with each other via a network 120.


The client computing device 110 can be any computing device that implements aspects of the present disclosure, and may include one or more software applications such as web browsers, mobile applications, messaging services, or other applications. In some embodiments, multiple client computing devices 110 may communicate with the segment association system 140. In any event, a user or users may interact with the example system 100 through any number of client computing devices 110.


The segment association system 140 can be a computing system configured to make associations between data items (e.g., television programs, streaming media, and the like, as discussed above) and segments of viewers that are of interest to advertisers. For example, the segment association system 140 can be a computer system configured to execute software or a set of programmable instructions that process viewer data, segment data, and viewership data to determine associations, and display the resulting associations in one or more user interfaces. In some embodiments, the segment association system 140 can be implemented using a computer system 400, as shown in FIG. 4 and described below.


The segment association system 140 can include one or more computing devices (e.g., server(s)), memory storing data and/or software instructions (e.g., database(s), memory device(s), etc.), and other known computing components. According to some embodiments, the segment association system 140 can include one or more networked computers that execute processing in parallel or use a distributed computing architecture. The segment association system 140 can be configured to communicate with one or more components of the system 100, and can be configured to securely provide information via an interface(s) accessible by users over a network (e.g., the Internet). For example, the segment association system 140 can include a web server that hosts a web page accessible through network 120. In some embodiments, the segment association system 140 can include an application server configured to provide data to one or more client applications executing on computing systems connected to the segment association system 140 via the network 120.


The data stores 150, 152, and 154 may illustratively be any computer-readable data stores that implement aspects of the present disclosure. For example, the data stores 150, 152, and 154 may be magnetic media such as hard disc drives, solid state devices, flash drives, databases, lists, or any other non-transient computer-readable data stores. The viewership data store 150 may store information regarding viewership of particular data items. For example, the viewership data store 150 may store ratings or other data that identify individual viewers and the data items (or portions of data items) that they viewed during a particular time period. The viewer data store 152 may store information regarding individual viewers, such as individual or household survey responses, activity logs, or other data that enables classifying viewers into segments. The segment data store 154 may store information regarding viewer segments, and in some embodiments may store the results of classifying viewers into segments. In some embodiments, one or more of the data stores 150, 152, 154 may be implemented as a single data store, such as a relational database.


The example system 100 further includes a network 120, through which the client computing device 110, segment association system 140, and data stores 150 and 152 may communicate. The network 120 may illustratively be any wired or wireless network, including but not limited to a local area network (LAN), wide area network (WAN), Wi-Fi network, Bluetooth network, cellular network, mesh network, the Internet, or other network or networks. In some embodiments, the system 100 may include multiple networks 120. For example, the client computing device 110 and the segment association system 140 may communicate via the Internet, and the segment association system 140 and the data stores 150, 152, and 154 may communicate via a LAN.


It will be understood that FIG. 1 is provided for purposes of example, and that the system 100 may include more, fewer, or different configurations of devices than the example illustrated in FIG. 1. For example, one or more of the data stores 150, 152, and 154 may be implemented as components of the segment association system 140. As a further example, a server, proxy, or other device may serve as an intermediary between the client computing device 110 and the segment association system 140. The present disclosure is thus understood to include many embodiments beyond the example provided in FIG. 1.



FIG. 2A is a diagram of an example user interface 200 for displaying the viewer segments that are optimally associated with a data item in accordance with aspects of the present disclosure. In the illustrated example, the user interface 200 displays a number of selection controls 210, 212, 214, 216, and 218, which enable selection of a particular set of associations. For example, a time period selector 210 may be displayed, enabling selection of a particular time period. In various embodiments, time periods may include years, quarters, months, weeks, days, days of the week, or other periods, and may include irregular or variable time periods (e.g., “Spring Training,” “Regular Season,” Playoffs,” etc.).


In some embodiments, multiple data sources may be available for ratings or other data. The user interface 200 may thus include a data source selector 212, which enables user selection of one of the data sources. In some embodiments, data sources may be tied to a particular time period. For example, individual or household surveys may be associated with a time period based on the dates upon which these surveys were taken, or viewership data may be associated with the quarter or month in which the viewing took place. In further embodiments, only the data sources that include data from a selected time period may be available for selection in the data source selector 212.


The user interface 200 may further include a content provider selector 214. Content providers may illustratively include any entity that provides data items, such as broadcast television networks, cable networks or channels, terrestrial and/or satellite radio networks, television stations, radio stations, webcasters, podcasters, online content subscription services, movie studios, and the like. In some embodiments, content providers may be organized into hierarchical relationships, such as a sports network, an entertainment network, and a news network that are all owned or operated by the same parent. In other embodiments, content may be organized and selected by content source (e.g., a particular studio or producer).


The user interface 200 may further include a day selector 216 and timeslot selector 218 to enable selection of a particular day and timeslot. In some embodiments, the day selector 216 and timeslot selector 218 may be combined into a single control. In other embodiments, a data item selector may be provided, and day(s) and timeslot(s) may be determined based on the selected data item. For example, a data item or items may be selected that air at irregular times (e.g., a miniseries or a series of sporting events), or a data item may be selected that is associated with multiple days and timeslots. In still further embodiments, a data item may not be associated with any particular day and time, or may only be loosely associated with a particular day and time. For example, a webcast or podcast may be nominally associated with the day and time when it is first made available, but may be viewed or consumed by different individuals at different times. In some embodiments, viewing times for streaming media may be grouped into categories such as “live” (e.g., viewing the streaming media at the time it is first made available), “live+same day,” “live+3 days,” “live+7 days,” and so forth, and a selector may be provided to allow selection of one of these categories.


Although FIG. 2A depicts the selection controls 210, 212, 214, 216, and 218 as drop-down menus, it will be understood that the present disclosure includes any user interface element or combination of elements that enables selection. For example, the user interface 200 may display a grid of timeslots and content providers, and enable selection of content providers and timeslots by selecting cells in the grid. As further examples, the user interface 200 may display checkboxes, radio buttons, lists, buttons, or other controls to enable selection of time periods, data sources, and the like. Additionally, in some embodiments, one or more of the selection controls 210, 212, 214, 216, and 218 may be omitted, combined, or displayed in a separate user interface.


The user interface 200 may further include a number of segments 230A-E that have been identified as having a strong association with the selected data item or timeslot. As described in more detail below with reference to FIG. 3A, a segment association system (such as the segment association system 140 of FIG. 1) may determine degrees of association between data items and segments of viewers. The user interface 200 may in turn display the segments 230A-E having the strongest degrees of association. The segments 230A-E may include human-readable descriptions of the segments, such as “redeemed frequent flyer miles in the past twelve months” or “has a membership at a warehouse club store.” The user interface 200 may further display a chart with bars 232A-E, which may indicate the degree of association between each of the segments 230A-E and the selected data item or timeslot. In some embodiments, the bars 232A-E may be labeled with affinity scores, legends, or other information to indicate the degree of association. In further embodiments, the bars 232A-E may be omitted and affinity scores may be presented in a table format or other format.


While FIG. 2A depicts the bars 232A-E as a “top five” list of segments having the highest degrees of association with a particular data item or timeslot, it will be understood that the present disclosure is not limited to displaying a particular number of segments or displaying them in a particular order. For example, in some embodiments, the user interface 200 may include a control that enables selection of a segment or segments of interest, and the user interface 200 may then display the degrees of association for the selected segment or segments regardless of which segments have the highest degrees of association with the data item. In some embodiments, selection of a segment may provide additional information regarding the segment, such as demographics or income distributions of viewers in the segment.


Other variations on FIG. 2A are within the scope of the present disclosure. For example, although FIG. 2A depicts a user interface 200 that is displayed via a web browser, the present disclosure includes embodiments in which an interface is displayed by a dedicated application, such as a mobile or desktop application. As further examples, the user interface 200 may include visualizations other than bar charts, such as pie charts, tables, grids, and the like. As a still further example, the user interface 200 may display a graph with networks and time slots as axes, and display degrees of association between a particular segment and the graphed networks and timeslots as colors, shades, or other indicators. The depicted user interface 200 is thus understood to be illustrative and not limiting.



FIG. 2B is a diagram of an example user interface 250 for displaying the data items that are optimally associated with a viewer segment in accordance with aspects of the present disclosure. In the illustrated user interface 250, selection controls 210 and 212 are as previously discussed with reference to FIG. 2A. The user interface 250 further includes segment selection controls 252A-B, which enable selection of a particular segment or segments. In some embodiments, the segment selection controls 252A-B can be used to define a new, custom segment. For example, the segment selection controls 252A-B may enable defining a new segment that is a union or intersection of previously defined segments. The segment selection controls 252A-B may further enable creating a description for the new segment and adding the new segment to a segment data store (such as the segment data store 154 of FIG. 1). In other embodiments, a single segment selection control 252A may be provided and may enable selection of a particular segment.


The user interface 250 further identifies content provider timeslots 270A-E that have a high degree of association with the selected segment. In some embodiments, the user interface 250 may identify the data items that air in the timeslots instead of, or in addition to, identifying the timeslots. As with the user interface 200, the user interface 250 may include bars 272A-E that indicate the degree of association for each of the timeslots 270A-E, and may further include labels, legends, scores or other information regarding the degrees of association. In some embodiments, the user interface 250 may enable selection of data items or timeslots of interest, and may display degrees of association between the data items of interest and a specified segment or segments. In further embodiments, segments and timeslots may be displayed in other formats, such as grids or timelines, to enable analysis of degrees of association across multiple data items, multiple timeslots, multiple providers, and so forth.


Other variations on FIG. 2B are within the scope of the present disclosure. For example, although FIG. 2B depicts a user interface 250 that is displayed via a web browser, the present disclosure includes embodiments in which an interface is displayed by a dedicated application, such as a mobile or desktop application. As further examples, the user interface 250 may include additional information for each segment, such as universe estimates (e.g., the estimated size of this segment within a given population, such as the population of the United States), indices, rating scores, and the like. The depicted user interface 250 is thus understood to be illustrative and not limiting.



FIG. 3A is a flowchart depicting an illustrative routine 300 for determining degrees of association between data items and segments of viewers. The routine 300 may be carried out, for example, by the segment association system 140 depicted in FIG. 1, or various blocks of the routine 300 may be carried out by a component or components of the segment association system 140, such as the segment population engine 142 and the segment association engine 144. Depending on the embodiment, the routine 300 may include more and/or fewer blocks and the blocks may be carried out in an order different than illustrated.


At block 302, first data regarding viewer segments may be obtained. In some embodiments, as described above, all or part of the first data may be generated or aggregated based on data regarding individual viewers or households. For example, a segment may be defined by a criterion, such as “answered ‘rent’ to the survey question ‘do you own or rent your current home?’”, and the segment may be populated by identifying individuals who satisfy the criterion. As a further example, a segment may be generated that represents the union or intersection of other, previously defined segments. In some embodiments, a user interface may be provided for creating custom segments, and information received via the user interface may be used to generate custom segments from individual or household viewer data.


At block 304, second data may be obtained regarding viewings of data items. In some embodiments, the second data may be associated with a time period in which the data items were available for viewing, such as a calendar quarter (e.g., the first quarter of 2018). In further embodiments, the first data may be associated with the same time period or a corresponding time period. The second data may include information that enables mapping of individual viewers in the second data to the individuals who were surveyed to collect the first data. For example, a unique ID number may be assigned to an individual, and may be used to associate the individual's viewing habits with their survey responses. In some embodiments, the first data and the second data may be obtained from the same source, such as a third-party service that monitors individual viewers to generate ratings data. In several embodiments, the first data and the second data may be anonymized such that individuals cannot be personally identified.


At block 306, an unprocessed data item (that is, a data item that the routine 300 has not previously processed) may be selected. At block 308, in some embodiments, one or more viewer segments associated with the data item may be identified. Illustratively, the first data may contain tens of thousands or even hundreds of thousands of segments, some of which may not be represented in the second data, or which may be represented in insufficient quantity. For example, the number of viewers of a particular program who are in the “have traveled outside the United States in the past twelve months” segment may not be statistically significant. A filtering function or other criterion may thus be used to identify a subset of segments that are well-represented in the second data. In other embodiments, block 308 may be omitted and all available viewer segments may be processed by the routine 300.


At block 310, an unprocessed segment may be selected. At block 312, a degree of association may be determined between the selected segment and the selected data item. In some embodiments, the degree of association may be determined using a term frequency—inverse document frequency (“TF-IDF”) function, such as:







segmentMinutes

p
,
q


×

log


(


totalMinutes
q


totalSegmentMinutes
q


)







In the above function, the term frequency (“TF”) segmentMinutesp,q is the total number of minutes that the selected segment watched the selected data item or timeslot p (which is associated with a particular provider) during a time period q. In some embodiments, q is the time period associated with the second data (e.g., a calendar quarter). The inverse document frequency (“IDF”) is a logarithmic function that includes a numerator totalMinutesq, which is the total number of minutes of available content during the time period q (regardless of how many people watched it or how often it was watched), and a denominator totalSegmentMinutesq, which is the total number of minutes that individuals in the segment watched any data item during the time period q. The TF-IDF function thus quantifies the degree of association between the segment and the data item and acts as a weighting function. For example, a segment of “people who have been to the supermarket in the past twelve months” may be large relative to the total population of viewers, and thus the TF of the segment may be relatively high for any given data item. However, the relative size of the segment causes it to have a relatively low IDF, and so the function is only likely to identify a high degree of association between a data item and a large segment if the TF for the data item (that is, the total amount of time this segment spends viewing the data item) is exceptionally high. As a further example, a segment of “people whose hobbies include mountain climbing” may be relatively small compared to the total population, and thus the segment would have a relatively high IDF. However, the segment would have a relatively low TF in light of its relative size. The function thus identifies segments that are large enough to have a significant TF (and thus be large enough to be of interest to advertisers) but also small enough to have a significant IDF (and thus be targetable with specific advertisements). In various embodiments, the degree of association may be determined based on minutes viewed, distinct viewers, or similar criteria.


At decision block 314, a determination may be made as to whether all of the segments identified at block 308 have been processed. If not, then the routine 300 branches to block 310, selects one of the remaining segments, and iterates through blocks 310, 312, and 314 until all of the segments have been processed. The routine 300 then branches to decision block 316, where a determination may be made as to whether all of the data items have been processed. If not, then the routine 300 branches to block 306 and iterates until all of the data items have been processed.


When all of the segments have been processed for all of the data items, the routine 300 branches to block 318, where a user interface may be generated and displayed to indicate the determined degrees of association between segments and data items. The routine 300 may generate, for example, the user interface 200 or the user interface 250 as depicted in FIGS. 2A and 2B. In some embodiments, block 318 may be omitted and the degrees of association may be stored in a data store for later access. For example, the degrees of association may be stored and then subsequently accessed via queries or API calls. In other embodiments, the routine may output a subset of the determined degrees of association for further analysis.



FIG. 3B is a flowchart depicting an illustrative routine 350 for determining degrees of association between data items and segments of viewers. The routine 350 may be carried out, for example, by the segment association system 140 depicted in FIG. 1, or various blocks of the routine 350 may be carried out by a component or components of the segment association system 140, such as the user interface generator 146. Depending on the embodiment, the routine 350 may include more and/or fewer blocks and the blocks may be carried out in an order different than illustrated.


At block 352, data sources may be obtained and timeframes that have at least one available data source may be displayed. The timeframes may, for example, be calendar quarters, years, or other time periods as described above. At block 354, a user selection of a timeframe may be received.


At block 356, the available data sources for the selected timeframe may be obtained. A data source may be associated with a particular timeframe, as described above, or in some embodiments may be associated with multiple timeframes or common to all timeframes. In some embodiments, if more than one data source is available for the selected timeframe, the available data sources may be displayed. In these embodiments, at block 358, one or more of the available data sources may be selected.


At block 360, providers and timeslots for which data is available from the selected data source(s) may be displayed. For example, a list of television networks and corresponding programming schedules may be displayed. At block 362, user selection of a particular provider and timeslot may be received. In some embodiments, only a provider may be selected. In other embodiments, only a timeslot may be selected. For example, a user may request degrees of association for viewer segments for all programming on a particular network, or for a particular day and time across all networks.


At block 364, degrees of association between viewer segments and the selected provider(s) and timeslot(s) may be obtained and displayed. In some embodiments, the degrees of association may be displayed interactively, such that the user may select a particular degree of association and receive more information regarding the segment, the provider, the timeslot, or other information. The information displayed at block 364 may thus enable optimal assignments of advertisements to available advertising spots in data items, as described above.


Additional Implementation Details and Embodiments

Various embodiments of the present disclosure may be a system, a method, and/or a computer program product at any possible technical detail level of integration. The computer program product may include a computer readable storage medium (or mediums) having computer readable program instructions thereon for causing a processor to carry out aspects of the present disclosure.


For example, the functionality described herein may be performed as software instructions are executed by, and/or in response to software instructions being executed by, one or more hardware processors and/or any other suitable computing devices. The software instructions and/or other executable code may be read from a computer readable storage medium (or mediums).


The computer readable storage medium can be a tangible device that can retain and store data and/or instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but is not limited to, an electronic storage device (including any volatile and/or non-volatile electronic storage devices), a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. A non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a solid state drive, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing. A computer readable storage medium, as used herein, is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.


Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network. The network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. A network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.


Computer readable program instructions (as also referred to herein as, for example, “code,” “instructions,” “module,” “application,” “software application,” and/or the like) for carrying out operations of the present disclosure may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, configuration data for integrated circuitry, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++, or the like, and procedural programming languages, such as the “C” programming language or similar programming languages. Computer readable program instructions may be callable from other instructions or from itself, and/or may be invoked in response to detected events or interrupts. Computer readable program instructions configured for execution on computing devices may be provided on a computer readable storage medium, and/or as a digital download (and may be originally stored in a compressed or installable format that requires installation, decompression or decryption prior to execution) that may then be stored on a computer readable storage medium. Such computer readable program instructions may be stored, partially or fully, on a memory device (e.g., a computer readable storage medium) of the executing computing device, for execution by the computing device. The computer readable program instructions may execute entirely on a user's computer (e.g., the executing computing device), partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider). In some embodiments, electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present disclosure.


Aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer readable program instructions.


These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart(s) and/or block diagram(s) block or blocks.


The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks. For example, the instructions may initially be carried on a magnetic disk or solid state drive of a remote computer. The remote computer may load the instructions and/or modules into its dynamic memory and send the instructions over a telephone, cable, or optical line using a modem. A modem local to a server computing system may receive the data on the telephone/cable/optical line and use a converter device including the appropriate circuitry to place the data on a bus. The bus may carry the data to a memory, from which a processor may retrieve and execute the instructions. The instructions received by the memory may optionally be stored on a storage device (e.g., a solid state drive) either before or after execution by the computer processor.


The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the blocks may occur out of the order noted in the Figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. In addition, certain blocks may be omitted in some implementations. The methods and processes described herein are also not limited to any particular sequence, and the blocks or states relating thereto can be performed in other sequences that are appropriate.


It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions. For example, any of the processes, methods, algorithms, elements, blocks, applications, or other functionality (or portions of functionality) described in the preceding sections may be embodied in, and/or fully or partially automated via, electronic hardware such application-specific processors (e.g., application-specific integrated circuits (ASICs)), programmable processors (e.g., field programmable gate arrays (FPGAs)), application-specific circuitry, and/or the like (any of which may also combine custom hard-wired logic, logic circuits, ASICs, FPGAs, etc. with custom programming/execution of software instructions to accomplish the techniques).


Any of the above-mentioned processors, and/or devices incorporating any of the above-mentioned processors, may be referred to herein as, for example, “computers,” “computer devices,” “computing devices,” “hardware computing devices,” “hardware processors,” “processing units,” and/or the like. Computing devices of the above-embodiments may generally (but not necessarily) be controlled and/or coordinated by operating system software, such as Mac OS, iOS, Android, Chrome OS, Windows OS (e.g., Windows XP, Windows Vista, Windows 7, Windows 8, Windows 10, Windows Server, etc.), Windows CE, Unix, Linux, SunOS, Solaris, Blackberry OS, VxWorks, or other suitable operating systems. In other embodiments, the computing devices may be controlled by a proprietary operating system. Conventional operating systems control and schedule computer processes for execution, perform memory management, provide file system, networking, I/O services, and provide a user interface functionality, such as a graphical user interface (“GUI”), among other things.


For example, FIG. 4 is a block diagram that illustrates a computer system 400 upon which various embodiments may be implemented. Computer system 400 includes a bus 402 or other communication mechanism for communicating information, and a hardware processor 404, or multiple processors 404, coupled with bus 402 for processing information. Hardware processor(s) 404 may be, for example, one or more general purpose microprocessors.


Computer system 400 also includes a main memory 406, such as a random access memory (RAM), cache and/or other dynamic storage devices, coupled to bus 402 for storing information and instructions to be executed by processor 404. Main memory 406 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor 404. Such instructions, when stored in storage media accessible to processor 404, render computer system 400 into a special-purpose machine that is customized to perform the operations specified in the instructions.


Computer system 400 further includes a read only memory (ROM) 408 or other static storage device coupled to bus 402 for storing static information and instructions for processor 404. A storage device 410, such as a magnetic disk, optical disk, solid state drive, USB thumb drive (flash drive), etc., is provided and coupled to bus 402 for storing information and instructions.


Computer system 400 may be coupled via bus 402 to a display 412, such as a cathode ray tube (CRT) or LCD display (or touchscreen), for displaying information to a computer user. An input device 414, including alphanumeric and other keys, is coupled to bus 402 for communicating information and command selections to processor 404. Another type of user input device is cursor control 416, such as a mouse, trackball, trackpad, or cursor direction keys for communicating direction information and command selections to processor 404 and for controlling cursor movement on display 412. This input device typically has two degrees of freedom in two axes, a first axis (e.g., x) and a second axis (e.g., y), that allows the device to specify positions in a plane. In some embodiments, the same direction information and command selections as cursor control may be implemented via receiving touches on a touchscreen without a cursor.


Computing system 400 may include a user interface module to implement a GUI that may be stored in a mass storage device as computer executable program instructions that are executed by the computing device(s). Computer system 400 may further, as described below, implement the techniques described herein using customized hard-wired logic, one or more ASICs or FPGAs, firmware and/or program logic which in combination with the computer system causes or programs computer system 400 to be a special-purpose machine. According to one embodiment, the techniques herein are performed by computer system 400 in response to processor(s) 404 executing one or more sequences of one or more computer readable program instructions contained in main memory 406. Such instructions may be read into main memory 406 from another storage medium, such as storage device 410. Execution of the sequences of instructions contained in main memory 406 causes processor(s) 404 to perform the process steps described herein. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions.


Various forms of computer readable storage media may be involved in carrying one or more sequences of one or more computer readable program instructions to processor 404 for execution. For example, the instructions may initially be carried on a magnetic disk or solid state drive of a remote computer. The remote computer can load the instructions into its dynamic memory and send the instructions over a telephone line using a modem. A modem local to computer system 400 can receive the data on the telephone line and use an infra-red transmitter to convert the data to an infra-red signal. An infra-red detector can receive the data carried in the infra-red signal and appropriate circuitry can place the data on bus 402. Bus 402 carries the data to main memory 406, from which processor 404 retrieves and executes the instructions. The instructions received by main memory 406 may optionally be stored on storage device 410 either before or after execution by processor 404.


Computer system 400 also includes a communication interface 418 coupled to bus 402. Communication interface 418 provides a two-way data communication coupling to a network link 420 that is connected to a local network 422. For example, communication interface 418 may be an integrated services digital network (ISDN) card, cable modem, satellite modem, or a modem to provide a data communication connection to a corresponding type of telephone line. As another example, communication interface 418 may be a local area network (LAN) card to provide a data communication connection to a compatible LAN (or WAN component to communicated with a WAN). Wireless links may also be implemented. In any such implementation, communication interface 418 sends and receives electrical, electromagnetic or optical signals that carry digital data streams representing various types of information.


Network link 420 typically provides data communication through one or more networks to other data devices. For example, network link 420 may provide a connection through local network 422 to a host computer 424 or to data equipment operated by an Internet Service Provider (ISP) 426. ISP 426 in turn provides data communication services through the world wide packet data communication network now commonly referred to as the “Internet” 428. Local network 422 and Internet 428 both use electrical, electromagnetic or optical signals that carry digital data streams. The signals through the various networks and the signals on network link 4210 and through communication interface 418, which carry the digital data to and from computer system 400, are example forms of transmission media.


Computer system 400 can send messages and receive data, including program code, through the network(s), network link 420 and communication interface 418. In the Internet example, a server 430 might transmit a requested code for an application program through Internet 428, ISP 426, local network 422 and communication interface 418.


The received code may be executed by processor 404 as it is received, and/or stored in storage device 410 or other non-volatile storage for later execution.


As described above, in various embodiments certain functionality may be accessible by a user through a web-based viewer (such as a web browser), or other suitable software program). In such implementations, the user interface may be generated by a server computing system and transmitted to a web browser of the user (e.g., running on the user's computing system). Alternatively, data (e.g., user interface data) necessary for generating the user interface may be provided by the server computing system to the browser, where the user interface may be generated (e.g., the user interface data may be executed by a browser accessing a web service and may be configured to render the user interfaces based on the user interface data). The user may then interact with the user interface through the web-browser. User interfaces of certain implementations may be accessible through one or more dedicated software applications. In certain embodiments, one or more of the computing devices and/or systems of the disclosure may include mobile computing devices, and user interfaces may be accessible through such mobile computing devices (for example, smartphones and/or tablets).


Many variations and modifications may be made to the above-described embodiments, the elements of which are to be understood as being among other acceptable examples. All such modifications and variations are intended to be included herein within the scope of this disclosure. The foregoing description details certain embodiments. It will be appreciated, however, that no matter how detailed the foregoing appears in text, the systems and methods can be practiced in many ways. As is also stated above, it should be noted that the use of particular terminology when describing certain features or aspects of the systems and methods should not be taken to imply that the terminology is being re-defined herein to be restricted to including any specific characteristics of the features or aspects of the systems and methods with which that terminology is associated.


Conditional language, such as, among others, “can,” “could,” “might,” or “may,” unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements, and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without user input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment.


The term “substantially” when used in conjunction with the term “real-time” forms a phrase that will be readily understood by a person of ordinary skill in the art. For example, it is readily understood that such language will include speeds in which no or little delay or waiting is discernible, or where such delay is sufficiently short so as not to be disruptive, irritating, or otherwise vexing to user.


Conjunctive language such as the phrase “at least one of X, Y, and Z,” or “at least one of X, Y, or Z,” unless specifically stated otherwise, is to be understood with the context as used in general to convey that an item, term, etc. may be either X, Y, or Z, or a combination thereof. For example, the term “or” is used in its inclusive sense (and not in its exclusive sense) so that when used, for example, to connect a list of elements, the term “or” means one, some, or all of the elements in the list. Thus, such conjunctive language is not generally intended to imply that certain embodiments require at least one of X, at least one of Y, and at least one of Z to each be present.


The term “a” as used herein should be given an inclusive rather than exclusive interpretation. For example, unless specifically noted, the term “a” should not be understood to mean “exactly one” or “one and only one”; instead, the term “a” means “one or more” or “at least one,” whether used in the claims or elsewhere in the specification and regardless of uses of quantifiers such as “at least one,” “one or more,” or “a plurality” elsewhere in the claims or specification.


The term “comprising” as used herein should be given an inclusive rather than exclusive interpretation. For example, a general purpose computer comprising one or more processors should not be interpreted as excluding other computer components, and may possibly include such components as memory, input/output devices, and/or network interfaces, among others.


While the above detailed description has shown, described, and pointed out novel features as applied to various embodiments, it may be understood that various omissions, substitutions, and changes in the form and details of the devices or processes illustrated may be made without departing from the spirit of the disclosure. As may be recognized, certain embodiments of the inventions described herein may be embodied within a form that does not provide all of the features and benefits set forth herein, as some features may be used or practiced separately from others. The scope of certain inventions disclosed herein is indicated by the appended claims rather than by the foregoing description. All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scope.

Claims
  • 1. A system comprising: a data store configured to store computer-executable instructions; anda processor in communication with the data store, wherein the computer-executable instructions, when executed by the processor, configure the processor to perform operations including: accessing first data identifying a plurality of segments of viewers, wherein each segment of viewers in the plurality of segments of viewers corresponds to a characteristic associated with each viewer in the segment;accessing second data identifying a plurality of viewers of television programs, wherein individual viewers in the plurality of viewers viewed at least a portion of a television program that aired on a specified network at a specified date and time;determining, based at least in part on the plurality of segments of viewers and the plurality of viewers of television programs, a degree of association between the characteristic associated with a first segment of viewers and individual television programs;generating a user interface for display, wherein the user interface enables selection of one or more characteristics associated with viewers, and wherein the user interface includes the degree of association between the selected one or more characteristics and one or more television programs; andcausing display of the user interface.
  • 2. The system of claim 1, wherein the operations further comprise identifying the first segment of viewers based at least in part on an analysis of the second data.
  • 3. The system of claim 1, wherein the operations further comprise determining, based at least in part on the plurality of segments of viewers and the plurality of viewers of television programs, a degree of association between a second segment of viewers and the television program.
  • 4. The system of claim 1, wherein the user interface indicates, for each segment of the plurality of segments of viewers, a respective degree of association between the segment of viewers and the television program.
  • 5. The system of claim 1, wherein the degree of association includes one or more of a numerical score, a probability, or a category.
  • 6. The system of claim 1, wherein the characteristic associated with the first segment of viewers comprises one or more of a recent activity, planned activity, food or drink preference, profession, employment status, medical condition, hobby, or political affiliation.
  • 7. A computer-implemented method comprising: obtaining first data identifying a plurality of segments of viewers, wherein each segment in the plurality of segments of viewers corresponds to a characteristic associated with each viewer in the segment;obtaining second data identifying a plurality of viewers of data items, wherein individual viewers in the plurality of viewers viewed at least a portion of a data item;determining, based at least in part on the plurality of segments of viewers and the plurality of viewers of data items, a degree of association between the characteristic associated with a first segment of viewers and the data item;generating a user interface for display, wherein the user interface enables selection of one or more characteristics associated with viewers, and wherein the user interface includes the degree of association between the selected one or more characteristics associated with the first segment of viewers and the data item; andcausing display of the user interface.
  • 8. The computer-implemented method of claim 7 further comprising: obtaining a plurality of common characteristics of viewers; andidentifying, based at least in part on the plurality of common characteristics of viewers, the first segment of viewers,wherein the user interface indicates degrees of association between the first segment of viewers and individual data items of a plurality of data items.
  • 9. The computer-implemented method of claim 7 further comprising determining that individual viewers of the plurality of viewers are associated with individual segments of the plurality of segments.
  • 10. The computer-implemented method of claim 7 further comprising: obtaining information identifying a network, a time of day, and a day of the week; anddetermining the data item based at least in part on the information.
  • 11. The computer-implemented method of claim 7, wherein the data item corresponds to one or more of a television program, streaming media, online content, time of day, day of the week, or programming network.
  • 12. The computer-implemented method of claim 7, wherein the data item corresponds to a plurality of dates and times.
  • 13. The computer-implemented method of claim 7 further comprising generating the first data from one or more data sources.
  • 14. The computer-implemented method of claim 13, wherein the one or more data sources include one or more of individual viewer surveys or household surveys.
  • 15. A non-transitory computer-readable medium storing computer-executable instructions that, when executed by a processor, configure the processor to perform operations including: determining, based at least in part on first data identifying a plurality of segments of viewers and second data identifying a plurality of viewers of data items, a degree of association between characteristics associated with each viewer in individual segments of the plurality of segments of viewers and a first data item of a plurality of data items;generating a first user interface for display, wherein the user interface enables selection of one or more characteristics associated with viewers, and wherein the first user interface includes a degree of association between the selected one or more characteristics associated with each viewer in the individual segments of the plurality of segments of viewers and the first data item of the plurality of data items; andcausing display of the first user interface.
  • 16. The non-transitory computer-readable medium of claim 15, the operations further comprising: generating a second user interface for display, the second user interface indicating a degree of association between a first segment of viewers and individual data items of the plurality of data items; andcausing display of the second user interface.
  • 17. The non-transitory computer-readable medium of claim 15, the operations further comprising determining one or more associations between individual segments in the first data and individual viewers in the second data.
  • 18. The non-transitory computer-readable medium of claim 15, wherein the first user interface displays degrees of association between individual segments of viewers and the first data item as a graph, and wherein the graph enables visual comparison of the degrees of association between the individual segments and the first data item.
  • 19. The non-transitory computer-readable medium of claim 15, wherein the first user interface displays one or more characteristics associated with a first segment of viewers.
  • 20. The non-transitory computer-readable medium of claim 15, wherein the first data identifies characteristics of individual viewers of the plurality of viewers of data items.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims priority to U.S. Provisional Application No. 62/679,614, filed Jun. 1, 2018, which is hereby incorporated by reference in its entirety.

US Referenced Citations (499)
Number Name Date Kind
4881179 Vincent Nov 1989 A
5241625 Epard et al. Aug 1993 A
5418950 Li et al. May 1995 A
5428737 Li et al. Jun 1995 A
5428776 Rothfield Jun 1995 A
5542089 Lindsay et al. Jul 1996 A
5608899 Li et al. Mar 1997 A
5613105 Xbikowski et al. Mar 1997 A
5701456 Jacopi et al. Dec 1997 A
5724575 Hoover et al. Mar 1998 A
5794228 French et al. Aug 1998 A
5794229 French et al. Aug 1998 A
5826021 Mastors et al. Oct 1998 A
5832218 Gibbs et al. Nov 1998 A
5845300 Comer Dec 1998 A
5857329 Bingham Jan 1999 A
5878434 Draper et al. Mar 1999 A
5897636 Kaeser Apr 1999 A
5911138 Li et al. Jun 1999 A
5918225 White et al. Jun 1999 A
5966706 Biliris et al. Oct 1999 A
5999911 Berg et al. Dec 1999 A
6006242 Poole et al. Dec 1999 A
6057757 Arrowsmith et al. May 2000 A
6065026 Cornelia et al. May 2000 A
6101479 Shaw Aug 2000 A
6134582 Kennedy Oct 2000 A
6208985 Krehel Mar 2001 B1
6232971 Haynes May 2001 B1
6236994 Swartz et al. May 2001 B1
6237138 Hameluck et al. May 2001 B1
6243706 Moreau et al. Jun 2001 B1
6243717 Gordon et al. Jun 2001 B1
6279018 Kudrolli et al. Aug 2001 B1
6289334 Reiner et al. Sep 2001 B1
6311181 Lee et al. Oct 2001 B1
6321274 Shakib et al. Nov 2001 B1
6370538 Lamping et al. Apr 2002 B1
6430305 Decker Aug 2002 B1
6463404 Appleby Oct 2002 B1
6519627 Dan et al. Feb 2003 B1
6523019 Borthwick Feb 2003 B1
6642945 Sharpe Nov 2003 B1
6643613 McGee et al. Nov 2003 B2
6665683 Meltzer Dec 2003 B1
6745382 Zothner Jun 2004 B1
6850317 Mullins et al. Feb 2005 B2
6851108 Syme et al. Feb 2005 B1
6857120 Arnold et al. Feb 2005 B1
6877137 Rivette et al. Apr 2005 B1
6944777 Belani et al. Sep 2005 B1
6944821 Bates et al. Sep 2005 B1
6967589 Peters Nov 2005 B1
6976024 Chavez et al. Dec 2005 B1
6978419 Kantrowitz Dec 2005 B1
7028223 Kolawa et al. Apr 2006 B1
7058648 Lightfoot et al. Jun 2006 B1
7085890 Kashyap Aug 2006 B2
7086028 Davis et al. Aug 2006 B1
7155728 Prabhu et al. Dec 2006 B1
7174377 Bernard et al. Feb 2007 B2
7194680 Roy et al. Mar 2007 B1
7213030 Jenkins May 2007 B1
7216133 Wu et al. May 2007 B2
7392254 Jenkins Jun 2008 B1
7403942 Bayliss Jul 2008 B1
7406592 Polyudov Jul 2008 B1
7441182 Beilinson et al. Oct 2008 B2
7441219 Perry et al. Oct 2008 B2
7461158 Rider et al. Dec 2008 B2
7519589 Charnock et al. Apr 2009 B2
7546353 Hesselink et al. Jun 2009 B2
7610290 Kruy et al. Oct 2009 B2
7627489 Schaeffer et al. Dec 2009 B2
7627812 Chamberlain et al. Dec 2009 B2
7634717 Chamberlain et al. Dec 2009 B2
7716140 Nielsen et al. May 2010 B1
7739246 Mooney et al. Jun 2010 B2
7757220 Griffith et al. Jul 2010 B2
7765489 Shah Jul 2010 B1
7770100 Chamberlain et al. Aug 2010 B2
7783679 Bley Aug 2010 B2
7853573 Warner et al. Dec 2010 B2
7877421 Berger et al. Jan 2011 B2
7880921 Dattilo et al. Feb 2011 B2
7908521 Sridharan et al. Mar 2011 B2
7912842 Bayliss Mar 2011 B1
7941336 Robin-Jan May 2011 B1
7958147 Turner et al. Jun 2011 B1
7962495 Jain et al. Jun 2011 B2
7962848 Bertram Jun 2011 B2
7966199 Frasher Jun 2011 B1
7979424 Dettinger et al. Jul 2011 B2
8001465 Kudrolli et al. Aug 2011 B2
8001482 Bhattiprolu et al. Aug 2011 B2
8010507 Poston et al. Aug 2011 B2
8073857 Sreekanth Dec 2011 B2
8103962 Embley et al. Jan 2012 B2
8117022 Linker Feb 2012 B2
8126848 Wagner Feb 2012 B2
8191005 Baier et al. May 2012 B2
8225201 Michael Jul 2012 B2
8290838 Thakur et al. Oct 2012 B1
8302855 Ma et al. Nov 2012 B2
8312367 Foster Nov 2012 B2
8364642 Garrod Jan 2013 B1
8386377 Xiong et al. Feb 2013 B1
8392556 Goulet et al. Mar 2013 B2
8417715 Bruckhaus et al. Apr 2013 B1
8429194 Aymeloglu et al. Apr 2013 B2
8429527 Arbogast Apr 2013 B1
8433702 Carrino et al. Apr 2013 B1
8499287 Shafi et al. Jul 2013 B2
8527949 Pleis et al. Sep 2013 B1
8554719 McGrew Oct 2013 B2
8560494 Downing Oct 2013 B1
8601326 Kirn Dec 2013 B1
8620641 Farnsworth et al. Dec 2013 B2
8639552 Chen et al. Jan 2014 B1
8682696 Shanmugam Mar 2014 B1
8688573 Ruknoic et al. Apr 2014 B1
8732574 Burr et al. May 2014 B2
8798354 Bunzel et al. Aug 2014 B1
8799313 Satlow Aug 2014 B2
8799867 Peri-Glass et al. Aug 2014 B1
8807948 Luo et al. Aug 2014 B2
8812444 Garrod et al. Aug 2014 B2
8838538 Landau et al. Sep 2014 B1
8855999 Elliot Oct 2014 B1
8903717 Elliot Dec 2014 B2
8909597 Aymeloglu et al. Dec 2014 B2
8924388 Elliot et al. Dec 2014 B2
8924389 Elliot et al. Dec 2014 B2
8924429 Fisher et al. Dec 2014 B1
8930874 Duff et al. Jan 2015 B2
8935201 Fisher et al. Jan 2015 B1
8935303 Karatzoglou et al. Jan 2015 B2
8938434 Jain et al. Jan 2015 B2
8938686 Erenrich et al. Jan 2015 B1
8984390 Aymeloglu et al. Mar 2015 B2
9031981 Potter et al. May 2015 B1
9058315 Burr et al. Jun 2015 B2
9105000 White et al. Aug 2015 B1
9165100 Begur et al. Oct 2015 B2
9176966 Silverstein et al. Nov 2015 B2
9230060 Friedlander et al. Jan 2016 B2
9286373 Elliot et al. Mar 2016 B2
9292388 Fisher et al. Mar 2016 B2
9330120 Colgrove et al. May 2016 B2
9348499 Aymeloglu et al. May 2016 B2
9348851 Kirn May 2016 B2
9348880 Kramer et al. May 2016 B1
9652510 Huang et al. May 2017 B1
10122808 Vickrey Nov 2018 B2
20010021936 Bertram Sep 2001 A1
20020032677 Morgenthaler et al. Mar 2002 A1
20020035590 Eibach et al. Mar 2002 A1
20020095360 Joao Jul 2002 A1
20020103705 Brady Aug 2002 A1
20020129368 Schlack Sep 2002 A1
20020184111 Swanson Dec 2002 A1
20020194058 Eldering Dec 2002 A1
20020196229 Chen et al. Dec 2002 A1
20030004770 Miller et al. Jan 2003 A1
20030018652 Heckerman Jan 2003 A1
20030023620 Trotta Jan 2003 A1
20030028560 Kudrolli et al. Feb 2003 A1
20030036927 Bowen Feb 2003 A1
20030061132 Mason et al. Mar 2003 A1
20030074187 Ait-Mokhtar et al. Apr 2003 A1
20030088438 Maughan et al. May 2003 A1
20030093401 Czahkowski et al. May 2003 A1
20030093755 O'Carroll May 2003 A1
20030105759 Bess et al. Jun 2003 A1
20030105833 Daniels Jun 2003 A1
20030115481 Baird et al. Jun 2003 A1
20030126102 Borthwick Jul 2003 A1
20030171942 Gaito Sep 2003 A1
20030177112 Gardner Sep 2003 A1
20030182313 Federwisch et al. Sep 2003 A1
20030212718 Tester Nov 2003 A1
20040003009 Wilmot Jan 2004 A1
20040006523 Coker Jan 2004 A1
20040034570 Davis Feb 2004 A1
20040044648 Anfindsen et al. Mar 2004 A1
20040078451 Dietz et al. Apr 2004 A1
20040083466 Dapp et al. Apr 2004 A1
20040088177 Travis et al. May 2004 A1
20040098731 Demsey et al. May 2004 A1
20040103088 Cragun et al. May 2004 A1
20040117387 Civetta et al. Jun 2004 A1
20040126840 Cheng et al. Jul 2004 A1
20040139212 Mukherjee et al. Jul 2004 A1
20040153451 Phillips et al. Aug 2004 A1
20040153837 Preston et al. Aug 2004 A1
20040193608 Gollapudi et al. Sep 2004 A1
20040205492 Newsome Oct 2004 A1
20040210763 Jonas Oct 2004 A1
20040236688 Bozeman Nov 2004 A1
20040236711 Nixon et al. Nov 2004 A1
20050004911 Goldberg et al. Jan 2005 A1
20050010472 Quatse et al. Jan 2005 A1
20050021397 Cui et al. Jan 2005 A1
20050028094 Allyn Feb 2005 A1
20050039116 Slack-Smith Feb 2005 A1
20050091186 Elish Apr 2005 A1
20050097441 Herbach et al. May 2005 A1
20050102328 Ring et al. May 2005 A1
20050120080 Weinreb et al. Jun 2005 A1
20050120391 Haynie Jun 2005 A1
20050125715 Di Franco et al. Jun 2005 A1
20050131935 O'Leary et al. Jun 2005 A1
20050183005 Denoue et al. Aug 2005 A1
20050226473 Ramesh Oct 2005 A1
20050262493 Schmidt et al. Nov 2005 A1
20050262512 Schmidt et al. Nov 2005 A1
20050278286 Djugash et al. Dec 2005 A1
20060004740 Dettinger et al. Jan 2006 A1
20060010130 Leff et al. Jan 2006 A1
20060026561 Bauman et al. Feb 2006 A1
20060031779 Theurer et al. Feb 2006 A1
20060045470 Poslinski et al. Mar 2006 A1
20060053097 King et al. Mar 2006 A1
20060053170 Hill et al. Mar 2006 A1
20060059423 Lehmann et al. Mar 2006 A1
20060070046 Balakrishnan et al. Mar 2006 A1
20060074866 Chamberlain et al. Apr 2006 A1
20060074967 Shaburov Apr 2006 A1
20060080139 Mainzer Apr 2006 A1
20060080283 Shipman Apr 2006 A1
20060080316 Gilmore et al. Apr 2006 A1
20060080616 Vogel et al. Apr 2006 A1
20060116991 Calderwood Jun 2006 A1
20060129746 Porter Jun 2006 A1
20060129992 Oberholtzer et al. Jun 2006 A1
20060136513 Ngo et al. Jun 2006 A1
20060142949 Helt Jun 2006 A1
20060143075 Carr et al. Jun 2006 A1
20060155654 Plessis et al. Jul 2006 A1
20060178915 Chao Aug 2006 A1
20060178954 Thukral et al. Aug 2006 A1
20060209085 Wong et al. Sep 2006 A1
20060218206 Bourbonnais et al. Sep 2006 A1
20060218491 Grossman et al. Sep 2006 A1
20060253502 Raman et al. Nov 2006 A1
20060265417 Amato et al. Nov 2006 A1
20060271884 Hurst Nov 2006 A1
20060277460 Forstall et al. Dec 2006 A1
20060288046 Gupta et al. Dec 2006 A1
20070000999 Kubo et al. Jan 2007 A1
20070005582 Navratil et al. Jan 2007 A1
20070018986 Hauser Jan 2007 A1
20070027851 Kruy et al. Feb 2007 A1
20070043686 Teng et al. Feb 2007 A1
20070061752 Cory Mar 2007 A1
20070067285 Blume Mar 2007 A1
20070094248 McVeigh et al. Apr 2007 A1
20070113164 Hansen et al. May 2007 A1
20070136095 Weinstein Jun 2007 A1
20070150805 Misovski Jun 2007 A1
20070156673 Maga Jul 2007 A1
20070162454 D'Albora et al. Jul 2007 A1
20070168336 Ransil et al. Jul 2007 A1
20070168871 Jenkins Jul 2007 A1
20070174760 Chamberlain et al. Jul 2007 A1
20070178501 Rabinowitz et al. Aug 2007 A1
20070185850 Walters et al. Aug 2007 A1
20070185867 Maga Aug 2007 A1
20070192122 Routson et al. Aug 2007 A1
20070192281 Cradick et al. Aug 2007 A1
20070233756 D'Souza et al. Oct 2007 A1
20070245339 Bauman et al. Oct 2007 A1
20070260582 Liang Nov 2007 A1
20070271317 Carmel Nov 2007 A1
20070284433 Domenica et al. Dec 2007 A1
20070295797 Herman et al. Dec 2007 A1
20070299697 Friedlander et al. Dec 2007 A1
20080005063 Seeds Jan 2008 A1
20080016155 Khalatian Jan 2008 A1
20080091693 Murthy Apr 2008 A1
20080109714 Kumar et al. May 2008 A1
20080126344 Hoffman et al. May 2008 A1
20080126951 Sood et al. May 2008 A1
20080140387 Linker Jun 2008 A1
20080155440 Trevor et al. Jun 2008 A1
20080172607 Baer Jul 2008 A1
20080177782 Poston et al. Jul 2008 A1
20080186904 Koyama et al. Aug 2008 A1
20080195672 Hamel et al. Aug 2008 A1
20080196016 Todd Aug 2008 A1
20080201313 Dettinger et al. Aug 2008 A1
20080208735 Balet et al. Aug 2008 A1
20080215543 Huang et al. Sep 2008 A1
20080228467 Womack et al. Sep 2008 A1
20080249820 Pathria Oct 2008 A1
20080267386 Cooper Oct 2008 A1
20080270316 Guidotti et al. Oct 2008 A1
20080270398 Landau et al. Oct 2008 A1
20080276167 Michael Nov 2008 A1
20080281580 Zabokritski Nov 2008 A1
20080288475 Kim et al. Nov 2008 A1
20080301042 Patzer Dec 2008 A1
20080313132 Hao et al. Dec 2008 A1
20080313243 Poston et al. Dec 2008 A1
20090006150 Prigge et al. Jan 2009 A1
20090007056 Prigge et al. Jan 2009 A1
20090024962 Gotz Jan 2009 A1
20090031401 Cudich et al. Jan 2009 A1
20090043762 Shiverick et al. Feb 2009 A1
20090043801 LeClair Feb 2009 A1
20090055487 Moraes et al. Feb 2009 A1
20090083275 Jacob et al. Mar 2009 A1
20090089651 Herberger et al. Apr 2009 A1
20090094217 Dettinger et al. Apr 2009 A1
20090094270 Alirez et al. Apr 2009 A1
20090106178 Chu Apr 2009 A1
20090106242 McGrew Apr 2009 A1
20090112678 Luzardo Apr 2009 A1
20090112745 Stefanescu Apr 2009 A1
20090144747 Baker Jun 2009 A1
20090150868 Chakra et al. Jun 2009 A1
20090157732 Hao et al. Jun 2009 A1
20090161147 Klave Jun 2009 A1
20090164387 Armstrong et al. Jun 2009 A1
20090164934 Bhattiprolu et al. Jun 2009 A1
20090172674 Bobak et al. Jul 2009 A1
20090177962 Gusmorino et al. Jul 2009 A1
20090187546 Whyte et al. Jul 2009 A1
20090187556 Ross et al. Jul 2009 A1
20090193012 Williams Jul 2009 A1
20090199047 Vaitheeswaran et al. Aug 2009 A1
20090199106 Jonsson et al. Aug 2009 A1
20090216562 Faulkner et al. Aug 2009 A1
20090228365 Tomchek et al. Sep 2009 A1
20090228507 Jain et al. Sep 2009 A1
20090248721 Burton et al. Oct 2009 A1
20090248757 Havewala et al. Oct 2009 A1
20090249178 Ambrosino et al. Oct 2009 A1
20090249244 Robinson et al. Oct 2009 A1
20090271343 Vaiciulis et al. Oct 2009 A1
20090281839 Lynn et al. Nov 2009 A1
20090282068 Shockro et al. Nov 2009 A1
20090287470 Farnsworth et al. Nov 2009 A1
20090299830 West et al. Dec 2009 A1
20090307049 Elliott et al. Dec 2009 A1
20090313311 Hoffmann et al. Dec 2009 A1
20090313463 Pang et al. Dec 2009 A1
20090319515 Minton et al. Dec 2009 A1
20090319891 MacKinlay Dec 2009 A1
20100004857 Pereira et al. Jan 2010 A1
20100011282 Dollard et al. Jan 2010 A1
20100057622 Faith et al. Mar 2010 A1
20100070531 Aymeloglu et al. Mar 2010 A1
20100070842 Aymeloglu et al. Mar 2010 A1
20100070844 Aymeloglu et al. Mar 2010 A1
20100073315 Lee et al. Mar 2010 A1
20100076813 Ghosh et al. Mar 2010 A1
20100082541 Kottomtharayil Apr 2010 A1
20100082671 Li et al. Apr 2010 A1
20100098318 Anderson Apr 2010 A1
20100114817 Broeder et al. May 2010 A1
20100114831 Gilbert et al. May 2010 A1
20100122152 Chamberlain et al. May 2010 A1
20100145902 Boyan et al. Jun 2010 A1
20100145909 Ngo Jun 2010 A1
20100161646 Ceballos et al. Jun 2010 A1
20100169376 Chu Jul 2010 A1
20100169405 Zhang Jul 2010 A1
20100199167 Uematsu et al. Aug 2010 A1
20100204983 Chung et al. Aug 2010 A1
20100223260 Wu Sep 2010 A1
20100238174 Haub et al. Sep 2010 A1
20100262901 DiSalvo Oct 2010 A1
20100280851 Merkin Nov 2010 A1
20100306285 Shah et al. Dec 2010 A1
20100306722 LeHoty et al. Dec 2010 A1
20100313119 Baldwin et al. Dec 2010 A1
20100313239 Chakra et al. Dec 2010 A1
20110004626 Naeymi-Rad et al. Jan 2011 A1
20110035396 Merz et al. Feb 2011 A1
20110041084 Karam Feb 2011 A1
20110047540 Williams et al. Feb 2011 A1
20110066497 Gopinath et al. Mar 2011 A1
20110074788 Regan et al. Mar 2011 A1
20110074811 Hanson et al. Mar 2011 A1
20110093327 Fordyce, III et al. Apr 2011 A1
20110093490 Schindlauer et al. Apr 2011 A1
20110099133 Chang et al. Apr 2011 A1
20110107196 Foster May 2011 A1
20110131547 Elaasar Jun 2011 A1
20110145401 Westlake Jun 2011 A1
20110161409 Nair Jun 2011 A1
20110173093 Psota et al. Jul 2011 A1
20110179048 Satlow Jul 2011 A1
20110208565 Ross et al. Aug 2011 A1
20110208822 Rathod Aug 2011 A1
20110218955 Tang Sep 2011 A1
20110225482 Chan et al. Sep 2011 A1
20110225586 Bentley et al. Sep 2011 A1
20110252282 Meek et al. Oct 2011 A1
20110258216 Supakkul et al. Oct 2011 A1
20110270871 He et al. Nov 2011 A1
20110295649 Fine Dec 2011 A1
20110321008 Jhoney et al. Dec 2011 A1
20120004894 Butler Jan 2012 A1
20120022945 Falkenborg et al. Jan 2012 A1
20120059853 Jagota Mar 2012 A1
20120065987 Farooq et al. Mar 2012 A1
20120078595 Balandin et al. Mar 2012 A1
20120084117 Tavares et al. Apr 2012 A1
20120084184 Raleigh Apr 2012 A1
20120084287 Lakshminarayan et al. Apr 2012 A1
20120089606 Eshwar et al. Apr 2012 A1
20120102022 Miranker et al. Apr 2012 A1
20120123989 Yu et al. May 2012 A1
20120136804 Lucia May 2012 A1
20120159449 Arnold et al. Jun 2012 A1
20120173381 Smith Jul 2012 A1
20120174057 Narendra et al. Jul 2012 A1
20120188252 Law Jul 2012 A1
20120191446 Binsztok et al. Jul 2012 A1
20120197657 Prodanovic Aug 2012 A1
20120197660 Prodanovic Aug 2012 A1
20120215784 King et al. Aug 2012 A1
20120226590 Love et al. Sep 2012 A1
20120266245 McDougal et al. Oct 2012 A1
20120284670 Kashik et al. Nov 2012 A1
20120284719 Phan et al. Nov 2012 A1
20120304244 Xie et al. Nov 2012 A1
20120323829 Stokes et al. Dec 2012 A1
20130006947 Akinyemi et al. Jan 2013 A1
20130016106 Yip et al. Jan 2013 A1
20130024268 Manickavelu Jan 2013 A1
20130024731 Shochat et al. Jan 2013 A1
20130054306 Bhalla Feb 2013 A1
20130054551 Lange Feb 2013 A1
20130055264 Burr et al. Feb 2013 A1
20130086482 Parsons Apr 2013 A1
20130096968 Van Pelt et al. Apr 2013 A1
20130097130 Bingol et al. Apr 2013 A1
20130097482 Marantz et al. Apr 2013 A1
20130110746 Ahn May 2013 A1
20130124193 Holmberg May 2013 A1
20130124567 Balinsky et al. May 2013 A1
20130132348 Garrod May 2013 A1
20130151305 Akinola et al. Jun 2013 A1
20130151453 Bhanot et al. Jun 2013 A1
20130166480 Popescu et al. Jun 2013 A1
20130185245 Anderson Jul 2013 A1
20130187926 Silverstein Jul 2013 A1
20130198624 Aymeloglu et al. Aug 2013 A1
20130225212 Khan Aug 2013 A1
20130226879 Talukder et al. Aug 2013 A1
20130226944 Baid et al. Aug 2013 A1
20130232220 Sampson Sep 2013 A1
20130246316 Zhao et al. Sep 2013 A1
20130254025 Liu Sep 2013 A1
20130262527 Hunter et al. Oct 2013 A1
20130262528 Foit Oct 2013 A1
20130263019 Castellanos et al. Oct 2013 A1
20130288719 Alonzo Oct 2013 A1
20130325826 Agarwal et al. Dec 2013 A1
20140006404 McGrew et al. Jan 2014 A1
20140012886 Downing et al. Jan 2014 A1
20140074888 Potter et al. Mar 2014 A1
20140089339 Siddiqui et al. Mar 2014 A1
20140095363 Caldwell Apr 2014 A1
20140108074 Miller et al. Apr 2014 A1
20140115589 Marinelli, III et al. Apr 2014 A1
20140115610 Marinelli, III et al. Apr 2014 A1
20140129936 Richards et al. May 2014 A1
20140143252 Silverstein et al. May 2014 A1
20140208281 Ming Jul 2014 A1
20140214579 Shen et al. Jul 2014 A1
20140215506 Kalmes Jul 2014 A1
20140222793 Sadkin et al. Aug 2014 A1
20140244284 Smith Aug 2014 A1
20140244388 Manouchehri et al. Aug 2014 A1
20140358829 Hurwitz Dec 2014 A1
20150012509 Kirn Jan 2015 A1
20150026622 Roaldson et al. Jan 2015 A1
20150046481 Elliot Feb 2015 A1
20150073954 Braff Mar 2015 A1
20150089353 Folkening Mar 2015 A1
20150100907 Erenrich et al. Apr 2015 A1
20150106379 Elliot et al. Apr 2015 A1
20150112641 Faraj Apr 2015 A1
20150186483 Tappan et al. Jul 2015 A1
20150212663 Papale et al. Jul 2015 A1
20150254220 Burr et al. Sep 2015 A1
20150269030 Fisher et al. Sep 2015 A1
20160026923 Erenrich et al. Jan 2016 A1
20160062555 Ward et al. Mar 2016 A1
20160098176 Cervelli et al. Apr 2016 A1
20160110369 Cervelli et al. Apr 2016 A1
20160162519 Stowe et al. Jun 2016 A1
20160366089 Bank Dec 2016 A1
20180063596 Joglekar Mar 2018 A1
20190188753 McConnell Jun 2019 A1
Foreign Referenced Citations (45)
Number Date Country
2013251186 Nov 2015 AU
2014203669 May 2016 AU
102054015 May 2014 CN
102014103482 Sep 2014 DE
102014204827 Sep 2014 DE
102014204830 Sep 2014 DE
102014204834 Sep 2014 DE
102014213036 Jan 2015 DE
1647908 Apr 2006 EP
1672527 Jun 2006 EP
2487610 Aug 2012 EP
2743839 Jun 2014 EP
2778913 Sep 2014 EP
2778914 Sep 2014 EP
2778986 Sep 2014 EP
2921975 Sep 2015 EP
2993595 Mar 2016 EP
3002691 Apr 2016 EP
3009943 Apr 2016 EP
3032441 Jun 2016 EP
3035214 Jun 2016 EP
2634745 May 2017 EP
2366498 Mar 2002 GB
2513472 Oct 2014 GB
2513721 Nov 2014 GB
2508503 Jan 2015 GB
2517582 Feb 2015 GB
2508293 Apr 2015 GB
1194178 Sep 2015 HK
2013134 Jan 2015 NL
622485 Mar 2015 NZ
616212 May 2015 NZ
616299 Jul 2015 NZ
WO 2000034895 Jun 2000 WO
WO 01025906 Apr 2001 WO
WO 2001088750 Nov 2001 WO
WO 2007133206 Nov 2007 WO
WO 2009051987 Apr 2009 WO
WO 2010030913 Mar 2010 WO
WO 2010030914 Mar 2010 WO
WO 2010030917 Mar 2010 WO
WO 2010030919 Mar 2010 WO
WO 2012061162 May 2012 WO
WO 2012119008 Sep 2012 WO
WO 2013030595 Mar 2013 WO
Non-Patent Literature Citations (77)
Entry
“A Quick Guide to UniProtKB Swiss-Prot & TrEMBL,” Sep. 2011, pp. 2.
“A Tour of Pinboard,” <http://pinboard.in/tour> as printed May 15, 2014 in 6 pages.
Abbey, Kristen, “Review of Google Docs,” May 1, 2007, pp. 2.
Adams et al., “Worklets: A Service-Oriented Implementation of Dynamic Flexibility in Workflows,” R. Meersman, Z. Tari et al. (Eds.): OTM 2006, LNCS, 4275, pp. 291-308, 2006.
Alur et al., “Chapter 2: IBM InfoSphere DataStage Stages,” IBM InfoSphere DataStage Data Flow and Job Design, Jul. 1, 2008, pp. 35-137.
Anonymous, “A Real-World Problem of Matching Records,” Nov. 2006, <http://grupoweb.upf.es/bd-web/slides/ullman.pdf> pp. 1-16.
Anonymous, “Frequently Asked Questions about Office Binder 97,” http://web.archive.org/web/20100210112922/http://support.microsoft.com/kb/843147 printed Dec. 18, 2006 in 5 pages.
Bae et al., “Partitioning Algorithms for the Computation of Average Iceberg Queries,” DaWaK 2000, LNCS 1874, pp. 276_286.
Ballesteros et al., “Batching: A Design Pattern for Efficient and Flexible Client/Server Interaction,” Transactions on Pattern Languages of Programming, Springer Berlin Heildeberg, 2009, pp. 48-66.
Bluttman et al., “Excel Formulas and Functions for Dummies,” 2005, Wiley Publishing, Inc., pp. 280, 284-286.
Bogle et al., “Reducing Cross-Domain Call Overhead Using Batched Futures,” SIGPLAN No. 29, 10 (Oct. 1994) pp. 341-354.
Bogle, Phillip Lee, “Reducing Cross-Domain Call Overhead Using Batched Futures,” May 1994, Massachusetts Institute of Technology, pp. 96.
Bouajjani et al., “Analysis of Recursively Parallel Programs,” PLDI09: Proceedings of the 2009 ACM Sigplan Conference on Programming Language Design and Implementation, Jun. 15-20, 2009, Dublin, Ireland, pp. 203-214.
Brandel, Mary, “Data Loss Prevention Dos and Don'ts,” <http://web.archive.org/web/20080724024847/http://www.csoonline.com/article/221272/Dos_and_Don_ts_for_Data_Loss_Prevention>, Oct. 10, 2007, pp. 5.
Canese et al., “Chapter 2: PubMed: The Bibliographic Database,” The NCBI Handbook, Oct. 2002, pp. 1-10.
Chaudhuri et al., “An Overview of Business Intelligence Technology,” Communications of the ACM, Aug. 2011, vol. 54, No. 8.
Chazelle et al., “The Bloomier Filter: An Efficient Data Structure for Static Support Lookup Tables,” SODA '04 Proceedings of the Fifteenth Annual ACM-SIAM Symposium on Discrete Algorithms, 2004, pp. 30-39.
Conner, Nancy, “Google Apps: The Missing Manual,” May 1, 2008, pp. 15.
Delcher et al., “Identifying Bacterial Genes and Endosymbiont DNA with Glimmer,” BioInformatics, vol. 23, No. 6, 2007, pp. 673-679.
Delicious, <http://delicious.com/> as printed May 15, 2014 in 1 page.
Donjerkovic et al., “Probabilistic Optimization of Top N Queries,” Proceedings of the 25th VLDB Conference, Edinburgh, Scotland, 1999, pp. 411-422.
“E-MailRelay,” <http://web.archive.org/web/20080821175021/http://emailrelay.sourceforge.net/> Aug. 21, 2008, pp. 2.
Fang et al., “Computing Iceberg Queries Efficiently,” Proceedings of the 24th VLDB Conference New York, 1998, pp. 299-310.
“The FASTA Program Package,” fasta-36.3.4, Mar. 25, 2011, pp. 29.
Ferreira et al., “A Scheme for Analyzing Electronic Payment Systems,” Basil 1997.
Galliford, Miles, “SnagIt Versus Free Screen Capture Software: Critical Tools for Website Owners,” <http://www.subhub.com/articles/free-screen-capture-software>, Mar. 27, 2008, pp. 11.
Gill et al., “Computerised Linking of Medical Records: Methodological Guidelines,”.
Goldstein et al., “Stacks Lazy Threads: Implementing a Fast Parallel Call,” Journal of Parallel and Distributed Computing, Jan. 1, 1996, pp. 5-20.
“GrabUp—What a Timesaver!” <http://atlchris.com/191/grabup/>, Aug. 11, 2008, pp. 3.
Gu et al., “Record Linkage: Current Practice and Future Directions,” Jan. 15, 2004, pp. 32.
Han et al., “Efficient Computation of Iceberg Cubes with Complex Measures,” ACM Sigmod, May 21-24, 2001, pp. 1-12.
Hua et al., “A Multi-attribute Data Structure with Parallel Bloom Filters for Network Services”, HiPC 2006, LNCS 4297, pp. 277-288, 2006.
Ivanova et al., “An Architecture for Recycling Intermediates in a Column-Store,” Proceedings of the 35th Sigmod International Conference on Management of Data, Sigmod '09, Jun. 29, 2009, p. 309.
Jacques, M., “An extensible math expression parser with plug-ins,” Code Project, Mar. 13, 2008. Retrieved on Jan. 30, 2015 from the internet: <http://www.codeproject.com/Articles/7335/An-extensible-math-expression-parser-with-plug-ins>.
Jenks et al., “Nomadic Threads: A Migrating Multithreaded Approach to Remote Memory Accesses in Multiprocessors,” Parallel Architectures and Compilation Techniques, 1996, Oct. 20, 1996, pp. 2-11.
JetScreenshot.com, “Share Screenshots via Internet in Seconds,” <http://web.archive.org/web/20130807164204/http://www.jetscreenshot.com/>, Aug. 7, 2013, pp. 1.
Johnson, Maggie, “Introduction to YACC and Bison”.
Johnson, Steve, “Access 2013 on demand,” Access 2013 on Demand, May 9, 2013, Que Publishing.
Kahan et al., “Annotea: an Open RDF Infrastructure for Shared Web Annotations”, Computer Networks, Elsevier Science Publishers B.V., vol. 39, No. 5, dated Aug. 5, 2002, pp. 589-608.
Karp et al., “A Simple Algorithm for Finding Frequent Elements in Streams and Bags,” ACM Transactions on Database Systems, vol. 28, No. 1, Mar. 2003, pp. 51D55.
Kitts, Paul, “Chapter 14: Genome Assembly and Annotation Process,” The NCBI Handbook, Oct. 2002, pp. 1-21.
Kwout, <http://web.archive.org/web/20080905132448/http://www.kwout.com/> Sep. 5, 2008, pp. 2.
Leela et al., “On Incorporating Iceberg Queries in Query Processors,” Technical Report, TR-2002-01, Database Systems for Advanced Applications Lecture Notes in Computer Science, 2004, vol. 2973.
Lim et al., “Resolving Attribute Incompatibility in Database Integration: An Evidential Reasoning Approach,” Department of Computer Science, University of Minnesota, 1994, <http://reference.kfupm.edu.sa/content/r/e/resolving_attribute_incompatibility_in_d_531691.pdf> pp. 1-10.
Litwin et al., “Multidatabase Interoperability,” IEEE Computer, Dec. 1986, vol. 19, No. 12, http://www.lamsade.dauphine.fr/˜litwin/mdb-interoperability.pdf, pp. 10-18.
Liu et al., “Methods for Mining Frequent Items in Data Streams: An Overview,” Knowledge and Information Systems, vol. 26, No. 1, Jan. 2011, pp. 1-30.
Madden, Tom, “Chapter 16: The BLAST Sequence Analysis Tool,” The NCBI Handbook, Oct. 2002, pp. 1-15.
Mendes et al., “TcruziKB: Enabling Complex Queries for Genomic Data Exploration,” IEEE International Conference on Semantic Computing, Aug. 2008, pp. 432-439.
Microsoft Windows, “Microsoft Windows Version 2002 Print Out 2,” 2002, pp. 1-6.
Microsoft, “Registering an Application to a URI Scheme,” <http://msdn.microsoft.com/en-us/library/aa767914.aspx>, printed Apr. 4, 2009 in 4 pages.
Microsoft, “Using the Clipboard,” <http://msdn.microsoft.com/en-us/library/ms649016.aspx>, printed Jun. 8, 2009 in 20 pages.
Mizrachi, Ilene, “Chapter 1: GenBank: The Nuckeotide Sequence Database,” The NCBI Handbook, Oct. 2002, pp. 1-14.
Nadeau et al., “A Survey of Named Entity Recognition and Classification,” Jan. 15, 2004, pp. 20.
Nin et al., “On the Use of Semantic Blocking Techniques for Data Cleansing and Integration,” 11th International Database Engineering and Applications Symposium, 2007, pp. 9.
Nitro, “Trick: How to Capture a Screenshot As PDF, Annotate, Then Share It,” <http://blog.nitropdf.com/2008/03/04/trick-how-to-capture-a-screenshot-as-pdf-annotate-it-then-share/>, Mar. 4, 2008, pp. 2.
Online Tech Tips, “Clip2Net—Share files, folders and screenshots easily,” <http://www.online-tech-tips.com/free-software-downloads/share-files-folders-screenshots/>, Apr. 2, 2008, pp. 5.
O'Reilly.com, http://oreilly.com/digitalmedia/2006/01/01/mac-os-x-screenshot-secrets.html published Jan. 1, 2006 in 10 pages.
Pythagoras Communications Ltd., “Microsoft CRM Duplicate Detection,” Sep. 13, 2011, https://www.youtube.com/watch?v=j-7Qis0D0Kc.
Qiang et al., “A Mutual-Information-Based Approach to Entity Reconciliation in Heterogeneous Databases,” Proceedings of 2008 International Conference on Computer Science & Software Engineering, IEEE Computer Society, New York, NY, Dec. 12-14, 2008, pp. 666-669.
Russell et al., “Nitelight: A Graphical Tool for Semantic Query Construction,” 2008, pp. 10.
Schroder, Stan, “15 Ways to Create Website Screenshots,” <http://mashable.com/2007/08/24/web-screenshots/>, Aug. 24, 2007, pp. 2.
Sekine et al., “Definition, Dictionaries and Tagger for Extended Named Entity Hierarchy,” May 2004, pp. 1977-1980.
Sigrist, et al., “Prosite, a Protein Domain Database for Functional Characterization and Annotation,” Nucleic Acids Research, 2010, vol. 38, pp. D161-D166.
Sirotkin et al., “Chapter 13: The Processing of Biological Sequence Data at NCBI,” The NCBI Handbook, Oct. 2002, pp. 1-11.
Smart et al., “A Visual Approach to Semantic Query Design Using a Web-Based Graphical Query Designer,” 16th International Conference on Knowledge Engineering and Knowledge Management (EKAW 2008),ÊAcitrezza, Catania, Italy, Sep. Ê29-Oct. 3, 2008, pp. 16.
SnagIt, “SnagIt 8.1.0 Print Out 2,” Software release date Jun. 15, 2006, pp. 1-3.
SnagIt, “SnagIt 8.1.0 Print Out,” Software release date Jun. 15, 2006, pp. 6.
SnagIt, “SnagIt Online Help Guide,” <http://download.techsmith.com/snagit/docs/onlinehelp/enu/snagit_help.pdf>, TechSmith Corp., Version 8.1, printed Feb. 7, 2007, pp. 284.
Stamos et al., “Remote Evaluation,” Journal ACM Transactions on Programming Languages and Systems (TOPLAS) vol. 12, Issue 4, Oct. 1990, pp. 537-564.
Wang et al., “Distributed Collaborative Filtering for Peer-to-Peer File Sharing Systems,” Proceedings of the 2006 ACM Symposium on Applied Computing, pp. 1026-1030; SAC'06 Apr. 23-27, 2006.
Wang et al., “Research on a Clustering Data De-Duplication Mechanism Based on Bloom Filter,” IEEE 2010, 5 pages.
Warren, Christina, “TUAW Faceoff: Screenshot apps on the firing line,” <http://www.tuaw.com/2008/05/05/tuaw-faceoff-screenshot-apps-on-the-firing-line/>, May 5, 2008, pp. 11.
Wikipedia, “Machine Code”, p. 1-5, printed Aug. 11, 2014.
Wikipedia, “Multimap,” Jan. 1, 2013, https://en.wikipedia.org/w/index.php?title=Multimap&oldid=530800748.
Winkler, William E., “Bureau of the Census Statistical Research Division Record Linkage Software and Methods for Merging Administrative Lists,” Statistical Research Report Series No. RR2001/03, Jul. 23, 2001, https://www.census.gov/srd/papers/pdf/rr2001-03.pdf, retrieved on Mar. 9, 2016.
Wollrath et al., “A Distributed Object Model for the Java System,” Proceedings of the 2nd Conference on USENEX, Conference on Object-Oriented Technologies (COOTS), Jun. 17, 1996, pp. 219-231.
Zhao et al., “Entity Matching Across Heterogeneous Data Sources: An Approach Based on Constrained Cascade Generalization,” Data & Knowledge Engineering, vol. 66, No. 3, Sep. 2008, pp. 368-381.
Provisional Applications (1)
Number Date Country
62679614 Jun 2018 US